Toggle light / dark theme

Physicists develop technology to transform information from microwaves to optical light

Physicists at the University of Alberta have developed technology that can translate data from microwaves to optical light—an advance that has promising applications in the next generation of super-fast quantum computers and secure fiber-optic telecommunications.

“Many quantum computer technologies work in the microwave regime, while many quantum communications channels, such as fiber and satellite, work with optical ,” explained Lindsay LeBlanc, who holds the Canada Research Chair in Ultracold Gasses for Quantum Simulation. “We hope that this platform can be used in the future to transduce quantum signals between these two regimes.”

The new technology works by introducing a between microwave radiation and atomic gas. The microwaves are then modulated with an , encoding information into the microwave. This modulation is passed through the gas atoms, which are then probed with to encode the signal into the light.

LIVE: Progress cargo space craft launches to International Space Station to deliver supplies

About Yahoo Finance:
At Yahoo Finance, you get free stock quotes, up-to-date news, portfolio management resources, international market data, social interaction and mortgage rates that help you manage your financial life.

Connect with Yahoo Finance:
Get the latest news: https://yhoo.it/2fGu5Bb
Find Yahoo Finance on Facebook: http://bit.ly/2A9u5Zq
Follow Yahoo Finance on Twitter: http://bit.ly/2LMgloP
Follow Yahoo Finance on Instagram: http://bit.ly/2LOpNYz

Venus in a Minute

Venus could serve as a model for many exoplanets soon to be discovered in the upcoming era of new space telescopes, such as James Webb and others.

So how did our sister planet evolve from a past “habitable” state to its present one, and how does that help us understand our own destiny?

Watch for more.

Watch Russia launch a new cargo ship to the International Space Station Thursday

A Russian Soyuz rocket will launch a robotic cargo ship packed with tons of supplies to the International Space Station Thursday (July 29), and you can watch the launch live.

Roscosmos, Russia’s space agency, will launch the uncrewed Progress 76 supply ship to the station at 10:26 a.m. EDT (1426 GMT) from Baikonur Cosmodrome in Kazakhstan, where the local time will be 7:26 p.m. You can watch the launch live here and on the Space.com homepage, courtesy of NASA TV.

Quantum Computing: Looking Ahead To Endless Possibilities

However, to dismiss the subject as fantastical or unnecessary would be akin to telling scientists 100 years ago that landing on the moon was also irrelevant.

This is because, for pioneers and champions of artificial intelligence, quantum computing is the holy grail. It’s not a make-believe fantasy; rather, it’s a tangible area of science that will take our probability-driven world into a whole new dimension.

Generative Feedback Explains Distinct Brain Activity Codes for Seen and Mental Images

The relationship between mental imagery and vision is a long-standing problem in neuroscience. Currently, it is not known whether differences between the activity evoked during vision and reinstated during imagery reflect different codes for seen and mental images. To address this problem, we modeled mental imagery in the human brain as feedback in a hierarchical generative network. Such networks synthesize images by feeding abstract representations from higher to lower levels of the network hierarchy. When higher processing levels are less sensitive to stimulus variation than lower processing levels, as in the human brain, activity in low-level visual areas should encode variation in mental images with less precision than seen images. To test this prediction, we conducted an fMRI experiment in which subjects imagined and then viewed hundreds of spatially varying naturalistic stimuli. To analyze these data, we developed imagery-encoding models. These models accurately predicted brain responses to imagined stimuli and enabled accurate decoding of their position and content. They also allowed us to compare, for every voxel, tuning to seen and imagined spatial frequencies, as well as the location and size of receptive fields in visual and imagined space. We confirmed our prediction, showing that, in low-level visual areas, imagined spatial frequencies in individual voxels are reduced relative to seen spatial frequencies and that receptive fields in imagined space are larger than in visual space. These findings reveal distinct codes for seen and mental images and link mental imagery to the computational abilities of generative networks.

Keywords: encoding models; fMRI; generative network; mental imagery; receptive fields; vision.

Copyright © 2020 Elsevier Inc. All rights reserved.

/* */