Toggle light / dark theme

Optical Computing: Solving Problems at the Speed of Light

Optical computing, which uses photons instead of electrons, has been one of the great promises of this field for decades.


According to Moore’s law —actually more like a forecast, formulated in 1965 by Intel co-founder Gordon Moore— the number of transistors in a microprocessor doubles about every two years, boosting the power of the chips without increasing their energy consumption. For half a century, Moore’s prescient vision has presided over the spectacular progress made in the world of computing. However, by 2015, the engineer himself predicted that we are reaching a saturation point in current technology. Today, quantum computing holds out hope for a new technological leap, but there is another option on which many are pinning their hopes: optical computing, which replaces electronics (electrons) with light (photons).

The end of Moore’s law is a natural consequence of physics: to pack more transistors into the same space they have to be shrunk down, which increases their speed while simultaneously reducing their energy consumption. The miniaturisation of silicon transistors has succeeded in breaking the 7-nanometre barrier, which used to be considered the limit, but this reduction cannot continue indefinitely. And although more powerful systems can always be obtained by increasing the number of transistors, in doing so the processing speed will decrease and the heat of the chips will rise.

The hybridization of electronics and optics

Hence the promise of optical computing: photons move at the speed of light, faster than electrons in a wire. Optical technology is also not a newcomer to our lives: the vast global traffic on the information highways today travels on fibre optic channels, and for years we have used optical readers to burn and read our CDs, DVDs and Blu-Ray discs. However, in the guts of our systems, the photons coming through the fibre optic cable must be converted into electrons in the microchips, and in turn these electrons must be converted to photons in the optical readers, slowing down the process.

Extreme events in quantum cascade lasers

Extreme events occur in many observable contexts. Nature is a prolific source: rogue water waves surging high above the swell, monsoon rains, wildfire, etc. From climate science to optics, physicists have classified the characteristics of extreme events, extending the notion to their respective domains of expertise. For instance, extreme events can take place in telecommunication data streams. In fiber-optic communications where a vast number of spatio-temporal fluctuations can occur in transoceanic systems, a sudden surge is an extreme event that must be suppressed, as it can potentially alter components associated with the physical layer or disrupt the transmission of private messages.

Recently, extreme events have been observed in quantum cascade lasers, as reported by researchers from Télécom Paris (France) in collaboration with UC Los Angeles (USA) and TU Darmstad (Germany). The giant pulses that characterize these extreme events can contribute the sudden, sharp bursts necessary for communication in neuromorphic systems inspired by the brain’s powerful computational abilities. Based on a quantum cascade laser (QCL) emitting mid-infrared light, the researchers developed a basic optical neuron system operating 10,000× faster than biological neurons. Their report is published in Advanced Photonics.

New Chrome 0-day Under Active Attacks – Update Your Browser Now

Attention readers, if you are using Google Chrome browser on your Windows, Mac, or Linux computers, you need to update your web browsing software immediately to the latest version Google released earlier today.

Google released Chrome version 86.0.4240.111 today to patch several security high-severity issues, including a zero-day vulnerability that has been exploited in the wild by attackers to hijack targeted computers.

Tracked as CVE-2020–15999, the actively exploited vulnerability is a type of memory-corruption flaw called heap buffer overflow in Freetype, a popular open source software development library for rendering fonts that comes packaged with Chrome.

Scientists borrow solar panel tech to create new ultrahigh-res OLED display

Ultra high-res displays for gadgets and tv sets may be coming. 😃


By expanding on existing designs for electrodes of ultra-thin solar panels, Stanford researchers and collaborators in Korea have developed a new architecture for OLED—organic light-emitting diode—displays that could enable televisions, smartphones and virtual or augmented reality devices with resolutions of up to 10,000 pixels per inch (PPI). (For comparison, the resolutions of new smartphones are around 400 to 500 PPI.)

Such high-pixel-density displays will be able to provide stunning images with true-to-life detail—something that will be even more important for headset displays designed to sit just centimeters from our faces.

The advance is based on research by Stanford University materials scientist Mark Brongersma in collaboration with the Samsung Advanced Institute of Technology (SAIT). Brongersma was initially put on this research path because he wanted to create an ultra-thin solar panel design.

Free Brain Computer Interfaces? Kernel Livestream Supercut

Application available for 10 Free Brain Computer Interfaces from Kernel! Noninvasive helmet like design using near infrared light.


Han from WrySci HX puts together a supercut from the Kernel Livestream. Find out how it works, what you can use it for and how to apply for a chance at a free brain computer interface. More below ↓↓↓

Follow me on twitter: https://twitter.com/han_xavier_

Please consider supporting 🙏

Patreon: https://www.patreon.com/wrysci_hx

Biomimicry has positive impact on planet says architect Michael Pawlyn

In the second video of our Design for Life collaboration with Dassault Systèmes, Exploration Architecture founder Michael Pawlyn explains how computational design tools allow architects to mimic the natural world.

Pawlyn is the second designer to feature in the Design for Life collaboration between Dezeen and Dassault Systèmes, which highlights designers who are using technology and research to build a better world.

Biomimicry is innovation inspired by nature,” explained Pawlyn in the video, which was filmed by Dezeen at the founder of biomimicry-focussed practice Exploration Architecture’s home studio in London.

An integrated circuit of pure magnons

Researchers led by Technische Universität Kaiserslautern (TUK) and the University of Vienna successfully constructed a basic building block of computer circuits using magnons to convey information, in place of electrons. The ‘magnonic half-adder’ described in Nature Electronics, requires just three nanowires, and far less energy than the latest computer chips.

A team of physicists are marking a milestone in the quest for smaller and more energy-efficient computing: they developed an integrated circuit using magnetic material and magnons to transmit binary data, the 1s and 0s that form the foundation of today’s computers and smartphones.

The new circuit is extremely tiny, with a streamlined, 2-D design that requires about 10 times less energy than the most advanced computer chips available today, which use CMOS technology. While the current magnon configuration is not as fast as CMOS, the successful demonstration can now be explored further for other applications, such as quantum or neuromorphic computing.

/* */