Toggle light / dark theme

Photonic computers: The future of computing is… analogue

The future is optical. Photonic processors promise blazing fast calculation speeds with much lower power demands, and they could revolutionise machine learning.

Photonic computing is as the name suggests, a computer system that uses optical light pulses to form the basis of logic gates rather than electrical transistors. If it can be made to work in such a way that processors can be mass produced at a practical size it has the potential to revolutionise machine learning and other specific types of computing tasks. The emphasis being on the word if. However there are some intriguing sounding products close to coming to market that could changes things drastically.

The idea behind photonic computers is not a new one, with optical matrix multiplications first being demonstrated in the 1970s, however nobody has managed to solve many of the roadblocks to getting them to work on a practical level that can be integrated as easily as transistor based systems. Using photons is an obvious choice to help speed things up. After all all new homes in the UK are built with fibre to the home for a reason. Fibre optic cables are superior to aluminium or copper wires for the modern world of digital data communication. They can transmit more information faster, and over longer distances without signal degradation than metal wiring. However transmitting data from A to B is a whole different kettle of fish to putting such optical pipelines onto a chip fabrication that allows for matrix processing, even though some data centres already use optical cables for faster internal data transfer over short distances.

From 3 Day Weekends Through To A Life Of Leisure. The Future Is Looking Good

From 3 day weekends to a future where all the jobs you do not want to do are automated, leaving you to spend your time as you desire.
Learning and research.
Sport and recreation.
Tourism and adventure.
Or whatever takes your fancy…

Well 4 day working weeks are already arriving, and I then show the rate of break throughs in Artificial Intelligence in just the last decade, which are removing the repetition and boredom from our jobs, so we can spend more time on the bits that matter and that are of interest.

What do you think?

Let us know your thoughts in the comments.

If you want to see more about the decade of disruption that is heading our way, then check out this next…

Iceland study.

Amazon Patented a New Delivery System That Could Have Your Block Crawling With Robots

Last week, Amazon patented a delivery system involving self driving trucks carrying several small robots that deliver packages to homes.


Once all the small delivery bots are back on board, the truck (which would have a human driver in the near future but likely be autonomous in the less-near future) drives off to the next block—its fleet of mini-me’s restocking with new packages en route—and the scene repeats itself.

Cool/creepy? Good/bad? Depends on your perspective. On the one hand, employing fewer humans would bring Amazon more cost savings in the long run, which it would ideally pass on to customers and re-invest in other parts of the business, leading to hiring more people in a virtuous circle.

But on the other hand, it’s not hard to imagine the secondary vehicles going awry; there would be plenty of obstacles for them to get around (dogs, bikes, sprinklers, and children are just a few that come to mind), and given how hard it’s been to bring self-driving cars to market, Amazon may be underestimating the challenge of maneuvering the small delivery vehicles even 100 feet from truck to doorstep.

Lambda raises $24.5M for AI-optimized hardware infrastructure

Lambda, an AI infrastructure company, this week announced it raised $15 million in a venture funding round from 1517, Gradient Ventures, Razer, Bloomberg Beta, Georges Harik, and others, plus a $9.5 million debt facility. The $24.5 million investment brings the company’s total raised to $28.5 million, following an earlier $4 million seed tranche.

In 2013, San Francisco, California-based Lambda controversially launched a facial recognition API for developers working on apps for Google Glass, Google’s ill-fated heads-up augmented reality display. The API — which soon expanded to other platforms — enabled apps to do things like “remember this face” and “find your friends in a crowd,” Lambda CEO Stephen Balaban told TechCrunch at the time. The API has been used by thousands of developers and was, at least at one point, seeing over 5 million API calls per month.

Since then, however, Lambda has pivoted to selling hardware systems designed for AI, machine learning, and deep learning applications. Among these are the TensorBook, a laptop with a dedicated GPU, and a workstation product with up to four desktop-class GPUs for AI training. Lambda also offers servers, including one designed to be shared between teams and a server cluster, called Echelon, that Balaban describes as “datacenter-scale.”

Cruise: The imagery was captured last month

A dazzling new animation puts you aboard NASA’s robotic Juno spacecraft during its epic flybys last month of Jupiter and the huge moon Ganymede.

On June 7, Juno zoomed within just 645 miles (1038 kilometers) of Ganymede, the largest moon in the solar system. It was the closest a probe had gotten to the icy, heavily cratered world since May 2000, when NASA’s Galileo spacecraft flew by at a distance of about 620 miles (1000 km).

World’s first 3D-printed steel bridge opens in Amsterdam

The world’s first 3D-printed steel bridge has opened in Amsterdam. It was created by robotic arms using welding torches to deposit the structure of the bridge layer-by-layer using 4500 kilograms of stainless steel.


The first ever 3D-printed steel bridge has opened in Amsterdam, the Netherlands. It was created by robotic arms using welding torches to deposit the structure of the bridge layer by layer, and is made of 4500 kilograms of stainless steel.

The 12-metre-long MX3D Bridge was built by four commercially available industrial robots and took six months to print. The structure was transported to its location over the Oudezijds Achterburgwal canal in central Amsterdam last week and is now open to pedestrians and cyclists.

More than a dozen sensors attached to the bridge after the printing was completed will monitor strain, movement, vibration and temperature across the structure as people pass over it and the weather changes. This data will be fed into a digital model of the bridge.

New artificial intelligence software can compute protein structures in 10 minutes

Scientists have waited months for access to highly accurate protein structure prediction since DeepMind presented remarkable progress in this area at the 2020 Critical Assessment of Structure Prediction, or CASP14, conference. The wait is now over.

Researchers at the Institute for Protein Design at the University of Washington School of Medicine in Seattle have largely recreated the performance achieved by DeepMind on this important task. These results will be published online by the journal Science on Thursday, July 15.

Unlike DeepMind, the UW Medicine team’s method, which they dubbed RoseTTAFold, is freely available. Scientists from around the world are now using it to build models to accelerate their own research. Since July, the program has been downloaded from GitHub by over 140 independent research teams.

Adding logical qubits to Sycamore quantum computer reduces error rate

The Google Quantum AI team has found that adding logical qubits to the company’s quantum computer reduced the logical qubit error rate exponentially. In their paper published in the journal Nature, the group describes their work with logical qubits as an error correction technique and outline what they have learned so far.

One of the hurdles standing in the way of the creation of usable quantum computers is figuring out how to either prevent errors from occurring or fixing them before they are used as part of a computation. On traditional computers, the problem is mostly solved by adding a parity bit—but that approach will not work with quantum computers because of the different nature of qubits—attempts to measure them destroy the data. Prior research has suggested that one possible solution to the problem is to group qubits into clusters called logical qubits. In this new effort, the team at AI Quantum has tested this idea on Google’s Sycamore quantum .

Sycamore works with 54 physical qubits, in their work, the researchers created logical qubits of different sizes ranging from five to 21 qubits to see how each would work. In so doing, they found that adding qubits reduced rates exponentially. They were able to measure the extra qubits in a way that did not involve collapsing their state, but that still provided enough information for them to be used for computations.