Toggle light / dark theme

For decades, one material has so dominated the production of computer chips and transistors that the tech capital of the world—Silicon Valley—bears its name. But silicon’s reign may not last forever.

MIT researchers have found that an alloy called InGaAs (indium gallium arsenide) could hold the potential for smaller and more energy efficient . Previously, researchers thought that the performance of InGaAs transistors deteriorated at small scales. But the new study shows this apparent deterioration is not an intrinsic property of the material itself.

The finding could one day help push computing power and efficiency beyond what’s possible with silicon. “We’re really excited,” said Xiaowei Cai, the study’s lead author. “We hope this result will encourage the community to continue exploring the use of InGaAs as a channel material for transistors.”

Researchers have found a way to protect highly fragile quantum systems from noise, which could aid in the design and development of new quantum devices, such as ultra-powerful quantum computers.

The researchers, from the University of Cambridge, have shown that microscopic particles can remain intrinsically linked, or entangled, over long distances even if there are random disruptions between them. Using the mathematics of quantum theory, they discovered a simple setup where entangled particles can be prepared and stabilized even in the presence of noise by taking advantage of a previously unknown symmetry in .

Their results, reported in the journal Physical Review Letters, open a new window into the mysterious quantum world that could revolutionize future technology by preserving in , which is the single biggest hurdle for developing such technology. Harnessing this capability will be at the heart of ultrafast quantum computers.

Quantum computing startup IonQ today announced its road map for the next few years — following a similar move from IBM in September — and it’s quite ambitious, to say the least.

At our Disrupt event earlier this year, IonQ CEO and president Peter Chapman suggested that we were only five years away from having desktop quantum computers. That’s not something you’ll likely hear from the company’s competitors — who also often use a very different kind of quantum technology — but IonQ now says that it will be able to sell modular, rack-mounted quantum computers for the data center in 2023 and that by 2025, its systems will be powerful enough to achieve broad quantum advantage across a wide variety of use cases.

In an interview ahead of today’s announcement, Chapman showed me a prototype of the hardware the company is working on for 2021, which fits on a workbench. The actual quantum chip is currently the size of a half-dollar and the company is now working on essentially putting the core of its technology on a single chip, with all of the optics that make its system work integrated.

For the longest time, Google’s new Fuchsia operating system remained a bit of a mystery — with little information in terms of the company’s plans for it, even as the team behind it brought the code to GitHub under a standard open-source license. These days, we know that it’s Google’s first attempt at developing a completely new kernel and general purpose operating system that promises to be more than just an experiment (or a retention project to keep senior engineers from jumping ship). For the most part, though, Google has remained pretty mum about the subject.

It seems like Google is ready to start talking about Fuchsia a bit more now. The company today announced that it is expanding the Fuchsia open-source community and opening it up to contributions from the public. Typically, companies start opening up their open-source projects to outside contributors once they feel they have achieved a stable foundation that others can build on.

“Starting today, we are expanding Fuchsia ‘s open source model to make it easier for the public to engage with the project,” the team writes. “We have created new public mailing lists for project discussions, added a governance model to clarify how strategic decisions are made, and opened up the issue tracker for public contributors to see what’s being worked on. As an open source effort, we welcome high-quality, well-tested contributions from all. There is now a process to become a member to submit patches, or a committer with full write access.”

WASHINGTON: Scientists are a step closer to restoring vision for the blind, after building an implant that bypasses the eyes and allows monkeys to perceive artificially induced patterns in their brains.

The technology, developed by a team at the Netherlands Institute for Neuroscience (NIN), was described in the journal Science on Thursday.

It builds on an idea first conceived decades ago: electrically stimulating the brain so it “sees” lit dots known as phosphenes, akin to pixels on a computer screen.

It’s been four years, and we still don’t really know what Google intends to do with this OS.


It’s been over four years since we first found out that Google is developing a new operating system called Fuchsia. It’s unique because it’s not based on a Linux kernel; instead, it uses a microkernel called Zircon. It’s also unique because, despite being developed “in the open” on publicly browsable repositories, nobody really understands what the OS is for, and Google executives have been remarkably coy about it all.

Today, that mix of trends continues as the company announces that it’s opening up a little more by asking for more public contributors from outside its organization. Google says it has “created new public mailing lists for project discussions, added a governance model to clarify how strategic decisions are made, and opened up the issue tracker for public contributors to see what’s being worked on.”

It’s been a while since we’ve seen a dive into the code and documentation Google has made available, though there are some early UI examples. Google’s post today emphasizes that “Fuchsia is not ready for general product development or as a development target,” but it’s likely that the announcement will spur another round of analysis.

What a tough time for Cyberpunk 2077 to be launching. The newest graphics cards are unavailable unless you’re willing to overpay a scalper, and older GPUs are also hard to buy at a reasonable price because of the shortage of new ones.

The good news is that the official Cyberpunk 2077 minimum specifications are surprisingly modest, especially if you’re OK with playing at 1080p. If you want to slide everything to high at that resolution, then you’re looking at a Core i7 4790 or AMD Ryzen 3 3200G, with a GeForce GTX 1060/1660 Super or Radeon RX 470, and 12GB of RAM. That’s really not too demanding, especially from the processor perspective.

Columbia team discovers 6-nanometer-long single-molecule circuit with enormous on/off ratio due to quantum interference; finding could enable faster, smaller, and more energy-efficient devices.

Researchers, led by Columbia Engineering Professor Latha Venkataraman, report today that they have discovered a new chemical design principle for exploiting destructive quantum interference. They used their approach to create a six-nanometer single-molecule switch where the on-state current is more than 10,000 times greater than the off-state current–the largest change in current achieved for a single-molecule circuit to date.

This new switch relies on a type of quantum interference that has not, up to now, been explored. The researchers used long molecules with a special central unit to enhance destructive quantum interference between different electronic energy levels. They demonstrated that their approach can be used to produce very stable and reproducible single-molecule switches at room temperature that can carry currents exceeding 0.1 microamps in the on-state. The length of the switch is similar to the size of the smallest computer chips currently on the market and its properties approach those of commercial switches. The study is published today in Nature Nanotechnology.