Toggle light / dark theme

Truthfully, it has been some time since Moore’s law, the propensity for processors to double in transistor count every two years, has been entirely accurate. The fundamental properties of silicon are beginning to limit development and will significantly curtail future performance gains, yet with 50 years and billions invested, it seems preposterous that any ‘beyond-silicon’ technology could power the computers of tomorrow. And yet, Nano might do just that, by harnessing its ability to be designed and built like a regular silicon wafer, while using carbon to net theoretical triple performance at one-third the power.

Nano began life much like all processors, a 150mm wafer with a pattern carved out of it by a regular chip fab. Dipped into a solution of carbon nanotubes bound together like microscopic spaghetti, it re-emerged with its semi-conductive carbon nanotubes stuck in the pattern of transistors and logic gates already etched on it. It then undergoes a process called ‘RINSE,’ removal of incubated nanotubes through selective exfoliation, by being coated with a polymer then dipped in a solvent. This has the effect of reducing the CNT layer to being just one tube, removing the large clumps of CNTs that stick together over 250 times more effectively than previous methods.

One of the challenges facing CNT processors has been difficulty in separating N-type and P-type transistors, which are “on” for 1 bit and “off” for 0 bit and the reverse, respectively. The difference is important for binary computing, and to perfect it, the researchers introduced ‘MIXED,’ metal interface engineering crossed with electrostatic doping. Occurring after RINSE, small platinum or titanium components are added to each transistor, then the wafer is coated in an oxide which acts as a sealant, improving performance. After that, Nano was just about done.

Ultra-low-loss metal films with high-quality single crystals are in demand as the perfect surface for nanophotonics and quantum information processing applications. Silver is by far the most preferred material due to low-loss at optical and near infrared (near-IR) frequencies. In a recent study now published on Scientific Reports, Ilya A. Rodionov and an interdisciplinary research team in Germany and Russia reported a two-step approach for electronic beam evaporation of atomically smooth single crystalline metal films. They proposed a method to establish thermodynamic control of the film growth kinetics at the atomic level in order to deposit state-of-the-art metal films.

The researchers deposited 35 to 100 nm thick, single-crystalline silver with sub-100 picometer (pm) with theoretically limited optical losses to form ultrahigh-Q nanophotonic devices. They experimentally estimated the contribution of material purity, material grain boundaries, surface roughness and crystallinity to the optical properties of metal films. The team demonstrated a fundamental two-step approach for single-crystalline growth of silver, gold and aluminum films to open new possibilities in nanophotonics, biotechnology and superconductive quantum technologies. The research team intends to adopt the method to synthesize other extremely low-loss single-crystalline metal films.

Optoelectronic devices with plasmonic effects for near-field manipulation, amplification and sub-wavelength integration can open new frontiers in nanophotonics, quantum optics and in quantum information. Yet, the ohmic losses associated in metals are a considerable challenge to develop a variety of useful plasmonic devices. Materials scientists have devoted research efforts to clarify the influence of metal film properties to develop high performance material platforms. Single-crystalline platforms and nanoscale structural alterations can prevent this problem by eliminating material-induced scattering losses. While silver is one of the best known plasmonic metals at optical and near-IR frequencies, the metal can be challenging for single-crystalline film growth.

Credit: MIT
Credit: MIT Engineers from the MIT and Analog Devices have created the most complex chip design yet that uses transistors made of carbon nanotubes instead of silicon. The chip was manufactured using new technologies proven to work in a commercial chip-manufacturing facility.

The researchers seem to have chosen the RISC-V instruction set architecture (ISA) for the design of the chip, presumably due to the open source nature that didn’t require hassling with licensing restrictions and costs. The RISC-V processor handles 32-bit instructions and does 16-bit memory addressing. The chip is not meant to be used in mainstream devices quite yet, but it’s a strong proof of concept that can already run “hello world”-type applications.

One advantage transistors made out of carbon nanotubes have over silicon transistors is that they can be manufactured in multiple layers, allowing for very dense 3D chip designs. DARPA also believes that carbon nanotubes may allow for the manufacturing of future 3D chips that have performance similar or better than silicon chips, but they can also be manufactured for much lower costs.

It’s the most complex integration of carbon nanotube-based CMOS logic so far, with nearly 15,000 transistors, and it was done using technologies that have already been proven to work in a commercial chip-manufacturing facility. The processor, called RV16X-NANO, is a milestone in the development of beyond-silicon technologies, its inventors say.

Unlike silicon transistors, nanotube devices can easily be made in multiple layers with dense 3D interconnections. The Defense Advanced Research Projects Agency is hoping this 3D aspect will lead to commercial carbon nanotube (CNT) chips with the performance of today’s cutting-edge silicon but without the high design and manufacturing cost.

Some of the same researchers created a modest one-bit, 178-transistor processor back in 2013. In contrast, the new one, which is based on the open source RISC-V instruction set, is capable of working with 16-bit data and 32-bit instructions. Naturally, the team, led by MIT assistant professor Max Shulaker, tested the chip by running a version of the obligatory “Hello, World!” program. They reported the achievement this week in Nature.

Aug. 27 (UPI) — Built-in night vision may not be far off. Scientists have developed nanoparticles that allow mice to see near-infrared light.

Researchers are scheduled to describe the technological breakthrough on Tuesday at 12:30 p.m. ET at the American Chemical Society’s fall meeting, held this week in San Diego. Their presentation will be streamed live online.

“When we look at the universe, we see only visible light,” lead researcher Gang Han, a material scientists and biochemist at the University of Massachusetts Medical School, said in a news release. “But if we had near-infrared vision, we could see the universe in a whole new way. We might be able to do infrared astronomy with the naked eye, or have night vision without bulky equipment.”

Carbon isn’t just the stuff life is made of—it’s also the stuff our future is being built on.

Carbon—a versatile element that frequently trades off its electrons to create various forms of itself—has been gaining an exciting reputation in tech thanks to the successful exfoliation of graphene, a sheet of carbon that’s just one atom thick and has remarkable chemical properties.

But carbon nanotubes, a sort of cousin to graphene, has been quietly staking out its own place in the world of materials science.