Toggle light / dark theme

Analog AI processor company Mythic launched its M1076 Analog Matrix Processor today to provide low-power AI processing.

The company uses analog circuits rather than digital to create its processor, making it easier to integrate memory into the processor and operate its device with 10 times less power than a typical system-on-chip or graphics processing unit (GPU).

The M1076 AMP can support up to 25 trillion operations per second (TOPS) of AI compute in a 3-watt power envelope. It is targeted at AI at the edge applications, but the company said it can scale from the edge to server applications, addressing multiple vertical markets including smart cities, industrial applications, enterprise applications, and consumer devices.

Designing an autonomous, learning smart garden.


In the first episode of Build Out, Colt and Reto — tasked with designing the architecture for a “Smart Garden” — supplied two very different concepts, that nevertheless featured many overlapping elements. Take a look at the video to see what they came up with, then continue reading to see how you can learn from their explorations to build your very own Smart Garden.

Both solutions aim to optimize plant care using sensors, weather forecasts, and machine learning. Watering and fertilizing routines for the plants are updated regularly to guarantee the best growth, health, and fruit yield possible.

Colt’s solution is optimized for small-scale home farming, using a modified CNC machine to care for a fruit or vegetable patch. The drill bit is replaced with a liquid spout, UV light, and camera, while the cutting area is replaced with a plant bed that includes sensors to track moisture, nutrient levels, and weight.

It’s ten times more powerful than the current U.S. effort.


Earlier this month, Chinese artificial intelligence (A.I.) researchers at the Beijing Academy of Artificial Intelligence (BAAI) unveiled Wu Dao 2.0, the world’s biggest natural language processing (NLP) model. And it’s a big deal.

NLP is a branch of A.I. research that aims to give computers the ability to understand text and spoken words and respond to them in much the same way human beings can.

Since the DeepSpeed optimization library was introduced last year, it has rolled out numerous novel optimizations for training large AI models—improving scale, speed, cost, and usability. As large models have quickly evolved over the last year, so too has DeepSpeed. Whether enabling researchers to create the 17-billion-parameter Microsoft Turing Natural Language Generation (Turing-NLG) with state-of-the-art accuracy, achieving the fastest BERT training record, or supporting 10x larger model training using a single GPU, DeepSpeed continues to tackle challenges in AI at Scale with the latest advancements for large-scale model training. Now, the novel memory optimization technology ZeRO (Zero Redundancy Optimizer), included in DeepSpeed, is undergoing a further transformation of its own. The improved ZeRO-Infinity offers the system capability to go beyond the GPU memory wall and train models with tens of trillions of parameters, an order of magnitude bigger than state-of-the-art systems can support. It also offers a promising path toward training 100-trillion-parameter models.

ZeRO-Infinity at a glance: ZeRO-Infinity is a novel deep learning (DL) training technology for scaling model training, from a single GPU to massive supercomputers with thousands of GPUs. It powers unprecedented model sizes by leveraging the full memory capacity of a system, concurrently exploiting all heterogeneous memory (GPU, CPU, and Non-Volatile Memory express or NVMe for short). Learn more in our paper, “ZeRO-Infinity: Breaking the GPU Memory Wall for Extreme Scale Deep Learning.” The highlights of ZeRO-Infinity include:

When you put these three factors together—the bounty of technological advances, the compressed restructuring timetable due to covid-19, and an economy finally running at full capacity—the ingredients are in place for a productivity boom. This will not only boost living standards directly, but also frees up resources for a more ambitious policy agenda.


AI and other digital technologies have been surprisingly slow to improve economic growth. But that could be about to change.

Researchers at Oxford University have developed an AI-enabled system that can comprehensively identify people in videos by conducting detective-like, multi-domain investigations as to who they might be, from context, and from a variety of publicly available secondary sources, including the matching of audio sources with visual material from the internet.

Though the research centers on the identification of public figures, such as people appearing in television programs and films, the principle of inferring identity from context is theoretically applicable to anyone whose face, voice, or name appears in online sources.

Indeed, the paper’s own definition of fame is not limited to show business workers, with the researchers declaring ‘We denote people with many images of themselves online as famous‘.

WASHINGTON—The Biden administration launched an initiative Thursday aiming to make more government data available to artificial intelligence researchers, part of a broader push to keep the U.S. on the cutting edge of the crucial new technology.

The National Artificial Intelligence Research Resource Task Force, a group of 12 members from academia, government, and industry led by officials from the White House Office of Science and Technology Policy and the National Science Foundation, will draft a strategy for creating an AI research resource that could, in part, give researchers secure access to stores of anonymous data about Americans, from demographics to health and driving habits.

They would also look to make available computing power to analyze the data, with the goal of allowing access to researchers across the country.

(Bloomberg) — On a Wednesday afternoon in May, an Uber driver in San Francisco was about to run out of charge on his Nissan Leaf. Normally this would mean finding a place to plug in and wait for a half hour, at least. But this Leaf was different.

Instead of plugging in, the driver pulled into a swapping station near Mission Bay, where a set of robot arms lifted the car off of the ground, unloaded the depleted batteries and replaced them with a fully charged set. Twelve minutes later the Leaf pulled away with 32 kilowatt hours of energy, enough to drive about 130 miles, for a cost of $13.

A swap like this is a rare event in the U.S. The Leaf’s replaceable battery is made by Ample, one of the only companies offering a service that’s more popular in markets in Asia. In March, Ample announced that it had deployed five stations around the Bay Area. Nearly 100 Uber drivers are using them, the company says, making an average of 1.3 swaps per day. Ample’s operation is tiny compared to the 100000 public EV chargers in the U.S.—not to mention the 150000 gas stations running more than a million nozzles. Yet Ample’s founders Khaled Hassounah and John de Souza are convinced that it’s only a matter of time before the U.S. discovers that swapping is a necessary part of the transition to electric vehicles.

COVID 19 pandemic, automation and 6G could end the metropolitan era from building high sky scrapers for companies. Companies can operate like a network from home to home without going to office. This will help a lot to bring down Urban Heat Islands and make our cities more efficient in transportation and communication to send the data even faster.

Tom Marzetta is the director of NYU Wireless, New York University’s research center for cutting-edge wireless technologies. Prior to joining NYU Wireless, Marzetta was at Nokia Bell Labs, where he developed massive MIMO. Massive MIMO (short for “multiple-input multiple-output”) allows engineers to pack dozens of small antennas into a single array. The high number of antennas means more signals can be sent and received at once, dramatically boosting a single cell tower’s efficiency.

Massive MIMO is becoming an integral part of 5G, as is an independent development that came out of NYU Wireless by the center’s founding director Ted Rappaport: Millimeter waves. And now the professors and students at NYU Wireless are already looking ahead to 6G and beyond.

Marzetta spoke with IEEE Spectrum about the work happening at NYU Wireless, as well as what we all might expect from 6G when it arrives in the next decade. The conversation below has been edited for clarity and length.