Toggle light / dark theme

Much of the recent progress in AI has come from building ever-larger neural networks. A new chip powerful enough to handle “brain-scale” models could turbo-charge this approach.

Chip startup Cerebras leaped into the limelight in2019when it came out of stealth to reveal a 1.2-trillion-transistor chip. The size of a dinner plate, the chip is called the Wafer Scale Engine and was the world’s largest computer chip. Earlier this year Cerebras unveiled the Wafer Scale Engine 2 (WSE-2), which more than doubled the number of transistors to 2.6 trillion.

Now the company has outlined a series of innovations that mean its latest chip can train a neural network with up to 120 trillion parameters. For reference, OpenAI’s revolutionary GPT-3 language model contains 175 billion parameters. The largest neural network to date, which was trained by Google, had 1.6 trillion.

As a recent New York Times article highlighted, self-driving cars are taking longer to come to market than many experts initially predicted. Automated vehicles where riders can sit back, relax, and be delivered to their destinations without having to watch the road are continuously relegated to the “not-too-distant future.”

There’s not just debate on when this driverless future will arrive, there’s also a lack of consensus on how we’ll get there, that is, which technologies are most efficient, safe, and scalable to take us from human-driven to computer-driven (Tesla is the main outlier in this debate). The big players are lidar, cameras, ultrasonic sensors, and radar. Last week, one lidar maker showcased some new technology that it believes will tip the scales.

California-based Luminar has built a lidar it calls Iris not only has a longer range than existing systems, it’s also more compact; gone are the days of a big, bulky setup that all but takes over the car. Perhaps most importantly, the company is aiming to manufacture and sell Iris at a price point well below the industry standard.

Circa 2009


The researchers expect to have a working prototype of the product in four years. “We are just at the beginning of this project,” Wang said. “During the first two years, our primary focus will be on the sensor systems. Integrating enzyme logic onto electrodes that can read biomarker inputs from the body will be one of our first major challenges.”

“Achieving the goal of the program is estimated to take nearly a decade,” Chrisey said.

Developing an effective interface between complex physiological processes and wearable devices could have a broader impact, Wang said. If the researchers are successful, they could pave the way for “autonomous, individual, on-demand medical care, which is the goal of the new field of personalized medicine,” he added.

Machines and robots undoubtedly make life easier. They carry out jobs with precision and speed, and, unlike humans, they do not require breaks as they are never tired.

As a result, companies are looking to use them more and more in their to improve productivity and remove dirty, dangerous, and dull tasks.

However, there are still so many tasks in the that require dexterity, adaptability, and flexibility.

Ultrasound imaging techniques have proved to be highly valuable tools for diagnosing a variety of health conditions, including peripheral artery disease (PAD). PAD, one of the most common diseases among the elderly, entails the blocking or narrowing of peripheral blood vessels, which limits the supply of blood to specific areas of the body.

Ultrasound imaging methods are among the most popular means of diagnosing PAD, due to their many advantageous characteristics. In fact, unlike other imaging methods, such as computed tomography angiography and , ultrasound imaging is non-invasive, low-cost and radiation-free.

Most existing ultrasound imaging techniques are designed to capture in real time. While this can be helpful in some cases, their inability to collect three-dimensional information reduces the reliability of the data they gather, increasing their sensitivity to variations in how individual physicians used a given technique.

At Google I/O today Google Cloud announced Vertex AI, a new managed machine learning platform that is meant to make it easier for developers to deploy and maintain their AI models. It’s a bit of an odd announcement at I/O, which tends to focus on mobile and web developers and doesn’t traditionally feature a lot of Google Cloud news, but the fact that Google decided to announce Vertex today goes to show how important it thinks this new service is for a wide range of developers.

The launch of Vertex is the result of quite a bit of introspection by the Google Cloud team. “Machine learning in the enterprise is in crisis, in my view,” Craig Wiley, the director of product management for Google Cloud’s AI Platform, told me. “As someone who has worked in that space for a number of years, if you look at the Harvard Business Review or analyst reviews, or what have you — every single one of them comes out saying that the vast majority of companies are either investing or are interested in investing in machine learning and are not getting value from it. That has to change. It has to change.”

Can this be true?


An unmanned aircraft was brought down by a powerful electromagnetic pulse in what could be the first reported test of an advanced new weapon in China.

A paper published in the Chinese journal Electronic Information Warfare Technology did not give details of the timing and location of the experiment, which are classified but it may be the country’s first openly reported field test of an electromagnetic pulse (EMP) weapon.

China is racing to catch up in the field after the US demonstrated a prototype EMP weapon that brought down 50 drones with one shot in 2019.