Toggle light / dark theme

Dr. Deepan Balakrishnan, the first author, said, “Our work shows the for single-shot 3D imaging with TEMs. We are developing a generalized method using physics-based machine learning models that learn material priors and provide 3D relief for any 2D projection.”

The team also envisions further generalizing the formulation of pop-out metrology beyond TEMs to any coherent imaging system for optically thick samples (i.e., X-rays, electrons, visible light photons, etc.).

Prof Loh added, “Like human vision, inferring 3D information from a 2D image requires context. Pop-out is similar, but the context comes from the material we focus on and our understanding of how photons and electrons interact with them.”

With the rise of AI, we’re abstracting complexity by embracing technologies that resonate with human intuition. Take ChatGPT, for instance. We can simply articulate our goals in plain English, and it generates code for provisioning the infrastructure accordingly.

Another approach is using visualization. For example, with Brainboard, you can draw your cloud infrastructure, and the necessary deployment and management code is automatically generated.

These examples illustrate the next-generation software and mindset. The shift is happening now, and the next set of tools will be adapted and optimized for humans.

The Kia EV3 — the new all-electric compact SUV revealed Thursday — illustrates a growing appetite among global automakers to bring generative AI into their vehicles.

The automaker said the Kia EV3 will feature a new voice assistant that is built off ChatGPT, the text-generating AI chatbot developed by OpenAI. The Kia EV3, and its AI assistant, will first come to market in Korea in July 2024, followed by Europe in the second half of the year. Kia expects to expand sales of the vehicle into other regions following the European launch. It will eventually come to the United States, although the automaker did not provide a date.

This isn’t, however, a pure OpenAI affair. Kia had its hands in the development of the voice assistant too.

New technology is shaping the toy industry by making manufacturing more efficient and the toy playing experience more immersive.

Modern smart toys, designed to provide a more immersive experience, often feature artificial intelligence (AI), Bluetooth connectivity, and sensors. These could include toys such as educational tablets that adapt to a child’s learning pace or robotic animals that can respond to voice commands.

Current AI training methods burn colossal amounts of energy to learn, but the human brain sips just 20 W. Swiss startup FinalSpark is now selling access to cyborg biocomputers, running up to four living human brain organoids wired into silicon chips.

The human brain communicates within itself and with the rest of the body mainly through electrical signals; sights, sounds and sensations are all converted into electrical pulses before our brains can perceive them. This makes brain tissue highly compatible with silicon chips, at least for as long as you can keep it alive.

For FinalSpark’s Neuroplatform, brain organoids comprising about 10,000 living neurons are grown from stem cells. These little balls, about 0.5 mm (0.02 in) in diameter, are kept in incubators at around body temperature, supplied with water and nutrients and protected from bacterial or viral contamination, and they’re wired into an electrical circuit with a series of tiny electrodes.