Toggle light / dark theme

This also means faster robotics and self-driving cars.

Foxconn, the largest producer of iPhones, is joining hands with the biggest chipmaker in the world, NVIDIA, to develop artificial intelligence factories that will power a range of applications like self-driving cars, more generative AI tools, and robotic systems, said a press release.

Dubbed AI factories, they are data centers that will power a wide range of applications, including the digitalization of manufacturing and inspection workflows, the development of AI-powered electric vehicle and robotics platforms, and language-based generative AI services.

The team estimates that their hardware can outperform the best electronic processors by a factor of 100 in terms of energy efficiency and compute density.

A team of scientists from Oxford University and their partners from Germany and the UK have developed a new kind of AI hardware that uses light to process three-dimensional (3D) data. Based on integrated photonic-electronic chips, the hardware can perform complex calculations in parallel using different wavelengths and radio frequencies of light. The team claims their hardware can boost the data processing speed and efficiency for AI tasks by several orders of magnitude.


AI computing and processing power

The research published today in the journal Nature Photonics addresses the challenge of meeting modern AI applications’ increasing demand for computing power. The conventional computer chips, which rely on electronics, need help to keep up with the pace of AI innovation, which requires doubling the processing power every 3.5 months. The team says that using light instead of electronics offers a new way of computing that can overcome this bottleneck.

Are you worried about the future of AI? In this video, we’ll look at a sci-fi scenario where a superintelligent AI has taken over the planet in 2075 and what that might mean for our future.

Ultimately, we need to be prepared for the future, that means being aware of superintelligent AI and how this future might unfold. So check out this video and leave your comments below.

https://www.grayscott.com.
Twitter: https://twitter.com/grayscott.
Facebook: https://www.facebook.com/futuristgrayscott/

Watch my other videos.
The Simulated Future: https://www.youtube.com/watch?v=pX9FY… igital Twin: https://www.youtube.com/watch?v=RjJzC… onscious Machines: https://www.youtube.com/watch?v=qtq1G… ranshumanism: https://www.youtube.com/watch?v=D8lE–… ream Recording: https://www.youtube.com/watch?v=33VoQ… uantified Self: https://www.youtube.com/watch?v=pMHDo… he future is a portal inward: https://www.youtube.com/watch?v=GpfwI… ray Scott is a futurist, philosopher, and artist. Gray is frequently interviewed by the Discovery Channel, History Channel, Forbes, CBS News, Vanity Fair, VICE MOTHERBOARD, Fast Company, The Washington Post, and SingularityHub.
Digital Twin:
Conscious Machines:
Transhumanism:
Dream Recording:
Quantified Self:
The future is a portal inward:

Gray Scott is a futurist, philosopher, and artist. Gray is frequently interviewed by the Discovery Channel, History Channel, Forbes, CBS News, Vanity Fair, VICE MOTHERBOARD, Fast Company, The Washington Post, and SingularityHub.

A large team of computer scientists and engineers at IBM Research has developed a dedicated computer chip that is able to run AI-based image recognition apps 22 times as fast as chips that are currently on the market.

In their paper published in the journal Science, the group describes the ideas that went into developing the , how it works and how well it performed when tested. Subramanian Iyer and Vwani Roychowdhury, both at the University of California, Los Angeles, have published a Perspective piece in the same journal issue, giving an in-depth analysis of the work by the team in California.

As AI-powered applications become mainstream tools used by professionals and amateurs alike, scientists continue work to make them better. One way to do that, Iyer and Roychowdhury note, is to move toward an “edge” computer system in which the data is physically closer to the AI applications that are using them.

Unfortunately, these precise cell arrangements are also why artificial muscles are difficult to recreate in the lab. Despite being soft, squishy, and easily damaged, our muscles can perform incredible feats—adapt to heavy loads, sense the outside world, and rebuild after injury. A main reason for these superpowers is alignment—that is, how muscle cells orient to form stretchy fibers.

Now, a new study suggests that the solution to growing better lab-grown muscles may be magnets. Led by Dr. Ritu Raman at the Massachusetts Institute of Technology (MIT), scientists developed a magnetic hydrogel “sandwich” that controls muscle cell orientation in a lab dish. By changing the position of the magnets, the muscle cells aligned into fibers that contracted in synchrony as if they were inside a body.

The whole endeavor sounds rather Frankenstein. But lab-grown tissues could one day be grafted into people with heavily damaged muscles—either from inherited diseases or traumatic injuries—and restore their ability to navigate the world freely. Synthetic muscles could also coat robots, providing them with human-like senses, flexible motor control, and the ability to heal after inevitable scratches and scrapes.

Tesla has shared a video of a hands-free drive demonstration of its Full Self-Driving suite in Austin. The FSD suite is not available to customers in a hands-free nature, but Tesla disabled the requirement for a new video it shared on X, formerly known as Twitter.

Tesla shared the video to demonstrate the capabilities of Software Version 11.4.7, which is the current version of the FSD Beta program.

The automaker describes in the Tweet in put up how the Full Self-Driving suite improves through data-driven techniques that refine the capabilities through analysis of other drivers’ behavior and normal navigation habits.

That spider you squished? It could have been used for science!

At least, that’s what Faye Yap and Daniel Preston think. Yap is a mechanical engineering PhD student in Preston’s lab at Rice University, where she co-authored a paper on reanimating spider corpses to create grippers, or tiny machines used to pick up and put down delicate objects. Yap and Preston dubbed this use of biotic materials for robotic parts “necrobotics” – and think this technique could one day become a cheap, green addition to the field.