Toggle light / dark theme

The future of artificial intelligence is the question on all of our minds right now. AI has the potential of replacing us in every conceivable industry, leading to a potential dystopia. Humanity is suddenly gripped with this massive anxiety, but this is also our greatest opportunity.

Will this be the end of meaning?

Or is this humanity’s greatest gift in the fulfillment of a larger purpose?

What will be the fate of human value?

At the 2023 IEEE International Solid State Circuits Conference (ISSCC) in San Francisco this week, Irvine, Calif.–based Syntiant detailed the NDP200. This is an ultralow-power chip designed to run neural networks that monitor video and wake other systems when it spots something important. That may be its core purpose, but the NDP200 can also mow down the spawn of hell, if properly trained.

The exponentially expanding scale of deep learning models is a major force in advancing the state-of-the-art and a source of growing worry over the energy consumption, speed, and, therefore, feasibility of massive-scale deep learning. Recently, researchers from Cornell talked about Transformer topologies, particularly how they are dramatically better when scaled up to billions or even trillions of parameters, leading to an exponential rise in the utilization of deep learning computing. These large-scale Transformers are a popular but expensive solution for many tasks because digital hardware’s energy efficiency has not kept up with the rising FLOP requirements of cutting-edge deep learning models. They also perform increasingly impressively in other domains, such as computer vision, graphs, and multi-modal settings.

Also, they exhibit transfer learning skills, which enable them to quickly generalize to certain activities, sometimes in a zero-shot environment with no additional training required. The cost of these models and their general machine-learning capabilities are major driving forces behind the creation of hardware accelerators for effective and quick inference. Deep learning hardware has previously been extensively developed in digital electronics, including GPUs, mobile accelerator chips, FPGAs, and large-scale AI-dedicated accelerator systems. Optical neural networks have been suggested as solutions that provide better efficiency and latency than neural-network implementations on digital computers, among other ways. At the same time, there is also significant interest in analog computing.

Even though these analog systems are susceptible to noise and error, neural network operations can frequently be carried out optically for a much lower cost, with the main cost typically being the electrical overhead associated with loading the weights and data amortized in large linear operations. The acceleration of huge-scale models like Transformers is thus particularly promising. Theoretically, the scaling is asymptotically more efficient regarding energy per MAC than digital systems. Here, they demonstrate how Transformers use this scaling more and more. They sampled operations from a real Transformer for language modeling to run on a real spatial light modulator-based experimental system. They then used the results to create a calibrated simulation of a full Transformer running optically. This was done to show that Transformers may run on these systems despite their noise and error characteristics.

Thank you to Brilliant for Supporting PBS. To learn more go to https://brilliant.org/SpaceTime/

PBS Member Stations rely on viewers like you. To support your local station, go to: http://to.pbs.org/DonateSPACE

Sign Up on Patreon to get access to the Space Time Discord!
https://www.patreon.com/pbsspacetime.

Physics progresses by breaking our intuitions, but we’re now at a point where further progress may require us to do away with the most intuitive and seemingly fundamental concepts of all—space and time.

The BMW Group on Monday launched a pilot fleet of hydrogen vehicles, with the German automotive giant’s CEO referring to hydrogen as “the missing piece in the jigsaw when it comes to emission-free mobility.”

The BMW iX5 Hydrogen, which uses fuel cells sourced from Toyota and has a top speed of more than 112 miles per hour, is being put together at a facility in Munich.


Described by the International Energy Agency as a “versatile energy carrier,” hydrogen has a variety of applications and can be deployed in sectors such as industry and transport.

BMW is one of several automotive firms continuing to look into the potential of hydrogen. Others include Toyota and Hyundai, while smaller businesses such as Riversimple are also working on hydrogen-powered cars.

How are you a conscious being?? Join us, and find out!

Subscribe for more ► https://wmojo.com/unveiled-subscribe.

In this video, Unveiled takes a closer look at human consciousness! It’s a topic that has intrigued and bemused scientists and philosophers for years… but are we FINALLY close to reaching an answer?? What is consciousness? Where is consciousness? And does it exist apart from our bodies??

This is Unveiled, giving you incredible answers to extraordinary questions!