Toggle light / dark theme

In a new patent from AMD, researchers have highlighted techniques for performing machine learning operations using one or more dedicated ML accelerator chiplets. The resulting device is called an Accelerated Processing Device or APD which can be used for both gaming and data center GPUs via different implementations. The method involves configuring a part of the chiplet memory as a cache while the other as directly accessible memory.

The sub-portion of the former is then used by the machine learning accelerators on the same chiplet to perform machine learning operations. The patent is very open-ended with respect to its uses, indicating possible use in CPUs, GPUs, or other caching devices, but the primary target appears to be GPUs with several thousand SIMD units.

One implementation of the APD is configured to accept commands via both graphics and compute pipelines from the command processor, showing the ability to both render graphics as well as compute-intensive workloads required by convolution networks. The APD contains several SIMDs capable of performing vector operations alongside limited scalar and SP tasks similar to existing AMD GPUs.

https://youtube.com/watch?v=dQfV2_sROBw&feature=share

On June 25, 2021 NASA published detail description of future missions for Ingenuity Mars Helicopter considering 2nd software update because of HD imaging issue. Ingenuity’s team determined that capturing color images may have been inducing the imaging pipeline glitch, which resulted in the instability (Flight 6 anomaly). So Mars Helicopter needs 2nd software update to make thing going well within upcoming 9th flight. Ingenuity’s first bug was solved by software update (watchdog timer issue). Another software update for Mars Helicopter is intended to return ability to make 13 Megapixels photos on mars without flight anomalies for Ingenuity. Last week Mars Helicopter completed 8th flight on flying to 160 meters South and Perseverance goes to new location Séítah as well. Black and white images are from Ingenuity’s onboard camera directly. Mars Helicopter flew for 77.4 seconds. Maximal horizontal speed was 4 meters per second. Altitude was 10 meters. Ingenuity made amazing work to live on Mars autonomously.

Credit: nasa.gov, NASA/JPL-Caltech, NASA/JPL-Caltech/ASU

Link to Ingenuity’s 9th flight preparation with 2nd software update: https://mars.nasa.gov/technology/helicopter/status/308/fligh…ext-steps/

#mars #ingenuity #helicopter

Researchers and entrepreneurs are starting to ponder how AI could create versions of people after their deaths—not only as static replicas but as evolving digital entities that may steer companies or influence world events.


Experts are exploring ways artificial intelligence might confer a kind of digital immortality, preserving the personalities of the departed in virtual form and then allowing them to evolve.

This video was made possible by NordPass. Sign up with this link and get 70% off your premium subscription + 1 monrth for free! https://nordpass.com/futurology.

Visit Our Parent Company EarthOne For Sustainable Living Made Simple ➤
https://earthone.io/

The story of humanity is progress, from the origins of humanity with slow disjointed progress to the agricultural revolution with linear progress and furthermore to the industrial revolution with exponential almost unfathomable progress.

This accelerating rate of change of progress is due to the compounding effect of technology, in which it enables countless more from 3D printing, autonomous vehicles, blockchain, batteries, remote surgeries, virtual and augmented reality, robotics – the list can go on and on. These devices in turn will lead to mass changes in society from energy generation, monetary systems, space colonization, automation and much more!

This trajectory of progress is now leading us into a time period that is, “characterized by a fusion of technologies that is blurring the lines between the physical, digital and biological spheres”, called by many the technological revolution or the 4th industrial revolution — in which everything will change, from the underlying structure and fundamental institutions of society to how we live our day-to-day lives.

00:00 Intro.

This is only the Beginning.


Quantum physicist Mario Krenn remembers sitting in a café in Vienna in early 2016, poring over computer printouts, trying to make sense of what MELVIN had found. MELVIN was a machine-learning algorithm Krenn had built, a kind of artificial intelligence. Its job was to mix and match the building blocks of standard quantum experiments and find solutions to new problems. And it did find many interesting ones. But there was one that made no sense.

“The first thing I thought was, ‘My program has a bug, because the solution cannot exist,’” Krenn says. MELVIN had seemingly solved the problem of creating highly complex entangled states involving multiple photons (entangled states being those that once made Albert Einstein invoke the specter of “spooky action at a distance”). Krenn and his colleagues had not explicitly provided MELVIN the rules needed to generate such complex states, yet it had found a way. Eventually, he realized that the algorithm had rediscovered a type of experimental arrangement that had been devised in the early 1990s. But those experiments had been much simpler. MELVIN had cracked a far more complex puzzle.

“When we understood what was going on, we were immediately able to generalize [the solution],” says Krenn, who is now at the University of Toronto. Since then, other teams have started performing the experiments identified by MELVIN, allowing them to test the conceptual underpinnings of quantum mechanics in new ways. Meanwhile Krenn, Anton Zeilinger of the University of Vienna and their colleagues have refined their machine-learning algorithms. Their latest effort, an AI called THESEUS, has upped the ante: it is orders of magnitude faster than MELVIN, and humans can readily parse its output. While it would take Krenn and his colleagues days or even weeks to understand MELVIN’s meanderings, they can almost immediately figure out what THESEUS is saying.

Below is my Answer.

“There is big confluence between AI & Social Media. It is a two way thing, AI not only affects Social Media, Social Media also plays a great role in the development of AI.

The way AI is developed is through data, large data (big data) and one of the easiest ways to generate and source for data at this scale is from the contents and interactions on social media.

Most social media platforms operate at scale, so for issues such as monitoring or censorship of what is being posted, the admin of these platforms have to use automation and AI for its management and policing.

AI algorithms such as sentiment analysis or recommendation engines (used by Facebook & Youtube to recommend posts based on the AI understanding of what you will like) are very much an integral part of any social platform architecture.

AI is integral to how and when Adverts are delivered to you on social media. AI controls the engagement levels on your posts and ensures that people who are most likely interested in the topics or communities you belong to get recommended to you as connection; this is because engagement is key goal for every social media platform.

So as you can see, AI plays a very critical role in social media. But beyond this, it is also important to mention that not all the effects of AI on social media are positive ones. For example, AI ensures a never ending supply of content recommendation (recommendation engines) that can keep you engrossed in social media, using time in an unproductive way.

After the program was first revealed in 2019, the Air Force’s then-Assistant Secretary of the Air Force for Acquisition, Technology and Logistics Will Roper stated he wanted to see operational demonstrations within two years. The latest test flight of the Skyborg-equipped Avenger shows the service has clearly hit that benchmark.

The General Atomics Avenger was used in experiments with another autonomy system in 2020, developed as part of the Defense Advanced Research Projects Agency’s (DARPA) Collaborative Operations in Denied Environment (CODE) program that sought to develop drones that could demonstrate “collaborative autonomy,” or the ability to work cooperatively.