Astronomers picked up extraterrestrial signals which they previously missed in an area they thought was devoid of potential ET activity. It could be the first hint that humans are not alone in the universe.
Mysterious Signals Detected
Experts led by University of Toronto student Peter Ma used an algorithm with artificial intelligence (AI) to examine 820 stars in an area they didn’t suspect would have any potential activity. They were surprised with their finding, especially since they missed the tentative signals earlier due to a lot of interference, Daily Mail reported.
Why would someone falling into a stellar-mass black hole be spaghettified, but someone crossing the event horizon of a supermassive black hole would not feel much discomfort?
As it turns out, there is a relatively simple equation that describes the tidal acceleration that a body of length d would feel, based on its distance from a given object with mass M: a = 2GMd/R3, where a is the tidal acceleration, G is the gravitational constant, and R is the body’s distance to the center of the object (with mass M).
This is NOT for ChatGPT, but instead its the AI tech used in beating GO, Chess, DOTA, etc. In other words, not just generating the next best word based on reading billions of sentences, but planning out actions to beat real game opponents (and winning.) And it’s free.
Reinforcement learning is an area of machine learning that involves taking right action to maximize reward in a particular situation. In this full tutorial course, you will get a solid foundation in reinforcement learning core topics.
The course covers Q learning, SARSA, double Q learning, deep Q learning, and policy gradient methods. These algorithms are employed in a number of environments from the open AI gym, including space invaders, breakout, and others. The deep learning portion uses Tensorflow and PyTorch.
Here is a list of some of the most popular quantum algorithms highlighting the significant impact quantum can have on the classical world:
Shor’s Algorithm
Our entire data security systems are based on the assumption that factoring integers with a thousand or more digits is practically impossible. That was until Peter Shor in 1995 proposed that quantum mechanics allows factorisation to be performed in polynomial time, rather than exponential time achieved using classical algorithms.
Quantum computing has entered a bit of an awkward period. There have been clear demonstrations that we can successfully run quantum algorithms, but the qubit counts and error rates of existing hardware mean that we can’t solve any commercially useful problems at the moment. So, while many companies are interested in quantum computing and have developed software for existing hardware (and have paid for access to that hardware), the efforts have been focused on preparation. They want the expertise and capability needed to develop useful software once the computers are ready to run it.
For the moment, that leaves them waiting for hardware companies to produce sufficiently robust machines—machines that don’t currently have a clear delivery date. It could be years; it could be decades. Beyond learning how to develop quantum computing software, there’s nothing obvious to do with the hardware in the meantime.
But a company called QuEra may have found a way to do something that’s not as obvious. The technology it is developing could ultimately provide a route to quantum computing. But until then, it’s possible to solve a class of mathematical problems on the same hardware, and any improvements to that hardware will benefit both types of computation. And in a new paper, the company’s researchers have expanded the types of computations that can be run on their machine.
Google has trained an artificial intelligence, named SingSong, that can generate a musical backing track to accompany people’s recorded singing.
To develop it, Jesse Engel and his colleagues at Google Research used an algorithm to separate the instrumental and vocal parts from 46,000 hours of music and then fine-tuned an existing AI model – also created by Google Research, but for generating speech and piano music – on those pairs of recordings.
Imagine you’re a young engineer whose boss drops by one morning with a sheaf of complicated fluid dynamics equations. “We need you to design a system to solve these equations for the latest fighter jet,” bossman intones, and although you groan as you recall the hell of your fluid dynamics courses, you realize that it should be easy enough to whip up a program to do the job. But then you remember that it’s like 1950, and that digital computers — at least ones that can fit in an airplane — haven’t been invented yet, and that you’re going to have to do this the hard way.
The scenario is obviously contrived, but this peek inside the Bendix MG-1 Central Air Data Computer reveals the engineer’s nightmare fuel that was needed to accomplish some pretty complex computations in a severely resource-constrained environment. As [Ken Shirriff] explains, this particular device was used aboard USAF fighter aircraft in the mid-50s, when the complexities of supersonic flight were beginning to outpace the instrumentation needed to safely fly in that regime. Thanks to the way air behaves near the speed of sound, a simple pitot tube system for measuring airspeed was no longer enough; analog computers like the MG-1 were designed to deal with these changes and integrate them into a host of other measurements critical to the pilot.
To be fair, [Ken] doesn’t do a teardown here, at least in the traditional sense. We completely understand that — this machine is literally stuffed full of a mind-boggling number of gears, cams, levers, differentials, shafts, and pneumatics. Taking it apart with the intention of getting it back together again would be a nightmare. But we do get some really beautiful shots of the innards, which reveal a lot about how it worked. Of particular interest are the torque-amplifying servo mechanism used in the pressure transducers, and the warped-plate cams used to finely adjust some of the functions the machine computes.
A key algorithm that quietly empowers and simplifies our electronics is the Fourier transform, which turns the graph of a signal varying in time into a graph that describes it in terms of its frequencies.
Packaging signals that represent sounds or images in terms of their frequencies allows us to analyze and adjust sound and image files, Richard Stern, professor of electrical and computer engineering at Carnegie Mellon University, tells Popular Mechanics. This mathematical operation also makes it possible for us to store data efficiently.
The invention of color TV is a great example of this, Stern explains. In the 1950s, television was just black and white. Engineers at RCA developed color television, and used Fourier transforms to simplify the data transmission so that the industry could introduce color without tripling the demands on the channels by adding data for red, green, and blue light. Viewers with black-and-white TVs could continue to see the same images as they saw before, while viewers with color TVs could now see the images in color.
Future computers You WON’T See Coming…(analog computing)
An emerging technology called analogue AI accelerators has the potential to completely change the AI sector. These accelerators execute computations using analogue circuits, which are distinct from digital circuits. They have advantages in handling specific kinds of AI algorithms, speed, and energy efficiency. We will examine the potential of this technology, its present constraints, and the use of analogue computing in AI in the future. Join us as we explore the realm of analogue AI accelerators and see how they’re influencing computing’s future. Don’t miss this engaging and educational film; click the subscribe button and check back for additional information about the newest developments in AI technology.