Toggle light / dark theme

10 Unsettling Artificial Intelligence Scenarios

https://www.patreon.com/johnmichaelgodier.

Music:

Cylinder Five by Chris Zabriskie is licensed under a Creative Commons Attribution license (https://creativecommons.org/licenses/by/4.0/)
Source: http://chriszabriskie.com/cylinders/
Artist: http://chriszabriskie.com/

Cylinder Eight by Chris Zabriskie is licensed under a Creative Commons Attribution license (https://creativecommons.org/licenses/by/4.0/)
Source: http://chriszabriskie.com/cylinders/
Artist: http://chriszabriskie.com/

20 Emerging Technologies That Will Change Our World

Technology has already changed our world. I mean, who knew that we’d be able to flick a switch to illuminate the darkness rather than lighting a candle? It’s wild. But the technology we have today and will have in the future is absolutely insane. From 3D printing houses to robotics to help us in our jobs, here are 20 emerging technologies that will change our world.

► For copyright matters please contact us: [email protected]

My Mind was Blown. AI Music is INSANE! — Google’s NEW MusicLM AI

I wonder if musicians should be worried.


Google Research introduces MusicLM, a model that can generate high-fidelity music from text descriptions. See how MusicLM casts the process of conditional music generation as a hierarchical sequence-to-sequence modeling task, and how it outperforms previous systems in audio quality and text description adherence. Learn more about MusicCaps, a dataset composed of 5.5k music-text pairs, and see how MusicLM can be conditioned on both text and a melody. Check out this video to see the power of MusicLM: Generating Music From Text! #GoogleResearch #MusicLM #MusicGeneration.

▼ Link(s) From Today’s Video:

✩ MusicLM: https://google-research.github.io/seanet/musiclm/examples/

► MattVidPro Website: https://MattVidPro.com.

Memories Become Chaotic before They Are Forgotten

A model for information storage in the brain reveals how memories decay with age.

Theoretical constructs called attractor networks provide a model for memory in the brain. A new study of such networks traces the route by which memories are stored and ultimately forgotten [1]. The mathematical model and simulations show that, as they age, memories recorded in patterns of neural activity become chaotic—impossible to predict—before disintegrating into random noise. Whether this behavior occurs in real brains remains to be seen, but the researchers propose looking for it by monitoring how neural activity changes over time in memory-retrieval tasks.

Memories in both artificial and biological neural networks are stored and retrieved as patterns in the way signals are passed among many nodes (neurons) in a network. In an artificial neural network, each node’s output value at any time is determined by the inputs it receives from the other nodes to which it’s connected. Analogously, the likelihood of a biological neuron “firing” (sending out an electrical pulse), as well as the frequency of firing, depends on its inputs. In another analogy with neurons, the links between nodes, which represent synapses, have “weights” that can amplify or reduce the signals they transmit. The weight of a given link is determined by the degree of synchronization of the two nodes that it connects and may be altered as new memories are stored.

/* */