Toggle light / dark theme

After 12 years of work, a huge team of researchers from the UK, US, and Germany have completed the largest and most complex brain map to date, describing every neural connection in the brain of a larval fruit fly.

Though nowhere near the size and complexity of a human brain, it still covers a respectable 548,000 connections between a total of 3,016 neurons.

The mapping identifies the different types of neurons and their pathways, including interactions between the two sides of the brain, and between the brain and ventral nerve cord. This brings scientists closer to understanding how the movements of signals from neuron to neuron lead to behavior and learning.

South Korea’s giant leap into space started with a small step on the internet.

With treaties banning certain tech transfers, South Korea’s rocket scientists turned to a search service to find an engine they could mimic as the country embarked on an ambitious plan to build an indigenous space program. The nation launched its first home-grown rocket called Nuri in October 2021.

First evidence of magnetic fields in the Universe’s galactic web.

The cosmic web is the name astronomers give to the structure of our Universe. It refers to the clusters, filaments, dark matter, and voids that make up the basis of this ever-expanding Universe. We can observe this via optical telescopes by mapping the locations of galaxies.

In a new research published in Science Advances, for the first time, scientists claim to have observed shockwaves moving through these galaxy clusters and filaments that make up the galactic or cosmic web. A phenomenon that has long been a universal mystery.

ALGORITHMS TURN PHOTO SHAPSHOTS INTO 3D VIDEO AND OR IMMERSIVE SPACE. This has been termed “Neural Radiance Fields.” Now Google Maps wants to turn Google Maps into a gigantic 3D space. Three videos below demonstrate the method. 1) A simple demonstration, 2) Google’s immersive maps, and 3) Using this principle to make dark, grainy photographs clear and immersive.

This technique is different from “time of flight” cameras which make a 3D snapshot based on the time light takes to travel to and from objects, but combined with this technology, and with a constellation of microsatellites as large as cell phones, a new version of “Google Earth” with live, continual imaging of the whole planet could eventually be envisioned.

2) https://www.youtube.com/watch?v=EUP5Fry24ao.

3)

► Skip the waitlist by signing up for Masterworks here: https://masterworks.art/ainews.
Purchase shares in great masterpieces from artists like Pablo Picasso, Banksy, Andy Warhol, and more. See important Masterworks disclosures: https://www.masterworks.com/about/disclaimer?utm_source=aine…disclaimer.

Premium Robots: https://taimine.com/
Deep Learning AI Specialization: https://imp.i384100.net/GET-STARTED

GenAug has been developed by Meta AI and the University of Washington, which utilizes pre-trained text-to-image generative artificial intelligence models to enable imitation-based learning in practical robots. Stanford artificial intelligence researchers have proposed a method, called ATCON to drastically improve the quality of attention maps and classification performance on unseen data. Google’s new SingSong AI can generate instrumental music that complements your singing.

AI News Timestamps:

My 11th Ambient music video release for YT. An unofficial Soundtrack to the Sci Fi movie ‘2010: The Year we Make Contact’ (starring Roy Scheider & Helen Mirren). The movie was based on the Arthur C. Clarke novel, which was the sequel to 2001: A Space Odyssey. I went alot more in depth with the visuals on this one, recreating shots from the original movie, but with an extra dash of VFX that weren’t easy to pull off on a PC in 1986.

In upcoming video releases I will be doing a deep dive into the ambient multiverse, exploring various styles from Space Ambient to Dark Ambient to Cyberpunk to Sleep music to White Noise. My focus on this channel is to create relaxing cinematic ambient background music for chilling, focus, work and meditation. With the occasional eerie dark ambient tracks. The theme for my video backdrops is a rich fusion of derelict imagery, planets and moons.

Music & Animation by Duncan Brown.
Planet Maps by Robert Stein III (Pinterest)

I made this for you to Enjoy. Like. Share. Subscribe.

Hey folks, I’m excited to share a new essay with y’all on my proposed route towards nanoscale human brain connectomics. I suggest that synchrotron ‘expansion x-ray microscopy’ has the potential to enable anatomical imaging of the entire human brain with sub-100 nm voxel size and high contrast in around 1 year for a price of roughly $10M. I plan to continue improving this essay over time as I acquire more detailed information and perform more calculations.

For a brief history of this concept: I started exploring this idea during undergrad (working with a laboratory-scale x-ray microscope), but was cut short by the pandemic. Now, I’m working on a PhD in biomedical engineering centered on gene therapy and synthetic biology, but I have retained a strong interest in connectomics. I recently began communication with some excellent collaborators who might be able to help move this technology forward. Hoping for some exciting progress!


By Logan Thrasher Collins.

PDF version

The German company will launch its operating system by the mid of this decade.

German luxury and commercial vehicle brand, Mercedes Benz, has announced its software partnership with Google to offer “super-computer-like” navigation and other services in every car, Reuters.


Mercedes’ plans for the future

Mercedes’ partnership with Google follows the route that conventional carmakers such as Ford, Renault, Nissan, and General Motors have taken to add Google’s suite of services to their cars. This partnership allows users to tap into Google’s Maps, Assitant, and other services and use traffic information to determine the best routes to reach their destination.

What if, instead of using X-rays or ultrasound, we could use touch to image the insides of human bodies and electronic devices? In a study publishing in the journal Cell Reports Physical Science (“A smart bionic finger for subsurface tactile-tomography”), researchers present a bionic finger that can create 3D maps of the internal shapes and textures of complex objects by touching their exterior surface.

“We were inspired by human fingers, which have the most sensitive tactile perception that we know of,” says senior author Jianyi Luo, a professor at Wuyi University. “For example, when we touch our own bodies with our fingers, we can sense not only the texture of our skin, but also the outline of the bone beneath it.”

“Our bionic finger goes beyond previous artificial sensors that were only capable of recognizing and discriminating between external shapes, surface textures, and hardness,” says co-author Zhiming Chen, a lecturer at Wuyi University.