Toggle light / dark theme

New AI tool can aid scientists in hunting for life on Mars

The development represents “an important advance in extraterrestrial research, in which biology has often lagged behind chemistry and geology.”

A new study has revealed a new way to enhance the search for aliens on Mars by teaching artificial intelligence to detect sites that could contain “biosignatures.”

And so, the researchers trained a deep learning framework to map biosignatures in a three-square-kilometer area of Chile’s Atacama Desert… More.


NASA/JPL-Caltech.

According to NASA, a biosignature is any “characteristic, element, molecule, substance, or feature that can be used as evidence for past or present life.” But before testing such a tool on Mars or other worlds, they need to be tested on Earth first.

Google marks major milestone towards its 1,000-language AI model

Its based model performs better than state-of-the-art models available today.

Search giant Google has completed the ‘critical first step’ toward building its artificial intelligence (AI) model that will support the world’s one thousand most-spoken languages. In a blog post the company released details about its Universal Speech Model (USM).

Google’s announcement is part of the build-up to its annual I/O event where it plans to unveil a slew of products powered by AI.


400tmax/iStock.

Engineers develop robots to house-hunt and scout real estate in space

The robots contain miniaturized sensors which are deployed as they traverse a cave or other subsurface environment.

Life on Mars is closer than you think. And researchers at the University of Arizona College of Engineering are already scouting real estate and house hunting. Their helpers? A flock of robots that can explore the subsurface environments on other worlds.

“Lava tubes and caves would make perfect habitats for astronauts because you don’t have to build a structure; you are shielded from harmful cosmic radiation, so all you need to do is make it pretty and cozy,” said Wolfgang Fink, an associate professor of electrical and computer engineering at UArizona.

Fink and team have published a paper in Advances in Space Research that details a “communication network that would link rovers, lake landers, and even submersible vehicles through a so-called mesh topology network, allowing the machines to work together as a team, independently from human input,” according to a press release.

Open AI’s Codex tool claims to help developers write code faster, better

It can’t fix the code when it does not work though.

The conversational chatbot from OpenAI, ChatGPT, has attracted the attention of users worldwide. However, the lesser-known tool called Codex from OpenAI has quickly become a top favorite among developers. Codex currently powers the Copilot feature on GitHub.

How does Open AI’s Codex work?


Guillaume/ iStock.

For those, who are relatively new to the world of programming, GitHub is an open-source community where developers share the code for the software they have written for others to use. Microsoft acquired GitHub over four years ago. Working closely with OpenAI, Microsoft has gained access to ChatGPT and Codex, OpenAI’s ChatGPT-like solution for code.

Biocomputing With Mini-Brains as Processors Could Be More Powerful Than Silicon-Based AI

So why not sidestep this conundrum and use neural tissue directly as a biocomputer?

This month, a team from Johns Hopkins University laid out a daring blueprint for a new field of computing: organoid intelligence (OI). Don’t worry—they’re not talking about using living human brain tissue hooked up to wires in jars. Rather, as in the name, the focus is on a surrogate: brain organoids, better known as “mini-brains.” These pea-sized nuggets roughly resemble the early fetal human brain in their gene expression, wide variety of brain cells, and organization. Their neural circuits spark with spontaneous activity, ripple with brain waves, and can even detect light and control muscle movement.

In essence, brain organoids are highly-developed processors that duplicate the brain to a limited degree. Theoretically, different types of mini-brains could be hooked up to digital sensors and output devices—not unlike brain-machine interfaces, but as a circuit outside the body. In the long term, they may connect to each other in a super biocomputer trained using biofeedback and machine learning methods to enable “intelligence in a dish.”

Mind-Boggling Neuromorphic Brain Chips (Part 1)

I can’t help myself. I keep thinking about the 1961 musical Stop the World—I Want to Get Off. After opening in Manchester, England, the show transferred to the West End, London, where it ran for 485 performances.

It’s not that the plot of this extravaganza has anything to do with what we are talking about here. It’s just that the sentiment embodied by the show’s title reflects the way I’m currently feeling about artificial intelligence (AI) and machine learning (ML).

On the one hand, the current state of play with AI and ML is tremendously exciting. On the other hand, I’m starting to think that I’ve enjoyed all the excitement I can stand.

Augmented Reality with X-Ray Vision

X-AR uses wireless signals and computer vision to enable users to perceive things that are invisible to the human eye (i.e., to deliver non-line-of-sight perception). It combines new antenna designs, wireless signal processing algorithms, and AI-based fusion of different sensors.

This design introduces three main innovations:

1) AR-conformal wide-band antenna that tightly matches the shape of the AR headset visor and provides the headset with Radio Frequency (RF) sensing capabilities. The antenna is flexible, lightweight, and fits on existing headsets without obstructing any of their cameras or the user’s field of view.

AGI Soon? 1 AI Using 2 Modalities Solves Visual IQ Test w/ 1,600,000,000 Parameters | Kosmos-1

Premium Robots: https://taimine.com/
Deep Learning AI Specialization: https://imp.i384100.net/GET-STARTED
A new multimodal artificial intelligence model from Microsoft called Kosmos-1 is able to process both text and visual data to the point of passing a visual IQ test with 26 percent accuracy, and researchers say this is a step towards AGI. Stable Diffusion AI can now read brain waves to reconstruct images that people are thinking about. Stanford has created a world record brain computer interface device with the help of AI to allow patients to type 62 words per minute with their thoughts.

AI News Timestamps:
0:00 Microsoft Kosmos-1 AI & AGI
3:34 AI Neuroscience Tech Reads Brain Waves.
5:43 AI & BCI Breaks Record.

#technology #tech #ai