Toggle light / dark theme

AN ARTIFICIAL intelligence text-to-image model has forecasted a disturbing end to mankind’s existence.

The popular Craiyon AI, formerly DALL-E mini AI image generator, designed some barren landscapes and scorched plains when prompted to predict the end of humans.

The AI has been trained to create its masterpieces using unfiltered data from the internet.

ROBOTS could one day overthrow humans in an ‘apocalyptic’ takeover, a tech expert has predicted.

Aidan Meller, the creator of the Ai-Da robot, believes that within three years artificial intelligence (AI) could overtake humanity, per The Daily Star.

He also backs Elon Musk’s belief that advances in AI could impact mankind more than nuclear war.

An international team of scientists announced on Wednesday that they have discovered two new “super-Earth” planets just 100 light-years away. Both of them are significantly larger than our own planet — and one of them may even be suitable for life.

Super-Earths are a unique class of exoplanet in the solar system that are more massive than our planet but lighter than the ice giants, according to NASA. They are made by some combination of gas and rock and can get up to 10 times the size of Earth’s mass.

The findings, discovered with NASA’s Transiting Exoplanet Survey Satellite and the University of Liège’s Search for Habitable Planets Eclipsing Ultra-Cool Stars (SPECULOOS), will be published in the journal Astronomy and Astrophysics.

THE JAMES Webb Space Telescope has captured new images of a distant planet in a first for the world’s top space observatory. https://img.particlenews.com/image.php?url=0gfMiv_0he9IdLN00The James Webb Space Telescope launched on Christmas Day in 2021 Credit: Alamy https://img.particlenews.com/image.php?url=1LcYPg_0he9IdLN00
The star pasted on the images represents the planet’s host star Credit: NASAPhotographing distant planets is extremely difficult because light from their host star will pollute the images.

Being able to decode brainwaves could help patients who have lost the ability to speak to communicate again, and could ultimately provide novel ways for humans to interact with computers. Now Meta researchers have shown they can tell what words someone is hearing using recordings from non-invasive brain scans.

Our ability to probe human brain activity has improved significantly in recent decades as scientists have developed a variety of brain-computer interface (BCI) technologies that can provide a window into our thoughts and intentions.

The most impressive results have come from invasive recording devices, which implant electrodes directly into the brain’s gray matter, combined with AI that can learn to interpret brain signals. In recent years, this has made it possible to decode complete sentences from someone’s neural activity with 97 percent accuracy, and translate attempted handwriting movements directly into text at speeds comparable to texting.