Toggle light / dark theme

Anticipation of mind uploading in this movie.


In a post-nuclear-war society, blue-skinned, silver-eyed human-like robots have become a common sight as the surviving population suffers from a decreasing birth rate and has grown dependent on their assistance. A fanatical organization tries to prevent the robots from becoming too human, fearing that they will take over. Meanwhile, a scientist experiments with creating human replicas that have genuine emotions and memories…

Enjoy wink

A google researcher was put on leave because he apparently believed his AI project had become sentient. Dr Mike Pound discusses.

https://www.facebook.com/computerphile.
https://twitter.com/computer_phile.

This video was filmed and edited by Sean Riley.

Computer Science at the University of Nottingham: https://bit.ly/nottscomputer.

Computerphile is a sister project to Brady Haran’s Numberphile. More at http://www.bradyharan.com

WASHINGTON — Artificial intelligence and related digital tools can help warn of natural disasters, combat global warming and fast-track humanitarian aid, according to retired Army Lt. Gen. H.R. McMaster, a onetime Trump administration national security adviser.

It can also help preempt fights, highlight incoming attacks and expose weaknesses the world over, he said May 17 at the Nexus 22 symposium.

The U.S. must “identify aggression early to deter it,” McMaster told attendees of the daylong event focused on autonomy, AI and the defense policy that underpins it. “This applies to our inability to deter conflict in Ukraine, but also the need to deter conflict in other areas, like Taiwan. And, of course, we have to be able to respond to it quickly and to maintain situational understanding, identify patterns of adversary and enemy activity, and perhaps more importantly, to anticipate pattern breaks.”

Watch the launch from New Zealand of CAPSTONE, a new pathfinder CubeSat that will explore a unique orbit around the Moon!

The Cislunar Autonomous Positioning System Technology Operations and Navigation Experiment, or CAPSTONE, will be the first spacecraft to fly a near rectilinear halo orbit (NRHO) around the Moon, where the pull of gravity from Earth and the Moon interact to allow for a nearly-stable orbit. CAPSTONE’s test of this orbit will lead the way for our future Artemis lunar outpost called Gateway.

CAPSTONE is targeted to launch at 5:55 a.m. EDT (9:55 UTC) Tuesday, June 28 on Rocket Lab’s Electron rocket from the company’s Launch Complex 1 in New Zealand.

A new phishing attack is using Facebook Messenger chatbots to impersonate the company’s support team and steal credentials used to manage Facebook pages.

Chatbots are programs that impersonate live support people and are commonly used to provide answers to simple questions or triage customer support cases before they are handed off to a live employee.

In a new campaign discovered by TrustWave, threat actors use chatbots to steal credentials for managers of Facebook pages, commonly used by companies to provide support or promote their services.

A new GPU-based machine learning algorithm developed by researchers at the Indian Institute of Science (IISc) can help scientists better understand and predict connectivity between different regions of the brain.

The algorithm, called Regularized, Accelerated, Linear Fascicle Evaluation, or ReAl-LiFE, can rapidly analyze the enormous amounts of data generated from diffusion Magnetic Resonance Imaging (dMRI) scans of the human brain. Using ReAL-LiFE, the team was able to evaluate dMRI data over 150 times faster than existing state-of-the-art algorithms.

“Tasks that previously took hours to days can be completed within seconds to minutes,” says Devarajan Sridharan, Associate Professor at the Centre for Neuroscience (CNS), IISc, and corresponding author of the study published in the journal Nature Computational Science.

The differences? The new Mayflower—logically dubbed the Mayflower 400—is a 50-foot-long trimaran (that’s a boat that has one main hull with a smaller hull attached on either side), can go up to 10 knots or 18.5 kilometers an hour, is powered by electric motors that run on solar energy (with diesel as a backup if needed), and required a crew of… zero.

That’s because the ship was navigated by an on-board AI. Like a self-driving car, the ship was tricked out with multiple cameras (6 of them) and sensors (45 of them) to feed the AI information about its surroundings and help it make wise navigation decisions, such as re-routing around spots with bad weather. There’s also onboard radar and GPS, as well as altitude and water-depth detectors.

The ship and its voyage were a collaboration between IBM and a marine research non-profit called ProMare. Engineers trained the Mayflower 400’s “AI Captain” on petabytes of data; according to an IBM overview about the ship, its decisions are based on if/then rules and machine learning models for pattern recognition, but also go beyond these standards. The algorithm “learns from the outcomes of its decisions, makes predictions about the future, manages risks, and refines its knowledge through experience.” It’s also able to integrat e far more inputs in real time than a human is capable of.