Toggle light / dark theme

More details about it will be revealed at CES 2024.

LG is going to start selling a compact bipedal robot that can roll around your house freely.


LG is going to start selling a compact bipedal robot that can roll around your house freely. The AI-powered robot, which will debut at CES 2024 in Las Vegas, has a wide range of capabilities — from notifying you that you left the AC on while you’re away to watching your pet while you’re at work. Like stationary smart home aids, like Alexa or Apple HomePod, LG’s robot can also tell you the weather and remind you to take your medications on time.

The robot is powered by Qualcomm’s Robotics RB5 Platform, which entails a mix of hardware and software that run the bot’s AI program. Some of these include its ability to recognize faces and voices, process the emotions of those around it and engage in conversation. LG says the bot will be able to greet you at your door, analyze your emotions and play music to either boost your good mood or lull you to sleep. It can even “emote” by changing its posture thanks to its articulated leg joints. Although it’s a cute feature, it might not have any practical use beyond making it approachable.

Industrial mishaps are not rare but caution is prime. Tesla’s CEO Elon Musk has lashed out at the media for sensationalizing an old injury caused by a robot at his Giga Texas factory in Austin, Texas. He claimed that the media was trying to link the incident to his futuristic Optimus robots, which he said would usher in a new era of abundance.

The incident, which happened two years ago, involved a software engineer who was programming software for robots that cut car parts from freshly cast aluminum. While he was working, he was unaware that one of the robots was still active while the other two were disabled for maintenance. The active robot then attacked the engineer, pinning him down and clawing at his back and arm. The attack left a trail of blood on the factory floor, as well as an open wound on the engineer’s left hand.

Scientists pave the way for new culinary frontiers.


This E-tongue can identify four tastes – saltiness, sourness, astringency, and sweetness – in just a tiny bit of food, and uses deep-learning technology to understand taste. It even works well with different kinds of wines.

The E-tongue is like a super tool that can be used in different industries like food, drinks, makeup, and medicine, explained the researchers in a press release by Daegu Gyeongbuk Institute of Science & Technology (DGIST).

“The novel technology developed in this study is an electronic tongue system that integrates sensors and deep learning and measures complex flavors, and it is a sensor-deep-learning technology that can quantitatively evaluate taste, which was difficult in the past,” said Professor Kyung-In Jang from the DGIST Department of Robotics and Mechanical and Electronic Engineering.

NEW YORK (AP) — Nearly three years after rioters stormed the U.S. Capitol, the false election conspiracy theories that drove the violent attack remain prevalent on social media and cable news: suitcases filled with ballots, late-night ballot dumps, dead people voting.

Experts warn it will likely be worse in the coming presidential election contest. The safeguards that attempted to counter the bogus claims the last time are eroding, while the tools and systems that create and spread them are only getting stronger.

Previously, researchers have used implants surgically placed in the brain or bulky, expensive machines to translate brain activity into text. The new approach, presented at this week’s NeurIPS conference by researchers from the University of Technology Sydney, is impressive for its use of a non-invasive EEG cap and the potential to generalize beyond one or two people.

The team built an AI model called DeWave that’s trained on brain activity and language and linked it up to a large language model—the technology behind ChatGPT—to help convert brain activity into words. In a preprint posted on arXiv, the model beat previous top marks for EEG thought-to-text translation with an accuracy of roughly 40 percent. Chin-Teng Lin, corresponding author on the paper, told MSN they’ve more recently upped the accuracy to 60 percent. The results are still being peer-reviewed.

Though there’s a long way to go in terms of reliability, it shows progress in non-invasive methods of reading and translating thoughts into language. The team believes their work could give voice to those who can no longer communicate due to injury or disease or be used to direct machines, like walking robots or robotic arms, with thoughts alone.

Skyline Robotics is disrupting the century-old practice of window washing with new technology that the startup hopes will redefine a risky industry.

Its window-washing robot, Ozmo, is now operational in Tel Aviv and New York, and has worked on major Manhattan buildings such as 10 Hudson Yards, 383 Madison, 825 3rd Avenue and 7 World Trade Center in partnership with the city’s largest commercial window cleaner Platinum and real estate giant The Durst Organization.

The machine is suspended from the side of a high-rise. A robotic arm with a brush attached to the end cleans the window following instructions from a LiDAR camera, which uses laser technology to map 3D environments. The camera maps the building’s exterior and identifies the parameters of the windows.

When the theoretical physicist Leonard Susskind encountered a head-scratching paradox about black holes, he turned to an unexpected place: computer science. In nature, most self-contained systems eventually reach thermodynamic equilibrium… but not black holes. The interior volume of a black hole appears to forever expand without limit. But why? Susskind had a suspicion that a concept called computational complexity, which underpins everything from cryptography to quantum computing to the blockchain and AI, might provide an explanation.

He and his colleagues believe that the complexity of quantum entanglement continues to evolve inside a black hole long past the point of what’s called “heat death.” Now Susskind and his collaborator, Adam Brown, have used this insight to propose a new law of physics: the second law of quantum complexity, a quantum analogue of the second law of thermodynamics.

Also appearing in the video: Xie Chen of CalTech, Adam Bouland of Stanford and Umesh Vazirani of UC Berkeley.

00:00 Intro to a second law of quantum complexity.

Artificial Intelligence is our best bet to understand the nature of our mind, and how it can exist in this universe. \r\
\
Joscha Bach, Ph.D. is an AI researcher who worked and published about cognitive architectures, mental representation, emotion, social modeling, and multi-agent systems. He earned his Ph.D. in cognitive science from the University of Osnabrück, Germany. He is especially interested in the philosophy of AI, and in using computational models and conceptual tools to understand our minds and what makes us human.\r\
Joscha has taught computer science, AI, and cognitive science at the Humboldt-University of Berlin, the Institute for Cognitive Science at Osnabrück, and the MIT Media Lab, and authored the book “Principles of Synthetic Intelligence” (Oxford University Press).\
\
This talk was given at a TEDx event using the TED conference format but independently organized by a local community.