Toggle light / dark theme

Four trends that changed AI in 2023

This story originally appeared in The Algorithm, our weekly newsletter on AI. To get stories like this in your inbox first, sign up here.

This has been one of the craziest years in AI in a long time: endless product launches, boardroom coups, intense policy debates about AI doom, and a race to find the next big thing. But we’ve also seen concrete tools and policies aimed at getting the AI sector to behave more responsibly and hold powerful players accountable. That gives me a lot of hope for the future of AI.

Google’s Gemini: Challenging OpenAI ChatGPT And Changing The Game

This week, Google rocked the technology world with the unveiling of Gemini — an artificial intelligence system representing their most significant leap in AI capabilities. Hailed as a potential game-changer across industries, Gemini combines data types like never before to unlock new possibilities in machine learning.

With three distinct versions tailored to different needs, Gemini points to a future powered by AI that can match and even outperform human intelligence. Its multimodal nature builds on yet goes far beyond predecessors like GPT-3.5 and GPT-4 in its ability to understand our complex world dynamically.

As Google sets its sights on real-world deployment, Gemini prompts critical ethical questions around responsibility and safety. If leveraged conscientiously, its potential applications span from mundane productivity tasks to world-changing scientific breakthroughs.

Is GM’s Mary Barra in Love with Tesla?

Follow Jeff Lutz on X: @theJeffLutz My website: https://www.herbertong.com Get Free TESLA Milestone Tables Check out 15+ modules of resources for the $TSLA Investor Join this channel or Patreon to get access to perks: Get free access to 15+ modules of TSLA investor resources Become a member: #Tesla #TSLA #stock #ElonMusk #investing #teslabot #technology #ai #TeslaNews #TeslaFSD #teslastockanalysis #TeslaUpdate #ElonMuskNews #EV #FSD

Portable, Non-invasive, Mindreading AI turns Thoughts into Text

In a world-first, researchers from the GrapheneX-UTS Human-centric Artificial Intelligence Centre at the University of Technology Sydney (UTS) have developed a portable, non-invasive system that can decode silent thoughts and turn them into text.

The technology could aid communication for people who are unable to speak due to illness or injury, including stroke or paralysis. It could also enable seamless communication between humans and machines, such as the operation of a bionic arm or robot.

The study has been selected as the spotlight paper at the NeurIPS conference, an annual meeting that showcases world-leading research on artificial intelligence and machine learning, held in New Orleans on 12 December 2023.

Revolutionary Robotic Blended-Wing Aircraft Set to Transform Cargo Transport

Summary: A novel aircraft design pioneered by startup Natilus could dramatically alter the cargo transportation industry, offering larger capacities, reduced emissions, and futuristic remote control options.

In the field of aviation technology, a groundbreaking blended-wing robotic aircraft presents a future where efficient and sustainable cargo planes are the norm. The company pioneering this effort, Natilus, has built a model that harmonizes ecological concerns with the need for faster and cost-effective transportation.

Drawing from the source article, the unconventional plane differs from traditional airliners with its distinct diamond-shaped body that facilitates a more spacious cargo hold. This design enables up to 60 percent more cargo to be carried compared to the current models in use. Furthermore, it notably claims to cut carbon emissions by half, a crucial development for an industry under increasing pressure to become more environmentally friendly.

Improving a robot’s self-awareness by giving it proprioception

A pair of roboticists at the Munich Institute of Robotics and Machine Intelligence (MIRMI), Technical University of Munich, in Germany, has found that it is possible to give robots some degree of proprioception using machine-learning techniques. In their study reported in the journal Science Robotics, Fernando Díaz Ledezma and Sami Haddadin developed a new machine-learning approach to allow a robot to learn the specifics of its body.

Giving robots the ability to move around in the real world involves fitting them with technology such as cameras and —data from such devices is then processed and used to direct the legs and/or feet to carry out appropriate actions. This is vastly different from the way animals, including humans, get the job done.

With animals, the brain is aware of its body state—it knows where the hands and legs are, how they work and how they can be used to move around or interact with the environment. Such knowledge is known as proprioception. In this new effort, the researchers conferred similar abilities to robots using .