In an in-depth interview, Pichai predicts: “This will be one of the biggest things we all grapple with for the next decade.”
Google released the first phase of its next-generation AI model, Gemini, today.
In an in-depth interview, Pichai predicts: “This will be one of the biggest things we all grapple with for the next decade.”
Google released the first phase of its next-generation AI model, Gemini, today.
In today’s column, I am going to do a deep dive into what is meant by the oft-mentioned terms known as Artificial Intelligence (AI) and Artificial General Intelligence (AGI).
I walk you thru the likes of AI and AGI and what they mean, including a recently posted proposal by Google DeepMind about encompassing levels of autonomy. Good stuff.
Uber’s robots will be strategically positioned in busy areas with restaurants and crowds, aiming to bring a festive atmosphere to these demanding spaces.
Uber’s holiday season activities also include a festive fleet, Christmas kit, and Gift hub.
Researchers are utilizing a reinforcement learning method known as ‘curiosity-driven’ training for tasks like calling the elevator, opening doors, and sorting boxes.
Swiss-Mile’s ANYmal robot is being trained to perform practical tasks by manipulating its wheeled legs and arms.
The robots offer a modular, multi-functional tool may help to efficiently and economically execute a spectrum of tasks on the lunar surface.
The Japanese startup was selected as part of 14 firms selected by DARPA for its LunA-10 study.
The alliance aims to open-source the development of artificial intelligence and take on the bad boys of AI, Microsoft, OpenAI, and Google.
Major names in the technology industry, such as IBM, Meta, and many others who seemed to have been left out of the race to develop artificial intelligence (AI) models, have now teamed up to form the AI Alliance.
The collaborative effort also includes government and research organizations and a few startups that will work together to “support open innovation and open science in AI”, a press release from IBM about the alliance said.
Ever since OpenAI released ChatGPT last year, technology companies have been caught in a frenzy to release to own AI models that can deliver text, visual, and even audio content with the help of machine learning. Although these models are still far from attaining artificial general intelligence (AGI), a handful of names have taken the lead in this arena.
A team of AI researchers at Google’s DeepMind project have developed a type of AI system that is able to demonstrate social learning capabilities. In their paper published in the journal Nature Communications, the group describes how they developed an AI application that showed it was capable of learning new skills in a virtual world by copying the actions of an implanted “expert.”
Most AI systems, such as ChatGPT, gain their knowledge through exposure to huge amounts of data, such as from repositories on the Internet. But such an approach, those in the industry have noted, is not very efficient. Therefore many in the field continue to look for other ways to teach AI systems to learn.
One of the most popular approaches used by researchers is to attempt to mimic the process by which humans learn. Like traditional AI apps, humans learn by exposure to known elements in an environment and by following the examples of others who know what they are doing. But unlike AI apps, humans pick things up without the need for huge numbers of examples. A child can learn to play the game of Jacks, for example, after watching others play for just a few minutes—an example of cultural transmission. In this new effort, the research team has attempted to replicate this process using AI constrained to a virtual world.
The feeling that we belong to something much larger and deeper than ourselves has long been a common human experience. Palaeontologist and Jesuit priest Teilhard de Chardin wrote about “a noosphere” of cognitive realisation evolving towards an “Omega point” of divine planetary spiritualisation. But it is hard to envisage that ever occurring. It is easier to envisage that we belong in an evolving intelligent power that has entered a momentous posthuman dimension though artificial intelligence.
Some futurists believe we are on the way to realising a posthuman world in which we will live on as cyborgs, or in some new embodiment of intelligent power that will absorb and supersede human intelligence. It is no longer fanciful to foresee a future in which we will have everyday interactions with androids that are powered by artificial general intelligence. They will look, move, and seem to think and respond like a human person, be skilled in simulating emotional responses realistically, and greatly out-perform us in mental activities and manual tasks. It may be we will regard them only as tools or mechanical assistants. But from their expression of human-like behaviours we may become attached to them, even to the extent of according them rights. Their design will have to ensure they don’t carry any threat, but will we be able to trust fully that this will remain the case given their technical superiority? And how far can we trust that the military, malicious groups, and rogue states won’t develop androids trained to kill people and destroy property? We know only too well about our human propensity for violent conflict.
It would be ironic if, to gain more power and control over the world, we used our human intelligence to create AI systems and devices which, for all the benefits they bring, end up managing our lives to our detriment, or even controlling us. And irony, as Greek dramatists were well aware, is often a component of fate.
Explore the dynamic interplay between the human brain and AI. Uncover the complexities and contrasts shaping the future of intelligence.
SUMMARY: A soft robot with octopus-inspired sensory and motion capabilities represents significant progress in robotics, offering nimbleness and adaptability in uncertain environments.
Robotic engineers have made a leap forward with the development of a soft robot that closely resembles the dynamic movements and sensory prowess of an octopus. This groundbreaking innovation from an international collaboration involving Beihang University, Tsinghua University, and the National University of Singapore has the potential to redefine how robots interact with the world around them.
The blueprint for this highly adaptable robot draws upon the intelligent, soft-bodied mechanics of an octopus, enabling smooth movements across a variety of surfaces and environments with precision. The sensorized soft arm, lovingly named the electronics-integrated soft octopus arm mimic (E-SOAM), embodies advancements in soft robotics with its incorporation of elastic materials and sophisticated liquid metal circuits that remain resilient under extreme deformation.