‘Shelley AI’ is waiting to collaborate…with you.

Sony Corp. is planning next spring to roll out a dog-shaped pet robot similar to its discontinued Aibo with updated components that could allow it to control home appliances, people familiar with the matter said.
Sony is preparing for a media event in November to show off the product, the people said. It is unclear whether the new product will use the Aibo name and how much it will cost.
Sony is preparing for a media event in November to show off the product.
Argo AI LLC, a driverless-car developer controlled by Ford Motor Co., has purchased a 17-year-old company that makes laser systems needed to operate cars without human intervention, an important step for a conventional Detroit auto maker looking to boost its role in shaping the industry’s transformation.
Argo AI said Friday it is buying New Jersey-based Princeton Lightwave Inc. for an undisclosed price, a move that provides Ford with more immediate access to so-called lidar systems that use lasers to create a 3D view of the…
To Read the Full Story.
LONDON: A humanoid robot took the stage at the Future Investment Initiative yesterday and had an amusing exchange with the host to the delight of hundreds of delegates.
Smartphones were held aloft as Sophia, a robot designed by Hong Kong company Hanson Robotics, gave a presentation that demonstrated her capacity for human expression.
Sophia made global headlines when she was granted Saudi citizenship, making the kingdom the first country in the world to offer its citizenship to a robot.
Since emerging as a species we have seen the world through only human eyes. Over the last few decades, we have added satellite imagery to that terrestrial viewpoint. Now, with recent advances in Artificial Intelligence (AI), we are not only able to see more from space but to see the world in new ways too.
One example is “Penny”, a new AI platform that from space can predict median income of an area on Earth. It may even help us make cities smarter than is humanly possible. We’re already using machines to make sense of the world as it is; the possibility before us is that machines help us create a world as it should be and have us question the nature of the thinking behind its design.
Penny is a free tool built using high-resolution imagery from DigitalGlobe, income data from the US census, neural network expertise from Carnegie Mellon and intuitive visualizations from Stamen Design. It’s a virtual cityscape (for New York City and St. Louis, so far), where AI has been trained to recognize, with uncanny accuracy, patterns of neighbourhood wealth (trees, parking lots, brownstones and freeways) by correlating census data with satellite imagery.
A group of astronomers from the universities of Groningen, Naples and Bonn has developed a method that finds gravitational lenses in enormous piles of observations. The method is based on the same artificial intelligence algorithm that Google, Facebook and Tesla have been using in the last years. The researchers published their method and 56 new gravitational lens candidates in the November issue of Monthly Notices of the Royal Astronomical Society.
When a galaxy is hidden behind another galaxy, we can sometimes see the hidden one around the front system. This phenomenon is called a gravitational lens, because it emerges from Einstein’s general relativity theory which says that mass can bend light. Astronomers search for gravitational lenses because they help in the research of dark matter.
The hunt for gravitational lenses is painstaking. Astronomers have to sort thousands of images. They are assisted by enthusiastic volunteers around the world. So far, the search was more or less in line with the availability of new images. But thanks to new observations with special telescopes that reflect large sections of the sky, millions of images are added. Humans cannot keep up with that pace.