NASA’s Hubble Space Telescope was the first telescope designed to be serviced in orbit. Join Hubble astronauts live as they discuss servicing from the innovative Robotics Operations Center. Plus a robot demo!
Posted in robotics/AI, space
NASA’s Hubble Space Telescope was the first telescope designed to be serviced in orbit. Join Hubble astronauts live as they discuss servicing from the innovative Robotics Operations Center. Plus a robot demo!
You never know how far your #SpaceApps solution will go! Gema knows that first hand. Hear about her project Deep Asteroid, which was a 2016 finalist, and how she used NASA data and the open-source tool Tensor Flow.
When NASA issued a worldwide challenge to help them better track the asteroids and comets that surround Earth, Gema Parreño answered the call. She used #TensorFlow, Google’s machine learning tool, to create a program called Deep Asteroid, which helps identify and track Near Earth Objects.
Special thanks to the Royal Observatory of Madrid. Learn more about them here: https://www.esmadrid.com/en/tourist-information/real-observa…gle.com%2F
Watch the next video in the series here: https://youtu.be/watch?v=kBxb-bIJPtw
https://paper.li/e-1437691924#/&h=AT3mdHzXuCejMgVQDYy6JiVw58…e-BeRlnE2g
“Our world is changing so fast… this year we have sessions on artificial intelligence, genetics and what the future holds for our planet. There is a new term now — cli-fi. We have a beautiful session on cli-fi, on what would happen if bees disappear.
”I feel at this moment in our country it is very very important to give impetus to empirical thinking,” the author of ”Paro: Dreams of Passion” said.
Nobel Laureate Venki Ramakrishnan will speak on the ‘Importance of Science’, cosmologist Priyamvada Natarajan on ‘Mapping the Heavens’ and professor of AI Toby Walsh on ‘How the Future is Now’ among others.
Interest in artificial neural networks has skyrocketed over the years as companies like Google and Facebook have invested heavily in machines that can think like humans. Today, an AI can recognize objects in photos or help generate realistic computer speech, but Nvidia has successfully built a neural network that can create an entire virtual world with the help of a game engine. The researchers speculate this “hybrid” approach could one day make AI-generated games a reality.
The system build by Nvidia engineers uses many of the same parts as other AI experiments, but they’re arranged in a slightly different way. To goal of the project was to create a simple driving simulator, but without using any humans to design the environment.
Like all neural networks, the system needed training data. Luckily, work on self-driving cars has ensured there’s plenty of training footage of a vehicle driving around city streets. The team used a segmentation network to recognize different object categories like trees, cars, sky, buildings, and so on. The segmented data is what Nvidia fed into its model, which used a generative adversarial network to improve the accuracy of the final output. Essentially, one network created rendered scenes, and a second network would pass or fail them. Over time, the network is tuned to only create believable data.
An ESA-led team subjected Intel’s new Myriad 2 artificial intelligence chip to one of the most energetic radiation beams available on Earth. This test of its suitability to fly in space took place at CERN, the European Organization for Nuclear Research. The AI chip is related in turn to an ESA-fostered family of integrated circuits.
In May, Google made quite the splash when it unveiled Duplex, its eerily humanlike voice assistant capable of making restaurant reservations and salon appointments. It seemed to mark a new milestone in speech generation and natural-language understanding, and it pulled back the curtain on what the future of human-AI interaction might look like.
So far, robots have primarily been developed to fulfill utilitarian purposes, assisting humans or serving as tools to facilitate the completion of particular tasks. As robots become more human-like, however, this could pose significant challenges, particularly for robots built to engage with humans socially.
Humans have used sex dolls as inanimate objects for sexual pleasure throughout history. Animated sex robots, social robots created to meet humans’ needs for sex and affection, offer more. Due to recent developments in robotics and AI, sex robots are now becoming more advanced and human-like. Purchasers can have them customised both in appearance and in how they speak and behave to simulate intimacy, warmth and emotion.
Currently, sex robots are inanimate things, able to simulate but not engage in mutual intimacies. In the future, however, technological advances might allow researchers to manufacture sentient, self-aware sex robots with feelings, or sexbots. The implications of the availability of sexbots as customisable perfect partners for intimate relationships with humans are potentially vast.