Toggle light / dark theme

Cornell scientists create ‘living’ machines that eat, grow, and evolve

The field of robotics is going through a renaissance thanks to advances in machine learning and sensor technology. Each generation of robot is engineered with greater mechanical complexity and smarter operating software than the last. But what if, instead of painstakingly designing and engineering a robot, you could just tear open a packet of primordial soup, toss it in the microwave on high for two minutes, and then grow your own ‘lifelike’ robot?

If you’re a Cornell research team, you’d grow a bunch and make them race.


This Tic-Tac-Sized Computer Can Turn Almost Anything Into a Smart Device

The idea is to give craftspeople the tools they need to incorporate digital services to the items they’re already making. Poupyrev made it clear that he doesn’t want fundamentally change tried and tested items, like a jacket, into a computer first, and an article of clothing second. He wants to imbue everyday items with digital functionality.

In its final form, Poupyrev envisions clothing, furniture, and accessories that are all connected to the cloud, each providing their own, specialized functionality. Users will interact with screens using their sleeves and pause their music by tapping their glasses. Step trackers will live in our shoes, translators will live in our ears, and medicinal nano-robots could be injected into our blood streams. The very notion of a computer will radically change as little computers get placed into everything.

“This could allow makers to image and create a new world where things are connected and we don’t need keyboards, screens, or mice to interact with computers,” he said. “I’ve been working on this for 20 years and as it’s taken shape I’m realizing that we’re not building an interface. We’re building a a new kind of computer, an invisible computer.”

Read more

Fit to drive? The car will judge

However, we are not there yet and we have to take it step-by-step, says Dr Anna Anund from the Swedish National Road and Transport Research Institute (VTI). She and her team are developing sensor-based systems as part of the ADAS&ME project to move towards level three, in which the driver can rest and would only be expected to drive when the car requests it.


When you’re sleepy, stressed or have had a few drinks, you’re not in the best position to drive – or even make that decision. But automated cars could soon make that call for you.

Read more

/* */