Toggle light / dark theme

For the past two centuries, humans have relied on fossil fuels for concentrated energy; hundreds of millions of years of photosynthesis packed into a convenient, energy-dense substance. But that supply is finite, and fossil fuel consumption has tremendous negative impact on Earth’s climate.

“The biggest challenge many people don’t realize is that even nature has no solution for the amount of energy we use,” said University of Chicago chemist Wenbin Lin. Not even is that good, he said: “We will have to do better than nature, and that’s scary.”

One possible option scientists are exploring is “”—reworking a plant’s system to make our own kinds of fuels. However, the chemical equipment in a single leaf is incredibly complex, and not so easy to turn to our own purposes.

The researchers revealed that deep convolutional neural networks were insensitive to configural object properties.

Deep convolutional neural networks (DCNNs) do not view things in the same way that humans do (through configural shape perception), which might be harmful in real-world AI applications. This is according to Professor James Elder, co-author of a York University study recently published in the journal iScience.

The study, which conducted by Elder, who holds the York Research Chair in Human and Computer Vision and is Co-Director of York’s Centre for AI & Society, and Nicholas Baker, an assistant psychology professor at Loyola College in Chicago and a former VISTA postdoctoral fellow at York, finds that deep learning models fail to capture the configural nature of human shape perception.

But it gets weirder.

The light from the table sitting just one meter away from you is also taking some time to reach you. Since its half as far away as the chair, you are seeing it as it was 330 picoseconds ago. That’s half as far back in the past as the chair. Ok, fine, but they both appear to you in the now. What you perceive as the “now” is really layer after layer of light reaching your eye from many different moments in the past. Your “now” is an overlapping mosaic of “thens.” What you imagine to be the real world existing simultaneously with you is really a patchwork of moments from different pasts. You never live in the world as it is. You only experience it as it was, a tapestry of past vintages.

The gentle system uses a soft micro finger that allows for safe interaction with insects and other microscopic objects.

Entomophilous out there, ever wanted to cuddle a bug? Brush through the tiny wings of a dragonfly? Tickle insects? Researchers in Japan have created what you’ve always wanted — a soft micro-robotic finger that allows humans to directly interact with insects at previously inaccessible scales.

Previously, we did have access to insect environments. For example, microbots could interact with the environment at much smaller scales, and microsensors were used to measure forces exerted by insects during flight or walking. However, most of these studies only focused on measuring insect behavior instead of direct insect-microsensor interaction.

Now, researchers from Ritsumeikan University in Japan have developed a soft micro-robotic finger that can enable direct interaction with the microworld. Led by Professor Satoshi Konishi, the study was published in Scientific Reports.