Toggle light / dark theme

The researchers revealed that deep convolutional neural networks were insensitive to configural object properties.

Deep convolutional neural networks (DCNNs) do not view things in the same way that humans do (through configural shape perception), which might be harmful in real-world AI applications. This is according to Professor James Elder, co-author of a York University study recently published in the journal iScience.

The study, which conducted by Elder, who holds the York Research Chair in Human and Computer Vision and is Co-Director of York’s Centre for AI & Society, and Nicholas Baker, an assistant psychology professor at Loyola College in Chicago and a former VISTA postdoctoral fellow at York, finds that deep learning models fail to capture the configural nature of human shape perception.

But it gets weirder.

The light from the table sitting just one meter away from you is also taking some time to reach you. Since its half as far away as the chair, you are seeing it as it was 330 picoseconds ago. That’s half as far back in the past as the chair. Ok, fine, but they both appear to you in the now. What you perceive as the “now” is really layer after layer of light reaching your eye from many different moments in the past. Your “now” is an overlapping mosaic of “thens.” What you imagine to be the real world existing simultaneously with you is really a patchwork of moments from different pasts. You never live in the world as it is. You only experience it as it was, a tapestry of past vintages.

The gentle system uses a soft micro finger that allows for safe interaction with insects and other microscopic objects.

Entomophilous out there, ever wanted to cuddle a bug? Brush through the tiny wings of a dragonfly? Tickle insects? Researchers in Japan have created what you’ve always wanted — a soft micro-robotic finger that allows humans to directly interact with insects at previously inaccessible scales.

Previously, we did have access to insect environments. For example, microbots could interact with the environment at much smaller scales, and microsensors were used to measure forces exerted by insects during flight or walking. However, most of these studies only focused on measuring insect behavior instead of direct insect-microsensor interaction.

Now, researchers from Ritsumeikan University in Japan have developed a soft micro-robotic finger that can enable direct interaction with the microworld. Led by Professor Satoshi Konishi, the study was published in Scientific Reports.

MK30 has custom-designed propellers that will reduce the its perceived noise by another 25 percent.

Amazon unveiled its next-generation delivery drone MK30 on Thursday and it promises increased range, expanded temperature tolerance, and the capability to fly in light rain. MK30 is due to come into service in 2024, the company wrote in a blog post.


Amazon.

The company kickstarted its drone delivery idea with the 2013 announcement of Prime Air. Back then, drones delivering packages up to five pounds to houses in less than half an hour seemed too good to be true. Amazon’s promises were no science fiction though. The company’s current fleet of delivery drones flies 400 feet above the ground at speeds up to 50 mph carrying packages up to five pounds within a range of nine miles.