Toggle light / dark theme

A new type of ferroelectric polymer that is exceptionally good at converting electrical energy into mechanical strain holds promise as a high-performance motion controller or “actuator” with great potential for applications in medical devices, advanced robotics, and precision positioning systems, according to a team of international researchers led by Penn State.

Mechanical strain, how a material changes shape when force is applied, is an important property for an actuator, which is any material that will change or deform when an external force such as is applied. Traditionally, these actuator materials were rigid, but soft actuators such as ferrroelectric polymers display higher flexibility and environmental adaptability.

The research demonstrated the potential of ferroelectric polymer nanocomposites to overcome the limitations of traditional piezoelectric polymer composites, offering a promising avenue for the development of with enhanced strain performance and mechanical energy density. Soft actuators are especially of interest to robotics researchers due to its strength, power and flexibility.

Facebook’s AI guru and machine learning pioneer Yann Lecun is coming for the artificial intelligence chatbot craze — and it may put him at odds with his own employer.

As Fortune reports, Meta’s chief AI scientist Yann Lecun admitted during a talk in Paris this week that he’s not exactly a fan of the current spate of chatbots and the large language models (LLMs) they’re built on.

“A lot of people are imagining all kinds of catastrophe scenarios because of AI, and it’s because they have in mind these auto-regressive LLMs that kind of spew nonsense sometimes,” he told the Meta Innovation Press Day crowd. “They say it’s not safe. They are right. It’s not. But it’s also not the future.”

MEET FLIPPY. STARTING in 2021, this tireless fry-station specialist toiled in 10 Chicago-area locations of White Castle, America’s first fast-food hamburger chain. Working behind a protective shield to reduce burn risk, Flippy could automatically fill and empty frying baskets as well as identify foods for frying and place them in the correct basket. While Flippy safely cooked French fries, White Castle employees could focus on serving customers and performing other restaurant tasks. That’s because Flippy is an AI-powered robot.

According to the International Federation of Robotics, more than half a million industrial robots are installed around the world, most in manufacturing. Now, a shortage of qualified workers is pushing more companies to explore using robots in a wide range of roles, from filling online orders in warehouses to making room service deliveries in hotels.


The restaurant industry is using AI to improve the human side of hospitality.

Flexible displays that can change color, convey information and even send veiled messages via infrared radiation are now possible, thanks to new research from the University of Illinois Urbana-Champaign. Engineers inspired by the morphing skins of animals like chameleons and octopuses have developed capillary-controlled robotic flapping fins to create switchable optical and infrared light multipixel displays that are 1,000 times more energy efficient than light-emitting devices.

The new study led by mechanical science and engineering professor Sameh Tawfick demonstrates how bendable fins and fluids can simultaneously switch between straight or bent and hot and cold by controlling the volume and temperature of tiny fluid-filled pixels. Varying the volume of fluids within the pixels can change the directions in which the flaps flip—similar to old-fashioned flip clocks—and varying the temperature allows the pixels to communicate via infrared energy. The study findings are published in the journal Science Advances.

Tawfick’s interest in the interaction of elastic and capillary forces—or elasto-capillarity—started as a graduate student, spanned the basic science of hair wetting and led to his research in soft robotic displays at Illinois.

Summary: A ‘smart hand exoskeleton’, a custom-made robotic glove, can aid stroke patients in relearning dexterity-based skills like playing music. The glove, equipped with integrated tactile sensors, soft actuators, and artificial intelligence, can mimic natural hand movements and provide tactile sensations.

By applying machine learning, the glove can distinguish between correct and incorrect piano play, potentially offering a novel tool for personalized rehabilitation. Although the current design focuses on music, the technology holds promise for a broader range of rehabilitation tasks.

A team describes their new method, NeRF AI. They then test it on Lady Gaga and Miley Cyrus music videos, revealing artists’ immersive environments.

Our eyes allow us to see the world, and it all depends on the interplay between light and our eyes.

Vision or sight is the process by which light enters the eye and gets focused by the lens onto the retina, where specialized cells called photoreceptors convert the light into electrical signals. These signals are then transmitted through the optic nerve to the brain, which interprets them as visual images, allowing us to perceive the world around us.