Toggle light / dark theme

The human fingertip is an exquisitely sensitive instrument for perceiving objects in our environment via the sense of touch. A team of Chinese scientists has mimicked the underlying perceptual mechanism to create a bionic finger with an integrated tactile feedback system capable of poking at complex objects to map out details below the surface layer, according to a recent paper published in the journal Cell Reports Physical Science.

“We were inspired by human fingers, which have the most sensitive tactile perception that we know of,” said co-author Jianyi Luo of Wuyi University. “For example, when we touch our own bodies with our fingers, we can sense not only the texture of our skin, but also the outline of the bone beneath it. This tactile technology opens up a non-optical way for the nondestructive testing of the human body and flexible electronics.”

According to the authors, previously developed artificial tactile sensors could only recognize and discriminate between external shapes, surface textures, and hardness. But they aren’t capable of sensing subsurface information about those materials. This usually requires optical technologies, such as CT scanning, PET scans, ultrasonic tomography (which scans the exterior of a material to reconstruct an image of its internal structure), or MRIs, for example. But all of these methods also have drawbacks. Similarly, optical profilometry is often used to measure the surface’s profile and finish, but it only works on transparent materials.

Musk’s company is far from the only group working on brain-computer interfaces, or systems to facilitate direct communication between human brains and external computers. Other researchers have been looking into using BCIs to restore lost senses and control prosthetic limbs, among other applications. While these technologies are still in their infancy, they’ve been around long enough for researchers to increasingly get a sense of how neural implants interact with our minds. As Anna Wexler, an assistant professor of philosophy in the Department of Medical Ethics and Health Policy at the University of Pennsylvania, put it: “Of course it causes changes. The question is what kinds of changes does it cause, and how much do those changes matter?”

Time to replace X-ray machines with bionic fingers.

A team of researchers at Wiyu University (WYU) in China has created a bionic finger that can create 3D maps of the interiors of any object just by poking it gently and repeatedly, according to a press release.

Imagine your son or daughter has an electronic toy train that they love to play with, but then due to some problem, the train stops working, and your kid starts crying… More.


Li et al.

What if, instead of using X-rays or ultrasound, we could use touch to image the insides of human bodies and electronic devices? In a study publishing in the journal Cell Reports Physical Science (“A smart bionic finger for subsurface tactile-tomography”), researchers present a bionic finger that can create 3D maps of the internal shapes and textures of complex objects by touching their exterior surface.

“We were inspired by human fingers, which have the most sensitive tactile perception that we know of,” says senior author Jianyi Luo, a professor at Wuyi University. “For example, when we touch our own bodies with our fingers, we can sense not only the texture of our skin, but also the outline of the bone beneath it.”

“Our bionic finger goes beyond previous artificial sensors that were only capable of recognizing and discriminating between external shapes, surface textures, and hardness,” says co-author Zhiming Chen, a lecturer at Wuyi University.

It turns out that it is now pretty standard to have no external stitches for spaying, and in fact, if I had to do it over again, I would have picked one port keyhole surgery which would have been even easier to heal from. (That would have required switching vets.) So for only $400, Kaia not only had advanced surgery with no outside stitches, but she was made into a cyborg with a microchip being implanted. Pretty impressive!

This is just one example of us rushing headlong into a science fiction-type future. The biggest such example being ChatGPT which feels way more intelligent than previous chatbots. It used to be that Ray Kurzweil would say that we would have AGI in 2029 and everyone else predicted dates such as 2070 or never. Now many people pick 2029 and I could definitely see the tech behind ChatGPT being part of the recipe for AGI. For me, the first example that AGI was coming, was Content-Aware Fill being added to Photoshop. That feature allows you to erase a person from a beach scene in one quick step. Very impressive!

One more example of tech advancing is that a few years ago my right eye’s retina partially detached. My doctor did surgery with cryotherapy in his office, and inserted a sulfur hexafluoride bubble to stabilize everything. He followed up the next day with laser therapy, again in his office. No hospital needed. Eye fixed!

Video is of our killer guard dog Kaia on patrol. Watch out world!

The loss of pollinators, such as bees, is a huge challenge for global biodiversity and affects humanity by causing problems in food production. At Tampere University, researchers have now developed the first passively flying robot equipped with artificial muscle. Could this artificial fairy be utilized in pollination?

The development of stimuli-responsive polymers has brought about a wealth of material-related opportunities for next-generation small-scale, wirelessly controlled soft-bodied robots. For some time now, engineers have known how to use these materials to make small robots that can walk, swim and jump. So far, no one has been able to make them fly.

Researchers of the Light Robots group at Tampere University are now researching how to make smart material fly. Hao Zeng, Academy Research Fellow and the group leader, and Jianfeng Yang, a doctoral researcher, have come up with a new design for their project called FAIRY – Flying Aero-robots based on Light Responsive Materials Assembly. They have developed a polymer-assembly robot that flies by wind and is controlled by light.

Scientists have managed to do something truly groundbreaking. According to a new paper published in Advanced Science, researchers have created programmable cyborg cells that could help revolutionize medicine and environmental cleanup efforts. The new research, which was carried out by researchers at the University of California, Davis, shows that it is possible to create semi-living cyborg cells that retain the capabilities of living cells, but are unable to divide and grow.

Synthetic biology has made major strides towards the holy grail of fully programmable bio-micromachines capable of sensing and responding to defined stimuli regardless of their environmental context. A common type of bio-micromachines is created by genetically modifying living cells.[ 1 ] Living cells possess the unique advantage of being highly adaptable and versatile.[ 2 ] To date, living cells have been successfully repurposed for a wide variety of applications, including living therapeutics,[ 3 ] bioremediation,[ 4 ] and drug and gene delivery.[ 5, 6 ] However, the resulting synthetic living cells are challenging to control due to their continuous adaption and evolving cellular context. Application of these autonomously replicating organisms often requires tailored biocontainment strategies,[ 7-9 ] which can raise logistical hurdles and safety concerns.

In contrast, nonliving synthetic cells, notably artificial cells,[ 10, 11 ] can be created using synthetic materials, such as polymers or phospholipids. Meticulous engineering of materials enables defined partitioning of bioactive agents, and the resulting biomimetic systems possess advantages including predictable functions, tolerance to certain environmental stressors, and ease of engineering.[ 12, 13 ] Nonliving cell-mimetic systems have been employed to deliver anticancer drugs,[ 14 ] promote antitumor immune responses,[ 15 ] communicate with other cells,[ 16, 17 ] mimic immune cells,[ 18, 19 ] and perform photosynthesis.

It was one of the teachers of the school’s engineering program who suggested that their classmates could help him by developing a robotic hand.

15-year-old Tennessee boy Sergio Peralta now has a robotic hand, thanks to his classmates. A group of high school students designed a robotic hand for their newcomer.

“In the first days of school, I honestly felt like hiding my hand,” he said to CBS News.


Kool99/iStock.