Toggle light / dark theme

This is a clip from Technocalyps, a documentary in three parts about the exponential growth of technology and trans-humanism, made by Hans Moravec. The documentary came out in 1998, and then a new version was made in 2006. This is how the film-makers themselves describe what the movie is about:

“The accelerating advances in genetics, brain research, artificial intelligence, bionics and nanotechnology seem to converge to one goal: to overcome human limits and create higher forms of intelligent life and to create transhuman life.”

You can see the whole documentary here: https://www.youtube.com/watch?v=fKvyXBPXSbk. Or, if you’re more righteous then I am, you can order the DVD on technocalyps.com.

Research in animal models has demonstrated that stem-cell derived heart tissues have promising potential for therapeutic applications to treat cardiac disease. But before such therapies are viable and safe for use in humans, scientists must first precisely understand on the cellular and molecular levels which factors are necessary for implanted stem-cell derived heart cells to properly grow and integrate in three dimensions within surrounding tissue.

New findings from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) make it possible for the first time to monitor the functional development and maturation of cardiomyocytes—the responsible for regulating the heartbeat through synchronized —on the single-cell level using -embedded . The devices—which are flexible, stretchable, and can seamlessly integrate with living cells to create “cyborgs”—are reported in a Science Advances paper.

“These mesh-like nanoelectronics, designed to stretch and move with growing tissue, can continuously capture long-term activity within individual stem-cell derived cardiomyocytes of interest,” says Jia Liu, co-senior author on the paper, who is an assistant professor of bioengineering at SEAS, where he leads a lab dedicated to bioelectronics.

It’s a revolutionary step forward for soft robotics.

A team of scientists from Edinburgh has engineered smart electronic skin that could pave the way for soft, flexible robotic devices with a sense of touch, according to a press release by the institution published last week.

The technology could aid in breakthroughs in soft robotics introducing a range of applications, such as surgical tools, prosthetics, and devices to explore hazardous environments.


University of Edinburgh.

Researchers say, “their stretchable e-skin gives robots for the first time a level of physical self-awareness similar to that of people and animals.”

The human fingertip is an exquisitely sensitive instrument for perceiving objects in our environment via the sense of touch. A team of Chinese scientists has mimicked the underlying perceptual mechanism to create a bionic finger with an integrated tactile feedback system capable of poking at complex objects to map out details below the surface layer, according to a recent paper published in the journal Cell Reports Physical Science.

“We were inspired by human fingers, which have the most sensitive tactile perception that we know of,” said co-author Jianyi Luo of Wuyi University. “For example, when we touch our own bodies with our fingers, we can sense not only the texture of our skin, but also the outline of the bone beneath it. This tactile technology opens up a non-optical way for the nondestructive testing of the human body and flexible electronics.”

According to the authors, previously developed artificial tactile sensors could only recognize and discriminate between external shapes, surface textures, and hardness. But they aren’t capable of sensing subsurface information about those materials. This usually requires optical technologies, such as CT scanning, PET scans, ultrasonic tomography (which scans the exterior of a material to reconstruct an image of its internal structure), or MRIs, for example. But all of these methods also have drawbacks. Similarly, optical profilometry is often used to measure the surface’s profile and finish, but it only works on transparent materials.

Musk’s company is far from the only group working on brain-computer interfaces, or systems to facilitate direct communication between human brains and external computers. Other researchers have been looking into using BCIs to restore lost senses and control prosthetic limbs, among other applications. While these technologies are still in their infancy, they’ve been around long enough for researchers to increasingly get a sense of how neural implants interact with our minds. As Anna Wexler, an assistant professor of philosophy in the Department of Medical Ethics and Health Policy at the University of Pennsylvania, put it: “Of course it causes changes. The question is what kinds of changes does it cause, and how much do those changes matter?”

Time to replace X-ray machines with bionic fingers.

A team of researchers at Wiyu University (WYU) in China has created a bionic finger that can create 3D maps of the interiors of any object just by poking it gently and repeatedly, according to a press release.

Imagine your son or daughter has an electronic toy train that they love to play with, but then due to some problem, the train stops working, and your kid starts crying… More.


Li et al.

This unique device can scan both living and non-living objects for any internal anomalies just by applying pressure on its surface. It could be used for performing non-destructive scanning and testing of the human body and various electronic applications in the future.

What if, instead of using X-rays or ultrasound, we could use touch to image the insides of human bodies and electronic devices? In a study publishing in the journal Cell Reports Physical Science (“A smart bionic finger for subsurface tactile-tomography”), researchers present a bionic finger that can create 3D maps of the internal shapes and textures of complex objects by touching their exterior surface.

“We were inspired by human fingers, which have the most sensitive tactile perception that we know of,” says senior author Jianyi Luo, a professor at Wuyi University. “For example, when we touch our own bodies with our fingers, we can sense not only the texture of our skin, but also the outline of the bone beneath it.”

“Our bionic finger goes beyond previous artificial sensors that were only capable of recognizing and discriminating between external shapes, surface textures, and hardness,” says co-author Zhiming Chen, a lecturer at Wuyi University.

It turns out that it is now pretty standard to have no external stitches for spaying, and in fact, if I had to do it over again, I would have picked one port keyhole surgery which would have been even easier to heal from. (That would have required switching vets.) So for only $400, Kaia not only had advanced surgery with no outside stitches, but she was made into a cyborg with a microchip being implanted. Pretty impressive!

This is just one example of us rushing headlong into a science fiction-type future. The biggest such example being ChatGPT which feels way more intelligent than previous chatbots. It used to be that Ray Kurzweil would say that we would have AGI in 2029 and everyone else predicted dates such as 2070 or never. Now many people pick 2029 and I could definitely see the tech behind ChatGPT being part of the recipe for AGI. For me, the first example that AGI was coming, was Content-Aware Fill being added to Photoshop. That feature allows you to erase a person from a beach scene in one quick step. Very impressive!

One more example of tech advancing is that a few years ago my right eye’s retina partially detached. My doctor did surgery with cryotherapy in his office, and inserted a sulfur hexafluoride bubble to stabilize everything. He followed up the next day with laser therapy, again in his office. No hospital needed. Eye fixed!

Video is of our killer guard dog Kaia on patrol. Watch out world!

The loss of pollinators, such as bees, is a huge challenge for global biodiversity and affects humanity by causing problems in food production. At Tampere University, researchers have now developed the first passively flying robot equipped with artificial muscle. Could this artificial fairy be utilized in pollination?

The development of stimuli-responsive polymers has brought about a wealth of material-related opportunities for next-generation small-scale, wirelessly controlled soft-bodied robots. For some time now, engineers have known how to use these materials to make small robots that can walk, swim and jump. So far, no one has been able to make them fly.

Researchers of the Light Robots group at Tampere University are now researching how to make smart material fly. Hao Zeng, Academy Research Fellow and the group leader, and Jianfeng Yang, a doctoral researcher, have come up with a new design for their project called FAIRY – Flying Aero-robots based on Light Responsive Materials Assembly. They have developed a polymer-assembly robot that flies by wind and is controlled by light.