Toggle light / dark theme

Could physics help people with epilepsy? That’s the question tackled by Louis Nemzer, a physicist at Nova Southeastern University, in the September 2019 issue of Physics World magazine, which is out now in print and digital formats.

He thinks that machine learning and real-time monitoring of the brain could give people with epilepsy live information about how much at risk they are of an imminent seizure – and is even developing a smartphone app to help them in daily life.

Elsewhere in the issue, Peter Martin and Tom Scott from the University of Bristol describe how they’ve used drones to map radiation levels at the Chernobyl plant, which you can also read on this website from 2 September, while Kate Brown from the Massachusetts Institute of Technology examines the health impact of Chernobyl fall-out.

In its most hubristic and unquestioning form, bolstered by unapologetic and brash advanced capitalist logics, transhumanism poses myriad threats: from automation unemployment to the end of democracy, to the risk that humans will branch into different species, making questions of inequality infinitely more urgent. Even if immortality arrives it will be accompanied by crimes, wars, and accidents—as Cantona states.


Technology is on the brink of making it possible to live forever—but should we?

There are good technical reasons why prototypes use the ancient game of Zenet as the interface. Although I do not rule out alternative approaches using the underlying designs and principles, there are unique reasons to choose Zenet — the only method recorded in ancient Egypt whereby the dead and living could communicate. Notwithstanding my background in neural net and hybrid AI in game software development, especially active divination systems; Zenet is the most elegant solution to bridge the worlds (between living and dead) since the rules and objectives vary slightly between the two, and smooth transition between these perspectives can occur in real-time. An objection to Cryogenics is the dead take energy and resources from the living. By making the Zenet boxes solar powered this will not impact on resources of the living, and also will provide a more authentic experience of sun rise and solar changes, important in solar theology of Ra and in Zenet. The game concerns movement of the solar b(ark). On a pragmatic note — the range of awareness can extend just to events and moves in the Zenet game. Even this task is far from trivial using silicon technology, and I don’t envisage anything like “full resurrection” or retention of current memories and so on as feasible for some time.

On Wednesday, Tesla CEO Elon Musk and Alibaba cofounder Jack Ma took the stage at the World AI Conference in Shanghai to debate artificial intelligence and its implications for humanity. As expected, Ma took a far more optimistic stance than Musk. Ma encouraged people to have faith in humanity, our creativity, and the future. “I don’t think artificial intelligence is a threat,” he said, to which Musk replied, “I don’t know, man, that’s like, famous last words.” An edited transcript of the discussion follows.

Elon Musk: What are we supposed to say? Just things about AI perhaps? Yeah. Okay. Let’s see.

Jack Ma: The AI, right? Okay, great.

Professor Jae Eun Jang’s team in the Department of Information and Communication Engineering has developed electronic skin technology that can detect “prick” and “hot” pain sensations like humans. This research result has applications in the development of humanoid robots and prosthetic hands in the future.

Scientists are continuously performing research to imitate tactile, olfactory and palate senses, and is expected to be the next mimetic technology for various applications. Currently, most tactile sensor research is focused on physical mimetic technologies that measure the pressure used for a robot to grab an object, but psychosensory tactile research on mimicking human tactile sensory responses like those caused by soft, smooth or rough surfaces has a long way to go.

Professor Jae Eun Jang’s team has developed a tactile sensor that can feel and temperature like humans through a joint project with Professor Cheil Moon’s team in the Department of Brain and Cognitive Science, Professor Ji-woong Choi’s team in the Department of Information and Communication Engineering, and Professor Hongsoo Choi’s team in the Department of Robotics Engineering. Its key strengths are that it has simplified the sensor structure and can measure pressure and temperature at the same time. Furthermore, it can be applied on various tactile systems regardless of the measurement principle of the sensor.

Soft robots are a class of robotic systems made of compliant materials and capable of safely adapting to complex environments. They have seen rapid growth recently and come in a variety of designs spanning multiple length scales, from meters to submicrometers.

In particular, small soft robots at millimeter scale are of practical interest as they can be designed as a combination of miniature actuators simply driven by pneumatic pressure. They are also well suited for navigation in confined areas and manipulation of small objects.

However, scaling down soft pneumatic robots to millimeters results in finer features that are reduced by more than one order of magnitude. The design complexity of such robots demands great delicacy when they are fabricated with traditional processes such as molding and soft lithography. Although emerging 3D printing technologies like digital light processing (DLP) offer high theoretical resolutions, dealing with microscale voids and channels without causing clogging has still been challenging. Indeed, successful examples of 3D printing miniature soft pneumatic robots are rare.

From afar, Olli resembles many of the “future is now!” electric autonomous shuttles that have popped up in recent years.

The tall rectangular pod, with its wide-set headlights and expansive windows nestled between a rounded frame, gives the shuttle a friendly countenance that screams, ever so gently, “come along, take a ride.”

But Olli is different in almost every way, from how it’s produced to its origin story. And now, its maker, Local Motors, has given Olli an upgrade in hopes of accelerating the adoption of its autonomous shuttles.

The Moon’s subsurface is the key to its longterm development and sustainability, says NASA scientist.


A view of the Apollo 11 lunar module “Eagle” as it returned from the surface of the moon to dock with the command module “Columbia”. A smooth mare area is visible on the Moon below and a half-illuminated Earth hangs over the horizon. Command module pilot Michael Collins took this picture.

By David Hambling

A robot pilot is learning to fly. It has passed its pilot’s test and flown its first plane, but it has also had its first mishap too.

Unlike a traditional autopilot, the ROBOpilot Unmanned Aircraft Conversion System literally takes the controls, pressing on foot pedals and handling the yoke using robotic arms. It reads the dials and meters with a computer vision system.