Toggle light / dark theme

Being able to decode brainwaves could help patients who have lost the ability to speak to communicate again, and could ultimately provide novel ways for humans to interact with computers. Now Meta researchers have shown they can tell what words someone is hearing using recordings from non-invasive brain scans.

Our ability to probe human brain activity has improved significantly in recent decades as scientists have developed a variety of brain-computer interface (BCI) technologies that can provide a window into our thoughts and intentions.

The most impressive results have come from invasive recording devices, which implant electrodes directly into the brain’s gray matter, combined with AI that can learn to interpret brain signals. In recent years, this has made it possible to decode complete sentences from someone’s neural activity with 97 percent accuracy, and translate attempted handwriting movements directly into text at speeds comparable to texting.

THE artificial intelligence revolution has only just begun, but there have already been numerous unsettling developments.

AI programs can be used to act on humans’ worst instincts or achieve humans’ more wicked goals, like creating weapons or terrifying its creators with a lack of morality.

Artificial intelligence is a catch-all phrase for a computer program designed to simulate, mimic or copy human thinking processes.

A team of researchers at University College London, working with a colleague from Nylers Ltd. and another from XPCI Technology Ltd., has developed a new way to X-ray luggage to detect small amounts of explosives. In their paper published in the journal Nature Communications, the group describes modifying a traditional X-ray device and applying a deep-learning application to better detect explosive materials in luggage.

Prior research has shown that when X-rays strike materials, they produce tiny bends that vary depending on the type of material. They sought to take advantage of these bends to create a precision X-ray machine.

The researchers first added a small change to an existing X-ray machine—a box containing masks, which are sheets of metal with tiny holes in them. The masks serve to split the X-ray beam into multiple smaller beams. The researchers then used the device to scan a variety of objects containing embedded and fed the results to a deep-learning AI application. The idea was to teach the machine what the tiny bends in such materials looked like. Once the machine was trained, they used it to scan other objects with embedded explosives to see if it could identify them. The researchers found their machine to be 100% accurate under lab settings.

In the Providentia++ project, researchers at the Technical University of Munich (TUM) have worked with industry partners to develop a technology to complement the vehicle perspective based on onboard sensor input with a bird’s-eye view of traffic conditions. This improves road safety, including for autonomous driving.

The expectations for autonomous driving are clear: “Cars have to travel safely not only at low speeds, but also in fast-moving traffic,” says Jörg Schrepfer, the head of Driving Advanced Research Germany at Valeo. For example, when objects fall off a truck, the “egocentric” perspective of a car will often be unable to detect the hazardous debris in time. “In these cases, it will be difficult to execute smooth evasive action,” says Schrepfer.

Researchers in the Providentia++ project have developed a system to transmit an additional view of the traffic situation into vehicles. “Using sensors on overhead sign bridges and masts, we have created a reliable, of the traffic situation on our test route that functions around the clock,” says Prof. Alois Knoll, project lead manager TUM. “With this system, we can now complement the vehicle’s view with an external perspective—a bird’s-eye view—and incorporate the behavior of other road users into decisions.”

University of Portsmouth joins leading AI researchers at DeepMind to help engineer faster acting enzymes for recycling some of the worlds most polluting single use plastics.

The University’s Centre for Enzyme Innovation (CEI) has used DeepMind’s ground-breaking AI system to make strides in their research on circular recycling.

To an untrained observer, the electrical storm that takes place over the brain’s neural network seems a chaotic flurry of activity. But as neuroscientists understand it, the millions of neurons are actually engaged in a sort of tightly choreographed dance, a tango of excitatory and inhibitory neurons. How is this precise balance that makes normal function possible achieved during development? And how does it go wrong in diseases like epilepsy when brain activity goes out of control?

Focusing on the cerebral cortex, the part of the controlling thought, sensory awareness, and motor function, a group of Harvard Stem Cell Institute (HSCI) researchers in the Department of Stem Cell and Regenerative Biology (SCRB), led by Assistant Professor Paola Arlotta, has discovered that excitatory neurons control the positioning of inhibitory neurons in a process that is critically important for generating balanced circuitry and proper cortical response.

Professor Takao Hensch, a collaborator on the study in the Harvard Center for Brain Science, Department of Molecular & Cellular Biology (MCB), had previously shown that the maturation of this circuit balance triggers critical periods of brain development. Certain inhibitory cells appear particularly vulnerable to genetic or environmental factors in early life, contributing to mental illness, such as schizophrenia or autism spectrum disorders.

A review paper by scientists at Zhejiang University summarized the development of continuum robots from the aspects of design, actuation, modeling and control. The new review paper, published on Jul. 26 in the journal Cyborg and Bionic Systems, provided an overview of the classic and advanced technologies of continuum robots, along with some prospects urgently to be solved.

“Some small-scale robots with new actuation methods are being widely investigated in the field of interventional surgical treatment or endoscopy, however, the characterization of mechanical properties of them is still different problem,” explained study author Haojian Lu, a professor at the Zhejiang University.

In order to realize the miniaturization of continuum robots, many cutting-edge materials have been developed and used to realize the actuation of robots, showing unique advantages. The continuum robots embedded with micromagnet or made of ferromagnetic composite material have accurate steering ability under an external controllable magnetic field; Magnetically soft continuum robots, on the other hand, can achieve small diameters, up to the micron scale, which ensures their ability to conduct targeted therapy in bronchi or in cerebral vessels.

Studies say that by combining historical accident data with road maps, satellite imagery, and GPS, a machine learning model is trained to create high-resolution crash maps, we might be getting ever so closer to safer roads. Technology has changed a lot over the years such as GPS systems that eliminated the need to memorize streets orally, sensors and cameras that warn us of objects that are close to our vehicles, and autonomous electric vehicles. However, the precautions we take on the road have largely remained the same. In most places, we still rely on traffic signs, mutual trust, and the hope that we’ll reach our destination safely.

With a view to finding solutions to the uncertainty underlying road accidents, researchers at the MIT Computer Science and Artificial Intelligence Laboratory have been working with the Qatari Center for Artificial Intelligence to develop a deep learning model that can predict high-resolution maps of accident risks. The model calculates the number of accidents predicted for a specific future time frame using past accident data, road maps, simulations and GPS traces. Thus, high-risk zones and future crashes can be identified using the map.

According to reports by homelandsecuritynewswire.com, maps of this type have been captured so far at much lower resolutions, resulting in a loss of vital information. Former attempts have relied mostly on hystorical crash data, whereas the research team has compiled a wide base of critical information, identifying high-risk areas by analyzing GPS signals that provide data on traffic density, speed, and direction, along with satellite imagery that provides data on road structures. They observed that highways, for example, are more hazardous than nearby residential roads, and intersections and exits to highways are even more dangerous than other highways.

The California-based Matternet has been testing its Model M2 drone over the past four years in the US as part of the FAA’s Unmanned Aircraft System (UAS) program. Matternet says getting the green light from the FAA could help streamline the process of “implementing new networks and getting approvals.”

Matternet partnered with UPS in 2019 to deliver medical supplies in North Carolina, and later started delivering prescriptions in Florida. Matternet also expanded its footprint to Switzerland, where it teamed up with the Swiss Post to deliver lab samples and blood tests. The program was briefly suspended in 2019 after its drones suffered two crashes in the country, but Matternet has since announced that it’s taking over the Swiss Post’s drone delivery program starting in 2023.

In a statement, the FAA says Matternet’s Model M2 drone “meets all federal regulations for safe, reliable and controllable operations and provides a level of safety equivalent to existing airworthiness standards applicable to other categories of aircraft.” The four-rotor drone’s been approved to carry four-pound payloads and fly at an altitude of 400 feet or lower with a maximum speed of 45mph.