Toggle light / dark theme

A new class of self-forming membrane to separate carbon dioxide from a mixture of gases has been developed by Newcastle University researchers.

Operating like a coffee filter, it lets harmless gases, such as nitrogen, exit into the atmosphere and then the can be processed.

The team believe that the system may be applicable for use in dioxide separation processes, either to protect the environment or in reaction engineering.

Alzheimer’s disease is the sixth leading cause of death in the United States, affecting one in 10 people over the age of 65. Scientists are engineering nanodevices to disrupt processes in the brain that lead to the disease.

People who are affected by Alzheimer’s disease have a specific type of plaque, made of self-assembled molecules called β-amyloid (Aβ) , that build up in the brain over time. This buildup is thought to contribute to loss of neural connectivity and . Researchers are studying ways to prevent the peptides from forming these dangerous plaques in order to halt development of Alzheimer’s disease in the brain.

In a multidisciplinary study, scientists at the U.S. Department of Energy’s (DOE) Argonne National Laboratory, along with collaborators from the Korean Institute of Science and Technology (KIST) and the Korea Advanced Institute of Science and Technology (KAIST), have developed an approach to prevent plaque formation by engineering a nano-sized device that captures the dangerous peptides before they can self-assemble.

SEAKR Engineering, Inc. has been awarded as the prime contractor for Defense Advanced Research Projects Agency (DARPA) Pit Boss contract to further expand its contractual work supporting the Blackjack program. The award for Phase I Option II is part of a three-phase effort seeking on-orbit demonstration of full processing capability in a multi-satellite constellation. SEAKR was first awarded a DARPA Pit Boss contract in October 2019.

DARPA’s Blackjack program focuses on integrating commercial satellite technologies into a constellation of military satellites. As sole prime, SEAKR will continue developing it’s Pit Boss solution to support the Blackjack program’s mission as a next generation on-board processor.

SEAKR said the solution will leverage off-the-shelf electronics adapted through design implementation to function reliably in space. The company said this award validates its program success in seeking on-orbit demonstration of state-of-the-art processing capability incorporating autonomous operations, Artificial Intelligence (AI), machine learning techniques, and bridged terrestrial and on-orbit technologies.

Lithium-sulfur batteries have been hailed as the next big step in battery technology, promising significantly longer use for everything from cellphones to electric vehicles on a single charge, while being more environmentally sustainable to produce than current lithium-ion batteries. However, these batteries don’t last as long as their lithium-ion counterparts, degrading over time.

A group of researchers in the Cockrell School of Engineering at The University of Texas at Austin has found a way to stabilize one of the most challenging parts of -sulfur batteries, bringing the technology closer to becoming commercially viable. The team’s findings, published today in Joule, show that creating an artificial layer containing tellurium, inside the battery in-situ, on top of lithium metal, can make it last four times longer.

“Sulfur is abundant and environmentally benign with no supply chain issues in the U.S.,” said Arumugam Manthiram, a professor of mechanical engineering and director of the Texas Materials Institute. “But there are engineering challenges. We’ve reduced a problem to extend the cycle life of these batteries.”

Today’s virtual reality systems can create immersive visual experiences, but seldom do they enable users to feel anything—particularly walls, appliances and furniture. A new device developed at Carnegie Mellon University, however, uses multiple strings attached to the hand and fingers to simulate the feel of obstacles and heavy objects.

By locking the strings when the user’s hand is near a virtual wall, for instance, the device simulates the sense of touching the wall. Similarly, the string mechanism enables people to feel the contours of a virtual sculpture, sense resistance when they push on a piece of furniture or even give a high five to a virtual character.

Cathy Fang, who will graduate from CMU next month with a joint degree in mechanical engineering and , said the shoulder-mounted device takes advantage of spring-loaded strings to reduce weight, consume less battery power and keep costs low.

Built in about 24 hours, this robot is undergoing in-hospital testing for coronavirus disinfection.


UV disinfection is one of the few areas where autonomous robots can be immediately and uniquely helpful during the COVID pandemic. Unfortunately, there aren’t enough of these robots to fulfill demand right now, and although companies are working hard to build them, it takes a substantial amount of time to develop the hardware, software, operational knowledge, and integration experience required to make a robotic disinfection system work in a hospital.

Conor McGinn, an assistant professor of mechanical engineering at Trinity College in Dublin and co-leader of the Robotics and Innovation Lab (RAIL), has pulled together a small team of hardware and software engineers who’ve managed to get a UV disinfection robot into hospital testing within a matter of just a few weeks. They made it happen in such a short amount of time by building on previous research, collaborating with hospitals directly, and leveraging a development platform: the TurtleBot 2.

Over the last few years, RAIL has been researching mobile social robots for elder care applications, and during their pilot testing, they came to understand how big of a problem infection can be in environments like nursing homes. This was well before COVID-19, but it was (and still is) one of the leading causes of hospitalization for nursing home residents. Most places just wipe down surfaces with disinfectant sometimes, but these facilities have many surfaces (like fabrics) that aren’t as easy to clean, and with people coming in and out all the time, anyone with a compromised immune system is always at risk.

Researchers have developed a number of potassium ion (K+) probes to detect fluctuating K+ concentrations during a variety of biological processes. However, such probes are not sensitive enough to detect physiological fluctuations in living animals and it is not easy to monitor deep tissues with short-wavelength excitations that are in use so far. In a new report, Jianan Liu and a team of researchers in neuroscience, chemistry, and molecular engineering in China, describe a highly sensitive and selective nanosensor for near infrared (NIR) K+ ion imaging in living cells and animals. The team constructed the nanosensor by encapsulating upconversion nanoparticles (UCNPs) and a commercial potassium ion indicator in the hollow cavity of mesoporous silica nanoparticles and coated them with a K+ selective filter membrane. The membrane adsorbed K+ from the medium and filtered away any interfering cations. In its mechanism of action, UCNPs converted NIR to ultraviolet (UV) light to excite the potassium ion indicator and detect fluctuating potassium ion concentrations in cultured cells and in animal models of disease including mice and zebrafish larvae. The results are now published on Science Advances.

The most abundant intracellular cation potassium (K+) is extremely crucial in a variety of biological processes including neural transmission, heartbeat, muscle contraction and kidney function. Variations in the intracellular or extracellular K+ concentration (referred herein as [K+]) suggest abnormal physiological functions including heart dysfunction, cancer, and diabetes. As a result, researchers are keen to develop effective strategies to monitor the dynamics of [K+] fluctuations, specifically with direct optical imaging.

Most existing probes are not sensitive to K+ detection under physiological conditions and cannot differentiate fluctuations between [K+] and the accompanying sodium ion ([Na+]) during transmembrane transport in the Na+/K+ pumps. While fluorescence lifetime imaging can distinguish K+ and Na+ in water solution, the method requires specialized instruments. Most K+ sensors are also activated with short wavelength light including ultraviolet (UV) or visible light—leading to significant scattering and limited penetration depth when examining living tissues. In contrast, the proposed near-infrared (NIR) imaging technique will offer unique advantages during deep tissue imaging as a plausible alternative.

The era of telecommunications systems designed solely by humans is coming to an end. From here on, artificial intelligence will play a pivotal role in the design and operation of these systems. The reason is simple: rapidly escalating complexity.

Each new generation of communications system strives to improve coverage areas, bit rates, number of users, and power consumption. But at the same time, the engineering challenges grow more difficult. To keep innovating, engineers have to navigate an increasingly tangled web of technological trade-offs made during previous generations.

In telecommunications, a major source of complexity comes from what we’ll call impairments. Impairments include anything that deteriorates or otherwise interferes with a communications system’s ability to deliver information from point A to point B. Radio hardware itself, for example, impairs signals when it sends or receives them by adding noise. The paths, or channels, that signals travel over to reach their destinations also impair signals. This is true for a wired channel, where a nearby electrical line can cause nasty interference. It’s equally true for wireless channels, where, for example, signals bouncing off and around buildings in an urban area create a noisy, distortive environment.

Your phone’s GPS, the Wi-Fi in your house and communications on aircraft are all powered by radio-frequency, or RF, waves, which carry information from a transmitter at one point to a sensor at another. The sensors interpret this information in different ways. For example, a GPS sensor uses the angle at which it receives an RF wave to determine its own relative location. The more precisely it can measure the angle, the more accurately it can determine location.

In a new paper published in Physical Review Letters, University of Arizona engineering and optical sciences researchers, in collaboration with engineers from General Dynamics Mission Systems, demonstrate how a combination of two techniques—radio frequency photonics sensing and quantum metrology—can give sensor networks a previously unheard-of level of precision. The work involves transferring information from electrons to photons, then using to increase the photons’ sensing capabilities.

“This quantum sensing paradigm could create opportunities to improve GPS systems, astronomy laboratories and biomedical imaging capabilities,” said Zheshen Zhang, assistant professor of materials science and engineering and , and principal investigator of the university’s Quantum Information and Materials Group. “It could be used to improve the performance of any application that requires a of sensors.”

Only 10 years ago, scientists working on what they hoped would open a new frontier of neuromorphic computing could only dream of a device using miniature tools called memristors that would function/operate like real brain synapses.

But now a team at the University of Massachusetts Amherst has discovered, while on their way to better understanding protein , how to use these biological, electricity conducting filaments to make a neuromorphic memristor, or “memory transistor,” device. It runs extremely efficiently on very low power, as brains do, to carry signals between neurons. Details are in Nature Communications.

As first author Tianda Fu, a Ph.D. candidate in electrical and , explains, one of the biggest hurdles to neuromorphic computing, and one that made it seem unreachable, is that most conventional computers operate at over 1 volt, while the brain sends signals called action potentials between neurons at around 80 millivolts—many times lower. Today, a decade after early experiments, memristor voltage has been achieved in the range similar to conventional computer, but getting below that seemed improbable, he adds.