Toggle light / dark theme

Both quantum computing and machine learning have been touted as the next big computer revolution for a fair while now.

However, experts have pointed out that these techniques aren’t generalized tools – they will only be the great leap forward in computer power for very specialized algorithms, and even more rarely will they be able to work on the same problem.

One such example of where they might work together is modeling the answer to one of the thorniest problems in physics: How does General Relativity relate to the Standard Model?

In a paper published on February 23, 2022 in Nature Machine Intelligence, a team of scientists at the Max Planck Institute for Intelligent Systems (MPI-IS) introduce a robust soft haptic sensor named “Insight” that uses computer vision and a deep neural network to accurately estimate where objects come into contact with the sensor and how large the applied forces are. The research project is a significant step toward robots being able to feel their environment as accurately as humans and animals. Like its natural counterpart, the fingertip sensor is very sensitive, robust, and high-resolution.

The thumb-shaped sensor is made of a soft shell built around a lightweight stiff skeleton. This skeleton holds up the structure much like bones stabilize the soft finger tissue. The shell is made from an elastomer mixed with dark but reflective aluminum flakes, resulting in an opaque grayish color that prevents any external light finding its way in. Hidden inside this finger-sized cap is a tiny 160-degree fish-eye camera, which records colorful images, illuminated by a ring of LEDs.

When any objects touch the sensor’s shell, the appearance of the color pattern inside the sensor changes. The camera records images many times per second and feeds a deep neural network with this data. The algorithm detects even the smallest change in light in each pixel. Within a fraction of a second, the trained machine-learning model can map out where exactly the finger is contacting an object, determine how strong the forces are, and indicate the force direction. The model infers what scientists call a force map: It provides a force vector for every point in the three-dimensional fingertip.

A new study challenges the conventional approach to designing soft robotics and a class of materials called metamaterials by utilizing the power of computer algorithms. Researchers from the University of Illinois Urbana-Champaign and Technical University of Denmark can now build multimaterial structures without dependence on human intuition or trial-and-error to produce highly efficient actuators and energy absorbers that mimic designs found in nature.

The study, led by Illinois civil and environmental engineering professor Shelly Zhang, uses optimization theory and an -based design process called . Also known as digital synthesis, the builds composite structures that can precisely achieve complex prescribed mechanical responses.

The study results are published in the Proceedings of the National Academy of Sciences.

Engineers sometimes turn to nature for inspiration. Cold Spring Harbor Laboratory Associate Professor Saket Navlakha and research scientist Jonathan Suen have found that adjustment algorithms—the same feedback control process by which the Internet optimizes data traffic—are used by several natural systems to sense and stabilize behavior, including ant colonies, cells, and neurons.

Internet engineers route data around the world in small packets, which are analogous to . As Navlakha explains, “The goal of this work was to bring together ideas from and Internet design and relate them to the way forage.”

The same algorithm used by internet engineers is used by ants when they forage for food. At first, the colony may send out a single ant. When the ant returns, it provides information about how much food it got and how long it took to get it. The colony would then send out two ants. If they return with food, the colony may send out three, then four, five, and so on. But if ten ants are sent out and most do not return, then the colony does not decrease the number it sends to nine. Instead, it cuts the number by a large amount, a multiple (say half) of what it sent before: only five ants. In other words, the number of ants slowly adds up when the signals are positive, but is cut dramatically lower when the information is negative. Navlakha and Suen note that the system works even if individual ants get lost and parallels a particular type of “additive-increase/multiplicative-decrease algorithm” used on the internet.

Quanta Magazine.


In The Structure of Scientific Revolutions, the philosopher of science Thomas Kuhn observed that scientists spend long periods taking small steps. They pose and solve puzzles while collectively interpreting all data within a fixed worldview or theoretical framework, which Kuhn called a paradigm. Sooner or later, though, facts crop up that clash with the reigning paradigm. Crisis ensues. The scientists wring their hands, reexamine their assumptions and eventually make a revolutionary shift to a new paradigm, a radically different and truer understanding of nature. Then incremental progress resumes.

For several years, the particle physicists who study nature’s fundamental building blocks have been in a textbook Kuhnian crisis.

The crisis became undeniable in 2016, when, despite a major upgrade, the Large Hadron Collider in Geneva still hadn’t conjured up any of the new elementary particles that theorists had been expecting for decades. The swarm of additional particles would have solved a major puzzle about an already known one, the famed Higgs boson. The hierarchy problem, as the puzzle is called, asks why the Higgs boson is so lightweight — a hundred million billion times less massive than the highest energy scales that exist in nature. The Higgs mass seems unnaturally dialed down relative to these higher energies, as if huge numbers in the underlying equation that determines its value all miraculously cancel out.

Is the Chief Medical Officer at Current Health (https://currenthealth.com/), a Best Buy Health company (https://healthcare.bestbuy.com/) and part of the American multinational consumer electronics retailer.

Current Health is an organization that enables the delivery of healthcare services in the home to enable healthcare organizations to deliver high-quality, patient-centric care at a lower cost. The company integrates patient-reported data with data from biosensors – including their own continuous monitoring wearable devices – to provide healthcare organizations with actionable, real-time insights into the patient’s condition. Leveraging clinical algorithms that can be tailored to the individual patient, Current Health identifies when a patient needs clinical attention, allowing organizations to manage patient care remotely or coordinate in-home care via integrated service partners. The Current Health platform brings together tele-health capabilities, patient engagement tools, and in-home connectivity to provide a single solution to manage all care in the home. Dr. Wolfberg also leads implementation and account management at the organization.

Previously, Dr Wolfberg worked in medical affairs at Ovia Health, a leading maternity and family benefits solution for employers and health plans (which was acquired by LabCorp), athenahealth (network-enabled services, mobile apps, and data-driven insights to hospitals and medical organizations) and Ariosa Diagnostics.

Dr. Wolfberg trained in OB/GYN and maternal-fetal medicine at The Johns Hopkins University School of Medicine, and has an MPH from Johns Hopkins School of Hygiene and Public Health.

Researchers from the University of Oxford’s Big Data Institute have taken a major step towards mapping the entirety of genetic relationships among humans: a single genealogy that traces the ancestry of all of us. The study has been published today in Science.

Imagine a field of wheat that extends to the horizon, being grown for flour that will be made into bread to feed cities’ worth of people. Imagine that all authority for tilling, planting, fertilizing, monitoring and harvesting this field has been delegated to artificial intelligence: algorithms that control drip-irrigation systems, self-driving tractors and combine harvesters, clever enough to respond to the weather and the exact needs of the crop. Then imagine a hacker messes things up.

Imagine a field of wheat that extends to the horizon, being grown for flour that will be made into bread to feed cities’ worth of people. Imagine that all authority for tilling, planting, fertilizing, monitoring and harvesting this field has been delegated to artificial intelligence: algorithms that control drip-irrigation systems, self-driving tractors and combine harvesters, clever enough to respond to the weather and the exact needs of the crop. Then imagine a hacker messes things up.

A new risk analysis, published today in the journal Nature Machine Intelligence, warns that the future use of artificial intelligence in agriculture comes with substantial potential risks for farms, farmers and that are poorly understood and under-appreciated.

“The idea of intelligent machines running farms is not science fiction. Large companies are already pioneering the next generation of autonomous ag-bots and decision support systems that will replace humans in the field,” said Dr. Asaf Tzachor in the University of Cambridge’s Center for the Study of Existential Risk (CSER), first author of the paper.