Toggle light / dark theme

Combat Environment Simulation Is Crucial for Future Conflicts

This post is also available in: he עברית (Hebrew)

Imagine knowing the future. Being able to predict what’s going to happen next. It feels that this concept is merely a dream, but in reality, this dream is underway. Modeling and simulation, data analytics, AI and machine learning, distributed systems, social dynamics and human behavior simulation are fast becoming the go-to tools, and their qualities could offer significant advantages for the battlespace of tomorrow.

According to army-technology.com, London-based technology provider Improbable has been working closely with the UK Ministry of Defense (MoD) since 2018 to explore the utility of synthetic environments (SEs) for tactical training and operational and strategic planning. At the core of this work is Skyral, a platform that supports an ecosystem of industry and academia enabling the fast construction of new SEs for almost any scenario using digital entities, algorithms, AI, historic and real-time data.

Finally, an answer to the question: AI — what is it good for?

Got a protein? This AI will tell you what it looks like.


AlphaFold was recognized by the journal Science as 2021’s Breakthrough of the Year, beating out candidates like Covid-19 antiviral pills and the application of CRISPR gene editing in the human body. One expert even wondered if AlphaFold would become the first AI to win a Nobel Prize.

The breakthroughs have kept coming.

Last week, DeepMind announced that researchers from around the world have used AlphaFold to predict the structures of some 200 million proteins from 1 million species, covering just about every protein known to human beings. All of that data is being made freely available on a database set up by DeepMind and its partner, the European Molecular Biology Laboratory’s European Bioinformatics Institute.

Quantum Computers can Look Beyond Zeros and Ones! Research Reveals

View insights.


The University of Innsbruck, Austria, realized a quantum computer that breaks out of this paradigm and unlocks additional computational resources, hidden in almost all of today’s quantum devices. Computers are well-known for operating with binary information, or zeros and ones, which has led to computers powering so much. This new approach results in more computational power with fewer quantum particles.

Quantum computers work with more than zero and one and digital computers work with zeros and ones, also called binary information. Quantum computers are also designed with binary information processing in mind. In fact, it was so successful that computers now power everything from coffee makers to self-driving cars, and it’s hard to imagine life without them. Restricting researchers to binary systems prevent these devices from living up to their true potential.

The research team succeeded in developing a quantum computer that can perform arbitrary calculations with so-called quantum digits, thereby unlocking more computational power with fewer quantum particles. Unlike the classical method, the new method that utilizes more states does not negatively impact the reliability of the computer. The researchers have developed a quantum computer that can make use of the full potential of these atoms.

A ‘nano-robot’ built entirely from DNA to explore cell processes

Constructing a tiny robot from DNA and using it to study cell processes invisible to the naked eye… You would be forgiven for thinking it is science fiction, but it is in fact the subject of serious research by scientists from Inserm, CNRS and Université de Montpellier at the Structural Biology Center in Montpellier. This highly innovative “nano-robot” should enable closer study of the mechanical forces applied at microscopic levels, which are crucial for many biological and pathological processes. It is described in a new study published in Nature Communications.

Our are subject to exerted on a microscopic scale, triggering biological signals essential to many involved in the normal functioning of our body or in the development of diseases.

For example, the feeling of touch is partly conditional on the application of mechanical forces on specific cell receptors (the discovery of which was this year rewarded by the Nobel Prize in Physiology or Medicine). In addition to touch, these receptors that are sensitive to mechanical forces (known as mechanoreceptors) enable the regulation of other key biological processes such as blood vessel constriction, pain perception, breathing or even the detection of sound waves in the ear, etc.

Neural networks and ‘ghost’ electrons accurately reconstruct behavior of quantum systems

Physicists are (temporarily) augmenting reality to crack the code of quantum systems.

Predicting the properties of a molecule or material requires calculating the collective behavior of its . Such predictions could one day help researchers develop new pharmaceuticals or design materials with sought-after properties such as superconductivity. The problem is that electrons can become “quantum mechanically” entangled with one another, meaning they can no longer be treated individually. The entangled web of connections becomes absurdly tricky for even the most powerful computers to unravel directly for any system with more than a handful of particles.

Now, at the Flatiron Institute’s Center for Computational Quantum Physics (CCQ) in New York City and the École Polytechnique Fédérale de Lausanne (EPFL) in Switzerland have sidestepped the problem. They created a way to simulate entanglement by adding to their computations extra “ghost” electrons that interact with the system’s actual electrons.

Is Artificial Sentience Here? with Blake Lemoine

Recently, Blake Lemoine a computer scientist and machine learning bias researcher for Google released an interview with Google’s LaMDA a conversation technology and AI. Blake proposes, based on his time testing LaMDA, that it is a super intelligence and sentient. Blake details just what made him come to this conclusion and why he believes we have passed the singularity, last year.

Blake’s links:
https://twitter.com/cajundiscordian.
https://cajundiscordian.medium.com/is-lamda-sentient-an-interview-ea64d916d917

News links:
https://www.vox.com/23167703/google-artificial-intelligence-…l-sentient.
https://blog.google/technology/ai/lamda/

Youtube Membership: https://www.youtube.com/channel/UCz3qvETKooktNgCvvheuQDw/join.
Podcast: https://anchor.fm/john-michael-godier/subscribe.
Apple: https://apple.co/3CS7rjT

More JMG

Supermind by John Michael Godier — https://amzn.to/3uvDaW0 (affiliated link)

Want to support the channel?

Allowing social robots to learn relations between users’ routines and their mood

Social robots, robots that can interact with humans and assist them in their daily lives, are gradually being introduced in numerous real-world settings. These robots could be particularly valuable for helping older adults to complete everyday tasks more autonomously, thus potentially enhancing their independence and well-being.

Researchers at University of Bari have been investigating the potential using for ambient assisted living applications for numerous years. Their most recent paper, published in UMAP’22 Adjunct: Adjunct Proceedings of the 30th ACM Conference on User Modeling, Adaptation and Personalization, specifically explores the value of allowing social robots who are assisting seniors to learn the relationships between a user’s routines and his/her .

“Social robots should support with and, at the same time, they should contribute to emotional wellness by considering affective factors in everyday situations,” Berardina De Carolis, Stefano Ferilli and Nicola Macciarulo wrote in their paper. “The main goal of this research is to investigate whether it is possible to learn relations between the user’s affective state state and , made by activities, with the aid of a social robot, Pepper in this case.”

A crab-inspired artificial vision system for both terrestrial and aquatic environments

To efficiently navigate real-world environments, robots typically analyze images collected by imaging devices that are integrated within their body. To enhance the performance of robots, engineers have thus been trying to develop different types of highly performing cameras, sensors and artificial vision systems.

Many artificial systems developed so far draw inspiration from the eyes of humans, animals, insects and fish. These systems have different features and characteristics, depending on the in which they are designed to operate in.

Most existing sensors and cameras are designed to work either in on the ground (i.e., in terrestrial environments) or in (i.e., in ). Bio-inspired artificial vision systems that can operate in both terrestrial and aquatic environments, on the other hand, remain scarce.

/* */