Toggle light / dark theme

## JOURNAL OF THE AMERICAN CHEMICAL SOCIETY • JUN 4, 2021.

# *A lovely single step bio-inspired process with some interesting complex benefits particularly for humans on Mars.*

*by holly ober, university of california — riverside*

A team led by UC Riverside engineers has developed a catalyst to remove a dangerous chemical from water on Earth that could also make Martian soil safer for agriculture and help produce oxygen for human Mars explorers.

Perchlorate, a negative ion consisting of one chlorine atom bonded to four oxygen atoms, occurs naturally in some soils on Earth, and is especially abundant in Martian soil. As a powerful oxidizer, perchlorate is also manufactured and used in solid rocket fuel, fireworks, munitions, airbag initiators for vehicles, matches and signal flares. It is a byproduct in some disinfectants and herbicides.

New EPFL research has found that almost half of local Twitter trending topics in Turkey are fake, a scale of manipulation previously unheard of. It also proves for the first time that many trends are created solely by bots due to a vulnerability in Twitter’s Trends algorithm.

Social media has become ubiquitous in our modern, daily lives. It has changed the way that people interact, connecting us in previously unimaginable ways. Yet, where once our social media networks probably consisted of a small circle of friends most of us are now part of much larger communities that can influence what we read, do, and even think.

One influencing mechanism, for example, is “Twitter Trends.” The platform uses an algorithm to determine hashtag-driven topics that become popular at a given point in time, alerting to the top words, phrases, subjects and popular hashtags globally and locally.

The researchers started with a sample taken from the temporal lobe of a human cerebral cortex, measuring just 1 mm3. This was stained for visual clarity, coated in resin to preserve it, and then cut into about 5300 slices each about 30 nanometers (nm) thick. These were then imaged using a scanning electron microscope, with a resolution down to 4 nm. That created 225 million two-dimensional images, which were then stitched back together into one 3D volume.

Machine learning algorithms scanned the sample to identify the different cells and structures within. After a few passes by different automated systems, human eyes “proofread” some of the cells to ensure the algorithms were correctly identifying them.

The end result, which Google calls the H01 dataset, is one of the most comprehensive maps of the human brain ever compiled. It contains 50000 cells and 130 million synapses, as well as smaller segments of the cells such axons, dendrites, myelin and cilia. But perhaps the most stunning statistic is that the whole thing takes up 1.4 petabytes of data – that’s more than a million gigabytes.

It’s either some obscure fluid effect or black magic.


Just when I think I’ve seen every possible iteration of climbing robot, someone comes up with a new way of getting robots to stick to things. The latest technique comes from the Bioinspired Robotics and Design Lab at UCSD, where they’ve managed to get a robot to stick to smooth surfaces using a vibrating motor attached to a flexible disk. How the heck does it work?

The Beijing Academy of Artificial Intelligence (BAAI) researchers announced this week a natural language processing model called WuDao 2.0 that, per the South China Morning Post, is more advanced than similar models developed by OpenAI and Google.

The report said WuDao 2.0 uses 1.75 trillion parameters to “simulate conversational speech, write poems, understand pictures and even generate recipes.” The models developed by OpenAI and Google are supposed to do similar things, but they use fewer parameters to do so, which means WuDao 2.0 is likely better at those tasks.

Biobots could help us with new organs! 😃


Computer scientists and biologists have teamed up to create a creature heretofore unseen on Earth: a living robot. Made from the cells of frogs and designed by artificial intelligence, they’re called xenobots, and they may soon revolutionize everything from how we fight pollution to organ transplants.

#Xenobots #Moonshot #BloombergQuicktake.

When Open AI’s GPT-3 model made its debut in May of 2020, its performance was widely considered to be the literal state of the art. Capable of generating text indiscernible from human-crafted prose, GPT-3 set a new standard in deep learning. But oh what a difference a year makes. Researchers from the Beijing Academy of Artificial Intelligence announced on Tuesday the release of their own generative deep learning model, Wu Dao, a mammoth AI seemingly capable of doing everything GPT-3 can do, and more.

First off, Wu Dao is flat out enormous. It’s been trained on 1.75 trillion parameters (essentially, the model’s self-selected coefficients) which is a full ten times larger than the 175 billion GPT-3 was trained on and 150 billion parameters larger than Google’s Switch Transformers.

In order to train a model on this many parameters and do so quickly — Wu Dao 2.0 arrived just three months after version 1.0’s release in March — the BAAI researchers first developed an open-source learning system akin to Google’s Mixture of Experts, dubbed FastMoE. This system, which is operable on PyTorch, enabled the model to be trained both on clusters of supercomputers and conventional GPUs. This gave FastMoE more flexibility than Google’s system since FastMoE doesn’t require proprietary hardware like Google’s TPUs and can therefore run on off-the-shelf hardware — supercomputing clusters notwithstanding.

In simple terms, comparing previous autonomy standards with that of Exyn is like the difference between self-navigating a single, defined road versus uncharted terrain in unknown and unmapped territory. Unlike a car, however, a drone must be able to manoeuvre within three dimensions and pack all its intelligence and sensors onto a fraction of the total body size with severe weight restrictions.

“People have been talking about Level 4 Autonomy in driverless cars for some time, but having that same degree of intelligence condensed onboard a self-sufficient UAV is an entirely different engineering challenge in and of itself,” said Jason Derenick, CTO at Exyn Technologies. “Achieving Level 5 is the holy grail of autonomous systems – this is when the drone can demonstrate 100% control in an unbounded environment, without any input from a human operator whatsoever. While I don’t believe we will witness this in my lifetime, I do believe we will push the limits of what’s possible with advanced Level 4. We are already working on attaining Level 4B autonomy with swarms, or collaborative multi-robot systems.”

“There’s things that we want to do to make it faster, make it higher resolution, make it more accurate,” said Elm, in an interview with Forbes. “But the other thing we were kind of contemplating is basically the ability to have multiple robots collaborate with each other so you can scale the problem – both in terms of scale and scope. So you can have multiple identical robots on a mission, so you can actually now cover a larger area, but also have specialised robots that might be different. So, heterogeneous swarms so they can actually now have specialised tasks and collaborate with each other on a mission.”