A new class of chemical instrumentation seeks to alleviate the tedium and complexity of organic syntheses.
Category: robotics/AI – Page 1901
Billionaire company founder Elon Musk tweeted a pair of photos this week apparently showing progress on one of the Starship prototypes SpaceX is currently developing.
SpaceX has said it plans to use the rockets to shuttle passengers and cargo across the planet or beyond it.
“Droid Junkyard, Tatooine,” Musk wrote in the first tweet, making a joking “Star Wars” reference.
NASA is on a mission to go back to the moon by 2024 and use it as a “backyard” of experimentation, according to Lucien Junkin, chief engineer of the space exploration vehicle at NASA.
ABC toured NASA’s Johnson Space Center’s robotics design area: https://abcn.ws/2AleGSQ
Wave function represents the quantum state of an atom, including the position and movement states of the nucleus and electrons. For decades researchers have struggled to determine the exact wave function when analyzing a normal chemical molecule system, which has its nuclear position fixed and electrons spinning. Fixing wave function has proven problematic even with help from the Schrödinger equation.
Previous research in this field used a Slater-Jastrow Ansatz application of quantum Monte Carlo (QMC) methods, which takes a linear combination of Slater determinants and adds the Jastrow multiplicative term to capture the close-range correlations.
Now, a group of DeepMind researchers have brought QMC to a higher level with the Fermionic Neural Network — or Fermi Net — a neural network with more flexibility and higher accuracy. Fermi Net takes the electron information of the molecules or chemical systems as inputs and outputs their estimated wave functions, which can then be used to determine the energy states of the input chemical systems.
To evaluate the performance of robotics algorithms and controllers, researchers typically use software simulations or real physical robots. While these may appear as two distinct evaluation strategies, there is a whole other range of possibilities that combine elements of both.
In a recent study, researchers at Texas A&M University and the University of South Carolina have set out to examine evaluation and execution scenarios that lie at an intersection between simulations and real implementations. Their investigation, outlined in a paper pre-published on arXiv, specifically focuses on instances in which real robots perceive the world via their sensors, where the environment they sense could be seen as a mere illusion.
“We consider problems in which robots conspire to present a view of the world that differs from reality,” Dylan Shell and Jason O’Kane, the researchers who carried out the study, wrote in their paper. “The inquiry is motivated by the problem of validating robot behavior physically despite there being a discrepancy between the robots we have at hand and those we wish to study, or the environment for testing that is available versus that which is desired, or other potential mismatches in this vein.”
Researchers at Hefei University of Technology in China and various universities in Japan have recently developed a unique emotion sensing system that can recognize people’s emotions based on their body gestures. They presented this new AI- powered system, called EmoSense, in a paper pre-published on arXiv.
“In our daily life, we can clearly realize that body gestures contain rich mood expressions for emotion recognition,” Yantong Wang, one of the researchers who carried out the study, told TechXplore. “Meanwhile, we can also find out that human body gestures affect wireless signals via shadowing and multi-path effects when we use antennas to detect behavior. Such signal effects usually form unique patterns or fingerprints in the temporal-frequency domain for different gestures.”
Wang and his colleagues observed that human body gestures can affect wireless signals, producing characteristic patterns that could be used for emotion recognition. This inspired them to develop a system that can identify these patterns, recognizing people’s emotions based on their physical movements.
Far from being a mystical “ghost in the machine”, consciousness evolved as a practical mental tool and we could engineer it in a robot using these simple guidelines.
Machine learning predicts 43 previously unknown, superhard forms of carbon. Could one offer a cheaper alternative to diamond?
Downloading your brain may seem like science fiction, but some neuroscientists think it’s not only possible, but that we’ve already started down a path to one day make it a reality. So, how close are we to downloading a human brain?
How Close Are We to Fusion Energy? — https://youtu.be/ZW_YCWLyv6A
We’ve Put a Worm’s Mind in a Lego Robot’s Body
https://www.smithsonianmag.com/smart-news/weve-put-worms-min…399/?no-is
“A wheeled Lego robot may not look like a worm, but it ‘thinks’ like one after programmers gave it the neuron connections in a C. elegans roundworm”
Crumb of Mouse Brain Reconstructed in Full Detail
Self-driving vehicles will lead to a rise in car sex, according to a new study.
People will be more likely to eat, sleep and engage in on-the-road hanky-panky when robot cars become the new normal, according to research published in the most recent issue of the journal Annals of Tourism Research.
“People will be sleeping in their vehicles, which has implications for roadside hotels. And people may be eating in vehicles that function as restaurant pods,” Scott Cohen, who led the study, told Fast Company magazine.