We’re excited to reveal the latest in Spot’s expanded product line. Join us live on Tuesday, February 2nd @ 11 am EST, to hear how these products will extend Spot’s value for autonomous inspection and data collection.
It was invented by David Gathu and Moses Kinyua and is powered by brain signals.
The signals are converted into an electric current by a “NeuroNode” biopotential headset receiver. This electrical current is then driven into the robot’s circuitry, which gives the arm its mobility.
The arm has several component materials including recycled wood and moves vertically and horizontally.
Micro-sized robots could bring a new wave of innovation in the medical field by allowing doctors to access specific regions inside the human body without the need for highly invasive procedures. Among other things, these tiny robots could be used to carry drugs, genes or other substances to specific sites inside the body, opening up new possibilities for treating different medical conditions.
Researchers at ETH Zurich and Helmholtz Institute Erlangen–Nürnberg for Renewable Energy have recently developed micro and nano-sized robots inspired by biological micro-swimmers (e.g., bacteria or spermatozoa). These small robots, presented in a paper published in Nature Machine Intelligence, are capable of upstream motility, which essentially means that they can autonomously move in the opposite direction to that in which a fluid (e.g., blood) flows. This makes them particularly promising for intervening inside the human body.
“We believe that the ideas discussed in our multidisciplinary study can transform many aspects of medicine by enabling tasks such as targeted and precise delivery of drugs or genes, as well as facilitating non-invasive surgeries,” Daniel Ahmed, lead author of the recent paper, told TechXplore.
A universal basic income worth about one-fifth of workers’ median wages did not reduce the amount of effort employees put into their work, according to an experiment conducted by Spanish economists, a sign that the policy initiative could help mitigate inequalities and the impact of automation.
Providing workers with a universal basic income did not reduce the amount of effort they put into their work, according to an experiment conducted by Spanish economists, a sign that the policy initiative could help mitigate inequalities and debunking a common criticism of the proposal.
Examining a universal basic income worth about one-fifth of workers’ median wages, the researchers also found that the threat of being replaced by robots did not impact workers’ productivity, nor did a tax on firms when they replace a worker with a robot or automated process, though the latter successfully created a disincentive for managers.
Our goal is audacious — some might even say naive. The aim is to evaluate every gene and drug perturbation in every possible type of cancer in laboratory experiments, and to make the data accessible to researchers and machine-learning experts worldwide. To put some ballpark numbers on this ambition, we think it will be necessary to perturb 20000 genes and assess the activity of 10000 drugs and drug candidates in 20000 cancer models, and measure changes in viability, morphology, gene expression and more. Technologies from CRISPR genome editing to informatics now make this possible, given enough resources and researchers to take on the task.
It is time to move beyond tumour sequencing data to identify vulnerabilities in cancers.
Exploring new approaches to improve the capabilities and accuracy of robots, a team of researchers in Singapore has turned to an unexpected source: plants.
Robots have been dispatched to move cars, lift weighty inventory in warehouses and assist in construction projects.
But what if you need to delicately lift a tiny object 1/50th of an inch?
In recent years, countless computer scientists worldwide have been developing deep neural network-based models that can predict people’s emotions based on their facial expressions. Most of the models developed so far, however, merely detect primary emotional states such as anger, happiness and sadness, rather than more subtle aspects of human emotion.
Past psychology research, on the other hand, has delineated numerous dimensions of emotion, for instance, introducing measures such as valence (i.e., how positive an emotional display is) and arousal (i.e., how calm or excited someone is while expressing an emotion). While estimating valence and arousal simply by looking at people’s faces is easy for most humans, it can be challenging for machines.
Researchers at Samsung AI and Imperial College London have recently developed a deep-neural-network-based system that can estimate emotional valence and arousal with high levels of accuracy simply by analyzing images of human faces taken in everyday settings. This model, presented in a paper published in Nature Machine Intelligence, can make predictions fairly quickly, which means that it could be used to detect subtle qualities of emotion in real time (e.g., from snapshots of CCTV cameras).