Designed by AI trained on hundreds of sculptures.
Category: robotics/AI – Page 1944
MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) is developing a robot that sorts for recycling. Fundamentally, the squad’s robot arm has soft grippers ad the robot can take objects from a conveyor belt and identify what they are made from— by touch.
Tactile sensors on the robot are the main feature. The sensorized gripper is fully electrical driven. It can detect the difference between paper, metal and plastic.
Why this matters: “Although environmental and sustainability concerns have made it crucial to scale up recycling operations, object sorting remains a critical bottleneck for recycling scalability,” they wrote in their paper that describes their work.
Announces the publication of a new open-access quarterly report: AI for Drug Discovery, Biomarker Development and Advanced R&D Landscape Overview 2019/Q1. Except for providing the analysis of 350 investors, 50 corporations and 150 companies operating in the field, the main events that took place in the industry from January to March 2019 are covered. The report also features the list of 30 leading R&D centers that provide important researches in the segment.
Link to the Report: https://www.ai-pharma.dka.global/quarter-1-2019
#AI #artificialintelligence #drugdiscovery
Presents its list of the top 30 FemTech Influencers, whose efforts in the FemTech Healthcare, FemTech Preventive Medicine and FemTech Longevity sectors have helped to grow the industry to its current state of maturity.
Jill Angelo genneve Elina Berglund Natural Cycles Starling Bank Tania Boler Elvie Ghela Boskovich Judith Campisi Adia Femtech Collective Dame Products EMBR Robin Starbuck Farmanfarmaian Cora Lifestyle Angie Lee Janet Lieberman Nuala Murphy Moment.Health Elena Mustatea Bold Health Anastasia Georgievskaya Haut.AI Maven Clinic THINX Nicole Shanahan Clearaccessip, Inc. Tammy Sun Ida Tin
Link to the Report: https://www.aginganalytics.com/femtech-healthcare-q1-2019
A new study from the Army Research Lab may help AI-infused weapons and tools better understand their human operators.
In World War II, the Allies had a big problem. Germany’s new bombers moved too quickly for the anti-aircraft methods of the previous war, in which soldiers used range tables and hand calculations to line up their guns. Mathematician Norbert Wiener had a theory: the only way to defeat the German aircraft was to merge the gun and its human operators — not physically but perceptually, through instruments. As Weiner explained in the video below, that meant “either a human interpretation of the machine, or a machine interpretation of the operator, or both.” This was the only way to get the gun to fire a round on target — not where the plane was but where it was going to be. This theoretical merger of human and machine gave rise to the field of cybernetics, derived from the Greek term cyber, to steer, and the English term net, for network.
Already deployed in over 50 stores around Japan, the VaakEye system constantly monitors security camera footage, detects suspicious activities and alerts staff, who can instantly review the footage and act on it. And the company is getting ready to launch Amazon-style auto-checkout as well.
Researchers at the Auckland Bioengineering Institute and Technische Universit\xE4t Dresden have recently designed a new type of inflatable robot for space navigation. These robots, presented in a paper published in SPIE Digital Library, were created using dielectric elastomer transducers (DETs), which are essentially electrical capacitors made from soft rubbery materials.
“Current space technology is limited by its mass and volume. It takes thousands of dollars to launch even a single kilogram into orbit,” Joseph Ashby, one of the researchers who carried out the study, told TechXplore. “Our research aims to replace or augment current technology with lighter smart-material replacements combined with inflatable structures.”
If they are integrated with inflatable structures, DETs could aid the development of soft and low-mass robots, which have high packaging efficiency and are easy to deploy. In fact, DETs deform when a voltage is applied to them, due to the Maxwell stress generated by the electric field.
The Internet comprises a decentralized global system that serves humanity’s collective effort to generate, process, and store data, most of which is handled by the rapidly expanding cloud. A stable, secure, real-time system may allow for interfacing the cloud with the human brain. One promising strategy for enabling such a system, denoted here as a “human brain/cloud interface” (“B/CI”), would be based on technologies referred to here as “neuralnanorobotics.” Future neuralnanorobotics technologies are anticipated to facilitate accurate diagnoses and eventual cures for the ∼400 conditions that affect the human brain. Neuralnanorobotics may also enable a B/CI with controlled connectivity between neural activity and external data storage and processing, via the direct monitoring of the brain’s ∼86 × 10 neurons and ∼2 × 1014 synapses. Subsequent to navigating the human vasculature, three species of neuralnanorobots (endoneurobots, gliabots, and synaptobots) could traverse the blood–brain barrier (BBB), enter the brain parenchyma, ingress into individual human brain cells, and autoposition themselves at the axon initial segments of neurons (endoneurobots), within glial cells (gliabots), and in intimate proximity to synapses (synaptobots). They would then wirelessly transmit up to ∼6 × 1016 bits per second of synaptically processed and encoded human–brain electrical information via auxiliary nanorobotic fiber optics (30 cm) with the capacity to handle up to 1018 bits/sec and provide rapid data transfer to a cloud based supercomputer for real-time brain-state monitoring and data extraction. A neuralnanorobotically enabled human B/CI might serve as a personalized conduit, allowing persons to obtain direct, instantaneous access to virtually any facet of cumulative human knowledge. Other anticipated applications include myriad opportunities to improve education, intelligence, entertainment, traveling, and other interactive experiences. A specialized application might be the capacity to engage in fully immersive experiential/sensory experiences, including what is referred to here as “transparent shadowing” (TS). Through TS, individuals might experience episodic segments of the lives of other willing participants (locally or remote) to, hopefully, encourage and inspire improved understanding and tolerance among all members of the human family.
“We’ll have nanobots that… connect our neocortex to a synthetic neocortex in the cloud… Our thinking will be a… biological and non-biological hybrid.”
— Ray Kurzweil, TED 2014