Summary: Human cortical networks have evolved a novel neural network that relies on abundant connections between inhibitory interneurons.
Source: Max Planck Institute.
The analysis of the human brain is a central goal of neuroscience. However, for methodological reasons, research has largely focused on model organisms, in particular the mouse.
AZoRobotics speaks with Dr. Erik Enegberg from Florida Atlantic University about his research into a wearable soft robotic armband. This could be a life-changing device for prosthetic hands users who have long-desired advances in dexterity.
Typing on a keyboard, pressing buttons on a remote control, or braiding a child’s hair has remained elusive for prosthetic hand users. How does the loss of tactile sensations impact limb-absent people’s lives?
Losing the sensation of touch has a profound impact on people’s lives. Some of the things that may seem simple and a part of everyday life, such as stroking the fur of a pet or the skin of a loved one, are a meaningful and fundamental way to connect with those around us for others. For example, a patient with a bilateral amputation has previously expressed concerns that he might hurt his granddaughter by accidentally squeezing her hand too tightly as he has lost tactile sensation.
Scientists in China say they have been able to run an artificial intelligence model as sophisticated as a human brain on their most powerful supercomputer, a report from the South China Morning Pos t reveals.
According to the report, this puts China’s Newest Generation Sunway supercomputer on the same level as the U.S. Department of Energy’s Frontier, which was named the world’s most powerful supercomputer earlier this month.
As a point of reference, Frontier is the first machine to have demonstrated it can perform more than one quintillion calculations per second.
Microsoft-owned GitHub is launching its Copilot AI tool today, which helps suggest lines of code to developers inside their code editor. GitHub originally teamed up with OpenAI last year to launch a preview of Copilot, and it’s generally available to all developers today. Priced at US$10 per month or US$100 a year, GitHub Copilot is capable of suggesting the next line of code as developers type in an integrated development environment (IDE) like Visual Studio Code, Neovim, and JetBrains IDEs. Copilot can suggest complete methods and complex algorithms alongside boilerplate code and assistance with unit testing. More than 1.2 million developers signed up to use the GitHub Copilot preview over the past 12 months, and it will remain a free tool for verified students and maintainers of popular open-source projects. In files where it’s enabled, GitHub says nearly 40 percent of code is now being written by Copilot.
“Over the past year, we’ve continued to iterate and test workflows to help drive the ‘magic’ of Copilot,” Ryan J. Salva, VP of product at GitHub, told TechCrunch via email. “We not only used the preview to learn how people use GitHub Copilot but also to scale the service safely.”
“We specifically designed GitHub Copilot as an editor extension to make sure nothing gets in the way of what you’re doing,” GitHub CEO Thomas Dohmke says in a blog post(Opens in a new window). “GitHub Copilot distills the collective knowledge of the world’s developers into an editor extension that suggests code in real-time, to help you stay focused on what matters most: building great software.”
Unfortunately my internet link went down in the second Q&A session at the end and the recording cut off. Shame, loads of great information came out about FPGA/ASIC implementations, AI for the VR/AR, C/C++ and a whole load of other riveting and most interesting techie stuff. But thankfully the main part of the talk was recorded.
TALK OVERVIEW This talk is about the realization of the ideas behind the Fractal Brain theory and the unifying theory of life and intelligence discussed in the last Zoom talk, in the form of useful technology. The Startup at the End of Time will be the vehicle for the development and commercialization of a new generation of artificial intelligence (AI) and machine learning (ML) algorithms.
We will show in detail how the theoretical fractal brain/genome ideas lead to a whole new way of doing AI and ML that overcomes most of the central limitations of and problems associated with existing approaches. A compelling feature of this approach is that it is based on how neurons and brains actually work, unlike existing artificial neural networks, which though making sensational headlines are impeded by severe limitations and which are based on an out of date understanding of neurons form about 70 years ago. We hope to convince you that this new approach, really is the path to true AI.
In the last Zoom talk, we discussed a great unifying of scientific ideas relating to life & brain/mind science through the application of the mathematical idea of symmetry. In turn the same symmetry approach leads to a unifying of a mass of ideas relating to computer and information science. There’s been talk in recent years of a ‘master algorithm’ of machine learning and AI. We’ll explain that it goes far deeper than that and show how there exists a way of unifying into a single algorithm, the most important fundamental algorithms in use in the world today, which relate to data compression, databases, search engines and also existing AI/ML. Furthermore and importantly this algorithm is completely fractal or scale invariant. The same algorithm which is able to perform all these functionalities is able to run on a micro-controller unit (MCU), mobile phone, laptop and workstation, going right up to a supercomputer.
The application and utility of this new technology is endless. We will discuss the road map by which the sort of theoretical ideas I’ve been discussing in the Zoom, academic and public talks over the past few years, and which I’ve written about in the Fractal Brain Theory book, will become practical technology. And how the Java/C/C++ code running my workstation and mobile phones will become products and services.
In his keynote at Amazon re: MARS, Alexa AI senior vice president and head scientist Rohit Prasad argued that the emerging paradigm of ambient intelligence offer… See more.
Rohit Prasad on the pathway to generalizable intelligence and what excites him most about his re: MARS keynote.
From there, they ran flight tests using a specially designed motion-tracking system. Each electroluminescent actuator served as an active marker that could be tracked using iPhone cameras. The cameras detect each light color, and a computer program they developed tracks the position and attitude of the robots to within 2 millimeters of state-of-the-art infrared motion capture systems.
“We are very proud of how good the tracking result is, compared to the state-of-the-art. We were using cheap hardware, compared to the tens of thousands of dollars these large motion-tracking systems cost, and the tracking results were very close,” Kevin Chen says.
In the future, they plan to enhance that motion tracking system so it can track robots in real-time. The team is working to incorporate control signals so the robots could turn their light on and off during flight and communicate more like real fireflies. They are also studying how electroluminescence could even improve some properties of these soft artificial muscles, Kevin Chen says.
Microplastics are found nearly everywhere on Earth and can be harmful to animals if they’re ingested. But it’s hard to remove such tiny particles from the environment, especially once they settle into nooks and crannies at the bottom of waterways. Now, researchers in ACS’ Nano Letters have created a light-activated fish robot that “swims” around quickly, picking up and removing microplastics from the environment.
Because microplastics can fall into cracks and crevices, they’ve been hard to remove from aquatic environments. One solution that’s been proposed is using small, flexible and self-propelled robots to reach these pollutants and clean them up. But the traditional materials used for soft robots are hydrogels and elastomers, and they can be damaged easily in aquatic environments. Another material called mother-of-pearl, also known as nacre, is strong and flexible, and is found on the inside surface of clam shells. Nacre layers have a microscopic gradient, going from one side with lots of calcium carbonate mineral-polymer composites to the other side with mostly a silk protein filler. Inspired by this natural substance, Xinxing Zhang and colleagues wanted to try a similar type of gradient structure to create a durable and bendable material for soft robots.
The researchers linked β-cyclodextrin molecules to sulfonated graphene, creating composite nanosheets. Then solutions of the nanosheets were incorporated with different concentrations into polyurethane latex mixtures. A layer-by-layer assembly method created an ordered concentration gradient of the nanocomposites through the material from which the team formed a tiny fish robot that was 15-mm (about half-an-inch) long. Rapidly turning a near-infrared light laser on and off at a fish’s tail caused it to flap, propelling the robot forward. The robot could move 2.67 body lengths per second—a speed that’s faster than previously reported for other soft swimming robots and that is about the same speed as active phytoplankton moving in water. The researchers showed that the swimming fish robot could repeatedly adsorb nearby polystyrene microplastics and transport them elsewhere. The material could also heal itself after being cut, still maintaining its ability to adsorb microplastics.
By combining two distinct approaches into an integrated workflow, Singapore University of Technology and Design (SUTD) researchers have developed a novel automated process for designing and fabricating customized soft robots. Their method, published in Advanced Materials Technologies, can be applied to other kinds of soft robots—allowing their mechanical properties to be tailored in an accessible manner.
Though robots are often depicted as stiff, metallic structures, an emerging class of pliable machines known as soft robots is rapidly gaining traction. Inspired by the flexible forms of living organisms, soft robots have wide applications in sensing, movement, object grasping and manipulation, among others. Yet, such robots are still mostly fabricated through manual casting techniques—limiting the complexity and geometries that can be achieved.
“Most fabrication approaches are predominantly manual due to a lack of standard tools,” said SUTD Assistant Professor Pablo Valdivia y Alvarado, who led the study. “But 3D printing or additive manufacturing is slowly coming into play as it facilitates repeatability and allows more complex designs—improving quality and performance.”