Toggle light / dark theme

AI Is Helping Us Combat The Economic Problem Of Human Trafficking

When we think of human trafficking, we often think about the despondent faces of women and children who live in slums all over the world. What if human trafficking is much closer to home than we think? In 2019, Markie Dell, stood on the TEDx stage to recount her experience of being a domestic human trafficking victim. She was an awkward teenager who was groomed by a girl that she befriended at a birthday party. She was subsequently kidnapped, drugged, sexually violated, intimidated at gunpoint into dancing in strip clubs for an entire year.

She didn’t know that she was a human trafficking victim until a police officer handed her a book called, “Pimpology”. Then, she knew that she was being human trafficked.

According to the Polaris Project, most human trafficking victims are trafficked by their romantic partners, spouses, family members, including parents. In the U.S., in 2018, there were 23,078 survivors identified and 10,949 cases of human trafficking. Even then, these cases are often drastically underreported.

Life 3 0 Audiobook Age of Artificial Intelligence

If anyone wants to learn about the future of artificial intelligence and the rise of super-intelligent machines here is Billionaire rocket scientist businessman Elon Musk’s favorite book of the year written by MIT math Professor Max Tegmark at the Future of Life Institute. The book Life 3.0 is presented in this free audio-book format on YouTube and is a 7 hour video talking about how we will become a Libertarian Utopia soon thanks to advances in technology.

Teaching the iCub robot to express basic human emotions

As robots make their way into a variety of environments and start interacting with humans on a regular basis, they should be able to communicate with users as effectively as possible. Over the past decade or so, researchers worldwide have thus been developing machine learning-based models and other computational techniques that could enhance human-robot communications.

One way to improve how robots communicate with human users is by training them to express , such as sadness, happiness, fear and anger. The ability to express emotions would ultimately allow robots to convey messages more effectively, in ways that are aligned with a given situation.

Researchers at the University of Hamburg in Germany have recently developed a machine learning-based method to teach robots how to convey what have previously been defined as the seven universal emotions, namely anger, disgust, fear, happiness, sadness, surprise and a neutral state. In their paper, pre-published on arXiv, they applied and tested their technique on a humanoid called iCub.

New scavenger technology allows robots to ‘eat’ metal for energy

When electronics need their own power sources, there are two basic options: batteries and harvesters. Batteries store energy internally, but are therefore heavy and have a limited supply. Harvesters, such as solar panels, collect energy from their environments. This gets around some of the downsides of batteries but introduces new ones, in that they can only operate in certain conditions and can’t turn that energy into useful power very quickly.

New research from the University of Pennsylvania’s School of Engineering and Applied Science is bridging the gap between these two fundamental technologies for the first time in the form of a “metal-air scavenger” that gets the best of both worlds.

This metal-air scavenger works like a battery, in that it provides power by repeatedly breaking and forming a series of chemical bonds. But it also works like a harvester, in that power is supplied by in its environment: specifically, the chemical bonds in metal and air surrounding the metal-air scavenger.

Researchers design intelligent microsystem for faster, more sustainable industrial chemistry

The synthesis of plastic precursors, such as polymers, involves specialized catalysts. However, the traditional batch-based method of finding and screening the right ones for a given result consumes liters of solvent, generates large quantities of chemical waste, and is an expensive, time-consuming process involving multiple trials.

Ryan Hartman, professor of chemical and at the NYU Tandon School of Engineering, and his laboratory developed a lab-based “intelligent microsystem” employing , for modeling that shows promise for eliminating this costly process and minimizing environmental harm.

In their research, “Combining automated microfluidic experimentation with machine learning for efficient polymerization design,” published in Nature Machine Intelligence, the collaborators, including doctoral student Benjamin Rizkin, employed a custom-designed, rapidly prototyped microreactor in conjunction with automation and in situ infrared thermography to study exothermic (heat generating) polymerization—reactions that are notoriously difficult to control when limited experimental kinetic data are available. By pairing efficient microfluidic technology with machine learning algorithms to obtain high-fidelity datasets based on minimal iterations, they were able to reduce chemical waste by two orders of magnitude and catalytic discovery from weeks to hours.

Downloading the Human Brain to a Computer: Elon Musk’s Neuralink

https://www.youtube.com/watch?v=r-vbh3t7WVI

Your Neuralink device would be implanted using traditional neurosurgery methods safely and seamlessly with a robot surgeon. As mentioned in the Neuralink published paper, “We have also built a neurosurgical robot capable of inserting six threads (192 electrodes) per minute. Each thread can be individually inserted into the brain with micron precision for the avoidance of surface vasculature and targeting specific brain regions.”

The Neuralink team has already started experimenting with various versions of the devices on both rats and monkeys. The results have been impressive. During the presentation in San Francisco, Musk and his team described one instance of a monkey being able to control a computer with simply its brain. Unfortunately, (or fortunately) there have not been any tests on humans. Yet, the team hopes to obtain FDA approval and begin human trials as early as this year.

During his presentation Elon Musk reiterated the idea that Neuralink will be an important part of our future, eventually allowing us to reach symbiosis with artificial intelligence. “With a high bandwidth brain-machine interface, we can go along for the ride and effectively have the option of merging with AI”, said Musk. And as for the risks to reach this end goal? Musk reassured his audience that the device will be safe.

Tesla is adding remarkable detail in its Full Self-Driving Visualizations

Most of the pieces of Elon Musk’s Master Plan, Part Deux are already in place. Tesla’s mass-market cars, the Model 3 and Model Y, have already been released. The Solar Roof is finally seeing a ramp. And the release of a feature-complete version of the company’s Full Self-Driving suite seems to be drawing closer.

For Tesla’s Full Self-Driving suite to be feature-complete, the electric car maker would need to master inner-city driving. FSD already works for highway driving with Navigate on Autopilot with automatic lane changes. But when it comes to inner-city streets, Full Self-Driving still has some ways to go. Fortunately, if Tesla’s v10.2 2020.12.5 release is any indication, it appears that more and more aspects of city driving are becoming recognized by the company’s neural networks.

Automatic diagnosis of the 12-lead ECG using a deep neural network

The role of automatic electrocardiogram (ECG) analysis in clinical practice is limited by the accuracy of existing models. Deep Neural Networks (DNNs) are models composed of stacked transformations that learn tasks by examples. This technology has recently achieved striking success in a variety of task and there are great expectations on how it might improve clinical practice. Here we present a DNN model trained in a dataset with more than 2 million labeled exams analyzed by the Telehealth Network of Minas Gerais and collected under the scope of the CODE (Clinical Outcomes in Digital Electrocardiology) study. The DNN outperform cardiology resident medical doctors in recognizing 6 types of abnormalities in 12-lead ECG recordings, with F1 scores above 80% and specificity over 99%. These results indicate ECG analysis based on DNNs, previously studied in a single-lead setup, generalizes well to 12-lead exams, taking the technology closer to the standard clinical practice.

/* */