Menu

Blog

Archive for the ‘information science’ category: Page 187

Aug 5, 2020

Deepfakes are the most worrying AI crime, researchers warn

Posted by in categories: information science, robotics/AI, terrorism

Deepfakes are the most concerning use of AI for crime and terrorism, according to a new report from University College London.

The research team first identified 20 different ways AI could be used by criminals over the next 15 years. They then asked 31 AI experts to rank them by risk, based on their potential for harm, the money they could make, their ease of use, and how hard they are to stop.

Deepfakes — AI-generated videos of real people doing and saying fictional things — earned the top spot for two major reasons. Firstly, they’re hard to identify and prevent. Automated detection methods remain unreliable and deepfakes also getting better at fooling human eyes. A recent Facebook competition to detect them with algorithms led researchers to admit it’s “very much an unsolved problem.”

Aug 5, 2020

Self-organising swarms of firefighting drones: Harnessing the power of collective intelligence in decentralised multi-robot systems

Posted by in categories: drones, information science, particle physics, robotics/AI

Swarm intelligence (SI) is concerned with the collective behaviour that emerges from decentralised self-organising systems, whilst swarm robotics (SR) is an approach to the self-coordination of large numbers of simple robots which emerged as the application of SI to multi-robot systems. Given the increasing severity and frequency of occurrence of wildfires and the hazardous nature of fighting their propagation, the use of disposable inexpensive robots in place of humans is of special interest. This paper demonstrates the feasibility and potential of employing SR to fight fires autonomously, with a focus on the self-coordination mechanisms for the desired firefighting behaviour to emerge. Thus, an efficient physics-based model of fire propagation and a self-organisation algorithm for swarms of firefighting drones are developed and coupled, with the collaborative behaviour based on a particle swarm algorithm adapted to individuals operating within physical dynamic environments of high severity and frequency of change. Numerical experiments demonstrate that the proposed self-organising system is effective, scalable and fault-tolerant, comprising a promising approach to dealing with the suppression of wildfires – one of the world’s most pressing challenges of our time.

Aug 5, 2020

4 Automatic Outlier Detection Algorithms in Python

Posted by in categories: information science, robotics/AI

The presence of outliers in a classification or regression dataset can result in a poor fit and lower predictive modeling performance.

Identifying and removing outliers is challenging with simple statistical methods for most machine learning datasets given the large number of input variables. Instead, automatic outlier detection methods can be used in the modeling pipeline and compared, just like other data preparation transforms that may be applied to the dataset.

In this tutorial, you will discover how to use automatic outlier detection and removal to improve machine learning predictive modeling performance.

Aug 4, 2020

Calculating the benefits of exascale and quantum computers

Posted by in categories: information science, quantum physics, supercomputing

A quintillion calculations a second. That’s one with 18 zeros after it. It’s the speed at which an exascale supercomputer will process information. The Department of Energy (DOE) is preparing for the first exascale computer to be deployed in 2021. Two more will follow soon after. Yet quantum computers may be able to complete more complex calculations even faster than these up-and-coming exascale computers. But these technologies complement each other much more than they compete.

It’s going to be a while before quantum computers are ready to tackle major scientific research questions. While quantum researchers and scientists in other areas are collaborating to design quantum computers to be as effective as possible once they’re ready, that’s still a long way off. Scientists are figuring out how to build qubits for quantum computers, the very foundation of the technology. They’re establishing the most fundamental quantum algorithms that they need to do simple calculations. The hardware and algorithms need to be far enough along for coders to develop operating systems and software to do scientific research. Currently, we’re at the same point in that scientists in the 1950s were with computers that ran on vacuum tubes. Most of us regularly carry computers in our pockets now, but it took decades to get to this level of accessibility.

In contrast, exascale computers will be ready next year. When they launch, they’ll already be five times faster than our fastest —Summit, at Oak Ridge National Laboratory’s Leadership Computing Facility, a DOE Office of Science user facility. Right away, they’ll be able to tackle major challenges in modeling Earth systems, analyzing genes, tracking barriers to fusion, and more. These powerful machines will allow scientists to include more variables in their equations and improve models’ accuracy. As long as we can find new ways to improve conventional computers, we’ll do it.

Aug 3, 2020

New Integrated 3D-Circuit Architecture With Spiraling Memory for More Efficient AI

Posted by in categories: climatology, information science, robotics/AI, sustainability

Researchers from the Institute of Industrial Science at The University of Tokyo designed and built specialized computer hardware consisting of stacks of memory modules arranged in a 3D-spiral for artificial intelligence (AI) applications. This research may open the way for the next generation of energy-efficient AI devices.

Machine learning is a type of AI that allows computers to be trained by example data to make predictions for new instances. For example, a smart speaker algorithm like Alexa can learn to understand your voice commands, so it can understand you even when you ask for something for the first time. However, AI tends to require a great deal of electrical energy to train, which raises concerns about adding to climate change.

Now, scientists from the Institute of Industrial Science at The University of Tokyo have developed a novel design for stacking resistive random-access memory modules with oxide semiconductor (IGZO) access transistor in a three-dimensional spiral. Having on-chip nonvolatile memory placed close to the processors makes the machine learning training process much faster and more energy-efficient. This is because electrical signals have a much shorter distance to travel compared with conventional computer hardware. Stacking multiple layers of circuits is a natural step, since training the algorithm often requires many operations to be run in parallel at the same time.

Aug 3, 2020

Artificial intelligence and algorithms bring drone inspection breakthrough

Posted by in categories: drones, information science, robotics/AI

A drone has successfully inspected a 19.4 meter high oil tank onboard a Floating Production, Storage and Offloading vessel. The video shot by the drone was interpreted in real-time by an algorithm to detect cracks in the structure.

Scout Drone Inspection and class society DNV GL have been working together to develop an autonomous drone system to overcome the common challenges of tank inspections. For the customer, costs can run into hundreds of thousands of dollars as the tank is taken out of service for days to ventilate and construct scaffolding. The tanks are also tough work environments, with surveyors often having to climb or raft into hard to reach corners. Using a drone in combination with an algorithm to gather and analyse video footage can significantly reduce survey times and staging costs, while at the same time improving surveyor safety.

Continue reading “Artificial intelligence and algorithms bring drone inspection breakthrough” »

Aug 3, 2020

The Quantum Gate Hack – Applying Ideas From Gaming Hacks to Quantum Computing

Posted by in categories: cybercrime/malcode, information science, quantum physics

PNNL quantum algorithm theorist and developer Nathan Wiebe is applying ideas from data science and gaming hacks to quantum computing.

Everyone working on quantum computers knows the devices are error prone. The basic unit of quantum programming – the quantum gate – fails about once every hundred operations. And that error rate is too high.

While hardware developers and programming analysts are fretting over failure rates, PNNL’s Nathan Wiebe is forging ahead writing code that he is confident will run on quantum computers when they are ready. In his joint appointment role as a professor of physics at the University of Washington, Wiebe is training the next generation of quantum computing theorists and programmers.

Aug 3, 2020

DeepMind releases Acme, a distributed framework for reinforcement learning algorithm development

Posted by in categories: information science, robotics/AI, transportation

DeepMind this week released Acme, a framework intended to simplify the development of reinforcement learning algorithms by enabling AI-driven agents to run at various scales of execution. According to the engineers and researchers behind Acme, who coauthored a technical paper on the work, it can be used to create agents with greater parallelization than in previous approaches.

Reinforcement learning involves agents that interact with an environment to generate their own training data, and it’s led to breakthroughs in fields from video games and robotics to self-driving robo-taxis. Recent advances are partly attributable to increases in the amount of training data used, which has motivated the design of systems where agents interact with instances of an environment to quickly accumulate experience. This scaling from single-process prototypes of algorithms to distributed systems often requires a reimplementation of the agents in question, DeepMind asserts, which is where the Acme framework comes in.

Aug 1, 2020

Surprisingly Recent Galaxy Discovered Using Machine Learning – May Be the Last Generation Galaxy in the Long Cosmic History

Posted by in categories: cosmology, information science, robotics/AI

Breaking the lowest oxygen abundance record.

New results achieved by combining big data captured by the Subaru Telescope and the power of machine learning have discovered a galaxy with an extremely low oxygen abundance of 1.6% solar abundance, breaking the previous record of the lowest oxygen abundance. The measured oxygen abundance suggests that most of the stars in this galaxy formed very recently.

Continue reading “Surprisingly Recent Galaxy Discovered Using Machine Learning – May Be the Last Generation Galaxy in the Long Cosmic History” »

Aug 1, 2020

Quantum machines learn ‘quantum data’

Posted by in categories: information science, quantum physics, robotics/AI, supercomputing

Skoltech scientists have shown that quantum enhanced machine learning can be used on quantum (as opposed to classical) data, overcoming a significant slowdown common to these applications and opening a “fertile ground to develop computational insights into quantum systems.” The paper was published in the journal Physical Review A.

Quantum computers utilize quantum mechanical effects to store and manipulate information. While quantum effects are often claimed to be counterintuitive, such effects will enable quantum enhanced calculations to dramatically outperform the best supercomputers. In 2019, the world saw a prototype of this demonstrated by Google as quantum computational superiority.

Quantum algorithms have been developed to enhance a range of different computational tasks; more recently this has grown to include quantum enhanced machine learning. Quantum machine learning was partly pioneered by Skoltech’s resident-based Laboratory for Quantum Information Processing, led by Jacob Biamonte, a coathor of this paper. “Machine learning techniques have become powerful tools for finding patterns in data. Quantum systems produce atypical patterns that are thought not to produce efficiently, so it is not surprising that quantum computers might outperform classical computers on machine learning tasks,” he says.