Toggle light / dark theme

Microsoft has purchased startup company Semantic Machines in an effort to make artificial intelligence bots sound more human. The Berkeley, California-based business focuses on contextual understanding of conversation.

Previously, the firm has worked with Apple on speech recognition technology for Siri. Semanitc Machines is lead by professor Dan Klein of UC Berkeley and professor Percy Liang of Standford University in addition to Apple’s former chief speech scientist Larry Gillick.

Microsoft has been working on speech recognition and natural language processing for nearly two decades now. As Cortana has gained a more prominent role in recent years, Redmond is aiming to improve the accuracy and fluency of its assistant.

Read more

Machine-learning technology is growing ever more accessible. Let’s not have a 9/11-style ‘failure of imagination’ about it.

There is a general tendency among counterterrorism analysts to understate rather than hyperbolize terrorists’ technological adaptations. In 2011 and 2012, most believed that the “Arab Spring” revolutions would marginalize jihadist movements. But within four years, jihadists had attracted a record number of foreign fighters to the Syrian battlefield, in part by using the same social media mobilization techniques that protesters had employed to challenge dictators like Zine El Abidine Ben Ali, Hosni Mubarak, and Muammar Qaddafi.

Militant groups later combined easy accessibility to operatives via social media with new advances in encryption to create the “virtual planner” model of terrorism. This model allows online operatives to provide the same offerings that were once the domain of physical networks, including recruitment, coordinating the target and timing of attacks, and even providing technical assistance on topics like bomb-making.

Read more

“Within five years, I have no doubt there will be robots in every Army formation.”


From the spears hurled by Romans to the missiles launched by fighter pilots, the weapons humans use to kill each other have always been subject to improvement. Militaries seek to make each one ever-more lethal and, in doing so, better protect the soldier who wields it. But in the next evolution of combat, the U.S. Army is heading down a path that may lead humans off the battlefield entirely.

Over the next few years, the Pentagon is poised to spend almost $1 billion for a range of robots designed to complement combat troops. Beyond scouting and explosives disposal, these new machines will sniff out hazardous chemicals or other agents, perform complex reconnaissance and even carry a soldier’s gear.

More from Bloomberg.com: China Casts Doubt on Report of $200 Billion Trade Deficit Offer.

Read more

Aurora Flight Services’ Autonomous Aerial Cargo Utility System (AACUS) took another step forward as an AACUS-enabled UH-1H helicopter autonomously delivered 520 lb (236 kg) of water, gasoline, MREs, communications gear, and a cooler capable of carrying urgent supplies such as blood to US Marines in the field.

Last week’s demonstration at the Marine Corps Air Ground Combat Center Twentynine Palms in California was the first ever autonomous point-to-point cargo resupply mission to Marines and was carried out as part of an Integrated Training Exercise. The completion of what has been billed as the system’s first closed-loop mission involved the modified helicopter carrying out a full cargo resupply operation that included takeoff and landing with minimal human intervention.

Developed as part of a US$98-million project by the US Office of Naval Research (ONR), AACUS is an autonomous flight system that can be retrofitted to existing helicopters to make them pilot optional. The purpose of AACUS is to provide the US armed forces with logistical support in the field with a minimum of hazard to human crews.

Read more

We propose a method that can generate soft segments, i.e. layers that represent the semantically meaningful regions as well as the soft transitions between them, automatically by fusing high-level and low-level image features in a single graph structure. The semantic soft segments, visualized by assigning each segment a solid color, can be used as masks for targeted image editing tasks, or selected layers can be used for compositing after layer color estimation.

Abstract

Accurate representation of soft transitions between image regions is essential for high-quality image editing and compositing. Current techniques for generating such representations depend heavily on interaction by a skilled visual artist, as creating such accurate object selections is a tedious task. In this work, we introduce semantic soft segments, a set of layers that correspond to semantically meaningful regions in an image with accurate soft transitions between different objects. We approach this problem from a spectral segmentation angle and propose a graph structure that embeds texture and color features from the image as well as higher-level semantic information generated by a neural network. The soft segments are generated via eigendecomposition of the carefully constructed Laplacian matrix fully automatically. We demonstrate that otherwise complex image editing tasks can be done with little effort using semantic soft segments.

Read more

Insect-sized flying robots could help with time-consuming tasks like surveying crop growth on large farms or sniffing out gas leaks. These robots soar by fluttering tiny wings because they are too small to use propellers, like those seen on their larger drone cousins. Small size is advantageous: These robots are cheap to make and can easily slip into tight places that are inaccessible to big drones.

But current flying robo-insects are still tethered to the ground. The electronics they need to power and control their wings are too heavy for these miniature robots to carry.

Now, engineers at the University of Washington have for the first time cut the cord and added a brain, allowing their RoboFly to take its first independent flaps. This might be one small flap for a robot, but it’s one giant leap for robot-kind. The team will present its findings May 23 at the International Conference on Robotics and Automation in Brisbane, Australia.

Read more

There’s always a lot of talk about how AI will steal all our jobs and how machines will bring about the collapse of employment as we know it. It’s certainly hard to blame people for worrying with all the negative press around the issue.

But the reality is that AI is completely dependent on humans, and it appears as if it will stay that way for the foreseeable future. In fact, as AI grows as an industry and machine learning becomes more widely used, this will actually create a whole host of new jobs for people.

Let’s take a look at some of the roles humans currently play in the AI industry and the kind of jobs that will continue to be important in the future.

Read more