Toggle light / dark theme

Researchers in the lab of UC Santa Barbara professor Yasamin Mostofi have enabled, for the first time, determining whether the person behind a wall is the same individual who appears in given video footage, using only a pair of WiFi transceivers outside.

This novel video-WiFi cross-modal gait-based person identification system, which they refer to as XModal-ID (pronounced Cross-Modal-ID), could have a variety of applications, from surveillance and security to smart homes. For instance, consider a scenario in which law enforcement has a of a robbery. They suspect that the robber is hiding inside a house. Can a pair of WiFi transceivers outside the house determine if the person inside the house is the same as the one in the robbery video? Questions such as this have motivated this new technology.

“Our proposed approach makes it possible to determine if the person behind the wall is the same as the one in video footage, using only a pair of off-the-shelf WiFi transceivers outside,” said Mostofi. “This approach utilizes only received power measurements of a WiFi link. It does not need any prior WiFi or video training data of the person to be identified. It also does not need any knowledge of the operation area.”

Would you consent to a surveillance system that watches without video and listens without sound?

If your knee-jerk reaction is “no!”, then “huh?” I’m with you. In a new paper in Applied Physics Letters, a Chinese team is wading into the complicated balance between privacy and safety with computers that can echolocate. By training AI to sift through signals from arrays of acoustic sensors, the system can gradually learn to parse your movements—standing, sitting, falling—using only ultrasonic sound.

To study author Dr. Xinhua Guo at the Wuhan University of Technology, the system may be more palatable to privacy advocates than security cameras. Because it relies on ultrasonic waves—the type that bats use to navigate dark spaces—it doesn’t capture video or audio. It’ll track your body position, but not you per se.

One firm looking to capitalize is Beijing-based startup i-Space, which is preparing for its third launch in coming weeks. Like most of the $500 million-valued firm’s employees, CEO Jingqi Cai came from China’s state space industry. She sees no limit to what the Chinese commercial space industry can achieve. “I don’t know any country in the world which can do things as fast as in China,” says Cai.

Still, the ability of China’s commercial space firms to compete is curtailed by strict International Traffic in Arms Regulations (ITAR) rules, which prohibit satellites containing American components from being launched by China. In response, China is offering holistic “turnkey” solutions: building U.S. component-free satellites for clients, handling the launch and offering ground station support. Although China’s share of the commercial market remains small at around 5–10%, say analysts, it is growing with launch of communications and surveillance satellites for nations like Brazil, Venezuela, Laos, Nigeria and Algeria. In addition, European satellite manufacturers have begun designing devices labeled as “ITAR-free” for this reason.

The likelihood is that China’s space exploration and commercial programs will advance in tandem. For i-Space chief engineer Yi Wei, launching satellites is simple compared to his previous job designing escape pods for China’s state-run human space program. “In comparison, I feel no pressure here at all,” he says.

You open your browser to look at the Web. Do you know who is looking back at you?

Over a recent week of Web surfing, I peered under the hood of Google Chrome and found it brought along a few thousand friends. Shopping, news and even government sites quietly tagged my browser to let ad and data companies ride shotgun while I clicked around the Web.

This was made possible by the Web’s biggest snoop of all: Google. Seen from the inside, its Chrome browser looks a lot like surveillance software.

Before an A.I. system can learn, someone has to label the data supplied to it. Humans, for example, must pinpoint the polyps. The work is vital to the creation of artificial intelligence like self-driving cars, surveillance systems and automated health care.


Artificial intelligence is being taught by thousands of office workers around the world. It is not exactly futuristic work.

At iMerit offices in Kolkata, India, employees label images that are used to teach artificial intelligence systems. Credit Credit Rebecca Conway for The New York Times.

Nevertheless, to date, most of the wealth generated by advances in A.I. and robotics has been acquired by the executives of technology companies. It’s time for the benefits of the A.I. revolution to be broadly distributed through an expanded social safety net.

Unfortunately, members of Congress are taking the opposite path and have proposed cuts to a range of social programs. Several hundred thousand people arrived in Washington on Saturday to protest these cuts. During the demonstration, masked agitators threw rocks at the autonomous drones deployed for crowd control; in response, drones dispensed pepper spray on the protesters below, causing a stampede. More than 20 people were injured and treated at local hospitals; one protester died of his injuries on Monday. The police detained 35 people at the scene; 25 more arrests have been made since then, after authorities used facial recognition technology to identify protesters from surveillance video.

Punishing the poor who were harmed by economic disruptions has been a mistake repeated throughout American history. During the Industrial Revolution, machines displaced many artisans and agricultural workers. To deter these unemployed workers from seeking public relief, local governments set up poorhouses that required residents to perform hard labor. And between 1990 and 2020, the federal government — and some state governments — repeatedly cut social program spending even as middle-class jobs disappeared as a result of outsourcing and automation. Workers who didn’t have the skills to thrive in the knowledge economy were resigned to join the underclass of service workers.

Security researchers from Google’s Project Zero team recently uncovered pre-installed apps in Android devices that either allowed remote attackers to carry out remote code execution, could disable Google Play Protect in devices, or could collect information on users’ web activities.

At the Black Hat cybersecurity conference in Las Vegas, Maddie Stone, a security researcher on Project Zero and who previously served as Senior Reverse Engineer & Tech Lead on Android Security team, revealed that her team discovered three instances of Android malware being pre-installed in budget Android phones in the recent past.

One such pre-installed app was capable of turning off Google Play Protect, the default mobile security app in Android devices, thereby leaving devices vulnerable to all forms of cyber attacks or remote surveillance. The Project Zero team also found an app pre-installed on Android phones that gathered logs of users’ web activities.