Toggle light / dark theme

The U.S. Navy has invented a new device to prevent people from speaking, one that people with siblings will recognize instantly. The handheld acoustic hailing and disruption device records a person’s speech and spits it back out again, disrupting their concentration and discouraging them from speaking further. Although an interesting—and very familiar—concept it’s unlikely this tech will ever see use on the battlefield.

The handheld acoustic hailing and disruption (AHAD) was developed by engineers at Naval Surface Warfare, Crane Division, a Navy research and development facility in Indiana that develops handheld and crew-served weapons for the service. The patent, New Scientist reports, was issued in 2019.


The system can get very sneaky by repeating anything a speaker says milliseconds after it’s said.

The ATLAS collaboration is breathing new life into its LHC Run 2 dataset, recorded from 2015 to 2018. Physicists will be reprocessing the entire dataset – nearly 18 PB of collision data – using an updated version of the ATLAS offline analysis software (Athena). Not only will this improve ATLAS physics measurements and searches, it will also position the collaboration well for the upcoming challenges of Run 3 and beyond.

Athena converts raw signals recorded by the ATLAS experiment into more simplified datasets for physicists to study. Its new-and-improved version has been in development for several years and includes multi-threading capabilities, more complex physics-analysis functions and improved memory consumption.

“Our aim was to significantly reduce the amount of memory needed to run the software, widen the types of physics analyses it could do and – most critically – allow current and future ATLAS datasets to be analysed together,” says Zach Marshall, ATLAS Computing Coordinator. “These improvements are a key part of our preparations for future high-intensity operations of the LHC – in particular the High-Luminosity LHC (HL-LHC) run beginning around 2,028 which will see ATLAS’s computing resources in extremely high demand.”

Facebook is pouring a lot of time and money into augmented reality, including building its own AR glasses with Ray-Ban. Right now, these gadgets can only record and share imagery, but what does the company think such devices will be used for in the future?

A new research project led by Facebook’s AI team suggests the scope of the company’s ambitions. It imagines AI systems that are constantly analyzing peoples’ lives using first-person video; recording what they see, do, and hear in order to help them with everyday tasks. Facebook’s researchers have outlined a series of skills it wants these systems to develop, including “episodic memory” (answering questions like “where did I leave my keys?”) and “audio-visual diarization” (remembering who said what when).

A University of Toronto astronomer’s research suggests the solar system is surrounded by a magnetic tunnel that can be seen in radio waves.

Jennifer West, a research associate at the Dunlap Institute for Astronomy & Astrophysics, is making a scientific case that two bright structures seen on opposite sides of the sky – previously considered to be separate – are actually connected and are made of rope-like filaments. The connection forms what looks like a tunnel around our solar system.

The data results of West’s research have been published in the Astrophysical Journal.