Toggle light / dark theme

Cells will ramp up gene expression in response to physical forces alone, a new study finds. Gene activation, the first step of protein production, starts less than one millisecond after a cell is stretched—hundreds of times faster than chemical signals can travel, the researchers report.

The scientists tested forces that are biologically relevant—equivalent to those exerted on by breathing, exercising or vocalizing. They report their findings in the journal Science Advances.

“We found that force can activate genes without intermediates, without enzymes or signaling molecules in the cytoplasm,” said University of Illinois mechanical science and engineering professor Ning Wang, who led the research. “We also discovered why some genes can be activated by force and some cannot.”

Given the rapid development of virtual reality technology, we may very well be moving toward a time when we’re able to manage the brain’s memories.


Could we develop a similar capability? That may depend heavily upon a handful of ambitious attempts at brain-computer interfacing. But science is moving in baby steps with other tactics in both laboratory animals and humans.

Thus far, there have been some notable achievements in rodent experiments, that haven’t done so well with humans. We don’t have a beam that can go into your mind and give you 60 years worth of new experiences. Nevertheless, the emerging picture is that the physical basis of memory is understandable to the point that we should be able to intervene — both in producing and eliminating specific memories.

2007…


Imagine a weapon that creates sound that only you can hear. Science fiction? No, this is one area that has a very solid basis in reality. The Air Force has experimented with microwaves that create sounds in people’s head (which they’ve called a possible psychological warfare tool), and American Technologies can “beam” sounds to specific targets with their patented HyperSound (and yes, I’ve heard/seen them demonstrate the speakers, and they are shockingly effective).

Sound Now the Defense Advanced Research Projects Agency is jumping on the bandwagon with their new “Sonic Projector” program:

The goal of the Sonic Projector program is to provide Special Forces with a method of surreptitious audio communication at distances over 1 km. Sonic Projector technology is based on the non-linear interaction of sound in air translating an ultrasonic signal into audible sound. The Sonic Projector will be designed to be a man-deployable system, using high power acoustic transducer technology and signal processing algorithms which result in no, or unintelligible, sound everywhere but at the intended target. The Sonic Projector system could be used to conceal communications for special operations forces and hostage rescue missions, and to disrupt enemy activities.

Australian government scientists have begun the first stages of testing for a potential vaccine against the SARS CoV-2 coronavirus, which causes the disease COVID-19. Australia’s national science agency CSIRO said Thursday that testing at a biosecurity facility was expected to take three months. The testing is being undertaken in cooperation with the Coalition for Epidemic Preparedness Innovations (CEPI), a global group that aims to help speedily develop vaccines against emerging infectious diseases.


Australia’s national science agency will test two vaccine candidates over the next three months. It is part of a global race to halt the coronavirus pandemic.

Daily life during a pandemic means social distancing and finding new ways to remotely connect with friends, family and co-workers. And as we communicate online and by text, artificial intelligence could play a role in keeping our conversations on track, according to new Cornell University research.

Humans having difficult conversations said they trusted artificially —the “smart” reply suggestions in texts—more than the people they were talking to, according to a new study, “AI as a Moral Crumple Zone: The Effects of Mediated AI Communication on Attribution and Trust,” published online in the journal Computers in Human Behavior.

“We find that when things go wrong, people take the responsibility that would otherwise have been designated to their human partner and designate some of that to the system,” said Jess Hohenstein, a doctoral student in the field of information science and the paper’s first author. “This introduces a potential to take AI and use it as a mediator in our conversations.”