Toggle light / dark theme

Our goal is audacious — some might even say naive. The aim is to evaluate every gene and drug perturbation in every possible type of cancer in laboratory experiments, and to make the data accessible to researchers and machine-learning experts worldwide. To put some ballpark numbers on this ambition, we think it will be necessary to perturb 20000 genes and assess the activity of 10000 drugs and drug candidates in 20000 cancer models, and measure changes in viability, morphology, gene expression and more. Technologies from CRISPR genome editing to informatics now make this possible, given enough resources and researchers to take on the task.


It is time to move beyond tumour sequencing data to identify vulnerabilities in cancers.

I think we may need to be more careful about brain implants in the future. 😃


Cutting down on the number of invasive surgeries associated with implants is one thing, but the wireless implant also stands to improve the quality of animal research. Without wireless controls or charging, animals needed to be wired up to power sources or other electronics with invasive, restrictive tethers. Doing away with those allows the animals to behave how they normally would have.

In the case of this particular test, KAIST scientists used the implant to block cocaine-associated behaviors in rats who they had just injected with the drug. But they suspect the underlying tech could be used in all sorts of implants and medical devices.

“We believe that the same basic technology can be applied to various types of implants, including deep brain stimulators, and cardiac and gastric pacemakers,” Jeong said in the release, “to reduce the burden on patients for long-term use within the body.”

In this episode of Lifespan News:

Chemotherapy with light
AI Identifies Senescent Cells and Tests New Drugs
Alpha-Ketoglutarate Delays Age‐Related Fertility Decline
A Genetic Pathway for Preventing Hearing Loss
Investigating the Link Between COVID-19 and Telomeres

I don’t think that star is the same after that one night stand.


When black holes swallow down massive amounts of matter from the space around them, they’re not exactly subtle about it. They belch out tremendous flares of X-rays, generated by the material heating to intense temperatures as it’s sucked towards the black hole, so bright we can detect them from Earth.

This is normal black hole behaviour. What isn’t normal is for those X-ray flares to spew forth with clockwork regularity, a puzzling behaviour reported in 2019 from a supermassive black hole at the centre of a galaxy 250 million light-years away. Every nine hours, boom — X-ray flare.

After careful study, astronomer Andrew King of the University of Leicester in the UK identified a potential cause — a dead star that’s endured its brush with a black hole, trapped on a nine-hour, elliptical orbit around it. Every close pass, or periastron, the black hole slurps up more of the star’s material.

Australian scientists have discovered a new way to analyze microscopic cells, tissues and other transparent specimens, through the improvement of an almost 100-year-old imaging technique.

La Trobe University researchers have led a four-year collaboration to make “the invisible visible” by using custom-designed nanomaterials to enhance the sensitivity of , an commonly used by scientists to study biological specimens.

The discovery, detailed in Nature Photonics, will benefit a broad range of researchers and has the potential to advance research into the understanding and detection of disease.

In recent years, countless computer scientists worldwide have been developing deep neural network-based models that can predict people’s emotions based on their facial expressions. Most of the models developed so far, however, merely detect primary emotional states such as anger, happiness and sadness, rather than more subtle aspects of human emotion.

Past psychology research, on the other hand, has delineated numerous dimensions of emotion, for instance, introducing measures such as valence (i.e., how positive an emotional display is) and arousal (i.e., how calm or excited someone is while expressing an emotion). While estimating valence and arousal simply by looking at people’s faces is easy for most humans, it can be challenging for machines.

Researchers at Samsung AI and Imperial College London have recently developed a deep-neural-network-based system that can estimate emotional valence and arousal with high levels of accuracy simply by analyzing images of human faces taken in everyday settings. This model, presented in a paper published in Nature Machine Intelligence, can make predictions fairly quickly, which means that it could be used to detect subtle qualities of emotion in real time (e.g., from snapshots of CCTV cameras).

On Tuesday, SpaceX plans to launch the latest prototype of its Starship spacecraft — a system that could one day carry humans to Mars. The prototype, called.


The first time SpaceX attempted such an ambitious Starship flight, the 16-story vehicle blew up. Seven weeks later, Elon Musk’s company is trying again.