Toggle light / dark theme

Some excellent food for thought face_with_colon_three


We now have everything we need to build a physics engine with infinite precision.

In this part, we’ve seen how to use the Python SymPy package to find the low-level expressions needed to create a perfect physics engine for our 2-D worlds of circles and wall. We found the expressions for the time when two objects will just touch (if they ever do). When they do touch, we found the expressions for their new velocities.

Two teams of researchers studying a galaxy through NASA’s James Webb Space Telescope have made multiple discoveries, including spotting the most distant active supermassive black hole ever found.

The teams were studying a galaxy known as GN-z11, an “exceptionally luminous” system that was formed when our 13.8 billion-year-old universe was only about 430 million years old, making it one of the youngest ever observed, NASA said in a news release. Scientists have been trying to find out what makes the distant galaxy so bright, and in doing so discovered the far-off black hole and a gas clump that could indicate rare stars.

The black hole was found by researchers from the Cavendish Laboratory and the Kavli Institute of Cosmology at the University of Cambridge in the United Kingdom using the telescope’s near-infrared camera. They determined the structure was a supermassive black hole, the largest type of black hole. It’s the most distant black hole of this size ever seen.

A medieval astronomical instrument discovered entirely by accident has turned out to be a powerful record of cross-cultural scientific collaboration.

The brass astrolabe dates back to 11th century Spain – but was subsequently engraved with annotations and amendments over the centuries, in multiple languages, as changing owners adapted and updated it for their own use.

The object is, therefore, not just a rare artifact, but almost unique: a palimpsest that records changing ideas and needs of its users as the world and context changes.

The human brain has billions of neurons. Working together, they enable higher-order brain functions such as cognition and complex behaviors. To study these higher-order brain functions, it is important to understand how neural activity is coordinated across various brain regions.

Although techniques such as imaging (fMRI) are able to provide insights into brain activity, they can show only so much information for a given time and area. Two-photon microscopy involving the use of cranial windows is a powerful tool for producing , but conventional cranial windows are small, making it difficult to study distant brain regions at the same time.

Now, a team of researchers led by the Exploratory Research Center on Life and Living Systems (ExCELLS) and the National Institute for Physiological Sciences (NIPS) have introduced a new method for in vivo brain imaging, enabling large-scale and long-term observation of neuronal structures and activities in awake mice.

In a recent review published in the Journal of Human Genetics, a group of authors explored the potential of deep learning (DL), particularly convolutional neural networks (CNNs), in enhancing predictive modeling for omics data analysis, addressing challenges and future research directions.

Study: Advances in AI and machine learning for predictive medicine. Image Credit: NicoElNino/Shutterstock.com.

Scientists have developed a new robot that can ‘mimic’ the two-handed movements of care workers as they dress an individual.

Until now, assistive dressing robots, designed to help an or a person with a disability get dressed, have been created in the laboratory as a one-armed machine, but research has shown that this can be uncomfortable for the person in care or impractical.

To tackle this problem, Dr. Jihong Zhu, a robotics researcher at the University of York’s Institute for Safe Autonomy, proposed a two-armed assistive dressing scheme, which has not been attempted in previous research, but inspired by caregivers who have demonstrated that specific actions are required to reduce discomfort and distress to the individual in their care.

The potential for personalized cancer treatment is fueling the need to identify T cell responses against neoantigens and other cancer-specific epitopes for the success of immunotherapy. Continuous advancements of epitope discovery prediction technology is leading to precise identification of antigen-specific T cells, playing a central role in monitoring immune responses to infection and cancer immunotherapies. Hence, the understanding of major histocompatibility complex class (MHC) molecules and peptides interaction within the immune system is fundamental for developing treatments in diseases like cancer and the creation of innovative vaccines.

Fundamentally, in vivo interaction between processed antigen loaded on MHC molecules is important communication for the adaptive immune response to alert against foreign antigens or cancerous cells. MHC I and II molecules loaded with foreign antigens or cancerous fragments are of great interest to the activation of the adaptive immune response. In vivo, peptide exchange reactions are not required for presentation of antigens by MHC molecules because they bind degraded antigens during assembly in the ER. However, peptide exchange reactions play an important role in the assembly of MHC molecules in vitro. It becomes essential to consider the allelic variation and peptide binding when utilizing MHC molecules for T cell detection ex vivo. It has been shown that immunogenic peptides tend to interact with their restricting MHC molecule. Thus, having the capability to assess the binding affinity of an in vitro interaction between peptide and MHC I is highly valued.