Toggle light / dark theme

We still don’t have a clear picture of the Sun’s physics — but the Solar Ring could change that.


To solve this a team of astronomers proposes the Solar Ring. The Solar Ring is a fleet of three spacecraft that will all orbit around the Sun. They will be separated from each other by 120 degrees and be fitted with identical instruments. This way their overlapping fields of view will make it impossible for us to miss anything happening on the surface.

Among the many kinds of observations that the astronomers behind the Solar Ring hope to perform, one involves a technique called reverberation mapping. By carefully mapping the velocity of gas on the surface of the Sun, they can measure vibrations and pulsations. These kinds of “sunquakes” give astronomers rich information about what is happening within deeper layers, much like how earthquakes tell us about the core and mantle of the Earth.

The Solar Ring will also be able to catch the beginnings of a solar flare or an eruption event no matter where it happens on the Sun, providing even more early warning for space weather. These kinds of plasma storms can disrupt satellites and even affect electrical systems on the Earth’s surface, so the more warning, the better.

Examinations using microscopes confirmed that these neurons were active in the mice with chronic pain. When the researchers used chemicals to stop the neuronal activity in this cortex, the mice’s appetites improved.

Similarly, when the researchers used chemicals to activate these neurons in mice that weren’t in pain, the animals ate less, even if they had been deprived of food before the experiment.

This is the first time that researchers have traced the brain mechanisms behind pain-related appetite loss, the researchers wrote.

The event is likened to the Chicxulub collision on Earth.

A study has found that the megatsunami that swept Mars around 3.4 billion years ago was caused by an asteroid strike on one of its oceans. The event is compared to the likes of the Chicxulub collision—which is believed to have wiped out dinosaurs from the face of the earth about 66 million years ago.

Researchers, led by Alexis Rodriguez of the Planetary Science Institute in Arizona, have also suggested that NASA’s Viking 1 Lander, which was deployed on a mission to find evidence of life on Mars in 1976, could have landed near the crater of this megatsunami.


Michalz86/iStock shannonstent/iStock Dominic Jeanmaire/iStock.

Go to https://squarespace.com/pbs to get a free trial and 10% off your first purchase of a website or domain.

PBS Member Stations rely on viewers like you. To support your local station, go to: http://to.pbs.org/DonateSPACE

Sign Up on Patreon to get access to the Space Time Discord!
https://www.patreon.com/pbsspacetime.

I’m going to tell you about the craziest proposal for an astrophysics mission that has a good chance of actually happening. A train of spacecraft sailing the sun’s light to a magical point out there in space where the Sun’s own gravity turns it into a gigantic lens. What could such a solar-system-sized telescope do? Pretty much anything. But definitely map the surfaces of alien worlds.

If we can analyze the organization of neural circuits, it will play a crucial role in better understanding the process of thinking. It is where the maps come into play. Maps of the nervous system contain information about the identity of individual cells, like their type, subcellular component, and connectivity of the neurons.

But how do we obtain these maps?

Volumetric nanometer-resolution imaging of brain tissue is a technique that provides the raw data needed to build these maps. But inferring all the relevant information is a laborious and challenging task because of the multiple scales of brain structures (e.g., nm for a synapse vs. mm for an axon). It requires hours of manual ground truth labeling by expert annotators.

Plenty of potential solutions have been put forward to deal with the problem, but they all face a similar problem at the first step: how to track the debris they’re attempting to eliminate. Enter a new idea from researchers in Iran — using a novel type of radar to detect and track space debris before it becomes a danger.

The novel type of radar is called inverse synthetic aperture radar, or ISAR. As one might expect from the name, it’s the opposite of synthetic aperture radar (SAR). SAR has become much more prominent lately, especially by satellites attempting to collect data about the Earth, especially terrain data that might be useful for geospatial mapping.

SAR uses the motion of its platform (i.e., a satellite) to recreate a larger, “synthetic” aperture by using the area the platform covers as it moves compared to the object it is imaging as its aperture size. That might sound confusing but think of it as a way to take multiple images of an object from different angles and then reconstruct a single three-dimensional image from those combined images.

Image and video editing are two of the most popular applications for computer users. With the advent of Machine Learning (ML) and Deep Learning (DL), image and video editing have been progressively studied through several neural network architectures. Until very recently, most DL models for image and video editing were supervised and, more specifically, required the training data to contain pairs of input and output data to be used for learning the details of the desired transformation. Lately, end-to-end learning frameworks have been proposed, which require as input only a single image to learn the mapping to the desired edited output.

Video matting is a specific task belonging to video editing. The term “matting ” dates back to the 19th century when glass plates of matte paint were set in front of a camera during filming to create the illusion of an environment that was not present at the filming location. Nowadays, the composition of multiple digital images follows similar proceedings. A composite formula is exploited to shade the intensity of the foreground and background of each image, expressed as a linear combination of the two components.

Although really powerful, this process has some limitations. It requires an unambiguous factorization of the image into foreground and background layers, which are then assumed to be independently treatable. In some situations like video matting, hence a sequence of temporal-and spatial-dependent frames, the layers decomposition becomes a complex task.

Animal studies have shown that pregnancy is associated with unique changes in the mammalian brain and behaviour, although pregnancy-associated changes in the human brain are less well studied. Here the authors show that pregnancy is associated with changes in resting state brain activity and brain anatomy which are most pronounced in the default mode network.

Researchers have discovered the heaviest-known bound isotope of sodium and characterized other neutron-rich isotopes, offering important benchmarks for refining nuclear models.

The neutron dripline marks a boundary of nuclear existence—indicating isotopes of a given element with a maximum number of neutrons. Adding a neutron to a dripline isotope will cause the isotope to become unbound and release one or more of its neutrons. Mapping the dripline is a major goal of modern nuclear physics, as this boundary is a testing ground for nuclear models and has implications for our understanding of neutron stars and of the synthesis of elements in stellar explosions. Now studies by two groups extend our knowledge of the properties of nuclei close to the dripline [1, 2]. Working at the Radioactive Isotope Beam Factory (RIBF) in Japan, Deuk Soon Ahn of RIKEN and colleagues have discovered sodium-39 (39 Na), which likely marks the dripline location for the heaviest element to date (Fig. 1) [1].