Toggle light / dark theme

Abstract: Here we show that precise Gaia EDR3 proper motions have provided robust estimates of 3D velocities, angular momentum and total energy for 40 Milky Way dwarfs. The results are statistically robust and are independent of the Milky Way mass profile. Dwarfs do not behave like long-lived satellites of the Milky Way because of their excessively large velocities, angular momenta, and total energies. Comparing them to other MW halo population, we find that many are at first passage, $\le$2 Gyr ago, i.e., more recently than the passage of Sagittarius, $\sim$4–5 Gyr ago. We suggest that this is in agreement with the stellar populations of all dwarfs, for which we find that a small fraction of young stars cannot be excluded. We also find that dwarf radial velocities contribute too little to their kinetic energy when compared to satellite systems with motions only regulated by gravity, and some other mechanism must be at work such as ram pressure. The latter may have preferentially reduced radial velocities when dwarf progenitors entered the halo until they lost their gas. It could also explain why most dwarfs lie near their pericenter. We also discover a novel large scale structure perpendicular to the Milky Way disk, which is made by 20% of dwarfs orbiting or counter orbiting with the Sagittarius dwarf.

From: Francois Hammer [view email].

“The Earth was very water-rich compared to other rocky planets in the Solar System, with oceans covering more than 70% of its surface, and scientists had long puzzled over the exact source of it all,” said Professor Phil Bland, director of the Space Science and Technology Centre at Curtin University.

“An existing theory is that water was carried to Earth in the final stages of its formation on C-type asteroids, however previous testing of the isotopic ‘fingerprint’ of these asteroids found they, on average, didn’t match with the water found on Earth meaning there was at least one other unaccounted for source.”

“Our research suggests the solar wind created water on the surface of tiny dust grains and this isotopically lighter water likely provided the remainder of the Earth’s water.”

First theorized in 1973 by physicist Philip W. Anderson, quantum spin liquids are exotic phases of matter with topological order. They feature long-range quantum entanglement that can potentially be exploited to realize robust quantum computation. But the problem about this exotic state of matter has been its very existence: no one had ever seen it — at least, that had been the case for almost five decades.

“Instead, it’s all about magnets that never freeze and the way electrons in them spin.”

No such fully realised metaverse yet exists but that has not stopped US tech companies from falling over themselves in recent months to announce their own forays into the space. The flurry of interest has shown few signs of abating and Asia is not immune to the trend, as around the world investors and companies scramble to latch onto what many see as the next big thing.


Investors and companies are scrambling to carve out a piece of an internet revolution that promises to forever change how people interact online – but some question whether Big Tech should be allowed to dominate its development.

On Friday, Alibaba Cloud announced in a social media post that its DAMO Academy has successfully developed a 3D stacked In-Memory Computing (IMC) chip.

Alibaba Cloud claims this is a breakthrough that can help overcome the von Neumann bottleneck, a limitation on throughput caused by the standard personal computer architecture. It meets the needs of artificial intelligence (AI) and other scenarios for high bandwidth, high capacity memory and extreme computing power. In the specific AI scenario tested by Alibaba, the performance of the chip is improved by more than 10 times.

With the outbreak of AI applications, the shortcomings of the existing computer system architecture are gradually revealed. The main problems are that, on the one hand, processing data brings huge energy consumption. Under the traditional architecture, the power consumption required for data transmission from memory unit to computing unit is about 200 times of that of computing itself, so the real energy consumption and time used for computing are very low.