Toggle light / dark theme

Curiosity continues to make progress along the margin of upper Gediz Vallis ridge, investigating the broken bedrock in our workspace and acquiring images of the ridge deposit as the rover drives south.

Today’s 2-sol plan focused on a DRT, contact science, and drive on the first sol, followed by untargeted remote sensing on the second sol. The team had to make some decisions at the start of planning about whether to drive on the first or second sol of this plan, and how that would affect the upcoming weekend activities.

As it turned out, the team was able to fit all of the desired contact science and remote sensing activities on the first sol, in addition to the drive on the first sol, which means we’ll be able to downlink more information about our end-of-drive location to better inform planning for the weekend. Weekend plans provide opportunities for a lot of great contact science, so it will be really helpful to have that additional data down for planning.

A team of AI researchers at Microsoft Research Asia has developed an AI application that converts a still image of a person and an audio track into an animation that accurately portrays the individual speaking or singing the audio track with appropriate facial expressions.

The team has published a paper describing how they created the app on the arXiv preprint server; video samples are available on the research project page.

The research team sought to animate still images talking and singing using any provided backing audio track, while also displaying believable facial expressions. They clearly succeeded with the development of VASA-1, an AI system that turns static images, whether captured by a camera, drawn, or painted, into what they describe as “exquisitely synchronized” animations.

“This study has given us an historical picture of how the emerging modern reef responded to huge environmental stress,” said Dr. Jody Webster.


What events caused the Great Barrier Reef to become what it is today, specifically over the course of the last six to eight thousand years, or just after the last Ice Age? This is what a recent study published in Quaternary Science Reviews hopes to address as a team of international researchers conducted an in-depth scientific analysis on various aspects of the Great Barrier Reef to ascertain the environmental factors that contributed to the Reef’s present conditions. This study holds the potential to help scientists better understand how reefs evolve over time and the environment’s role in their evolution.

For the study, the researchers drilled almost two dozen coral samples and analyzed them using a variety of methods, including computer tomography, scanning electron microscopy, and X-ray diffraction to ascertain yearly growth patterns within the coral samples. In the end, they determined that environmental factors, including increased water temperatures, ocean turbulence, and rising sea levels, led to increased nutrients, which contributed to the growth of the Great Barrier Reef, and is consistent with previous studies.

SpaceX has shown off a futuristic-looking new extravehicular activity (EVA) spacesuit designed to allow space tourists to venture outside of the company’s Crew Dragon spacecraft in orbit — and it’s decked out in the latest cutting-edge tech.

The suit will make its first appearance during this summer’s Polaris Dawn mission, which will see a crew of four space tourists stepping out of the capsule to go for a spacewalk.

The suit is astonishingly slim compared to the bulky suits we’ve become accustomed to that allow astronauts to venture outside spacecraft like the International Space Station.

The rapid progress of humanoid robot development is nothing short of astounding. Less than 12 months after introducing its 6th-gen general-purpose humanoid, Canada’s Sanctuary AI has pulled back the curtains on the next iteration of Phoenix.

Sanctuary has been working on a general-purpose humanoid robot for a few years now, with much of the development focus on building and training the upper torso to perform an array of tasks so we don’t have to – including putting labels on boxes, bagging groceries, moving packages, scanning products and soldering. However, it seems that most of the “Robots Doing Stuff” series of videos are actually showing the bots being teleoperated, which is how they’re taught to perform tasks.

A couple of months later, Sanctuary introduced a bipedal version called Phoenix, together with an AI control system called Carbon designed to give the humanoid “human-like intelligence and enable it to do a wide range of work tasks.” Some 11 months later, the seventh generation is ready for its time in the spotlight.