A new artificial intelligence algorithm called ‘Zoobot’ helped to identify 40,000 ring galaxies. What else is the astronomical AI capable of?
Category: space – Page 386
Lockheed Martin has been busy this year. In April of 2022, the Defense Advanced Research Projects Agency (DARPA) and its U.S. Air Force partner announced that they had completed a free flight test of the Lockheed Martin version of the Hypersonic Air-breathing Weapon Concept (HAWC).
Then just last month, the U.S. Department of Defense (DoD) awarded the company a contract to construct the nation’s first megawatt-scale long-duration energy storage system. Under the direction of the U.S. Army Engineer Research and Development Center’s (ERDC) Construction Engineering Research Laboratory (CERL), the new system, called “GridStar Flow,” will be set up at Fort Carson, Colorado.
In the same time frame, General Motors and the firm announced their plans to produce a series of electric moon rovers for future commercial space missions. The companies said they plan aim to test the batteries developed by GM, in space later this year. They also set the ambitious goal of testing a prototype vehicle on the moon by 2025.
It was taken by the James Webb space telescope and is one of the clearest images ever produced of the planet.
By Planning in the Latent Space of a Learned World Model. The world model Director builds from pixels allows effective planning in a latent space. To anticipate future model states given future actions, the world model first maps pictures to model states. Director optimizes two policies based on the model states’ anticipated trajectories: Every predetermined number of steps, the management selects a new objective, and the employee learns to accomplish the goals using simple activities. The direction would have a difficult control challenge if they had to choose plans directly in the high-dimensional continuous representation space of the world model. To reduce the size of the discrete codes created by the model states, they instead learn a goal autoencoder. The goal autoencoder then transforms the discrete codes into model states and passes them as goals to the worker after the manager has chosen them.
Deep reinforcement learning advancements have accelerated the study of decision-making in artificial agents. Artificial agents may actively affect their environment by moving a robot arm based on camera inputs or clicking a button in a web browser, in contrast to generative ML models like GPT-3 and Imagen. Although artificial intelligence has the potential to aid humans more and more, existing approaches are limited by the necessity for precise feedback in the form of often given rewards to acquire effective techniques. For instance, even robust computers like AlphaGo are restricted to a certain number of moves before earning their next reward while having access to massive computing resources.
Contrarily, complex activities like preparing a meal necessitate decision-making at all levels, from menu planning to following directions to the shop to buy supplies to properly executing the fine motor skills required at each stage along the way based on high-dimensional sensory inputs. Artificial agents can complete tasks more independently with scarce incentives thanks to hierarchical reinforcement learning (HRL), which automatically breaks down complicated tasks into achievable subgoals. Research on HRL has, however, been difficult because there is no universal answer, and existing approaches rely on manually defined target spaces or subtasks.
The Webb Telescope is a new era for astronomy and science. Scientists have no idea what they might discover with Webb. But with five observations taken in just one week of operation, they have already found several cosmic Easter eggs that defy expectations — including a few complete and utter unknowns.
Let’s take a tour through six of the most revelatory and intriguing Easter eggs hidden in the first five James Webb Space Telescope observations.
How many galaxies can you see in this image? Hey, you’re already looking at an exploding star — what more can you want? How about at some galaxies hiding in the chaos?
The James Webb Space Telescope has made headlines this week with its ability to look deeper into the universe than ever before, but it will also be used to look at some targets closer to home. As well as distant galaxies and far-off exoplanets, Webb will also be used to investigate objects right here in our solar system — and one of the first research projects it will be used for will study Jupiter and its rings and moons.
Now, NASA and its partners, the European Space Agency and the Canadian Space Agency, have demonstrated how capable Webb is of studying Jupiter by releasing the first images it has taken of targets in our solar system. The images show the iconic stripes of Jupiter as seen in the infrared, and also show up some of the moons of Jupiter like Europa which is clearly visible below:
“Combined with the deep field images released the other day, these images of Jupiter demonstrate the full grasp of what Webb can observe, from the faintest, most distant observable galaxies to planets in our own cosmic backyard that you can see with the naked eye from your actual backyard,” said one of the researchers who worked on the images, Bryan Holler of the Space Telescope Science Institute, in a statement.
😳!
“My team has spotted something unexpected: It’s a piece of a thermal blanket that they think may have come from my descent stage, the rocket-powered jet pack that set me down on landing day back in 2021,” Perseverance team members wrote on Twitter at the time.
An interesting piece of debris that could come from the rover.
Now, the crafty rover has spotted yet another interesting piece of debris, this time in the shape of spaghetti. The image was taken on Tuesday, and it’s got researchers confused.
Vertical farming saves water, land, and energy — and it could be how we grow food on Mars.
The levitating rover concept is a wild new idea to explore airless worlds.
Researchers at the Massachusetts Institute of Technology (MIT) want to engineer a new kind of hovering spacecraft that can operate without air.
The entirely theoretical cloud of icy space debris marks the frontiers of our solar system.
The Oort cloud represents the very edges of our solar system. The thinly dispersed collection of icy material starts roughly 200 times farther away from the sun than Pluto and stretches halfway to our sun’s nearest starry neighbor, Alpha Centauri. We know so little about it that its very existence is theoretical — the material that makes up this cloud has never been glimpsed by even our most powerful telescopes, except when some of it breaks free.
“For the foreseeable future, the bodies in the Oort cloud are too far away to be directly imaged,” says a spokesperson from NASA. “They are small, faint, and moving slowly.”
Aside from theoretical models, most of what we know about this mysterious area is told from the visitors that sometimes swing our way every 200 years or more — long period comets. “[The comets] have very important information about the origin of the solar system,” says Jorge Correa Otto, a planetary scientist the Argentina National Scientific and Technical Research Council (CONICET).