Toggle light / dark theme

Despite being battered by Hurricane Maria, and facing a decrease in funding from the U.S. government, the Arecibo radio telescope in Puerto Rico is still going strong, and is now up and running again, following a series of repairs. And with the near-Earth asteroid 3200 Phaethon having just flown by our planet, Arecibo has just sent back images that are supposed to represent the highest-resolution photos of the asteroid, which help reveal some important details about the object.

According to a press release from NASA, the radar images were taken by the Arecibo Observatory Planet Radar last Saturday, December 16, and generated the day after, as asteroid Phaethon 3200 had its close encounter with Earth. At the time of its closest approach, the object was only 1.1 million miles away from Earth, or less than five times the distance separating our planet from the moon. The images have resolutions estimated at about 250 feet per pixel, making them the best-quality photos of the asteroid that are currently available, Phys.org added.

Based on Arecibo’s radar images, scientists believe that 3200 Phaethon is substantially larger than once estimated, with a diameter of approximately 3.6 miles, or 0.6 miles larger than what previous studies had suggested. That also makes Phaethon the second largest near-Earth object classified as a “potentially hazardous asteroid,” or a comparatively large asteroid that orbits much closer to Earth than most others do. The images also suggest Phaethon has a spheroidal shape, with a number of peculiar physical features that scientists are still trying to understand in full. These features include a concave area believed to be several hundred meters wide, and a dark, crater-like area located near one of its poles.

Read more

Although solar panels might appear bright and shiny, in desert environments, where they are most frequently installed, layers of dust and other particles can quickly coat their surface. These coatings can affect the panels’ ability to absorb sunlight and drastically reduce the conversion of the sun’s rays into energy, making it necessary to periodically wash the panels with water. But often, in areas like Nevada, water resources are scarce.

Consequently, NEXUS scientists have turned their attention toward developing technologies for waterless cleaning. NASA has already been using such techniques to wash panels in the lunar and Mars missions, but their developed methodologies prove too expensive for widespread public application. NEXUS scientist Biswajit Das of UNLV and his team are aiming to develop a water-free cleaning technology that will be cost-effective for large-scale photovoltaic generation, whereby they look to nanotechnology, rather than water, to clean the panels. “Our mission is to develop a waterless, or at least a less-water cleaning technique to address the effect of dust on solar panels,” Das says. “Once developed, this method will significantly reduce water use for the future PV generation.”

Read more

A new technique for recording image information onto a surface creates the ability for one space to contain multiple holographic snapshots, depending on how you look at it.

With this new research, cramming numerous holograms without loss of resolution on the same material could open the way to some fascinating new applications.

Holograms have been around for over half a century, serving as art, entertainment, and foils to counterfeiting. It’s been a hard and fast rule that no matter which way you view a hologram, the same object would appear in three dimensions. Until now.

Read more

Even as autonomous robots get better at doing things on their own, there will still be plenty of circumstances where humans might need to step in and take control. New software developed by Brown University computer scientists enables users to control robots remotely using virtual reality, which helps users to become immersed in a robot’s surroundings despite being miles away physically.

The software connects a robot’s arms and grippers as well as its onboard cameras and sensors to off-the-shelf virtual reality hardware via the internet. Using handheld controllers, users can control the position of the robot’s arms to perform intricate manipulation tasks just by moving their own arms. Users can step into the robot’s metal skin and get a first-person view of the environment, or can walk around the robot to survey the scene in the third person—whichever is easier for accomplishing the task at hand. The data transferred between the robot and the virtual reality unit is compact enough to be sent over the internet with minimal lag, making it possible for users to guide robots from great distances.

“We think this could be useful in any situation where we need some deft manipulation to be done, but where people shouldn’t be,” said David Whitney, a graduate student at Brown who co-led the development of the system. “Three examples we were thinking of specifically were in defusing bombs, working inside a damaged nuclear facility or operating the robotic arm on the International Space Station.”

Read more

Dec. 14 (UPI) — NASA scientists have found a planetary system with as many planets as our own.

“Scientists have found for the first time eight planets in a distant planetary system,” Paul Hertz, astrophysics division director at NASA Headquarters, said during a teleconference on Thursday that was live-streamed on NASA TV.

Astronomers were aware of seven of the eight planets orbiting the Kepler 90 star. The discovery of the new planet, Kepler-90i, was made possible by machine learning.

Read more

NASA will host a media teleconference at 1 p.m. EST Thursday, Dec. 14, to announce the latest discovery made by its planet-hunting Kepler space telescope. The discovery was made by researchers using machine learning from Google. Machine learning is an approach to artificial intelligence, and demonstrates new ways of analyzing Kepler data.

The briefing participants are:

Read more