Toggle light / dark theme

Most companies working on autonomous vehicles consider lidar sensors mandatory for vehicles to safely navigate alone and distinguish objects such as pedestrians and cyclists. But the best existing sensors are bulky, extremely expensive, and in short supply as demand surges (see “Self-Driving Cars’ Spinning Laser Problem”). Alphabet and Uber have both said they were forced to invent their own, better-performing sensors from scratch to make self-driving vehicles viable. Luminar hopes to serve automakers that would rather not go to that effort.

Russell doesn’t have a college degree—he dropped out of Stanford in return for a $100,000 check under a program started by venture capitalist Peter Thiel to encourage entrepreneurship. But Russell says a (short) lifetime of tinkering and building with electronics helped him design a new lidar sensor that sees farther and in more detail than those on the market.

Read more

Artificial intelligence picks up racial and gender biases when learning language from text, researchers say. Without any supervision, a machine learning algorithm learns to associate female names more with family words than career words, and black names as being more unpleasant than white names.

For a study published today in Science, researchers tested the bias of a common AI model, and then matched the results against a well-known psychological test that measures bias in humans. The team replicated in the algorithm all the psychological biases they tested, according to study co-author Aylin Caliskan, a post-doc at Princeton University. Because machine learning algorithms are so common, influencing everything from translation to scanning names on resumes, this research shows that the biases are pervasive, too.

“Language is a bridge to ideas, and a lot of algorithms are built on language in the real world,” says Megan Garcia, the director of New America’s California branch who has written about this so-called algorithmic bias. “So unless an alg is making a decision based only on numbers, this finding is going to be important.”

Read more

Fascinating stuff!


NASA has new evidence that the most likely places to find life beyond Earth are Jupiter’s moon Europa or Saturn’s moon Enceladus. In terms of potential habitability, Enceladus particularly has almost all of the key ingredients for life as we know it, researchers said.

New observations of these active ocean worlds in our solar system have been captured by two NASA missions and were presented in two separate studies in an announcement at NASA HQ in Washington today.

Using a mass spectrometer, the Cassini spacecraft detected an abundance of hydrogen molecules in water plumes rising from the “tiger stripe” fractures in Enceladus’ icy surface. Saturn’s sixth-largest moon is an ice-encased world with an ocean beneath. The researchers believe that the hydrogen originated from a hydrothermal reaction between the moon’s ocean and its rocky core. If that is the case, the crucial chemical methane could be forming in the ocean as well.

A proton beam can kill cancer cells, and the accelerators used for treatment are always the circular kind. Linear accelerators (“linacs”) allow more control of the beam; for example, the energy can be varied rapidly to match a patient’s breathing. But linacs take up a lot of space. Now researchers propose a design that could fit in a room 10 m x 20 m, potentially making linacs practical for patient therapy. Research from CERN.

Read more

New Horizons has made observations it was never tasked with making; serendipity of the sort that may lead to new and better astronomical missions to observe the distant cosmos from the outer fringes of the solar system, where it’s darkest and astronomers would have the clearest, dust-free view of our own galaxy and the optical light from the universe as a whole.


NASA’s New Horizons spacecraft has unexpectedly made observations of optical light from beyond our Milky Way galaxy. Astronomers are ecstatic and hoping for more.

Read more

While many scientists have shied away from explicitly political actions in recent decades, the community throughout history has spoken publicly on a wide variety of social, technological and ideological issues.

That has included everything from opposing fascism, nuclear proliferation and the Vietnam War to sitting on government panels that advise elected leaders on stem-cell research involving human embryos.


In U.S. history, scientists have been vocal about fascism, nuclear proliferation, the Vietnam War, stem cells and more.

Read more

AFTER training a network of telescopes stretching from Hawaii to Antarctica to Spain at the heart of our galaxy for five nights running, astronomers said Wednesday they may have snapped the first-ever picture of a black hole.

It will take months to develop the image, but if scientists succeed the results may help peel back mysteries about what the universe is made of and how it came into being.

“Instead of building a telescope so big that it would probably collapse under its own weight, we combined eight observatories like the pieces of a giant mirror,” said Michael Bremer, an astronomer at the International Research Institute for Radio Astronomy (IRAM) and a project manager for the Event Horizon Telescope.

Read more

I’ve been reading about Gcam, the Google X project that was first sparked by the need for a tiny camera to fit inside Google Glass, before evolving to power the world-beating camera of the Google Pixel. Gcam embodies an atypical approach to photography in seeking to find software solutions for what have traditionally been hardware problems. Well, others have tried, but those have always seemed like inchoate gimmicks, so I guess the unprecedented thing about Gcam is that it actually works. But the most exciting thing is what it portends.

I think we’ll one day be able to capture images without any photographic equipment at all.

Now I know this sounds preposterous, but I don’t think it’s any more so than the internet or human flight might have once seemed. Let’s consider what happens when we tap the shutter button on our cameraphones: light information is collected and focused by a lens onto a digital sensor, which converts the photons it receives into data that the phone can understand, and the phone then converts that into an image on its display. So we’re really just feeding information into a computer.

Read more