Toggle light / dark theme

Check out this time lapse showing the retraction of half of Platform C in High Bay 3 of the Vehicle Assembly Building today at.

NASA’s Kennedy Space Center. On March 17, NASA’s Space Launch System & Orion Spacecraft will roll out to Launch Pad 39B for wet dress rehearsal for Artemis I.

When Google unveiled its first autonomous cars in 2010, the spinning cylinder mounted on the roofs really stood out. It was the vehicle’s light detection and ranging (LiDAR) system, which worked like light-based radar. Together with cameras and radar, LiDAR mapped the environment to help these cars avoid obstacles and drive safely.

Since a then, inexpensive, chip-based cameras and have moved into the mainstream for collision avoidance and autonomous highway driving. Yet, LiDAR navigation systems remain unwieldy mechanical devices that cost thousands of dollars.

That may be about to change, thanks to a new type of high-resolution LiDAR chip developed by Ming Wu, professor of electrical engineering and computer sciences and co-director of the Berkeley Sensor and Actuator Center at the University of California, Berkeley. The new design appears Wednesday, March 9, in the journal Nature.

The molecule is a precursor to organic molecules, which can be associated with life.


Astronomers spotted the largest molecule yet found in a planet-forming disk, which they say will tell us more about the origin of life.

The molecule, dimethyl, is a precursor to organic molecules like methane that in some cases may be indicative of life. The team also may have found methyl formate, which is deemed a “building block” to constructing even larger organic molecules.

Samsung said on Monday that hackers breached its internal company data, gaining access to some source codes of Galaxy-branded devices like smartphones.

The statement from the South Korean electronics giant comes after hacking group Lapsus$ claimed over the weekend via its Telegram channel that it has stolen 190 gigabytes of confidential Samsung source code.

Samsung did not name any specific hackers in its statement nor what precise data was stolen.

I’ve been trying to review and summarize Eliezer Yudkowksy’s recent dialogues on AI safety. Previously in sequence: Yudkowsky Contra Ngo On Agents. Now we’re up to Yudkowsky contra Cotra on biological anchors, but before we get there we need to figure out what Cotra’s talking about and what’s going on.

The Open Philanthropy Project (“Open Phil”) is a big effective altruist foundation interested in funding AI safety. It’s got $20 billion, probably the majority of money in the field, so its decisions matter a lot and it’s very invested in getting things right. In 2020, it asked senior researcher Ajeya Cotra to produce a report on when human-level AI would arrive. It says the resulting document is “informal” — but it’s 169 pages long and likely to affect millions of dollars in funding, which some might describe as making it kind of formal. The report finds a 10% chance of “transformative AI” by 2031, a 50% chance by 2052, and an almost 80% chance by 2100.

Eliezer rejects their methodology and expects AI earlier (he doesn’t offer many numbers, but here he gives Bryan Caplan 50–50 odds on 2030, albeit not totally seriously). He made the case in his own very long essay, Biology-Inspired AGI Timelines: The Trick That Never Works, sparking a bunch of arguments and counterarguments and even more long essays.

Google has patented technology that will let users control its smartwatches and earbuds by simply touching their skin.

The patent titled “Skin interface for Wearables: Sensor fusion to improve signal quality” was spotted by folks over at LetsGoDigital. It details tech that users can use to operate wearable devices using skin gestures.

Patent documents show that users can swipe or tap the skin near the wearables in order to control them. The gesture creates a mechanical wave that is picked up by the sensors in the wearables. The “Sensor Fusion” tech then combines this movement data collected from various sensors into an input command for the wearable.