Toggle light / dark theme

As any driver knows, accidents can happen in the blink of an eye—so when it comes to the camera system in autonomous vehicles, processing time is critical. The time that it takes for the system to snap an image and deliver the data to the microprocessor for image processing could mean the difference between avoiding an obstacle or getting into a major accident.

In-sensor , in which important features are extracted from raw data by the itself instead of the separate microprocessor, can speed up the . To date, demonstrations of in-sensor processing have been limited to emerging research materials which are, at least for now, difficult to incorporate into commercial systems.

Now, researchers from the Harvard John A. Paulson School of Engineering and Applied Sciences (SEAS) have developed the first in-sensor processor that could be integrated into commercial silicon imaging sensor chips–known as complementary metal-oxide-semiconductor (CMOS) image sensors–that are used in nearly all commercial devices that need capture visual information, including smartphones.

The NQISRCs integrate state-of-the-art DOE facilities, preeminent talent at national laboratories and U.S. universities, and the enterprising ingenuity of U.S. technology companies.

As a result, the centers are pushing the frontier of what’s possible in quantum computers, sensors, devices, materials and much more.

In a new medical breakthrough, scientists have successfully grown a synthetic embryo of a mouse without male sperm and a female womb. They used stem cells from mice to recreate the first stage of life and successfully developed an embryo with a brain, beating heart, and vitals for other organs.

The natural process of life was mimicked in the lab without eggs or sperm but with the body’s master cells, which can develop into almost any cell type in the body. The embryo was developed 8 ½ days after fertilization, containing the same structures as a natural one.

The study published in the journal Nature states that their result demonstrates the self-organization ability of embryonic and two types of extra-embryonic stem cells to reconstitute mammalian development. The researchers induced expression of a particular set of genes and established a unique environment for their interactions and got the stem cells to ‘talk’ to each other.

Watch live as our mega Moon rocket launches an uncrewed Orion spacecraft on a six-week mission around the Moon and back to Earth. During #Artemis I, Orion will lift off aboard the Space Launch System (SLS) rocket, and travel 280,000 miles (450,000 km) from Earth and 40,000 miles (64,000 km) beyond the far side of the Moon, carrying science and technology payloads to expand our understanding of lunar science, technology developments, and deep space radiation.

Liftoff from Launch Pad 39B at NASA’s Kennedy Space Center in Florida is currently targeted for 8:33 a.m. EDT (12:33 UTC) Monday, Aug. 29, at the start of a two-hour launch window.

Through Artemis missions, NASA will land the first woman and the first person of color on the Moon, paving the way for a long-term lunar presence and serving as a steppingstone to send astronauts to Mars. We are going.

More: www.nasa.gov/artemis

The Space Launch System is an American super heavy-lift expendable launch vehicle under development by NASA since 2011. As of April 2022, the first launch is scheduled for no earlier than August 2022, pending the success of a wet dress rehearsal test.

Become a member of Space News Pod!
►► https://www.youtube.com/spacenewspod/join.
►► https://starshipshirts.com.
►► https://discord.gg/dMXghpX
►► https://twitch.tv/spacenewspod.
►► https://facebook.com/spacenewspod.
►► https://patreon.com/spacenewspod.
►►
https://twitter.com/spacenewspod.

►► https://instagram.com/thespacenewspod.

#NASA #artemis #news

Meta CEO Mark Zuckerberg outlined the company’s approach to neural interface technology — tech which lets you control technology with your mind — in an interview on podcast The Joe Rogan Experience.

Zuckerberg said Meta is researching neural interface tech as part of its push into the metaverse.

He said the company is primarily focused on tech which can receive signals from the brain but does send any information back to it.

Go to https://NordVPN.com/sabine or use code “sabine” to get a 2-year plan plus 1 additional month with a huge discount.

It is possible to make a brand new baby universe in the laboratory? How would we do it? And if we made one, what would we do with it?

You can support us on Patreon: https://www.patreon.com/Sabine.

A nice introduction to the topic is this paper: https://arxiv.org/abs/1512.01819 which also contains the 10kg bound that I mention.

A black hole is matter and/or light crammed into such a tiny volume that nothing can escape. But, shortly after the big bang, the observable universe was that small. How did it escape?! Brilliant for 20% off: http://brilliant.org/ScienceAsylum.

Creator/Host: Nick Lucid.
Writer: Nick Lucid.
Copy Editor: Nick Lucid.
Editor/Animator: Nick Lucid.

VIDEO ANNOTATIONS/CARDS

Black Holes are Inescapable:

Foresight Existential Hope Group.
Program & apply to join: https://foresight.org/existential-hope/

In the Existential Hope-podcast (https://www.existentialhope.com), we invite scientists to speak about long-termism. Each month, we drop a podcast episode where we interview a visionary scientist to discuss the science and technology that can accelerate humanity towards desirable outcomes.

Xhope Special with Foresight Fellow Morgan Levine.

Morgan Levine is a ladder-rank Assistant Professor in the Department of Pathology at the Yale School of Medicine and a member of both the Yale Combined Program in Computational Biology and Bioinformatics, and the Yale Center for Research on Aging. Her work relies on an interdisciplinary approach, integrating theories and methods from statistical genetics, computational biology, and mathematical demography to develop biomarkers of aging for humans and animal models using high-dimensional omics data. As PI or co-Investigator on multiple NIH-, Foundation-, and University-funded projects, she has extensive experience using systems-level and machine learning approaches to track epigenetic, transcriptomic, and proteomic changes with aging and incorporate.