Toggle light / dark theme

Micro-sized cameras have great potential to spot problems in the human body and enable sensing for super-small robots, but past approaches captured fuzzy, distorted images with limited fields of view.

Now, researchers at Princeton University and the University of Washington have overcome these obstacles with an ultracompact the size of a coarse grain of salt. The new system can produce crisp, on par with a conventional compound camera lens 500,000 times larger in volume, the researchers reported in a paper published Nov. 29 in Nature Communications.

Enabled by a joint design of the camera’s hardware and computational processing, the system could enable minimally invasive endoscopy with medical robots to diagnose and treat diseases, and improve imaging for other robots with size and weight constraints. Arrays of thousands of such cameras could be used for full-scene sensing, turning surfaces into cameras.

Today’s quantum computers are complicated to build, difficult to scale up, and require temperatures colder than interstellar space to operate. These challenges have led researchers to explore the possibility of building quantum computers that work using photons—particles of light. Photons can easily carry information from one place to another, and photonic quantum computers can operate at room temperature, so this approach is promising. However, although people have successfully created individual quantum “logic gates” for photons, it’s challenging to construct large numbers of gates and connect them in a reliable fashion to perform complex calculations.

Now, Stanford University researchers have proposed a simpler design for photonic quantum computers using readily available components, according to a paper published Nov. 29 in Optica. Their proposed design uses a laser to manipulate a single atom that in turn, can modify the state of the photons via a phenomenon called “quantum teleportation.” The atom can be reset and reused for many quantum gates, eliminating the need to build multiple distinct physical gates, vastly reducing the complexity of building a quantum .

“Normally, if you wanted to build this type of quantum computer, you’d have to take potentially thousands of quantum emitters, make them all perfectly indistinguishable, and then integrate them into a giant photonic circuit,” said Ben Bartlett, a Ph.D. candidate in applied physics and lead author of the paper. “Whereas with this design, we only need a handful of relatively simple components, and the size of the machine doesn’t increase with the size of the quantum program you want to run.”

To persist, life must reproduce. Over billions of years, organisms have evolved many ways of replicating, from budding plants to sexual animals to invading viruses.

Now scientists have discovered an entirely new form of biological reproduction—and applied their discovery to create the first-ever, self-replicating living robots.

The same team that built the first living robots (” Xenobots,” assembled from frog cells—reported in 2020) has discovered that these computer-designed and hand-assembled organisms can swim out into their tiny dish, find , gather hundreds of them together, and assemble “baby” Xenobots inside their Pac-Man-shaped “mouth”—that, a few days later, become new Xenobots that look and move just like themselves.

Circa 2019


Brazilian design studio Furf designed furniture made with a leather-like material made from leaves, developed by organic tannery Nova Kaeru.

Vegan, plastic free leather alternatives are a booming industry at the moment. One of the most notable examples is Piñatex, made from leftover leaves after the pineapple harvest, but there are also leather-like materials made of anything from tree bark to fruit leftovers to mycelium (for an extensive list of options, click here).

Nova Kaeru’s material, called beLEAF, is a leather-like material made from the elephant ear plant, a plant with very big leaves. The material has similar characteristics to animal leather, but the main difference is, aside from being vegan, that the CO2 emissions of its manufacturing process is compensated by the carbon absorption of planting and leaf growth.

Circa 2019


Cellular enhancement in banana leaves

Banana Leaf Technology started in 2010 when Tenith Adithyaa, then 11 years old, saw farmers in Southern India dump heaps of banana leaves as trash due to the lack of a preservation technology. The spark ignited when the question came to the mind, ‘can these leaves be enhanced biologically?’ By trial and error, he succeeded in preserving the leaves for about a year without using any chemicals. For four years, he perfected his technology of cellular enhancement. He received his first international award for this technology in 2014, at the global invention fair in Texas.

Lytro’s Immerge light-field camera is meant for professional high-end VR productions. It may be a beast of a rig, but it’s capable of capturing some of the best looking volumetric video that I’ve had my eyes on yet. The company has revealed a major update to the camera, the Immerge 2.0, which, through a few smart tweaks, makes for much more efficient production and higher quality output.

Light-field specialist Lytro, which picked up a $60 million Series D investment earlier this year, is making impressive strides in its light-field capture and playback technology. The company is approaching light-field from both live-action and synthetic ends; last month Lytro announced Volume Tracer, a software which generates light-fields from pre-rendered CG content, enabling ultra-high fidelity VR imagery that retains immersive 6DOF viewing.

Immerge 2.0

On the live-action end, the company has been building a high-end light-field camera which they call Immerge. Designed for high-end productions, the camera is actually a huge array of individual lenses which all work in unison to capture light-fields of the real world.

I’m more optimistic about our future, but this information still needs to be taken into account, as the hard data and it’s interpretation are troubling in the extreme, ESPECIALLY for transhumanists like me and like many of you…


The first 1,000 fans to signup with this link will get a 1 month free trial of Skillshare.
👉 https://skl.sh/economicsexplained11211

Watch “How Not To Suck At Speaking Aussie” 👉 https://www.skillshare.com/classes/How-Not-To-Suck-At-Speaki…1324222190

SpaceX’s newest drone ship is on its way out into the Atlantic Ocean for a Starlink mission that will break the company’s record for annual launch cadence.

Somewhat confusing known as Starlink Shell 4 Launch 3 or Starlink 4–3, the batch of 53 laser-linked V1.5 satellites is scheduled to fly before Starlink 4–2 for unknown reasons and at the same time as Starlink 2–3 is scheduled to fly before Starlink 2–2 on the West Coast. Regardless of the seemingly unstable launch order, perhaps related to the recent introduction of Starlink’s new V1.5 satellite design, drone ship A Shortfall of Gravitas’ (ASOG) November 27th Port Canaveral confirms that SpaceX is more or less on track to launch Starlink 4–3 no earlier than (NET) 6:20 pm EST (23:20 UTC) on Wednesday, December 1st.

In a bit of a return to stride after launching 20 times in the first six months but only three times in the entire third quarter of 2021, Starlink 4–3 is currently the first of four or even five SpaceX launches scheduled in the last month of the year. Nevertheless, if Starlink 4–3 is successful, it will also set SpaceX up to cross a milestone unprecedented in the history of satellite launches.

Applications are now open for the role of ESA-sponsored research medical doctor at Concordia research station in Antarctica for the 2023 winter over season. Do you have a medical degree, an interest in space exploration and the fortitude to spend almost a year in isolation in the world’s largest desert? Apply today for this unique post.

The blank backdrop

Located at the mountain plateau called Dome C in Antarctica, the French-Italian base is one of only three that is inhabited all year long.