Toggle light / dark theme

A joint research team from the Hong Kong University of Science and Technology (HKUST) and the University of Tokyo discovered an unusual topological aspect of sodium chloride, commonly known as table salt, which will not only facilitate the understanding of the mechanism behind salt’s dissolution and formation, but may also pave the way for the future design of nanoscale conducting quantum wires.

There is a whole variety of advanced materials in our daily life, and many gadgets and technology are created through the assembly of different materials. Cellphones, for example, adopted a combination of many different substances—glass for the monitor, aluminum alloy for the frame, and metals like gold, silver and copper for their internal wiring. But nature has its own genius way of ‘cooking’ different properties into one wonder material, or what is known as ‘topological material’.

Topology, as a mathematical concept, studies what aspects of an object are robust under a smooth deformation. For instance, we can squeeze, stretch, or twist a T-shirt, but the number its openings would still be four so long as we do not tear it apart. The discovery of topological phases of matter, highlighted by the 2016 Nobel Prize in Physics, suggests that certain quantum materials are inherently a combination of electrical insulators and conductors. This could necessitate a conducting boundary even when the bulk of the material is insulating. Such materials are neither classified as a metal nor an insulator, but a natural assembly of the two.

If you download music online, you can get accompanying information embedded into the digital file that might tell you the name of the song, its genre, the featured artists on a given track, the composer, and the producer. Similarly, if you download a digital photo, you can obtain information that may include the time, date, and location at which the picture was taken. That led Mustafa Doga Dogan to wonder whether engineers could do something similar for physical objects. “That way,” he mused, “we could inform ourselves faster and more reliably while walking around in a store or museum or library.”

The idea, at first, was a bit abstract for Dogan, a 4th-year Ph.D. student in the MIT Department of Electrical Engineering and Computer Science. But his thinking solidified in the latter part of 2020 when he heard about a new smartphone model with a camera that utilizes the infrared (IR) range of the electromagnetic spectrum that the naked eye can’t perceive. IR light, moreover, has a unique ability to see through certain materials that are opaque to visible light. It occurred to Dogan that this feature, in particular, could be useful.

The concept he has since come up with—while working with colleagues at MIT’s Computer Science and Artificial Intelligence Lab (CSAIL) and a research scientist at Facebook—is called InfraredTags. In place of the standard barcodes affixed to products, which may be removed or detached or become otherwise unreadable over time, these tags are unobtrusive (due to the fact that they are invisible) and far more durable, given that they’re embedded within the interior of objects fabricated on standard 3D printers.

Researchers from Korea’s one of the best science and technology universities, KAIST’s Department of Bio and Brain Engineering, have developed a new artificial intelligence-powered light field camera that can read 3D facial expressions.

The highly capable camera uses a technique that uses infrared light to read facial expressions. Professors Ki-Hun Jeong and Doheon Lee led the research team which developed this artificial intelligence-enabled technology.

The newly developed light-field camera comes with micro-lens arrays in front of the image sensor, allowing it to capture the spatial and directional information of light in a single shot, making it tiny enough to fit into a smartphone.

This week, startup exiting stealth a year ago, previously announced that it was commercializing waveguides composed of silicon nanostructures as an alternative to traditional optics for use in mobile devices.

Metalenz recently began a partnership with ST Microelectronics to move its technology into mass production and expects to be shipping imaging packages sometime in the second quarter of this year, according to CEO Robert Devlin.

Google’s AR headsets, internally codenamed Project Iris, are expected to be released in 2024. Its device uses “outward-facing cameras to blend computer graphics with a video feed of the real world, creating a more immersive, mixed reality experience than existing AR glasses.” The hardware is “powered by a custom Google processor, like its newest Google Pixel smartphone, and runs on Android, though recent job listings indicate that a unique OS is in the works.”

Google Glass, the prior foray into AR, didn’t gain widespread consumer interest or adoption. The Verge says that the work on the project began to pick up speed recently. As of now, there isn’t a “clearly defined go-to-market strategy.” According to the Verge, Google is keeping the project secret, requiring “special keycard access” and “non-disclosure agreements.”

Facebook said it would hire around 10,000 workers around the world to build Meta and related products. A search on LinkedIn’s job board for “metaverse” shows thousands of listings. For people looking for fast-growing opportunities, you may want to look at pivoting into virtual and augmented reality and related opportunities in the metaverse.

Content warning: this story contains descriptions of abusive language and violence.

The smartphone app Replika lets users create chatbots, powered by machine learning, that can carry on almost-coherent text conversations. Technically, the chatbots can serve as something approximating a friend or mentor, but the app’s breakout success has resulted from letting users create on-demand romantic and sexual partners — a vaguely dystopian feature that’s inspired an endless series of provocative headlines.

Replika has also picked up a significant following on Reddit, where members post interactions with chatbots created on the app. A grisly trend has emerged there: users who create AI partners, act abusively toward them, and post the toxic interactions online.

What if plants could tell us when pests are attacking them, or they’re too dry, or they need more fertilizer. One startup is gene engineering farm plants so they can communicate in in fluorescent colors. The result: a farmer’s phone, drone, or even satellite imagery can reveal what is happening in hundreds of acres of fields …

That leads to better food, fewer crop failures, and more revenue for farmers.

In this TechFirst with John Koetsier we meet Shely Aronov, CEO and founder at InnerPlant, and chat about what plants say, and how farmers can understand their messages.

Links: