Toggle light / dark theme

Researchers have shown that 3D laser printing can be used to fabricate a high-quality, complex polymer optical device directly on the end of an optical fiber. This type of micro-optical device—which has details smaller than the diameter of a human hair—could provide an extremely compact and inexpensive way to tailor light beams for a variety of applications.

“Communication technologies, the internet and many other applications are based on optical fibers,” said research team leader Shlomi Lightman from Soreq Nuclear Research Center in Israel. “When light comes out of the fiber, large bulky optical elements are typically used to route it to the next location. Our approach minimizes both the size and cost for this process by integrating the routing process into the fiber itself.”

In the journal Optics Letters, Lightman and colleagues describe how they fabricated the tiny multi-component beam shaper directly onto a fiber. The device turns normal laser light into a twisted Bessel beam that carries orbital angular momentum and doesn’t expand in space like typical .

Microcontrollers, miniature computers that can run simple commands, are the basis for billions of connected devices, from internet-of-things (IoT) devices to sensors in automobiles. But cheap, low-power microcontrollers have extremely limited memory and no operating system, making it challenging to train artificial intelligence models on “edge devices” that work independently from central computing resources.

Training a on an intelligent edge device allows it to adapt to new data and make better predictions. For instance, training a model on a smart keyboard could enable the keyboard to continually learn from the user’s writing. However, the training process requires so much memory that it is typically done using powerful computers at a data center, before the model is deployed on a device. This is more costly and raises privacy issues since user data must be sent to a central server.

To address this problem, researchers at MIT and the MIT-IBM Watson AI Lab have developed a new technique that enables on-device training using less than a quarter of a megabyte of memory. Other training solutions designed for connected devices can use more than 500 megabytes of memory, greatly exceeding the 256-kilobyte capacity of most microcontrollers (there are 1,024 kilobytes in one ).

Were you unable to attend Transform 2022? Check out all of the summit sessions in our on-demand library now! Watch here.

The open-source Linux operating system is an essential component of the cloud and enterprise application delivery. In fact, every cloud service, even Microsoft, offers Linux-based compute resources and Linux is often the default choice for embedded and internet of things (IoT) devices. Among the major Linux distribution vendors today are IBM’s Red Hat business unit, German vendor SUSE and Canonical, which develops the Ubuntu Linux distribution.

The market for Linux is forecast to grow to $22.15 billion by 2029, according to Fortune Business Insights, up from $6.27 billion in 2022.

Known for its electric vehicles, Tesla Inc TSLA also has a solar power division. Customers who bought solar roofs in Florida might be thanking the company after the lingering damage of Hurricane Ian.

What Happened: Hurricane Ian hit landfall in Florida and has caused severe damage to the region. Benzinga previously reported the impact could be $258 billion in replacement costs in one region and another $149 billion in the area of Tampa Bay.

The impact could be hundreds of millions of dollars for insurance companies as well.

The streets in this meticulously planned neighborhood were designed to flood so houses don’t. Native landscaping along roads helps control storm water. Power and internet lines are buried to avoid wind damage. This is all in addition to being built to Florida’s robust building codes.

Some residents, like Grande, installed more solar panels on their roofs and added battery systems as an extra layer of protection from power outages. Many drive electric vehicles, taking full advantage of solar energy in the Sunshine State.

Climate resiliency was built into the fabric of the town with stronger storms in mind.

Digitalization generated 4 percent of the total greenhouse emissions in 2020.

More than half of the digital data firms generate is collected, processed, and stored for single-use purposes. Often, it is never re-used. This could be your multiple near-identical images held on Google Photos or iCloud, a business’s outdated spreadsheets that will never be used again, or data from internet of things sensors that have no purpose.

This “dark data” is anchored to the real world by the energy it requires. Even data that is stored and never used again takes up space on servers — typically huge banks of computers in warehouses. Those computers and those warehouses all use lots of electricity.


Gorodenkoff/iStock.

This is a significant energy cost that is hidden in most organizations. Maintaining an effective organizational memory is a challenge, but at what cost to the environment?

Artificial Intelligence (AI), a term first coined at a Dartmouth workshop in 1956, has seen several boom and bust cycles over the last 66 years. Is the current boom different?

The most exciting advance in the field since 2017 has been the development of “Large Language Models,” giant neural networks trained on massive databases of text on the web. Still highly experimental, Large Language Models haven’t yet been deployed at scale in any consumer product — smart/voice assistants like Alexa, Siri, Cortana, or the Google Assistant are still based on earlier, more scripted approaches.

Large Language Models do far better at routine tasks involving language processing than their predecessors. Although not always reliable, they can give a strong impression of really understanding us and holding up their end of an open-ended dialog. Unlike previous forms of AI, which could only perform specific jobs involving rote perception, classification, or judgment, Large Language Models seem to be capable of a lot more — including possibly passing the Turing Test, named after computing pioneer Alan Turing’s thought experiment that posits when an AI in a chat can’t be distinguished reliably from a human, it will have achieved general intelligence.

But can Large Language Models really understand anything, or are they just mimicking the superficial “form” of language? What can we say about our progress toward creating real intelligence in a machine? What do “intelligence” and “understanding” even mean? Blaise Agüera y Arcas, a Fellow at Google Research, and Melanie Mitchell, the Davis Professor of Complexity at the Santa Fe Institute, take on these thorny questions in a wide-ranging presentation and discussion.

It offers services on all seven continents of the world.

SpaceX has crossed the milestone of producing a million Starlink terminals, the company’s CEO Elon Musk confirmed on Twitter earlier today. It is a significant boost for the satellite internet business of the space company, which began accepting preorders only 19 months ago.


Miss Vosk/Flickr.

The satellite internet is the new way of connecting the world. That can guarantee network coverage even in the remotest parts of the world. Fiber or cellular network-based internet requires the infrastructure to be connected to the last mile to ensure services; however, services like Starlink rely on a constellation of satellites in low Earth orbits that can deliver internet services directly from the skies.