Toggle light / dark theme

Wires and cables are not the only things that can get entangled: Plants, fungi, and bacteria can all exhibit filamentous or branching growth patterns that eventually become entangled too. Previous work with nonliving materials demonstrated that entanglement can produce unique and desirable material properties, but achieving entanglement requires meticulously engineered material structure and geometry. It has been unclear if the same rules apply to organisms, which, unlike nonliving systems, develop through a process of progressive growth. Through a blend of experiments and simulations, we show that growth easily produces entanglement.

Specifically, we find that treelike growth leads to branch arrangements that cannot be disassembled without breaking or deforming branches. Further, entanglement via growth is possible for a wide range of geometries. In fact, it appears to be largely insensitive to the geometry of branched trees but instead depends sensitively on how long the organism can keep growing. In other words, growing branched trees can entangle with almost any geometry if they keep growing for a long-enough time.

Entanglement via growth appears to be largely distinct from, and easier to achieve than, entanglement of nonliving materials. These observations may in part account for the broad prevalence of entanglement in biological systems, as well as inform recent experiments that observed the rapid evolution of entanglement, though much still remains to be discovered.

Since ChatGPT debuted in the fall of 2022, much of the interest in generative AI has centered around large language models. Large language models, or LLMs, are the giant compute-intensive computer models that are powering the chatbots and image generators that seemingly everyone is using and talking about nowadays.

While there’s no doubt that LLMs produce impressive and human-like responses to most prompts, the reality is most general-purpose LLMs suffer when it comes to deep domain knowledge around things like, say, health, nutrition, or culinary. Not that this has stopped folks from using them, with occasionally bad or even laughable results and all when we ask for a personalized nutrition plan or to make a recipe.

LLMs’ shortcomings in creating credible and trusted results around those specific domains have led to growing interest in what the AI community is calling small language models (SLMs). What are SLMs? Essentially, they are smaller and simpler language models that require less computational power and fewer lines of code, and often, they are specialized in their focus.

Fusion energy has long been hailed as the holy grail because of its potential for limitless amounts of clean energy. But that promise has trailed reality for decades, with billions of dollars in research leading to few breakthroughs. Now there’s optimism that is about to change, partly because of new startups funded by the likes of Sam Altman, Jeff Bezos, and Bill Gates.

Yahoo Finance went inside the country’s largest magnetic fusion facility for an exclusive look, to explore the challenges of bringing this technology to commercial use for the latest episode of NEXT.

“The race is on to actually see who can develop this and who can get it to the masses the fastest,” said David Callaway, former editor-in-chief of USA Today and founder of Callaway Climate Insights, a news and information service focused on the business of climate change.

Talk about the call coming from inside the house!

In an interview with The Financial Times, Google DeepMind CEO Demis Hassibis likened the frothiness shrouding the AI gold rush to that of the crypto industry’s high-dollar funding race, saying that the many billions being funneled into AI companies and projects brings a “bunch of hype and maybe some grifting and some other things that you see in other hyped-up areas, crypto or whatever.”

“Some of that has now spilled over into AI,” the CEO added, “which I think is a bit unfortunate.”