Toggle light / dark theme

A new study in the Journal of Perinatology finds that when it comes to diagnosing early-onset infant sepsis, how blood samples are connected matters more than how many cultures are taken.

A team led by Yale Pediatrics’ Noa Fleiss, MD, finds obtaining more than one bloodculture doesn’t significantly…


Clinicians diagnose neonatal sepsis by taking a blood culture. New research evaluates factors that influence diagnostic utility while taking blood from infants.

Despite having been in public beta mode for nearly a year, Google’s search AI is still spitting out confusing and often incorrect answers.

As the Washington Post found when assessing Google’s Search Generative Experience, or SGE for short, the AI-powered update to the tech giant’s classic search bar is still giving incorrect or misleading answers nearly a year after it was introduced last May.

While SGE no longer tells users that they can melt eggs or that slavery was good, it does still hallucinate, which is AI terminology for confidently making stuff up. A search for a made-up Chinese restaurant called “Danny’s Dan Dan Noodles” in San Francisco, for example, spat out references to “long lines and crazy wait times” and even gave phony citations about 4,000-person lines and a two-year waitlist.

A new theoretical framework for plastic neural networks predicts dynamical regimes where synapses rather than neurons primarily drive the network’s behavior, leading to an alternative candidate mechanism for working memory in the brain.

The brain is an immense network of neurons, whose dynamics underlie its complex information processing capabilities. A neuronal network is often classed as a complex system, as it is composed of many constituents, neurons, that interact in a nonlinear fashion (Fig. 1). Yet, there is a striking difference between a neural network and the more traditional complex systems in physics, such as spin glasses: the strength of the interactions between neurons can change over time. This so-called synaptic plasticity is believed to play a pivotal role in learning. Now David Clark and Larry Abbott of Columbia University have derived a formalism that puts neurons and the connections that transmit their signals (synapses) on equal footing [1]. By studying the interacting dynamics of the two objects, the researchers take a step toward answering the question: Are neurons or synapses in control?

Clark and Abbott are the latest in a long line of researchers to use theoretical tools to study neuronal networks with and without plasticity [2, 3]. Past studies—without plasticity—have yielded important insights into the general principles governing the dynamics of these systems and their functions, such as classification capabilities [4], memory capacities [5, 6], and network trainability [7, 8]. These works studied how temporally fixed synaptic connectivity in a network shapes the collective activity of neurons. Adding plasticity to the system complicates the problem because then the activity of neurons can dynamically shape the synaptic connectivity [9, 10].

Wires and cables are not the only things that can get entangled: Plants, fungi, and bacteria can all exhibit filamentous or branching growth patterns that eventually become entangled too. Previous work with nonliving materials demonstrated that entanglement can produce unique and desirable material properties, but achieving entanglement requires meticulously engineered material structure and geometry. It has been unclear if the same rules apply to organisms, which, unlike nonliving systems, develop through a process of progressive growth. Through a blend of experiments and simulations, we show that growth easily produces entanglement.

Specifically, we find that treelike growth leads to branch arrangements that cannot be disassembled without breaking or deforming branches. Further, entanglement via growth is possible for a wide range of geometries. In fact, it appears to be largely insensitive to the geometry of branched trees but instead depends sensitively on how long the organism can keep growing. In other words, growing branched trees can entangle with almost any geometry if they keep growing for a long-enough time.

Entanglement via growth appears to be largely distinct from, and easier to achieve than, entanglement of nonliving materials. These observations may in part account for the broad prevalence of entanglement in biological systems, as well as inform recent experiments that observed the rapid evolution of entanglement, though much still remains to be discovered.

Since ChatGPT debuted in the fall of 2022, much of the interest in generative AI has centered around large language models. Large language models, or LLMs, are the giant compute-intensive computer models that are powering the chatbots and image generators that seemingly everyone is using and talking about nowadays.

While there’s no doubt that LLMs produce impressive and human-like responses to most prompts, the reality is most general-purpose LLMs suffer when it comes to deep domain knowledge around things like, say, health, nutrition, or culinary. Not that this has stopped folks from using them, with occasionally bad or even laughable results and all when we ask for a personalized nutrition plan or to make a recipe.

LLMs’ shortcomings in creating credible and trusted results around those specific domains have led to growing interest in what the AI community is calling small language models (SLMs). What are SLMs? Essentially, they are smaller and simpler language models that require less computational power and fewer lines of code, and often, they are specialized in their focus.

Fusion energy has long been hailed as the holy grail because of its potential for limitless amounts of clean energy. But that promise has trailed reality for decades, with billions of dollars in research leading to few breakthroughs. Now there’s optimism that is about to change, partly because of new startups funded by the likes of Sam Altman, Jeff Bezos, and Bill Gates.

Yahoo Finance went inside the country’s largest magnetic fusion facility for an exclusive look, to explore the challenges of bringing this technology to commercial use for the latest episode of NEXT.

“The race is on to actually see who can develop this and who can get it to the masses the fastest,” said David Callaway, former editor-in-chief of USA Today and founder of Callaway Climate Insights, a news and information service focused on the business of climate change.