Toggle light / dark theme

As searches for the leading dark matter candidates—weakly interacting massive particles, axions, and primordial black holes—continue to deliver null results, the door opens on the exploration of more exotic alternatives. Guanming Liang and Robert Caldwell of Dartmouth College in New Hampshire have now proposed a dark matter candidate that is analogous with a superconducting state [1]. Their proposal involves interacting fermions that could exist in a condensate similar to that formed by Cooper pairs in the Bardeen-Cooper-Schrieffer theory of superconductivity.

The novel fermions considered by Liang and Caldwell emerge in the Nambu–Jona-Lasinio model, which can be regarded as a low-energy approximation of the quantum chromodynamics theory that describes the strong interaction. The duo considers a scenario where, in the early Universe, the fermions behave like radiation, reaching thermal equilibrium with standard photons. As the Universe expands and the temperature drops below a certain threshold, however, the fermions undergo a phase transition that leads them to pair up and form a massive condensate.

The proposed scenario has several appealing features, say Liang and Caldwell. The fermions’ behavior would be consistent with that of the cold dark matter considered by the current standard model of cosmology. Further, the scenario implies a slight imbalance between fermions with different chiralities (left-and right-handed). Such an imbalance might be related to the yet-to-be-explained matter–antimatter asymmetry seen in the Universe. What’s more, the model predicts that the fermions obey a time-dependent equation of state that would produce unique, potentially observable signatures in the cosmic microwave background (CMB) radiation. The researchers suggest that next-generation CMB measurements—by the Simons Observatory and by so-called stage 4 CMB telescopes—might reach sufficient precision to vet their idea.

Google DeepMind’s AlphaEvolve AI system breaks a 56-year-old mathematical record by discovering a more efficient matrix multiplication algorithm that had eluded human mathematicians since Strassen’s 1969 breakthrough.

HUGE AI breakthrough: Absolute Zero Reasoner deep dive. Self-improving AI that learns with no data! #ai #aitools #ainews #llm.

Sources:
https://arxiv.org/abs/2505.03335
https://github.com/LeapLabTHU/Absolut… Thanks to Tavus for sponsoring this video. Try Tavus for free https://tavus.plug.dev/T4AQw5K 0:00 Absolute Zero intro 0:50 Traditional methods of training AI models 4:00 Absolute Zero algorithm 5:01 How Absolute Zero Reasoner works 7:19 Types of training tasks 9:00 How good is Absolute Zero 10:47 Tavus 12:11 Adding Absolute Zero to existing models 13:01 Interesting findings 15:43 Uhoh… 16:50 Ablation study 18:15 More interesting findings Newsletter: https://aisearch.substack.com/ Find AI tools & jobs: https://ai-search.io/ Support: https://ko-fi.com/aisearch Here’s my equipment, in case you’re wondering: Dell Precision 5690: https://www.dell.com/en-us/dt/ai-tech… Nvidia RTX 5,000 Ada https://nvda.ws/3zfqGqS Mouse/Keyboard: ALOGIC Echelon https://bit.ly/alogic-echelon Mic: Shure SM7B https://amzn.to/3DErjt1 Audio interface: Scarlett Solo https://amzn.to/3qELMeu.

Thanks to Tavus for sponsoring this video. Try Tavus for free https://tavus.plug.dev/T4AQw5K

0:00 Absolute Zero intro.
0:50 Traditional methods of training AI models.
4:00 Absolute Zero algorithm.
5:01 How Absolute Zero Reasoner works.
7:19 Types of training tasks.
9:00 How good is Absolute Zero.
10:47 Tavus.
12:11 Adding Absolute Zero to existing models.
13:01 Interesting findings.
15:43 Uhoh…
16:50 Ablation study.
18:15 More interesting findings.

Newsletter: https://aisearch.substack.com/
Find AI tools & jobs: https://ai-search.io/
Support: https://ko-fi.com/aisearch.

Here’s my equipment, in case you’re wondering:

This leads us to perhaps the hardest change of all: seeing a longer life as an opportunity and overcoming deeply engrained ageist assumptions. Currently, we underestimate the capacity of older people and the promise of our own later years.

David Bowie, a man who knew a thing or two about transitions, described ageing as “an extraordinary process whereby you become the person you always should have been”. If we can make life not just longer, but healthier, productive and engaged for longer, what’s not to like?

For most of human history, only a minority of the young and middle-aged became old. The result is that we underinvest in our later years and fail to provide the required support that a long healthy, productive and engaged life requires. Given how many of us alive can expect to become 80, have a shot at 90, and might even make it to 100, that is a problem which demands change.

Insight, involving representational change, can boost long-term memory. Here, in an fMRI study, the authors show that insight triggers stronger conceptual shifts in solution relevant brain regions and enhanced network integration, improving memory retention.

Large language models (LLMs) are remarkably versatile. They can summarize documents, generate code or even brainstorm new ideas. And now we’ve expanded these capabilities to target fundamental and highly complex problems in mathematics and modern computing.

Today, we’re announcing AlphaEvolve, an evolutionary coding agent powered by large language models for general-purpose algorithm discovery and optimization. AlphaEvolve pairs the creative problem-solving capabilities of our Gemini models with automated evaluators that verify answers, and uses an evolutionary framework to improve upon the most promising ideas.

AlphaEvolve enhanced the efficiency of Google’s data centers, chip design and AI training processes — including training the large language models underlying AlphaEvolve itself. It has also helped design faster matrix multiplication algorithms and find new solutions to open mathematical problems, showing incredible promise for application across many areas.