Toggle light / dark theme

To expand its GPT capabilities, OpenAI released its long-anticipated o1 model, in addition to a smaller, cheaper o1-mini version. Previously known as Strawberry, the company says these releases can “reason through complex tasks and solve harder problems than previous models in science, coding, and math.”

Although it’s still a preview, OpenAI states this is the first of this series in ChatGPT and on its API, with more to come.

The company says these models have been training to “spend more time thinking through problems before they respond, much like a person would. Through training, they learn to refine their thinking process, try different strategies, and recognize their mistakes.”

I find it weird that black holes would be moving throughout the galaxy because most are stationary.


A fluffy cluster of stars spilling across the sky may have a secret hidden in its heart: a swarm of over 100 stellar-mass black holes.

The star cluster in question is called Palomar 5. It’s a stellar stream that stretches out across 30,000 light-years, and is located around 80,000 light-years away.

Such globular clusters are often considered ‘fossils’ of the early Universe. They’re very dense and spherical, typically containing roughly 100,000 to 1 million very old stars; some, like NGC 6397, are nearly as old as the Universe itself.

Using a 3D printer that works with molten glass, researchers forged LEGO-like glass bricks with a strength comparable to concrete. The bricks could have a role in circular construction in which materials are used over and over again.

“Glass as a structural material kind of breaks people’s brains a little bit,” says Michael Stern, a former MIT graduate student and researcher in both MIT’s Media Lab and Lincoln Laboratory. “We’re showing this is an opportunity to push the limits of what’s been done in architecture.”

Stern is also the founder of MIT spinoff, Evenline. That company developed a special 3D printer that can execute additive manufacturing using molten glass as its feedstock, which you can see in operation in the following video.

To save this element to your Kindle, first ensure [email protected] is added to your Approved Personal Document E-mail List under your Personal Document Settings on the Manage Your Content and Devices page of your Amazon account. Then enter the ‘name’ part of your Kindle email address below. Find out more about saving to your Kindle.

Note you can select to save to either the @free.kindle.com or @kindle.com variations. ‘@free.kindle.com’ emails are free but can only be saved to your device when it is connected to wi-fi. ‘@kindle.com’ emails can be delivered even when you are not connected to wi-fi, but note that service fees apply.

Find out more about the Kindle Personal Document Service.

In today’s fast-paced world, speed is celebrated. Instant messaging outpaces thoughtful letters, and rapid-fire tweets replace reflective essays. We’ve become conditioned to believe that faster is better. But what if the next great leap in artificial intelligence challenges that notion? What if slowing down is the key to making AI think more like us—and in doing so, accelerating progress?

OpenAI’s new o1 model, built on the transformative concept of the hidden Chain of Thought, offers an interesting glimpse into this future. Unlike traditional AI systems that rush to deliver answers by scanning data at breakneck speeds, o1 takes a more human-like approach. It generates internal chains of reasoning, mimicking the kind of reflective thought humans use when tackling complex problems. This evolution not only marks a shift in how AI operates but also brings us closer to understanding how our own brains work.

This concept of AI thinking more like humans is not just a technical accomplishment—it taps into fascinating ideas about how we experience reality. In his book The User Illusion, Tor Nørretranders reveals a startling truth about our consciousness: only a tiny fraction of the sensory input we receive reaches conscious awareness. He argues that our brains process vast amounts of information—up to a million times more than we are consciously aware of. Our minds act as functional filters, allowing only the most relevant information to “bubble up” into our conscious experience.