Toggle light / dark theme

The recent James Webb Space Telescope(JWST) guide camera’s test image looks really similar to Hubble’s deep fields, which are my favorite. I decided to take a long exposure to the same target to see what my telescope can see and compare it to JWST’s image. I found one really faint galaxy 26–32 million light-years away, and a cute planetary nebula called Abell 39, pause and see if you can find it in my image.

- Scope: Celestron RASA 8.
- Mount: Ioptron cem40.
- Camera: ZWO ASI183mm pro.
- Guide scope: ZWO mini120mm.
- Guide Camera: ZWO ASI224mc.
- Filter: Astronomik MaxFR 12nm Ha filter.

NASA article: https://www.nasa.gov/image-feature/countdown-to-the-webb-telescopes-first-images.

More of my astrophotography work on Instagram.

Multiple angles of Booster 7 experiencing an unexpected ignition during Raptor engine testing.

Video and Pictures from the NSF Robots. Edited by Jack (@theJackBeyer).

All content copyright to NSF. Not to be used elsewhere without explicit permission from NSF.

Click “Join” for access to early fast turnaround clips, exclusive discord access with the NSF team, etc — to support the channel.

Head to https://www.squarespace.com/marcushouse to save 10% off your first purchase of a website or domain using code MARCUSHOUSE

Quite the inspirational week this one with the complete set of JWST First Images. Loads of Starship and Starbase news. Last week I mentioned that it was fire time for Starbase, and…WOW… I was not wrong there. SpaceX’s Starship Booster 7 has gone for repair after explosion. Falcon 9 launches for both Starlink and finally CRS-25. We also had the very first launch of Vega C. Rocket Lab firing off another Electron, and more. So enough of this intro. Let’s crack on with it!

Everyday Astronaut — Elon Musk Explains SpaceX’s Raptor Engine!

End Screen Music — Isle of Rain by Savfk.

A new language model similar in scale to GPT-3 is being made freely available and could help to democratise access to AI.

BLOOM (which stands for BigScience Large Open-science Open-access Multilingual Language Model) has been developed by 1,000 volunteer researchers from over 70 countries and 250 institutions, supported by ethicists, philosophers, and legal experts, in a collaboration called BigScience. The project, coordinated by New York-based startup Hugging Face, used funding from the French government.

The new AI took more than a year of planning and training, which included a final run of 117 days (11th March – 6th July) using the Jean Zay, one of Europe’s most powerful supercomputers, located in the south of Paris, France.

An international team of physicists has developed a new technique that allows researchers to study the interactions between neutrons inside of an atom. In their paper published in the journal Nature, the group describe their laser spectroscopy measurement technique and how it can be used.

It has been nearly 100 years since scientists discovered that inside of every atom are —which give atoms their —as well as . And despite much study of subatomic particles, scientists still do not know what sorts of interactions go on inside of an atom. In this new effort, the researchers modified laser spectroscopy measurement techniques to study such interactions.

In this new work, the researchers began by looking at elements with a —those that have highly stable protons and neutrons—and wound up using indium-131, which has a magic number of neutrons, and also a proton hole, in which a nuclide has one fewer proton than a traditional magic number element. Indium-131 is, unfortunately, also notoriously unstable, which means that it only exists for a short time before breaking down—it tends to last for just 0.28 seconds.

Should the Fed make a 1-percentage-point hike at the July meeting, it would be the largest move since Paul Volcker was Fed chairman in the 1980s.


Lasers normally use mirrors to create laser light, but a new kind uses clumps of moving particles. The result is a laser that is more programmable and could generate extra-sharp visual displays.

The question of how the chemical composition of a protein—the amino acid sequence—determines its 3D structure has been one of the biggest challenges in biophysics for more than half a century. This knowledge about the so-called “folding” of proteins is in great demand, as it contributes significantly to the understanding of various diseases and their treatment, among other things. For these reasons, Google’s DeepMind research team has developed AlphaFold, an artificial intelligence that predicts 3D structures.

A team consisting of researchers from Johannes Gutenberg University Mainz (JGU) and the University of California, Los Angeles, has now taken a closer look at these structures and examined them with respect to knots. We know knots primarily from shoelaces and cables, but they also occur on the nanoscale in our cells. Knotted proteins can not only be used to assess the quality of structure but also raise important questions about folding mechanisms and the evolution of proteins.