Toggle light / dark theme

Circa 2000


A 1940 paper by Gamow and Mario Schoenberg was the first in a subject we now call particle astrophysics. The two authors presciently speculated that neutrinos could play a role in the cooling of massive collapsing stars. They named the neutrino reaction the Urca process, after a well known Rio de Janeiro casino. This name might seem a strange choice, but not to Gamow, a legendary prankster who once submitted a paper to Nature in which he suggested that the Coriolis force might account for his observation that cows chewed clockwise in the Northern Hemisphere and counterclockwise in the Southern Hemisphere.

In the 1940s Gamow began to attack, with his colleague Ralph Alpher, the problem of the origin of the chemical elements. Their first paper on the subject appeared in a 1948 issue of the Physical Review. At the last minute Gamow, liking the sound of ‘alpha, beta, gamma’, added his old friend Hans Bethe as middle author in absentia (Bethe went along with the joke, but the editors did not). Gamow and Alpher, with Robert Herman, then pursued the idea of an extremely hot neutron-dominated environment. They envisioned the neutrons decaying into protons, electrons and anti-neutrinos and, when the universe had cooled sufficiently, the neutrons and protons assembling heavier nuclei. They even estimated the photon background that would be necessary to account for nuclear abundances, suggesting a residual five-degree background radiation.

We now realize that their scheme was incorrect. The Universe began with roughly equal numbers of protons and neutrons. Collisions with electrons, positrons, neutrinos and anti-neutrinos are more important than neutron decay, and the absence of stable nuclei with atomic numbers of five and eight creates a barrier to further fabrication in the early Universe. Nevertheless Alpher, Gamow and Herman’s work was the first serious attempt to discuss the observable consequences of a big bang and the basic framework was correct. Ironically, the term ‘Big Bang’ was coined by Fred Hoyle, an advocate of a steady-state model of the universe, to make fun of Gamow’s efforts.

One of the U.S. Defense Department’s two prototype robot warships just fired its first missile.

The military on Friday hailed the test-launch of an SM-6 missile from the Unmanned Surface Vessel Ranger, sailing off the California coast, as “game-changing.”

It’s one thing for an unmanned vessel to launch a missile, however. It’s quite another for the same vessel autonomously to find and fix targets.

COPENHAGEN, Sept 8 (Reuters) — The world’s largest plant that sucks carbon dioxide directly from the air and deposits it underground is due to start operating on Wednesday, the company behind the nascent green technology said.

Swiss start-up Climeworks AG, which specialises in capturing carbon dioxide directly from the air, has partnered with Icelandic carbon storage firm Carbfix to develop a plant that sucks out up to 4,000 tons of CO2 per year.

That’s the equivalent of the annual emissions from about 790 cars. Last year, global CO2-emissions totalled 31.5 billion tonnes, according to the International Energy Agency.

Entergy has restored power to more than half a million of its customers, Louisiana’s largest utility said Tuesday morning.

But there are still roughly 370,000 customers without power across the state, with about 50,000 of them in New Orleans. Entergy expects 90% of its customers in the city to have power back Wednesday.

Some neighborhoods such as Venetian Isles will likely take longer due to more damage in those areas. Details of power restoration timelines for specific neighborhoods in New Orleans can be found here.

Circa 2006


MIT researchers are putting a tiny gas-turbine engine inside a silicon chip about the size of a quarter. The resulting device could run 10 times longer than a battery of the same weight can, powering laptops, cell phones, radios and other electronic devices.

It could also dramatically lighten the load for people who can’t connect to a power grid, including soldiers who now must carry many pounds of batteries for a three-day mission — all at a reasonable price.

The researchers say that in the long term, mass-production could bring the per-unit cost of power from close to that for power from today’s large gas-turbine power plants.

What we need now is an expansion of public and private investment that does justice to the opportunity at hand. Such investments may have a longer time horizon, but their eventual impact is without parallel. I believe that net-energy gain is within reach in the next decade; commercialization, based on early prototypes, will follow in very short order.

But such timelines are heavily dependent on funding and the availability of resources. Considerable investment is being allocated to alternative energy sources — wind, solar, etc. — but fusion must have a place in the global energy equation. This is especially true as we approach the critical breakthrough moment.

If laser-driven nuclear fusion is perfected and commercialized, it has the potential to become the energy source of choice, displacing the many existing, less ideal energy sources. This is because fusion, if done correctly, offers energy that is in equal parts clean, safe and affordable. I am convinced that fusion power plants will eventually replace most conventional power plants and related large-scale energy infrastructure that are still so dominant today. There will be no need for coal or gas.

A method for employing sinusoidal oscillations of electrical bombardment on the surface of one Kerr type singularity in close proximity to a second Kerr type singularity in such a method to take advantage of the Lense-Thirring effect, to simulate the effect of two point masses on nearly radial orbits in a 2+1 dimensional anti-de Sitter space resulting in creation of circular timelike geodesics conforming to the van Stockum under the Van Den Broeck modification of the Alcubierre geometry (Van Den Broeck 1999) permitting topology change from one spacelike boundary to the other in accordance with Geroch’s theorem (Geroch 1967) which results in a method for the formation of G{umlaut over ()}odel-type geodesically complete spacetime envelopes complete with closed timelike curves.

Circa 2020


IBM knows how to adapt to an ever-changing enterprise tech landscape. The venerable company more almost 20 years ago shed its PC business – selling it to Lenovo – understanding that that systems were quickly becoming commodity devices and the market was going to stumble. A decade later IBM sold its X86-based server business to Lenovo for $2.3 billion and in the intervening years has put a keen focus on hybrid clouds and artificial intelligence, buying Red Hat for $34 billion and continuing to invest its Watson portfolio.

However, that hasn’t meant throwing out product lines simply because they’ve been around for a while. IBM has continued to upgrade its mainframe systems to keep up with modern workloads and the bet has paid off. In the last quarter 2,019 the company saw mainframe revenue – driven by its System z15 mainframe in September 2019 – jump 63 percent, a number followed the next quarter by a 59 percent increase.

Tape storage is a similar story. The company rolled out its first tape storage device in 1,952 the 726 Tape Unit, which had a capacity of 2MB. Five decades later, the company is still innovating its tape storage technology and this week said that, as part of a 15-year partnership with Fujifilm, has set a record with a prototype system of 317 gigabits-per-square-inch (GB/in2) in areal density, 27 times more than the areal density of current top-performance tape drives. The record, reached with the help of a new tape material called Strontium Ferrite (SrFe), is an indication that magnetic tape fits nicely in a data storage world of flash, SSDs and NVMe and a rising demand for cloud-based storage.