Toggle light / dark theme

Russian scientists have proposed a concept of a thorium hybrid reactor in that obtains additional neutrons using high-temperature plasma held in a long magnetic trap. This project was applied in close collaboration between Tomsk Polytechnic University, All-Russian Scientific Research Institute Of Technical Physics (VNIITF), and Budker Institute of Nuclear Physics of SB RAS. The proposed thorium hybrid reactor is distinguished from today’s nuclear reactors by moderate power, relatively compact size, high operational safety, and a low level of radioactive waste.

“At the initial stage, we get relatively cold using special plasma guns. We retain the amount by deuterium gas injection. The injected neutral beams with particle energy of 100 keV into this plasma generate the high-energy deuterium and tritium ions and maintain the required temperature. Colliding with each other, deuterium and tritium ions are combined into a helium nucleus so high-energy neutrons are released. These neutrons can freely pass through the walls of the vacuum chamber, where the plasma is held by a magnetic field, and entering the area with nuclear fuel. After slowing down, they support the fission of heavy nuclei, which serves as the main source of energy released in the hybrid ,” says professor Andrei Arzhannikov, a chief researcher of Budker Institute of Nuclear Physics of SB RAS.

The main advantage of a hybrid nuclear fusion reactor is the simultaneous use of the fission reaction of heavy nuclei and synthesis of light ones. It minimizes the disadvantages of applying these nuclear reactions separately.

There’s a good reason why we need to take a closer look at the sun. When the solar atmosphere releases its magnetic energy, it results in explosive phenomena like solar flares that hurl ultra-energized particles through the solar system in all directions, including ours. This […] can wreak havoc on things like GPS and electrical grids. Learning more about solar activity could give us more notice of when hazardous space weather is due to hit.


You can see structures on the surface as small as 18.5 miles in size.

After decades of not happening, fusion power finally appears to be maybe possibly happening.


The joke has been around almost as long as the dream: Nuclear fusion energy is 30 years away…and always will be. But now, more than 80 years after Australian physicist Mark Oliphant first observed deuterium atoms fusing and releasing dollops of energy, it may finally be time to update the punch line.

Over the past several years, more than two dozen research groups—impressively staffed and well-funded startups, university programs, and corporate projects—have achieved eye-opening advances in controlled nuclear fusion. They’re building fusion reactors based on radically different designs that challenge the two mainstream approaches, which use either a huge, doughnut-shaped magnetic vessel called a tokamak or enormously powerful lasers.

What’s more, some of these groups are predicting significant fusion milestones within the next five years, including reaching the breakeven point at which the energy produced surpasses the energy used to spark the reaction. That’s shockingly soon, considering that the mainstream projects pursuing the conventional tokamak and laser-based approaches have been laboring for decades and spent billions of dollars without achieving breakeven.

Essentially the higgs boson could create a replicator and even a teleportation device.


Can you think of any? Here’s what I mean. When we set about justifying basic research in fundamental science, we tend to offer multiple rationales. One (the easy and most obviously legitimate one) is that we’re simply curious about how the world works, and discovery is its own reward. But often we trot out another one: the claim that applied research and real technological advances very often spring from basic research with no specific technological goal. Faraday wasn’t thinking of electronic gizmos when he helped pioneer modern electromagnetism, and the inventors of quantum mechanics weren’t thinking of semiconductors and lasers. They just wanted to figure out how nature works, and the applications came later.

higgs-cms.jpg

So what about contemporary particle physics, and the Higgs boson in particular? We’re spending a lot of money to look for it, and I’m perfectly comfortable justifying that expense by the purely intellectual reward associated with understanding the missing piece of the Standard Model of particle physics. But inevitably we also mention that, even if we don’t know what it will be right now, it’s likely (or some go so far as to say “inevitable”) that someday we’ll invent some marvelous bit of technology that makes crucial use of what we learned from studying the Higgs. So — anyone have any guesses as to what that might be? You are permitted to think broadly here. We’re obviously not expecting something within a few years after we find the little bugger. So imagine that we have discovered it, and if you like you can imagine we have the technology to create Higgses with a lot less overhead than a kilometers-across particle accelerator.

“Think what we can do if we teach a quantum computer to do statistical mechanics,” posed Michael McGuigan, a computational scientist with the Computational Science Initiative at the U.S. Department of Energy’s Brookhaven National Laboratory.

At the time, McGuigan was reflecting on Ludwig Boltzmann and how the renowned physicist had to vigorously defend his theories of . Boltzmann, who proffered his ideas about how atomic properties determine physical properties of matter in the late 19th century, had one extraordinarily huge hurdle: atoms were not even proven to exist at the time. Fatigue and discouragement stemming from his peers not accepting his views on atoms and physics forever haunted Boltzmann.

Today, Boltzmann’s factor, which calculates the probability that a system of particles can be found in a specific energy state relative to zero energy, is widely used in physics. For example, Boltzmann’s factor is used to perform calculations on the world’s largest supercomputers to study the behavior of atoms, molecules, and the quark “soup” discovered using facilities such as the Relativistic Heavy Ion Collider located at Brookhaven Lab and the Large Hadron Collider at CERN.

To further shrink electronic devices and to lower energy consumption, the semiconductor industry is interested in using 2-D materials, but manufacturers need a quick and accurate method for detecting defects in these materials to determine if the material is suitable for device manufacture. Now a team of researchers has developed a technique to quickly and sensitively characterize defects in 2-D materials.

Two-dimensional materials are atomically thin, the most well-known being graphene, a single-atom-thick layer of carbon atoms.

“People have struggled to make these 2-D materials without defects,” said Mauricio Terrones, Verne M. Willaman Professor of Physics, Penn State. “That’s the ultimate goal. We want to have a 2-D material on a four-inch wafer with at least an acceptable number of defects, but you want to evaluate it in a quick way.”

Fyodor R.

Scientists recently identified the oldest material on Earth: stardust that’s 7 billion years old, tucked away in a massive, rocky meteorite that struck our planet half a century ago.

🏺Stardust

Stars have life cycles. They’re born when bits of dust and gas floating through space find each other and collapse in on each other and heat up. They burn for millions to billions of years, and then they die. When they die, they pitch the particles that formed in their winds out into space, and those bits of stardust eventually form new stars, along with new planets and moons and meteorites. And in a meteorite that fell fifty years ago in Australia, scientists have now discovered stardust that formed 5 to 7 billion years ago — the oldest solid material ever found on Earth.

Circa 2002


This paper proposes a new concept for generating controlled, high-flux pulses of neutrinos. Laser-induced generation of relativistic protons, followed by pion production and decay, provides the neutrino source. By conservative estimate, the source will yield nanosecond-range pulses of muon–neutrinos, with fluxes of ~1019 νμ s−1 sr−1 and energies of ~20 MeV or higher. Concept feasibility depends upon further progress in high-intensity lasers; the process assumes a driving laser with pulse energy ~8 kJ, providing an irradiance of ~9 × 1022 W cm−2. The study of the KARMEN time anomaly and neutrino oscillations would be the possible applications of the source.

Export citation and abstract BibTeX RIS.

As others have pointed out, voxel-based games have been around for a long time; a recent example is the whimsical “3D Dot Game Hero” for PS3, in which they use the low-res nature of the voxel world as a fun design element.

Voxel-based approaches have huge advantages (“infinite” detail, background details that are deformable at the pixel level, simpler simulation of particle-based phenomena like flowing water, etc.) but they’ll only win once computing power reaches an important crossover point. That point is where rendering an organic world a voxel at a time looks better than rendering zillions of polygons to approximate an organic world. Furthermore, much of the effort that’s gone into visually simulating real-world phenomena (read the last 30 years of Siggraph conference proceedings) will mostly have to be reapplied to voxel rendering. Simply put: lighting, caustics, organic elements like human faces and hair, etc. will have to be “figured out all over again” for the new era of voxel engines. It will therefore likely take a while for voxel approaches to produce results that look as good, even once the crossover point of level of detail is reached.

I don’t mean to take anything away from the hard and impressive coding work this team has done, but if they had more academic background, they’d know that much of what they’ve “pioneered” has been studied in tremendous detail for two decades. Hanan Samet’s treatise on the subject tells you absolutely everything you need to know, and more: (http://www.amazon.com/Foundations-Multidimensional-Structure…sr=8-1) and even goes into detail about the application of these spatial data structures to other areas like machine learning. Ultimately, Samet’s book is all about the “curse of dimensionality” and how (and how much) data structures can help address it.

Essentially beyond this is a higgs boson reactor essentially a universe of power in a jar.


Scientists have longed to create the perfect energy source. Ideally, that source would eventually replace greenhouse gas-spewing fossil fuels, power cars, boats, and planes, and send spacecraft to remote parts of the universe. So far, nuclear fusion energy has seemed like the most likely option to help us reach those goals.