Toggle light / dark theme

AI trained on the bible spits out bleak religious prophecies

Code Unto Caesar

Durendal’s algorithm wrote scripture about three topics: “the plague,” “Caesar,” and “the end of days.” So it’s not surprising that things took a grim turn. The full text is full of glitches characteristic of AI-written texts, like excerpts where over half of the nouns are “Lord.” But some passages are more coherent and read like bizarre doomsday prophecies.

For example, from the plague section: “O LORD of hosts, the God of Israel; When they saw the angel of the Lord above all the brethren which were in the wilderness, and the soldiers of the prophets shall be ashamed of men.”

NASA Uses Powerful Supercomputers and AI to Map Earth’s Trees, Discovers Billions of Trees in West African Drylands

Scientists from NASA ’s Goddard Space Flight Center in Greenbelt, Maryland, and international collaborators demonstrated a new method for mapping the location and size of trees growing outside of forests, discovering billions of trees in arid and semi-arid regions and laying the groundwork for more accurate global measurement of carbon storage on land.

Using powerful supercomputers and machine learning algorithms, the team mapped the crown diameter – the width of a tree when viewed from above – of more than 1.8 billion trees across an area of more than 500,000 square miles, or 1,300,000 square kilometers. The team mapped how tree crown diameter, coverage, and density varied depending on rainfall and land use.

The World Needs Nuclear Power, And We Shouldn’t Be Afraid Of It

Although many different approaches have been proposed to address this problem, it’s clear that any sustainable, long-term solution will include one important component: a transition to energy sources that don’t result in additional carbon dioxide emissions. While most of the ideas put forth — such as the hypothetical Green New Deal — focus on renewable energy sources like solar and wind power, there’s another option that we should seriously reconsider: nuclear fission power.


As we embrace green solutions, nuclear should absolutely be part of the equation.

CCNY team in quantum algorithm breakthrough

Researchers led by City College of New York physicist Pouyan Ghaemi report the development of a quantum algorithm with the potential to study a class of many-electron quantums system using quantum computers. Their paper, entitled “Creating and Manipulating a Laughlin-Type ν=1/3 Fractional Quantum Hall State on a Quantum Computer with Linear Depth Circuits,” appears in the December issue of PRX Quantum, a journal of the American Physical Society.

“Quantum physics is the fundamental theory of nature which leads to formation of molecules and the resulting matter around us,” said Ghaemi, assistant professor in CCNY’s Division of Science. “It is already known that when we have a macroscopic number of quantum particles, such as electrons in the metal, which interact with each other, novel phenomena such as superconductivity emerge.”

However, until now, according to Ghaemi, tools to study systems with large numbers of interacting quantum particles and their novel properties have been extremely limited.

New study outlines steps higher education should take to prepare a new quantum workforce

A new study outlines ways colleges and universities can update their curricula to prepare the workforce for a new wave of quantum technology jobs. Three researchers, including Rochester Institute of Technology Associate Professor Ben Zwickl, suggested steps that need to be taken in a new paper in Physical Review Physics Education Research after interviewing managers at more than 20 quantum technology companies across the U.S.

The study’s authors from University of Colorado Boulder and RIT set out to better understand the types of entry-level positions that exist in these companies and the educational pathways that might lead into those jobs. They found that while the companies still seek employees with traditional STEM degrees, they want the candidates to have a grasp of fundamental concepts in quantum information science and technology.

“For a lot of those roles, there’s this idea of being ‘quantum aware’ that’s highly desirable,” said Zwickl, a member of RIT’s Future Photon Initiative and Center for Advancing STEM Teaching, Learning and Evaluation. “The companies told us that many positions don’t need to have deep expertise, but students could really benefit from a one- or two-semester introductory sequence that teaches the foundational concepts, some of the hardware implementations, how the algorithms work, what a qubit is, and things like that. Then a graduate can bring in all the strength of a traditional STEM degree but can speak the language that the is talking about.”

Google Brain Paper Demystifies Learned Optimizers

Learned optimizers are algorithms that can be trained to solve optimization problems. Although learned optimizers can outperform baseline optimizers in restricted settings, the ML research community understands remarkably little about their inner workings or why they work as well as they do. In a paper currently under review for ICLR 2021, a Google Brain research team attempts to shed some light on the matter.

The researchers explain that optimization algorithms can be considered the basis of modern machine learning. A popular research area in recent years has focused on learning optimization algorithms by directly parameterizing and training an optimizer on a distribution of tasks.

Research on learned optimizers aims to replace the baseline “hand-designed” optimizers with a parametric optimizer trained on a set of tasks, which can then be applied more generally. In contrast to baseline optimizers that use simple update rules derived from theoretical principles, learned optimizers use flexible, high-dimensional, nonlinear parameterizations.

Researchers make most precise measurements of deuterium fusing with a proton to form helium-3

A large team of researchers affiliated with a host of institutions in Italy, the U.K and Hungary has carried out the most precise measurements yet of deuterium fusing with a proton to form helium-3. In their paper published in the journal Nature, the group describes their effort and how they believe it will contribute to better understanding the events that transpired during the first few minutes after the Big Bang.

Astrophysics theory suggests that the creation of deuterium was one of the first things that happened after the Big Bang. Therefore, it plays an important role in Big Bang nucleosynthesis—the reactions that happened afterward that led to the production of several of the light elements. Theorists have developed equations that show the likely series of events that occurred, but to date, it has been difficult to prove them correct without physical evidence. In this new effort, the researchers working at the Laboratory for Underground Nuclear Astrophysics in Italy have carried out experiments to simulate those first few minutes, hoping to confirm the theories.

The work was conducted deep under the thick rock cover of the Gran Sasso mountain to prevent interference from —it involved firing a beam of protons at a deuterium target—deuterium being a form of hydrogen with just one and one neutron—and then measuring the rate of fusion. But because the rate of fusion is so low, the bombardment had to be carried out many times—the team carried out their work nearly every weekend for three years.