Toggle light / dark theme

We may build incredible AI. But can we contain our cruelty? Oxford professor Nick Bostrom explains.

Up next, Is AI a species-level threat to humanity? With Elon Musk, Michio Kaku, Steven Pinker & more ► https://youtu.be/91TRVubKcEM

Nick Bostrom, a professor at Oxford University and director of the Future of Humanity Institute, discusses the development of machine superintelligence and its potential impact on humanity. Bostrom believes that in this century, we will create the first general intelligence that will be smarter than humans. He sees this as the most important thing humanity will ever do, but it also comes with an enormous responsibility.

Bostrom notes that there are existential risks associated with the transition to the machine intelligence era, such as the possibility of an underlying superintelligence that overrides human civilization with its own value structures. In addition, there is the question of how to ensure that conscious digital minds are treated well. However, if we succeed in ensuring the well-being of artificial intelligence, we could have vastly better tools for dealing with everything from diseases to poverty.

Ultimately, Bostrom believes that the development of machine superintelligence is crucial for a truly great future.

0:00 Smarter than humans.

The last few weeks have been abuzz with news and fears (well, largely fears) about the impact chatGPT and other generative technologies might have on the workplace. Goldman Sachs predicted 300 million jobs would be lost, while the likes of Steve Wozniak and Elon Musk asked for AI development to be paused (although pointedly not the development of autonomous driving).

Indeed, OpenAI chief Sam Altman recently declared that he was “a little bit scared”, with the sentiment shared by OpenAI’s chief scientist Ilya Sutskever, who recently said that “at some point it will be quite easy, if one wanted, to cause a great deal of harm”.


As fears mount about the jobs supposedly at risk from generative AI technologies like chatGPT, are these fears likely to prevent people from taking steps to adapt?

And exploration of the Vulnerable World Hypothesis solution to the Fermi Paradox.

And exploration of the possibility of finding fossils of alien origin right here on the surface of the earth.

My Patreon Page:

https://www.patreon.com/johnmichaelgodier.

My Event Horizon Channel:

In 1942 The Manhattan Project was established by the United States as part of a top-secret research and development (R&D) program to produce the first nuclear weapons. The project involved thousands of scientists, engineers, and other personnel who worked on different aspects of the project, including the development of nuclear reactors, the enrichment of uranium, and the design and construction of the bomb. The goal: to develop an atomic bomb before Germany did.

The Manhattan Project set a precedent for large-scale government-funded R&D programs. It also marked the beginning of the nuclear age and ushered in a new era of technological and military competition between the world’s superpowers.

Today we’re entering the age of Artificial Intelligence (AI)—an era arguably just as important, if not more important, than the age of nuclear war. While the last few months might have been the first you’ve heard about it, many in the field would argue we’ve been headed in this direction for at least the last decade, if not longer. For those new to the topic: welcome to the future, you’re late.

Several asteroids are set to dash past Earth in the coming days, according to a list released by NASA’s Jet Propulsion Laboratory, close encounters that are almost certain to pass harmlessly and come days after the White House announced new plans to defend the planet against threats from space.

Two asteroids, one bus-sized and the other the size of a house, will make relatively close approaches to Earth on Wednesday, according to NASA’s Asteroid Watch Dashboard.

Three more, all approximately airplane-sized, are also set to whizz past Earth on Thursday, the agency said.


Three plane-sized asteroids—and one the size of a house—are set to pass close by Earth on Wednesday and Thursday, NASA said.

Machine learning has been leveraged to accelerate analysis in nuclear processing facilities and investigations in the field.

Surprise nuclear attacks or threats will soon be a thing of the past. Researchers at the Department of Energy’s Pacific Northwest National Laboratory (PNNL), U.S., have developed new techniques to accelerate the discovery and understanding of nuclear weapons by leveraging machine learning.

One enticing application of these new techniques in national security is to use data analytics and machine learning to monitor several ingredients used to produce nukes.

FallenKingdomReads’ list of five must-read science fiction books for fans of The Matrix.

This dystopian novel is the basis for the classic film Blade Runner, and it explores a world devastated by nuclear war where bounty hunter Rick Deckard must retire six rogue androids.

In this novel, human consciousness is digitized and can be transferred between bodies, leading to an interstellar conspiracy that draws the protagonist Takeshi into a thrilling adventure.

An exploration in nanotechnology and how even as highly advanced as it could be, might show no technosignature or SETI detectable signal, thus if all alien civilizations convert to a nanotechnological existence, then this would solve the Fermi Paradox.

My Patreon Page:

https://www.patreon.com/johnmichaelgodier.

My Event Horizon Channel:

https://www.youtube.com/eventhorizonshow.

Music:

This is a galactic-sized problem. Scientists revealed Tuesday that galaxy PBC J2333.9–2343 has been reclassified after discovering a supermassive black hole that is currently facing our solar system, reports Royal Astronomical Society. Alien-hunting physicist on mission to prove meteorite that hit Earth is extraterrestrial probe Asteroid that could wipe out a city is near.

Read more ❯.

A newly discovered asteroid called 2023 DW has generated quite a buzz over the past week due to an estimated 1-in-670 chance of impact on Valentine’s Day 2046. But despite a NASA advisory and the resulting scary headlines, there’s no need to put an asteroid doomsday on your day planner for that date.

The risk assessment doesn’t have as much to do with the probabilistic roll of the cosmic dice as it does with the uncertainty that’s associated with a limited set of astronomical observations. If the case of 2023 DW plays out the way all previous asteroid scares have gone over the course of nearly 20 years, and further observations will reduce the risk to zero.

Nevertheless, the hubbub over a space rock that could be as wide as 165 feet (50 meters) highlights a couple of trends to watch for: We’re likely to get more of these asteroid alerts in the years to come, and NASA is likely to devote more attention to heading off potentially dangerous near-Earth objects, or NEOs.