Toggle light / dark theme

Have you ever wondered how can North Korea afford its nuclear program and the luxury goods for its leadership when its economy is effectively cut off from the world? Well… let me tell you a little secret.

If you want to support the channel, check out my Patreon: https://www.patreon.com/ExplainedWithDom.

Selected sources and further reading:
https://www.wilsoncenter.org/event/north-koreas-criminal-act…-challenge.
https://press.armywarcollege.edu/cgi/viewcontent.cgi?article…monographs.
https://www.airuniversity.af.edu/JIPA/Display/Article/328526…ase-study/
https://sgp.fas.org/crs/row/RL33885.pdf.
https://www.rusi.org/events/open-to-all/organised-crime-north-korea.
https://moneyweek.com/19827/north-koreas-criminal-economy

A video worth watching. An amazingly detailed deep dive into Sam Altman’s interviews and a high-level look at AI LLMs.


Missed by much of the media, Sam Altman (and co) have revealed at least 16 surprising things over his World Tour. From AI’s designing AIs to ‘unstoppable opensource’, the ‘customisation’ leak (with a new 16k ChatGPT and ‘steerable GPT 4), AI and religion, and possible regrets over having ‘pushed the button’.

I’ll bring in all of this and eleven other insights, together with a new and highly relevant paper just released this week on ‘dual-use’. Whether you are interested in ‘solving climate change by telling AIs to do it’, ‘staring extinction in the face’ or just a deepfake Altman, this video touches on it all, ending with comments from Brockman in Seoul.

I watched over ten hours of interviews to bring you this footage from Jordan, India, Abu Dhabi, UK, South Korea, Germany, Poland, Israel and more.

Altman Abu Dhabi, HUB71, ‘change it’s architecture’: https://youtu.be/RZd870NCukg.

Join top executives in San Francisco on July 11–12, to hear how leaders are integrating and optimizing AI investments for success. Learn More

The skies above where I reside near New York City were noticeably apocalyptic last week. But to some in Silicon Valley, the fact that we wimpy East Coasters were dealing with a sepia hue and a scent profile that mixed cigar bar, campfire and old-school happy hour was nothing to worry about. After all, it is AI, not climate change, that appears to be top of mind to this cohort, who believe future superintelligence is either going to kill us all, save us all, or almost kill us all if we don’t save ourselves first.

Whether they predict the “existential risks” of runaway AGI that could lead to human “extinction” or foretell an AI-powered utopia, this group seems to have equally strong, fixed opinions (for now, anyway — perhaps they are “loosely held”) that easily tip into biblical prophet territory.

Unlike many of his peers in the artificial intelligence community, Andrew Ng isn’t convinced about the dangers of AI.

In a video posted to Twitter this week, Ng, a Stanford University professor and founder of several Silicon Valley AI startups, expressed doubt about the doomsday predictions of other executives and experts in the field.

Science Fiction author Robert J. Sawyer talks about Oppenheimer and about his Alternate History book: The Oppenheimer Alternative.

Where to find ‘The Oppenheimer Alternative” book?
Robert J. Sawyer’s website: https://sfwriter.com.

* Trinity moment — AI vs. Nuclear.
* ‘Now I am become death, the destroyer of worlds’
* The Jewish connection to the Manhattan project and the Nazi nuclear program.
* Nuking Japan.
* Oppenheimer personality.
* Nuclear as a Double Edge Sword. Existential risk of a nuclear Holocaust.
* Thermonuclear — the rivalry with Edward Teller.
* Alternate History — the end of the world by 2030
* Military driven science vs. science driven by scientists.
* Nuclear energy in space.
* The Orion project — Nuclear Impales propulsion.
* Controversy of Wernher von Braun.
* Role of science fiction.

Channel inks:
Quora blog: https://spacefaringcivilization.quora.com/
Amazon Author page: http://amazon.com/author/ronfriedman.
My Website: https://ronsfriedman.wordpress.com/

How to support the channel:
Get $5 in NDAX (Canadian Crypto Exchange): https://refer.ndax.io/vm1j.
Buy Escape Velocity short stories collection:
Support with Ethereum or Plygon donation: sciandscifi.nft

The sex of human and other mammal babies is decided by a male-determining gene on the Y chromosome. But the human Y chromosome is degenerating and may disappear in a few million years, leading to our extinction unless we evolve a new sex gene.

The good news is two branches of rodents have already lost their Y chromosome and have lived to tell the tale.

A recent paper in Proceedings of the National Academy of Science shows how the spiny rat has evolved a new male-determining gene.

A raft of industry experts have given their views on the likely impact of artificial intelligence on humanity in the future. The responses are unsurprisingly mixed.

The Guardian has released an interesting article regarding the potential socioeconomic and political impact of the ever-increasing rollout of artificial intelligence (AI) on society. By asking various experts in the field on the subject, the responses were, not surprisingly, a mixed bag of doom, gloom, and hope.


Yucelyilmaz/iStock.

“I don’t think the worry is of AI turning evil or AI having some kind of malevolent desire,” Jessica Newman, director of University of California Berkeley’s Artificial Intelligence Security Initiative, told the Guardian. “The danger is from something much more simple, which is that people may program AI to do harmful things, or we end up causing harm by integrating inherently inaccurate AI systems into more and more domains of society,” she added.