Toggle light / dark theme

Maximizing Benefits Of The Life Sciences & Health Tech For All Americans — Dr. Andrew Hebbeler, Ph.D., Principal Assistant Director for Health and Life Sciences, Office of Science and Technology Policy, The White House.


Dr. Andrew Hebbeler, Ph.D., is Principal Assistant Director for Health and Life Sciences, Office of Science and Technology Policy at The White House (https://www.whitehouse.gov/ostp/ostps-teams/health-and-life-sciences/), and has extensive foreign affairs, national security, global health, and science and technology (S&T) policy experience.

Most recently, Dr. Hebbeler was Senior Director and Lead Scientist for Global Biological Policy and Programs at the non-profit Nuclear Threat Initiative and previous to that served in leadership positions at the State Department’s offices of Science and Technology Cooperation (OES/STC), the Science and Technology Adviser to the Secretary of State (E/STAS), and Cooperative Threat Reduction (ISN/CTR).

“If we get a similar hit rate in detecting texture in tumors, the potential for early diagnosis is huge,” says scientist.

Researchers at University College London.

The potentially early-stage fatal tumors in humans could be noticed by the new x-ray method that collaborates with a deep-learning Artificial Intelligence (AI) algorithm to detect explosives in luggages, according to a report published by MIT Technology Review on Friday.

In the last three or four years, he said, “we’ve gone through three different versions of our dismounted gear. So we’re able to quickly pivot to the next technology and not necessarily go down long-term production of the same solution when the technology is iterating and the threat is iterating.”

The Army is reinvigorating its networks, sensors, EW arsenal and related tools following decades of counterterrorism operations — a period when troops engaged with forces sporting less-advanced gear and communications were less at risk.

The U.S. is now preparing for potential fights against China and Russia, two world powers that spend significantly on military science and technology. The targeting of networks and other battlefield systems seen in the Russia-Ukraine war is only adding to the sense of urgency.

Dr. Asha M. George, DrPH (https://biodefensecommission.org/teams/asha-m-george-drph/) is Executive Director, Bipartisan Commission on Biodefense, which was established in 2014 to assess gaps in and provide recommendations to improve U.S. biodefense. The Panel determines where the United States is falling short of addressing biological attacks and emerging and reemerging infectious diseases.

Dr. George is a public health security professional whose research and programmatic emphasis has been practical, academic, and political. She served in the U.S. House of Representatives as a senior professional staffer and subcommittee staff director at the House Committee on Homeland Security in the 110th and 111th Congress. She has worked for a variety of organizations, including government contractors, foundations, and non-profits. As a contractor, she supported and worked with all Federal Departments, especially the Department of Homeland Security and the Department of Health and Human Services.

Dr. George also served on active duty in the U.S. Army as a military intelligence officer and as a paratrooper and she is a decorated Desert Storm Veteran.

Dr. George holds a Bachelor of Arts in Natural Sciences from Johns Hopkins University, a Master of Science in Public Health from the University of North Carolina at Chapel Hill (in Parasitology and Laboratory Practice), and a Doctorate in Public Health (with a focus on Public Health Policy and Security Preparedness) from the University of Hawaii at Manoa. She is also a graduate of the Harvard University National Preparedness Leadership Initiative.

Robert Long is a research fellow at the Future of Humanity Institute. His work is at the intersection of the philosophy of AI Safety and consciousness of AI. We talk about the recent LaMDA controversy, Ilya Sutskever’s slightly conscious tweet, the metaphysics and philosophy of consciousness, artificial sentience, and how a future filled with digital minds could get really weird.

Audio & transcript: https://theinsideview.ai/roblong.
Michaël: https://twitter.com/MichaelTrazzi.
Robert: https://twitter.com/rgblong.

Robert’s blog: https://experiencemachines.substack.com.

OUTLINE

Making the future of medicine possible by rethinking how medicines are made — olivia zetter, head of government affairs & AI strategy, resilience.


Olivia Zetter is Head of Government Affairs and AI Strategy at National Resilience, Inc. (https://resilience.com/) a first-of-its-kind manufacturing and technology company dedicated to broadening access to complex medicines and protecting bio-pharmaceutical supply chains against disruption.

Founded in 2020, National Resilience, Inc. is building a sustainable network of high-tech, end-to-end manufacturing solutions to ensure the medicines of today, and tomorrow, can be made quickly, safely, and at scale.

Russia’s central bank on Thursday proposed banning the use and mining of cryptocurrencies on Russian territory, citing threats to financial stability, citizens’ wellbeing and its monetary policy sovereignty.

The move is the latest in a global cryptocurrency crackdown as governments from Asia to the United States worry that privately operated and highly volatile digital currencies could undermine their control of financial and monetary systems.

Russia has argued for years against cryptocurrencies, saying they could be used in money laundering or to finance terrorism. It eventually gave them legal status in 2020 but banned their use as a means of payment.

Americans have become accustomed to images of Hellfire missiles raining down from Predator and Reaper drones to hit terrorist targets in Pakistan or Yemen. But that was yesterday’s drone war.

A revolution in unmanned aerial vehicles is unfolding, and the U.S. has lost its monopoly on the technology.

Some experts believe the spread of the semi-autonomous weapons will change ground warfare as profoundly as the machine gun did.

A new video released by nonprofit The Future of Life Institute (FLI) highlights the risks posed by autonomous weapons or ‘killer robots’ – and the steps we can take to prevent them from being used. It even has Elon Musk scared.

Its original Slaughterbots video, released in 2017, was a short Black Mirror-style narrative showing how small quadcopters equipped with artificial intelligence and explosive warheads could become weapons of mass destruction. Initially developed for the military, the Slaughterbots end up being used by terrorists and criminals. As Professor Stuart Russell points out at the end of the video, all the technologies depicted already existed, but had not been put together.

Now the technologies have been put together, and lethal autonomous drones able to locate and attack targets without human supervision may already have been used in Libya.

Experts in the AI and Big Data sphere consider October 2021 to be a dark month. Their pessimism isn’t fueled by rapidly shortening days or chilly weather in much of the country—but rather by the grim news from Facebook on the effectiveness of AI in content moderation.

This is unexpected. The social media behemoth has long touted tech tools such as machine learning and Big Data as answers to its moderation woes. As CEO Mark Zuckerberg explained for CBS News, “The long-term promise of AI is that in addition to identifying risks more quickly and accurately than would have already happened, it may also identify risks that nobody would have flagged at all—including terrorists planning attacks using private channels, people bullying someone too afraid to report it themselves, and other issues both local and global.”