Toggle light / dark theme

Human Enhancement & Personal Performance Hacking with Matt Ward of “The Disruptors”

We explore human enhancement and personal performance hacking with Matt Ward (@mattwardio), host of The Disruptors podcast, startup investor, adviser, and business innovation consultant. Matt and I thought it would be fun to do two episodes, one here on MIND & MACHINE and the other on The Disruptors, where we explore what we’ve learned, the ideas we’ve formed and our takeaways across all these different fields that we cover.

So with this episode here on MIND & MACHINE, we focus on human enhancement — technologies that are extending lifespan and enhancing human capability. Then we get into what Matt and I are doing currently to maximize our own performance capabilities — our ability to think more clearly, to live more energetic vibrant lives… which is all heavily informed by all these amazing guests across these different fields that we explore.

In the other part of this discussion, on The Disruptors, we look at another set of subjects from space to AI to Augmented and Virtual reality. So I encourage you to check that out as well at The Disruptors… For the other part of the Conversation on The Disruptors: https://is.gd/mv1Vez https://youtu.be/PtpwgTr4GSU __________ MIND & MACHINE features interviews by August Bradley with bold thinkers and leaders in transformational technologies. Subscribe to the MIND & MACHINE newsletter: https://www.mindandmachine.io/newsletter MIND & MACHINE Website: https://www.MindAndMachine.io Subscribe to the podcast on: iTunes: https://www.mindandmachine.io/itunes Android or Other Apps: https://www.mindandmachine.io/android Show Host August Bradley on Twitter: https://twitter.com/augustbradley _____________________________

For the other part of the Conversation on The Disruptors:
https://is.gd/mv1Vez.

__________

MIND & MACHINE features interviews by August Bradley with bold thinkers and leaders in transformational technologies.

Crew-3 Mission

On Thursday, November 11 at 6:32 p.m. EST, SpaceX’s Dragon autonomously docked with the International Space Station (ISS). Falcon 9 launched the spacecraft to orbit from historic Launch Complex 39A (LC-39A) at NASA’s Kennedy Space Center in Florida on Wednesday, November 10 at 9:03 p.m. EST.

After an approximate six-month stay, Dragon and the Crew-3 astronauts will depart the orbiting laboratory no earlier than late April 2022 for return to Earth and splashdown off the coast of Florida.

Neuromorphic Computing, AI Chips Emulating the Brain with Kelsey Scharnhorst on MIND & MACHINE

We explore Artificial Intelligence (AI) through Neuromorphic Computing with computer chips that emulate the biological neurons and synapses in the brain. Neuro-biological chip architectures enable machines to solve very different kinds of problems than traditional computers, the kinds of problems we previously thought only humans could tackle.

Our guest today is Kelsey Scharnhorst. Kelsey is an Artificial Neural Network Researcher at UCLA. Her research lab (Gimzewski Lab under James Gimzewski) is focused on creating neuromorphic computer chips and further developing their capabilities.

We’ll talk with Kelsey about how neuromorphic computing is different, how neural-biological computer architecture works, and how it will be used in the future.

Podcast version at: https://is.gd/MM_on_iTunes.

Gimzewski Lab (UCLA Neuromorphic Lab): http://gim.chem.ucla.edu.
Kelsey on LinkedIn: https://www.linkedin.com/in/kelseyscharnhorst.
__________

AI Aliens

Get a free month of Curiosity Stream: http://curiositystream.com/isaacarthur.
We often consider interacting with Aliens and Robots in the future, but what about Alien Robots? Today we’ll ask what artificial intelligence created by aliens might look like and what sort of circumstances we’d encounter them, and if they may be the only aliens we ever encounter.

Visit our Website: http://www.isaacarthur.net.
Support us on Patreon: https://www.patreon.com/IsaacArthur.
SFIA Merchandise available: https://www.signil.com/sfia/

Social Media:
Facebook Group: https://www.facebook.com/groups/1583992725237264/
Reddit: https://www.reddit.com/r/IsaacArthur/
Twitter: https://twitter.com/Isaac_A_Arthur on Twitter and RT our future content.
SFIA Discord Server: https://discord.gg/53GAShE

Listen or Download the audio of this episode from Soundcloud: Episode’s Audio-only version: https://soundcloud.com/isaac-arthur-148927746/ai-aliens.
Episode’s Narration-only version: https://soundcloud.com/isaac-arthur-148927746/ai-aliens-narration-only.

Credits:
Alien Civilizations: AI Aliens.
Episode 212a, Season 5 E46a.

Written by:

Are Robots Replacing Real Animals? — The Soft Robotics Revolution

With advancements in the field of robotics, scientists have created smart soft robots that can mimic any animal in terms of movement and behaviour. This may be the future of robotics since any kind of animals, whether it’s a robot dog, cat or fish can easily be built to perform complex movements and actions to help researchers and people in need of a social companion or pet.

Soft robotics is the specific sub-field of robotics dealing with constructing robots from highly compliant materials, similar to those found in living organisms.
Soft robotics draws heavily from the way in which living organisms move and adapt to their surroundings. In contrast to robots built from rigid materials, soft robots allow for increased flexibility and adaptability for accomplishing tasks, as well as improved safety when working around humans. These characteristics allow for its potential use in the fields of medicine and manufacturing.

If you enjoyed this video, please consider rating this video and subscribing to our channel for more frequent uploads. Thank you! smile

TIMESTAMPS:
00:00 How Robots have gotten more real.
00:48 Current robotic companions.
02:45 What are these new “Soft Robots”?
04:34 What these new Robots can accomplish.
06:55 Last Words.

#robots #robotics #animals

NVIDIA to Build Earth-2 Supercomputer to See Our Future

NVIDIA plans to build the world’s most powerful AI supercomputer dedicated to predicting climate change, named Earth-2.


The earth is warming. The past seven years are on track to be the seven warmest on record. The emissions of greenhouse gases from human activities are responsible for approximately 1.1°C of average warming since the period 1850–1900.

What we’re experiencing is very different from the global average. We experience extreme weather — historic droughts, unprecedented heatwaves, intense hurricanes, violent storms and catastrophic floods. Climate disasters are the new norm.

We need to confront climate change now. Yet, we won’t feel the impact of our efforts for decades. It’s hard to mobilize action for something so far in the future. But we must know our future today — see it and feel it — so we can act with urgency.

Women in tech are fighting A.I. bias —but where are the men?

Battling bias. If I’ve been a little MIA this week, it was because I spent Monday and Tuesday in Boston for Fortune ’s inaugural Brainstorm A.I. gathering. It was a fun and wonky couple of days diving into artificial intelligence and machine learning, technologies that—for good or ill—seem increasingly likely to shape not just the future of business, but the world at large.

There are a lot of good and hopeful things to be said about A.I. and M.L., but there’s also a very real risk that the technologies will perpetuate biases that already exist, and even introduce new ones. That was the subject of one of the most engrossing discussions of the event by a panel that was—as pointed out by moderator, guest co-chair, and deputy CEO of Smart Eye Rana el Kaliouby—comprised entirely of women.

One of the scariest parts of bias in A.I. is how wide and varied the potential effects can be. Sony Group’s head of A.I. ethics office Alice Xiang gave the example of a self-driving car that’s been trained too narrowly in what it recognizes as a human reason to jam on the breaks. “You need to think about being able to detect pedestrians—and ensure that you can detect all sorts of pedestrians and not just people that are represented dominantly in your training or test set,” said Xiang.

A New Method Will Teach Robots How to Stop Annoying You

Non-verbal social cues are key.

Robots are increasingly becoming common in everyday life but their communications skills still lag far behind. One key attribute that might really help robot-human interactions is if robots could learn to read and respond to human emotional cues.

In that case, they would be able to interfere when they are really needed and not disturb the rest of the time. Now, researchers at Franklin & Marshall College have been working on allowing socially assistive robots to process social cues given by humans and respond to them accordingly.