Toggle light / dark theme

The Global Work Crisis: Automation, the Case Against Jobs, and What to Do About It

In the end, we look back at our careers and reflect on what we’ve achieved. It may have been the hundreds of human interactions we’ve had; the thousands of emails read and replied to; the millions of minutes of physical labor—all to keep the global economy ticking along.

According to Gallup’s World Poll, only 15 percent of people worldwide are actually engaged with their jobs. The current state of “work” is not working for most people. In fact, it seems we as a species are trapped by a global work crisis, which condemns people to cast away their time just to get by in their day-to-day lives.

Technologies like artificial intelligence and automation may help relieve the work burdens of millions of people—but to benefit from their impact, we need to start changing our social structures and the way we think about work now.

An Algorithm Has Been Developed to Obstruct AI Facial Recognition, and It’s Free to Use

Are you worried about AI collecting your facial data from all the pictures you have ever posted or shared? Researchers have now developed a method for hindering facial recognition.

It is a commonly accepted fact nowadays that the images we post or share online can and might find themselves being used by third parties for one reason or another. It may not be something we truly agree with, but it’s a fact that most of us have accepted as an undesirable consequence of using freely available social media apps and websites.

To avoid this happening, a team of researchers from the University of Chicago have developed an algorithm, named “Fawkes,” as an ode to Guy Fawkes, that works in the background to slightly alter your image, which is mostly unnoticeable to human eye. The reason for this is that companies, such as, Clearview, which collect large amounts of facial data, use artificial intelligence to find and connect one photograph of one’s face to another photograph from elsewhere. This connection is found by linking the similarities between the two photos. However, it doesn’t mean that the recognition only occurs when identical facial symmetry or characteristics, such as moles, are found. Facial recognition also looks into “invisible relationships between the pixels that make up a computer-generated picture of that face.”

What If Your Teacher Were AI?

We would soon have an Ai teacher @ Ogba Educational Clinic.


Watch our Discussed episode where we dive deeper into the topic with Dr. Joanna Bryson: https://bit.ly/what-if-your-teacher-were-ai

Have you ever had a teacher that was just, well, kind of boring? Maybe even a little, robotic? We’ve all been there. Let’s face it, it’s hard to learn when you’re being put to sleep. But what if we found a completely new way of learning? Could a robot be a better teacher than a human? How might education change with AI?

Can you translate this episode into another language? Add subtitles and we will link your YouTube channel in the description: https://www.youtube.com/timedtext_video?v=zm-ZJrsiRvU

Watch more what-if scenarios:

Here’s why Apple believes it’s an AI leader—and why it says critics have it all wrong

Do you agree Eric Klien

Apple AI chief and ex-Googler John Giannandrea dives into the details with Ars.


Machine learning (ML) and artificial intelligence (AI) now permeate nearly every feature on the iPhone, but Apple hasn’t been touting these technologies like some of its competitors have. I wanted to understand more about Apple’s approach, so I spent an hour talking with two Apple executives about the company’s strategy—and the privacy implications of all the new features based on AI and ML.

Omniviolence Is Coming and the World Isn’t Ready

The terrorist or psychopath of the future, however, will have not just the Internet or drones—called “slaughterbots” in this video from the Future of Life Institute—but also synthetic biology, nanotechnology, and advanced AI systems at their disposal. These tools make wreaking havoc across international borders trivial, which raises the question: Will emerging technologies make the state system obsolete? It’s hard to see why not. What justifies the existence of the state, English philosopher Thomas Hobbes argued, is a “social contract.” People give up certain freedoms in exchange for state-provided security, whereby the state acts as a neutral “referee” that can intervene when people get into disputes, punish people who steal and murder, and enforce contracts signed by parties with competing interests.

The trouble is that if anyone anywhere can attack anyone anywhere else, then states will become—and are becoming—unable to satisfy their primary duty as referee.


In The Future of Violence, Benjamin Wittes and Gabriella Blum discuss a disturbing hypothetical scenario. A lone actor in Nigeria, “home to a great deal of spamming and online fraud activity,” tricks women and teenage girls into downloading malware that enables him to monitor and record their activity, for the purposes of blackmail. The real story involved a California man who the FBI eventually caught and sent to prison for six years, but if he had been elsewhere in the world he might have gotten away with it. Many countries, as Wittes and Blum note, “have neither the will nor the means to monitor cybercrime, prosecute offenders, or extradite suspects to the United States.”

Technology is, in other words, enabling criminals to target anyone anywhere and, due to democratization, increasingly at scale. Emerging bio-, nano-, and cyber-technologies are becoming more and more accessible. The political scientist Daniel Deudney has a word for what can result: “omniviolence.” The ratio of killers to killed, or “K/K ratio,” is falling. For example, computer scientist Stuart Russell has vividly described how a small group of malicious agents might engage in omniviolence: “A very, very small quadcopter, one inch in diameter can carry a one-or two-gram shaped charge,” he says. “You can order them from a drone manufacturer in China. You can program the code to say: ‘Here are thousands of photographs of the kinds of things I want to target.’ A one-gram shaped charge can punch a hole in nine millimeters of steel, so presumably you can also punch a hole in someone’s head. You can fit about three million of those in a semi-tractor-trailer. You can drive up I-95 with three trucks and have 10 million weapons attacking New York City. They don’t have to be very effective, only 5 or 10% of them have to find the target.” Manufacturers will be producing millions of these drones, available for purchase just as with guns now, Russell points out, “except millions of guns don’t matter unless you have a million soldiers. You need only three guys to write the program and launch.” In this scenario, the K/K ratio could be perhaps 3/1,000,000, assuming a 10-percent accuracy and only a single one-gram shaped charge per drone.

Will emerging technologies make the state system obsolete? It’s hard to see why not.