Menu

Blog

Aug 5, 2018

Employees at Google, Amazon and Microsoft Have Threatened to Walk Off the Job Over the Use of AI

Posted by in categories: biotech/medical, ethics, information science, military, robotics/AI

There is. Our engagement with AI will transform us. Technology always does, even while we are busy using it to reinvent our world. The introduction of the machine gun by Richard Gatling during America’s Civil War, and its massive role in World War I, obliterated our ideas of military gallantry and chivalry and emblazoned in our minds Wilfred Owen’s imagery of young men who “die as Cattle.” The computer revolution beginning after World War II ushered in a way of understanding and talking about the mind in terms of hardware, wiring and rewiring that still dominates neurology. How will AI change us? How has it changed us already? For example, what does reliance on navigational aids like Waze do to our sense of adventure? What happens to our ability to make everyday practical judgments when so many of these judgments—in areas as diverse as credit worthiness, human resources, sentencing, police force allocation—are outsourced to algorithms? If our ability to make good moral judgments depends on actually making them—on developing, through practice and habit, what Aristotle called “practical wisdom”—what happens when we lose the habit? What becomes of our capacity for patience when more and more of our trivial interests and requests are predicted and immediately met by artificially intelligent assistants like Siri and Alexa? Does a child who interacts imperiously with these assistants take that habit of imperious interaction to other aspects of her life? It’s hard to know how exactly AI will alter us. Our concerns about the fairness and safety of the technology are more concrete and easier to grasp. But the abstract, philosophical question of how AI will impact what it means to be human is more fundamental and cannot be overlooked. The engineers are right to worry. But the stakes are higher than they think.

Read more

Comments are closed.