Amnon H. Eden – Lifeboat News: The Blog https://lifeboat.com/blog Safeguarding Humanity Tue, 08 Sep 2020 09:22:42 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.2 AI in the enterprise: Prepare to be disappointed – oversold but under appreciated, it can help… just not too much https://lifeboat.com/blog/2020/09/ai-in-the-enterprise-prepare-to-be-disappointed-oversold-but-under-appreciated-it-can-help-just-not-too-much Tue, 08 Sep 2020 09:22:42 +0000 https://lifeboat.com/blog/2020/09/ai-in-the-enterprise-prepare-to-be-disappointed-oversold-but-under-appreciated-it-can-help-just-not-too-much

Artificial Intelligence research is making big strides. But in practice?

There are several buckets you can use to categorize AI, one of which is the BS bucket. Within, you’ll find simple statistical algorithms people have been using forever. But there’s another bucket of things that actually weren’t possible a decade ago.

“The vast majority of businesses are still in the early phases of collecting and using data. Most companies looking for data scientists are looking for people to collect, manage, and calculate basic statistics over normal business processes.”


Today we launch our Register Debates in which we spar over hot topics and YOU decide which side is right – by reader vote.

]]>
Artificial Intelligence Will Do What We Ask. That’s a Problem https://lifeboat.com/blog/2020/01/artificial-intelligence-will-do-what-we-ask-thats-a-problem Fri, 31 Jan 2020 21:04:30 +0000 https://lifeboat.com/blog/2020/01/artificial-intelligence-will-do-what-we-ask-thats-a-problem

YouTube’s “next video” is a profit-maximizing recommendation system, an A.I. selecting increasingly ‘engaging’ videos. And that’s the problem.

“Computer scientists and users began noticing that YouTube’s algorithm seemed to achieve its goal by recommending increasingly extreme and conspiratorial content. One researcher reported that after she viewed footage of Donald Trump campaign rallies, YouTube next offered her videos featuring “white supremacist rants, Holocaust denials and other disturbing content.” The algorithm’s upping-the-ante approach went beyond politics, she said: “Videos about vegetarianism led to videos about veganism. Videos about jogging led to videos about running ultramarathons.” As a result, research suggests, YouTube’s algorithm has been helping to polarize and radicalize people and spread misinformation, just to keep us watching.”


By teaching machines to understand our true desires, one scientist hopes to avoid the potentially disastrous consequences of having them do what we command.

]]>