Toggle light / dark theme

The bottom line is robots are machines; and like any other machine, a robot system can be (with the right expertise) reprogram. And, a connected robot to the net, etc. poses a risk as long as hackers poses a risk in the current Cyber environment. Again, I encourage government, tech companies, and businesses work collectively together in addressing the immediate challenge around Cyber Security.

And, there will need to be some way to also track robots & deactivate them remotely especially when the public are allowed to buy them (including criminals).


“We believe story comprehension in robots can eliminate psychotic-appearing behavior and reinforce choices that won’t harm humans and still achieve the intended goal”.

There’s no manual for being a good human, but greeting strangers as you walk by in the morning, saying thank you and opening doors for people are probably among the top things we know we should do, even if we sometimes forget.

Again, I see too many gaps that will need to be address before AI can eliminate 70% of today’s jobs. Below, are the top 5 gaps that I have seen so far with AI in taking over many government, business, and corporate positions.

1) Emotion/ Empathy Gap — AI has not been designed with the sophistication to provide personable care such as you see with caregivers, medical specialists, etc.
2) Demographic Gap — until we have a more broader mix of the population engaged in AI’s design & development; AI will not meet the needs for critical mass adoption; only a subset of the population will find will connection in serving most of their needs.
3) Ehtics & Morale Code Gap — AI still cannot understand at a full cognitive level ethics & empathy to a degree that is required.
4) Trust and Compliance Gap — companies need to feel that their IP & privacy is protected; until this is corrected, AI will not be able to replace an entire back office and front office set of operations.
5) Security & Safety Gap — More safeguards are needed around AI to deal with hackers to ensure that information managed by AI is safe as well as ensure public saftey from any AI that becomes disruptive or hijacked to cause injury or worse to the public

Until these gaps are addressed; it will be very hard to eliminate many of today’s government, office/ business positions. The greater job loss will be in the lower skill areas like standard landscaping, some housekeeping, some less personable store clerk, some help desk/ call center operations, and some lite admin admin roles.


The U.S. economy added 2.7 million jobs in 2015, capping the best two-year stretch of employment growth since the late ‘90’s, pushing the unemployment rate down to five percent.

DARPA’s efforts to teach AI “Empathy & Ethics”


The rapid pace of artificial intelligence (AI) has raised fears about whether robots could act unethically or soon choose to harm humans. Some are calling for bans on robotics research; others are calling for more research to understand how AI might be constrained. But how can robots learn ethical behavior if there is no “user manual” for being human?

Researchers Mark Riedl and Brent Harrison from the School of Interactive Computing at the Georgia Institute of Technology believe the answer lies in “Quixote” — to be unveiled at the AAAI-16 Conference in Phoenix, Ariz. (Feb. 12 — 17, 2016). Quixote teaches “value alignment” to robots by training them to read stories, learn acceptable sequences of events and understand successful ways to behave in human societies.

“The collected stories of different cultures teach children how to behave in socially acceptable ways with examples of proper and improper behavior in fables, novels and other literature,” says Riedl, associate professor and director of the Entertainment Intelligence Lab. “We believe story comprehension in robots can eliminate psychotic-appearing behavior and reinforce choices that won’t harm humans and still achieve the intended purpose.”

The late Supreme Court Justice Potter Stewart once said, “Ethics is knowing the difference between what you have a right to do and what is right to do.”

As artificial intelligence (AI) systems become more and more advanced, can the same statement apply to computers?

According to many technology moguls and policymakers, the answer is this: We’re not quite there yet.

Read more

Danaher’s Instruments of Change — If you feel like your industry that has always been on a slow & stable growth curve is now under greater pressure to change; you’re not alone. Recent indicators are showing with the latest changes in tech and consumers (namely the millennials as the largest consumers today); industries have been shaken up to perform at new levels like never before or companies in those industries will cease to be relevant.


Doing well by doing good is now expected for businesses, and moral leadership is at a premium for CEOs. For today’s companies to maintain their license to operate, they need to take into account a range of elements in their decision making: managing their supply chains, applying new ways of measuring their business performance that include indicators for social as well as commercial returns, and controlling the full life cycle of their products’ usage as well as disposal. This new reality is demonstrated by the launch last September of the Sustainable Development Goals (SDGs), which call on businesses to address sustainability challenges such as poverty, gender equality, and climate change in new and creative ways. The new expectations for business also are at the heart of the Change the World list, launched by Fortune Magazine in August 2015, which is designed to identify and celebrate companies that have made significant progress in addressing major social problems as a part of their core business strategy.

Technology and millennials seem to be driving much of this change. Socially conscious customers and idealistic employees are applauding companies’ ability to do good as part of their profit-making strategy. With social media capable of reaching millions instantly, companies want to be on the right side of capitalism’s power. This is good news for society. Corporate venturing activities are emerging, and companies are increasingly leveraging people, ideas, technology, and business assets to achieve social and environmental priorities together with financial profit. These new venturing strategies are focusing more and more on areas where new partnerships and investments can lead to positive outcomes for all: the shareholders, the workers, the environment, and the local community.

Furthermore, this is especially true in the technology sector. More than 25% of the Change the World companies listed by Fortune are tech companies, and four are in the top ten–Vodafone, Google, Cisco Systems, and Facebook. Facebook’s billionaire co-founder and CEO, Mark Zuckerberg, and his wife have helped propel the technology sector into the spotlight as a shining beacon of how to do good and do well. Zuckerberg and Priscilla Chan pledged on December 1, 2015, to give 99 percent of their fortune to charity. Facebook shares are valued between $40 and $45 billion, which makes this a very large gift. The donations will initially be focused on personalized learning, curing disease, connecting people, and building strong communities.

Davos: The True Fear Around Robots — Autonomous weapons, which are currently being developed by the US, UK, China, Israel, South Korea and Russia, will be capable of identifying targets, adjusting their behavior in response to that target, and ultimately firing — all without human intervention.


The issue of ‘killer robots’ one day posing a threat to humans has been discussed at the annual World Economic Forum meeting in Davos, Switzerland.

The discussion took place on 21 January during a panel organised by the Campaign to Stop Killer Robots (CSKR) and Time magazine, which asked the question: “What if robots go to war?”

Participants in the discussion included former UN disarmament chief Angela Kane, BAE Systems chair Sir Roger Carr, artificial intelligence (AI) expert Stuart Russell and robot ethics expert Alan Winfield.

Read more

DoD spending $12 to $15 billion of its FY17 budget on small bets that includes NextGen tech improvements — WOW. Given the DARPA new Neural Engineering System Design (NESD); guessing we may finally have a Brain Mind Interface (BMI) soldier in the future.


The Defense Department will invest the $12 billion to $15 billion from its Fiscal Year 2017 budget slotted for developing a Third Offset Strategy on several relatively small bets, hoping to produce game-changing technology, the vice chairman of the Joint Chiefs of Staff said.

Read more

God does not exist. However, let’s grant for a moment that God is real. Religious texts and practices show that God is wicked, cruel, and immoral, and totally unworthy of affection by moral human beings.

For the sake of brevity, we’ll exclusively consider the God of the New Testament, and ignore the God of the Old Testament, Koran, and other books. This God is often portrayed as hip, cool, and loving. If we dig deeper into some of the basic tenets of Christianity held by mainstream Protestant, Catholic, and Orthodox churches, we’ll see that it’s an elaborate smoke screen. The God of the New Testament is a beast.

The Problem of Evil

Christian churches usually portray God as all powerful, all knowing, and all benevolent. Long ago, freethinkers discovered the silver bullet to prove that God is not a moral agent. The problem of evil, in its simplest form, goes like this, “If God is good, why does he let evil exist in the world?”

Read more