Toggle light / dark theme

Interesting article, and one thing that I have thoroughly enjoyed over the years is helping companies developed new products and services through innovation, or changing their IT organization into real profit centers like this article highlights. And, as part of these types of transformations it has always been key to change/ reinforce the culture’s mindset that business owns the definition of the strategies and solutions.

However, as AI becomes more and more prevelant across businesses; we could eventually see that IT ends up owning the definition as well as the enablement of the solutions for the company/ business. So, it is almost like we come full circle through AI after all. And, this is just one of many business/ corporate cultural questions that we will need to address with AI in the coming years.


There are many reasons to run IT as a well functioning business instead of the traditional cost center model. Below are the top 5 consequences of continuing to run IT in the traditional manner.

Wasted Resources

Running IT as a cost center actually can result in waste of precious company resources. Both money and time can be used more effectively if the department is run in a more business like manner. Money is wasted through duplication of efforts, maintaining systems that should be replaced, failed projects, and systems and data centers that do not meet the business needs. The most precious resource of time is wasted in both IT departments and throughout the company. Poorly run projects that run over time or do not provide full value impact everyone. Business departments also spend time working around IT instead of with it resulting in wasted time. Running IT as a business helps ensure that time and money are used in the most effective manner to help the business meet it objectives.

Read more

The Watson 2016 Foundation is an independent organization formed for the advocacy of the artificial intelligence known as Watson to run for President of The United States of America.


It’s time to elect the first artificial intelligence into office.

Read more

What would be really cool is have a “Computer Screen in a Can”; take your polymer spray and instantly create a screen on a table, a window, suitcase, etc. with your “Computer Screen in a Can”; U Can! I can just imagine the infomercials. On a more serious note — NW Univ has developed a new Hybrid Polymer which is going to expand the capabilities of polymer into so many areas in medicine, to manufacturing, electronics, self reparing material & devices, etc.

http://www.compositesworld.com/news/northwestern-university-…id-polymer


A completely new hybrid polymer has been developed by Northwestern University (Evanston, IL) researchers.

“We have created a surprising new polymer with nano-sized compartments that can be removed and chemically regenerated multiple times,” said materials scientist Samuel Stupp, the senior author of the study and director of Northwestern’s Simpson Querrey Institute for BioNanotechnology. The study was published in the Jan. 29 issue of Science.

“Some of the nanoscale compartments contain rigid conventional polymers, but others contain the so- called supramolecular polymers, which can respond rapidly to stimuli, be delivered to the environment and then be easily regenerated again in the same locations. The supramolecular soft compartments could be animated to generate polymers with the functions we see in living things,” he said.

I will admit there is some great VR and AI talent in the UK.


U.S. giants such as Apple and Microsoft are flocking to the U.K. to buy artificial intelligence (AI) start-ups as Britain establishes itself as the go-to place for the technology.

Microsoft announced that it had acquired London-based Swiftkey, an AI start-up that makes a predictive keyboard for smartphones, on Wednesday for $250 million, sources close to the deal told CNBC.

“We are looking for interesting tech. It is not new news that London is the most advanced start-up ecosystem in Europe,” a Microsoft insider who wished to remain anonymous because they were not authorised to speak publicly about the deal, told CNBC on Thursday.

Future of Life Institute illustrate their objection to automated lethal robots:

“Outrage swells within the international community, which demands that whoever is responsible for the atrocity be held accountable. Unfortunately, no one can agree on who that is”


The year is 2020 and intense fighting has once again broken out between Israel and Hamas militants based in Gaza. In response to a series of rocket attacks, Israel rolls out a new version of its Iron Dome air defense system. Designed in a huge collaboration involving defense companies headquartered in the United States, Israel, and India, this third generation of the Iron Dome has the capability to act with unprecedented autonomy and has cutting-edge artificial intelligence technology that allows it to analyze a tactical situation by drawing from information gathered by an array of onboard sensors and a variety of external data sources. Unlike prior generations of the system, the Iron Dome 3.0 is designed not only to intercept and destroy incoming missiles, but also to identify and automatically launch a precise, guided-missile counterattack against the site from where the incoming missile was launched. The day after the new system is deployed, a missile launched by the system strikes a Gaza hospital far removed from any militant activity, killing scores of Palestinian civilians. Outrage swells within the international community, which demands that whoever is responsible for the atrocity be held accountable. Unfortunately, no one can agree on who that is…

Much has been made in recent months and years about the risks associated with the emergence of artificial intelligence (AI) technologies and, with it, the automation of tasks that once were the exclusive province of humans. But legal systems have not yet developed regulations governing the safe development and deployment of AI systems or clear rules governing the assignment of legal responsibility when autonomous AI systems cause harm. Consequently, it is quite possible that many harms caused by autonomous machines will fall into a legal and regulatory vacuum. The prospect of autonomous weapons systems (AWSs) throws these issues into especially sharp relief. AWSs, like all military weapons, are specifically designed to cause harm to human beings—and lethal harm, at that. But applying the laws of armed conflict to attacks initiated by machines is no simple matter.

Repeated incidents of inflammation in the stomach could mean a higher risk to colon cancer — new research released shows this.


“A quarter of the world’s population is affected by some type of gut inflammation and these patients always have a much higher chance of developing colon cancer,” said lead author Xiling Shen, associate professor at Duke University in North Carolina, US.

The scientists focussed on a microRNA — a class of naturally occurring, small non-coding ribonucleic acid (RNA) molecules — called miR-34a that gives cancer stem cells the odd ability to divide asymmetrically. This process controls the cancerous stem cell population and generates a diverse set of cells.

However, the problem showed up when the mice’s tissues became inflamed. Without any microRNA miR-34a, their stem cells quickly grew out of control and formed many tumour-like structures, the researchers elucidated.