Toggle light / dark theme

Insecticides mimic melatonin, creating higher risk for diabetes

Synthetic chemicals commonly found in insecticides and garden products bind to the receptors that govern our biological clocks, University at Buffalo researchers have found. The research suggests that exposure to these insecticides adversely affects melatonin receptor signaling, creating a higher risk for metabolic diseases such as diabetes.

Published online on Dec. 27 in Chemical Research in Toxicology, the research combined a big data approach, using computer modeling on millions of chemicals, with standard wet-laboratory experiments. It was funded by a grant from the National Institute of Environmental Health Sciences, part of the National Institutes of Health.

Disruptions in human circadian rhythms are known to put people at higher risk for diabetes and other metabolic diseases but the mechanism involved is not well-understood.

China, already dominant in supercomputers, shoots for an exascale prototype in 2017

Back in June, China debuted the world’s fastest supercomputer, the Sunway TaihuLight (pictured), with a Linpack benchmark result of 93 petaflop/s. That machine contains 40,960 locally developed ShenWei processors, each with 260 cores and roughly comparable with Intel’s Knight’s Landing Xeon Phi CPU. China also developed a 136GB/sec memory controller and custom interconnect that delivers 16GB/sec of peak bandwidth between nodes.

Now China is working on a prototype exascale (1,000-petaflop) system that it aims to complete by the end of this year, according to state media. An exascale computer is capable of a quintillion calculations per second, and could deliver vast dividends in deep learning and big data across a variety of disciplines as varied as nuclear test research, code breaking, and weather forecasting.

“A complete computing system of the exascale supercomputer and its applications can only be expected in 2020, and will be 200 times more powerful than the country’s first petaflop computer Tianhe-1, recognized as the world’s fastest in 2010,” said Zhang Ting, an application engineer at Tianjin’s National Super Computer Center, to Xinhua news agency (via AFP).

EyeLock to exhibit iris authentication technology at Intersec Dubai 2017

For all my friends in Dubai or travelling to Dubai; wish I could go.


EyeLock LLC will be exhibiting its suite of iris authentication technology at Intersec Dubai 2017, on January 22–24 at the Dubai International Convention and Exhibition Centre in Dubai.

Featuring EyeLock’s proprietary software, security, algorithms and optics, the iris authentication technology delivers secure, reliable and user-friendly capabilities, according to the company.

EyeLock’s technology analyzes more than 240 unique iris characteristics to deliver dual-eye authentication, an unmatched security architecture and anti-spoofing technology.

End to Illness: Machine Learning Is Revolutionizing How We Prevent Disease

The TeraStructure algorithm can analyze genome sets much larger than current systems can efficiently handle, including those as big as 100,000 or 1 million genomes. Finding an efficient way to analyze genome databases would allow for personalized healthcare that takes into account any genetic mutations that could exist in a person’s DNA.

10 Powerful Examples Of Artificial Intelligence In Use Today

Not sure where the author got his messaging on AI and QC (namely AI more fluid and human like due to QC); but it sounds a lot like my words. However, there is one lost piece to the AI story even with QC to make AI more human like and that is when you have Synbio involved in the mix. In fact I can not wait to see what my friend Alex Zhavoronkov and his team does with QC in his anti-aging work. I expect to see many great things with QC, AI, and Synbio together.

Nonetheless, I am glad to see others also seeing the capability that many of us do see.


Applications of Artificial Intelligence In Use Today

Beyond our quantum-computing conundrum, today’s so-called A.I. systems are merely advanced machine learning software with extensive behavioral algorithms that adapt themselves to our likes and dislikes. While extremely useful, these machines aren’t getting smarter in the existential sense, but they are improving their skills and usefulness based on a large dataset. These are some of the most popular examples of artificial intelligence that’s being used today.

#1 — Siri

Everyone is familiar with Apple’s personal assistant, Siri. She’s the friendly voice-activated computer that we interact with on a daily basis. She helps us find information, gives us directions, add events to our calendars, helps us send messages and so on. Siri is a pseudo-intelligent digital personal assistant. She uses machine-learning technology to get smarter and better able to predict and understand our natural-language questions and requests.

Mixed Reality will be most important tech of 2017

Quantum will be the most important technology in 2017; as it will touch everything as well as change everything. Until we see a better integration of AR in Enterprise Apps, platforms, and published services; AR like VR will remain a niche market gadget.

I do know companies like Microsoft, SAP, and Oracle have been looking at ways to leverage AR in their enterprise platforms and services such as ERP and CRM as well as Big Data Analytics; however, to see the volume of sales needed to make VR or AR have staying power on a large scale; the vendors will need to it a pragmatic useful device on multiple fronts. And, yes it is great that we’re using VR and AR in healthcare, defense, engineering, and entertainment (includes gaming); we just need to make it an every consumer device that people canot live with out.


2016 has been a remarkable year that’s brought continued growth and awareness to the worlds of Augmented, Virtual and Mixed Reality. Set to become a $165 Billion dollar industry by 2020, there’s still a common question that lingers among many newcomers trying to understand this fast moving digital phenomena we are just beginning to watch evolve: What’s the difference between them and how will it impact the digital world as I currently know it?

Before we jump into the mind-blowing future Mixed Reality is set to usher in over the course of 2017, let’s first discuss the distinctions between Virtual and Augmented Reality. Their technologies are very similar but have some fundamental differences.

TNW is at CES 2017! Get the low-down on the latest and most insane tech being showcased in Las Vegas.

Running an experiment in the IBM Quantum Experience

IBM Research is making quantum computing available to the public for the first time, providing access to a quantum computing platform from any desktop or mobile device via the cloud. Users of the platform called the IBM Quantum Experience can create algorithms and run experiments on an IBM quantum processor, learn about quantum computing through tutorials and simulations, and get inspired by the possibilities of a quantum computer.

To learn more about IBM’s quantum computing research and get access to the IBM Quantum Experience please visit: http://ibm.com/quantumcomputing

Cryptographers Rally to NIST Call for Quantum Computer Algorithms

Has anyone besides NSA, NIST, DARPA, IARPA, etc. realize and thought about what type of cyber warfare will exist in a QC world? The skillsets alone will be so far advance than the techies that we have seen in most companies today as well as in most government agencies. Granted we’re simplifying things with the platform; however, skillsets will still need to be more advance than what we have seen with the standard techie.


Members of the cryptography community have expressed interest in the National Institute of Standards and Technology’s (NIST) recent call for an algorithm less susceptible to hacks from a computer that does not exist yet.

NIST announced a call for proposals for post-quantum cryptography standardization on Dec. 20. One or more of the proposed algorithms will ultimately replace some of NIST’s cryptographic standards that are most vulnerable to quantum computers. According to Dustin Moody, a mathematician at NIST, 40 people have joined the agency’s online cryptography forum since the call was announced two weeks ago. The forum had about 200 members before the call went out. Moody said that many people were anticipating the announcement, as cryptography enthusiasts tend to run in the same circles.

“Most people who are interested in the field already knew about it,” Moody said. “The call wasn’t a surprise.”

Researchers Build FIRST Reprogrammable Quantum Computer!

Nice advancement this week in QC.


Researchers may have finally created the first fully reprogrammable quantum computer in the world. This changes the entire spectrum of the technology, as quantum computers so far could only run one type of equation.

This marks the beginning of reprogrammable quantum computers. Several teams and companies like IBM are still in the race towards quantum computing, which so far can only run one type of equation. This seems ironic as they can theoretically run more operations than there are atoms in the universe. But this stops now.

According to Futurism, a team from the University of Maryland may have developed the first fully programmable quantum computer.

Computing at Light Speed: The World’s First Photonic Neural Network Has Arrived

In Brief

  • Princeton University researchers have developed the world’s first integrated silicon photonic neuromorphic chip, which contains 49 circular nodes etched into semiconductive silicon.
  • The chip could complete a math equation 1,960 times more quickly than a typical central processing unit, a speed that would make it ideal for use in future neural networks.

As developments are made in neural computing, we can continue to push artificial intelligence further. A fairly recent technology, neural networks have been taking over the world of data processing, giving machines advanced capabilities such as object recognition, face recognition, natural language processing, and machine translation.

These sound like simple things, but they were way out of reach for processors until scientists began to find way to make machines behave more like human brains in the way they learned and handled data. To do this, scientists have been focusing on building neuromorphic chips, circuits that operate in a similar fashion to neurons.