Archive for the ‘information science’ category

Jan 20, 2017

Insecticides mimic melatonin, creating higher risk for diabetes

Posted by in categories: biotech/medical, computing, health, information science

Synthetic chemicals commonly found in insecticides and garden products bind to the receptors that govern our biological clocks, University at Buffalo researchers have found. The research suggests that exposure to these insecticides adversely affects melatonin receptor signaling, creating a higher risk for metabolic diseases such as diabetes.

Published online on Dec. 27 in Chemical Research in Toxicology, the research combined a big data approach, using computer modeling on millions of chemicals, with standard wet-laboratory experiments. It was funded by a grant from the National Institute of Environmental Health Sciences, part of the National Institutes of Health.

Disruptions in human circadian rhythms are known to put people at higher risk for diabetes and other metabolic diseases but the mechanism involved is not well-understood.

Continue reading “Insecticides mimic melatonin, creating higher risk for diabetes” »

Jan 20, 2017

China, already dominant in supercomputers, shoots for an exascale prototype in 2017

Posted by in categories: information science, nuclear energy, robotics/AI, supercomputing

Back in June, China debuted the world’s fastest supercomputer, the Sunway TaihuLight (pictured), with a Linpack benchmark result of 93 petaflop/s. That machine contains 40,960 locally developed ShenWei processors, each with 260 cores and roughly comparable with Intel’s Knight’s Landing Xeon Phi CPU. China also developed a 136GB/sec memory controller and custom interconnect that delivers 16GB/sec of peak bandwidth between nodes.

Now China is working on a prototype exascale (1,000-petaflop) system that it aims to complete by the end of this year, according to state media. An exascale computer is capable of a quintillion calculations per second, and could deliver vast dividends in deep learning and big data across a variety of disciplines as varied as nuclear test research, code breaking, and weather forecasting.

“A complete computing system of the exascale supercomputer and its applications can only be expected in 2020, and will be 200 times more powerful than the country’s first petaflop computer Tianhe-1, recognized as the world’s fastest in 2010,” said Zhang Ting, an application engineer at Tianjin’s National Super Computer Center, to Xinhua news agency (via AFP).

Continue reading “China, already dominant in supercomputers, shoots for an exascale prototype in 2017” »

Jan 19, 2017

EyeLock to exhibit iris authentication technology at Intersec Dubai 2017

Posted by in categories: information science, security

For all my friends in Dubai or travelling to Dubai; wish I could go.

EyeLock LLC will be exhibiting its suite of iris authentication technology at Intersec Dubai 2017, on January 22–24 at the Dubai International Convention and Exhibition Centre in Dubai.

Featuring EyeLock’s proprietary software, security, algorithms and optics, the iris authentication technology delivers secure, reliable and user-friendly capabilities, according to the company.

EyeLock’s technology analyzes more than 240 unique iris characteristics to deliver dual-eye authentication, an unmatched security architecture and anti-spoofing technology.

Continue reading “EyeLock to exhibit iris authentication technology at Intersec Dubai 2017” »

Jan 16, 2017

End to Illness: Machine Learning Is Revolutionizing How We Prevent Disease

Posted by in categories: biotech/medical, genetics, information science, robotics/AI

The TeraStructure algorithm can analyze genome sets much larger than current systems can efficiently handle, including those as big as 100,000 or 1 million genomes. Finding an efficient way to analyze genome databases would allow for personalized healthcare that takes into account any genetic mutations that could exist in a person’s DNA.

Read more

Jan 10, 2017

10 Powerful Examples Of Artificial Intelligence In Use Today

Posted by in categories: information science, life extension, quantum physics, robotics/AI

Not sure where the author got his messaging on AI and QC (namely AI more fluid and human like due to QC); but it sounds a lot like my words. However, there is one lost piece to the AI story even with QC to make AI more human like and that is when you have Synbio involved in the mix. In fact I can not wait to see what my friend Alex Zhavoronkov and his team does with QC in his anti-aging work. I expect to see many great things with QC, AI, and Synbio together.

Nonetheless, I am glad to see others also seeing the capability that many of us do see.

Applications of Artificial Intelligence In Use Today

Continue reading “10 Powerful Examples Of Artificial Intelligence In Use Today” »

Jan 8, 2017

Mixed Reality will be most important tech of 2017

Posted by in categories: augmented reality, engineering, information science, quantum physics, virtual reality

Quantum will be the most important technology in 2017; as it will touch everything as well as change everything. Until we see a better integration of AR in Enterprise Apps, platforms, and published services; AR like VR will remain a niche market gadget.

I do know companies like Microsoft, SAP, and Oracle have been looking at ways to leverage AR in their enterprise platforms and services such as ERP and CRM as well as Big Data Analytics; however, to see the volume of sales needed to make VR or AR have staying power on a large scale; the vendors will need to it a pragmatic useful device on multiple fronts. And, yes it is great that we’re using VR and AR in healthcare, defense, engineering, and entertainment (includes gaming); we just need to make it an every consumer device that people canot live with out.

2016 has been a remarkable year that’s brought continued growth and awareness to the worlds of Augmented, Virtual and Mixed Reality. Set to become a $165 Billion dollar industry by 2020, there’s still a common question that lingers among many newcomers trying to understand this fast moving digital phenomena we are just beginning to watch evolve: What’s the difference between them and how will it impact the digital world as I currently know it?

Continue reading “Mixed Reality will be most important tech of 2017” »

Jan 8, 2017

Running an experiment in the IBM Quantum Experience

Posted by in categories: computing, information science, quantum physics

IBM Research is making quantum computing available to the public for the first time, providing access to a quantum computing platform from any desktop or mobile device via the cloud. Users of the platform called the IBM Quantum Experience can create algorithms and run experiments on an IBM quantum processor, learn about quantum computing through tutorials and simulations, and get inspired by the possibilities of a quantum computer.

To learn more about IBM’s quantum computing research and get access to the IBM Quantum Experience please visit:

Continue reading “Running an experiment in the IBM Quantum Experience” »

Jan 5, 2017

Cryptographers Rally to NIST Call for Quantum Computer Algorithms

Posted by in categories: cybercrime/malcode, encryption, government, information science, military, privacy, quantum physics

Has anyone besides NSA, NIST, DARPA, IARPA, etc. realize and thought about what type of cyber warfare will exist in a QC world? The skillsets alone will be so far advance than the techies that we have seen in most companies today as well as in most government agencies. Granted we’re simplifying things with the platform; however, skillsets will still need to be more advance than what we have seen with the standard techie.

Members of the cryptography community have expressed interest in the National Institute of Standards and Technology’s (NIST) recent call for an algorithm less susceptible to hacks from a computer that does not exist yet.

NIST announced a call for proposals for post-quantum cryptography standardization on Dec. 20. One or more of the proposed algorithms will ultimately replace some of NIST’s cryptographic standards that are most vulnerable to quantum computers. According to Dustin Moody, a mathematician at NIST, 40 people have joined the agency’s online cryptography forum since the call was announced two weeks ago. The forum had about 200 members before the call went out. Moody said that many people were anticipating the announcement, as cryptography enthusiasts tend to run in the same circles.

Continue reading “Cryptographers Rally to NIST Call for Quantum Computer Algorithms” »

Jan 5, 2017

Researchers Build FIRST Reprogrammable Quantum Computer!

Posted by in categories: computing, information science, particle physics, quantum physics

Nice advancement this week in QC.

Researchers may have finally created the first fully reprogrammable quantum computer in the world. This changes the entire spectrum of the technology, as quantum computers so far could only run one type of equation.

This marks the beginning of reprogrammable quantum computers. Several teams and companies like IBM are still in the race towards quantum computing, which so far can only run one type of equation. This seems ironic as they can theoretically run more operations than there are atoms in the universe. But this stops now.

Continue reading “Researchers Build FIRST Reprogrammable Quantum Computer!” »

Jan 2, 2017

Computing at Light Speed: The World’s First Photonic Neural Network Has Arrived

Posted by in categories: information science, mathematics, robotics/AI

In Brief

  • Princeton University researchers have developed the world’s first integrated silicon photonic neuromorphic chip, which contains 49 circular nodes etched into semiconductive silicon.
  • The chip could complete a math equation 1,960 times more quickly than a typical central processing unit, a speed that would make it ideal for use in future neural networks.

As developments are made in neural computing, we can continue to push artificial intelligence further. A fairly recent technology, neural networks have been taking over the world of data processing, giving machines advanced capabilities such as object recognition, face recognition, natural language processing, and machine translation.

These sound like simple things, but they were way out of reach for processors until scientists began to find way to make machines behave more like human brains in the way they learned and handled data. To do this, scientists have been focusing on building neuromorphic chips, circuits that operate in a similar fashion to neurons.

Continue reading “Computing at Light Speed: The World’s First Photonic Neural Network Has Arrived” »

Page 1 of 4212345678Last