Toggle light / dark theme

If you’ve ever tried to learn how to spin a pencil in your hand, you’ll know it takes some concerted effort—but it’s even harder for a robot. Now, though, researchers have finally built a ‘bot that can learn to do it.

The reason that tasks like spinning a stick are hard is that a lot happens in a very short time. As the stick moves, the forces exerted by the hand can easily send it flying out of control if they’re not perfectly co-ordinated. Sensing where the stick is and varying the hand’s motion is an awful lot for even the smartest algorithms to handle based on a list of rules.

Read more

QC meets Blockchaining; nice.


CoinFac Limited, a technology company, has recently introduced the next generation quantum computing technology into cryptocurrency mining, allowing current Bitcoin and Altcoin miners to enjoy a 4,000 times speed increase.

Quantum computing is being perceived as the next generation of supercomputers capable of processing dense digital information and generating multi-sequential algorithmic solutions 100,000 times faster than conventional computers. With each quantum computing server costing at an exorbitant price tag of $5 Million — $10 Million, this revolutionary concoction comprising advanced technological servers with a new wave of currency systems, brings about the most uprising event in the cryptocurrency ecosystem.

“We envisioned cryptocurrency to be the game changer in most developed country’s economy within the next 5 years. Reliance of quantum computing technology expedite the whole process, and we will be recognized as the industry leader in bringing about this tidal change. We aren’t the only institution fathom to leverage on this technology. Other Silicon big boys are already in advance talks of a possible tie up”, said Mike Howzer, CEO of CoinFac Limited. “Through the use of quantum computing, usual bitcoin mining processes are expedited by a blazing speed of 4,000 times. We bring lucrative mining back into Bitcoin industry, all over again”.

Researchers at the University of Liverpool have developed a set of algorithms that will help teach computers to process and understand human languages.

Whilst mastering is easy for humans, it is something that computers have not yet been able to achieve. Humans understand language through a variety of ways for example this might be through looking up it in a dictionary, or by associating it with words in the same sentence in a meaningful way.

The algorithms will enable a to act in much the same way as a human would when encountered with an unknown word. When the computer encounters a word it doesn’t recognise or understand, the algorithms mean it will look up the word in a dictionary (such as the WordNet), and tries to guess what other words should appear with this unknown word in the text.

Read more

Hmmm; my verdict is out for now because I haven’t seen anything showing me that IBM is a real player in this space.


IBM is bringing quantum computing to a device near you by delivering its IBM Quantum Experience through the IBM Cloud. The platform is part of IBM’s Research Frontiers Institute and could be a data scientist’s newest tool and a data junkie’s dream come true.

The platform is available on any desktop or mobile device. The tech allows users to “run algorithms and experiments on IBM’s quantum processor, work with the individual quantum bits (qubits), and explore tutorials and simulations around what might be possible with quantum computing,” the press release noted.

The processor itself, which is housed at the T.J. Watson Research Center in New York, is made up of five superconducting qubits.

Cambridge University spin-out Optalysys has been awarded a $350k grant for a 13-month project from the US Defense Advanced Research Projects Agency (DARPA). The project will see the company advance their research in developing and applying their optical co-processing technology to solving complex mathematical equations. These equations are relevant to large-scale scientific and engineering simulations such as weather prediction and aerodynamics.

The Optalysys technology is extremely energy efficient, using light rather than electricity to perform intensive mathematical calculations. The company aims to provide existing computer systems with massively boosted processing capabilities, with the aim to eventually reach exaFLOP rates (a billion billion calculations per second). The technology operates at a fraction of the energy cost of conventional high-performance computers (HPCs) and has the potential to operate at orders of magnitude faster.

In April 2015 Optalysys announced that they had successfully built a scaleable, lens-less optical processing prototype that can perform mathematical functions. Codenamed Project GALELEO, the device demonstrates that second order derivatives and correlation pattern matching can be performed optically in a scaleable design.

Read more

Nice; however, I see also 3D printing along with machine learning being part of any cosmetic procedures and surgeries.


With an ever-increasing volume of electronic data being collected by the healthcare system, researchers are exploring the use of machine learning—a subfield of artificial intelligence—to improve medical care and patient outcomes. An overview of machine learning and some of the ways it could contribute to advancements in plastic surgery are presented in a special topic article in the May issue of Plastic and Reconstructive Surgery®, the official medical journal of the American Society of Plastic Surgeons (ASPS).

“Machine learning has the potential to become a powerful tool in plastic surgery, allowing surgeons to harness complex clinical data to help guide key clinical decision-making,” write Dr. Jonathan Kanevsky of McGill University, Montreal, and colleagues. They highlight some key areas in which machine learning and “Big Data” could contribute to progress in plastic and reconstructive surgery.

Machine Learning Shows Promise in Plastic Surgery Research and Practice

Machine learning analyzes historical data to develop algorithms capable of knowledge acquisition. Dr. Kanevsky and coauthors write, “Machine learning has already been applied, with great success, to process large amounts of complex data in medicine and surgery.” Projects with healthcare applications include the IBM Watson Health cognitive computing system and the American College of Surgeons’ National Surgical Quality Improvement Program.

Due to the pace of Quantum Computing is developing; NIST is rushing to create a Quantum proof cryptographic algorithms to prevent QC hacking. Like I have stated, I believe we’re now less that 7 years away for QC being in many mainstream devices, infrastructure, etc. And, China and it’s partnership with Australia; the race is now on and hotter than ever.


The National Institute for Standards and Technology has begun to look into quantum cybersecurity, according to a new report that details and plans out ways scientists could protect these futuristic computers.

April 29, 2016.

Ransomware has taken off in 2016, already eclipsing the number of attacks observed in a recently published threat report from Symantec.

Post-quantum cryptography discussion in Tacoma WA on May 5th discussing hacking by QC hackers and leveraging Cryptography algorithms to offset the attacks; may be of interest to sit in and even join in the debates. I will try attend if I can because it would be interesting to see the arguments raised and see the responses.


The University of Washington Tacoma Institute of Technology will present a discussion about the esoteric field of post-quantum cryptography at the Northwest Cybersecurity Symposium on May 5.

“I’ve been researching post-quantum cryptography for years, finding ways to protect against a threat that doesn’t yet exist,” said Anderson Nascimento, assistant professor of computer science at the institute, in a release.

Post-quantum cryptography refers to encryption that would be secure against an attack by a quantum computer — a kind of supercomputer using quantum mechanics, which, so far, exists only in theory.