I am glad that D. Whyte recognizes “If quantum computers are developed faster than anticipated, certification would mandate insecure modules, given the time to approve and implement new quantum resistant algorithms. Worse, it is conceivable that data encrypted by a certified module is more vulnerable than data encrypted by a non-certified module that has the option of using a quantum-safe encryption algorithm.”
Because many of us who are researching and developing in this space have seen the development pace accelerated this year and what was looking like we’re 10 years away is now looking like we’re less than 7 years.
Dr. William Whyte, Chief Scientist for Security Innovation, a cybersecurity provider and leader in the 2015 Gartner Magic Quadrant for Security Awareness Training, will be presenting at the Fourth International Cryptographic Module Conference in Ottawa, Ontario.
Australian physicists’ team has developed a new research assistant to carry out experiments in quantum mechanics in an artificial intelligence (AI) algorithm form, which quickly took control of the experiment, learned the job tasks and even innovated. In a statement, co-lead researcher Paul Wigley from the Australian National University (ANU) Research School of Physics and Engineering, said he didn’t expect that the machine would be able to conduct the experiment itself from scratch within an hour.
He added that in case a simple computer program had been used, it would have taken much more time than the age of the universe to go through all the combinations and work on it.
Scientists were looking forward to reconstruct an experiment that was awarded the 2001 Nobel Prize in Physics, which included very cold gas trapped in a laser beam called a Bose-Einstein condensate.
Given the fact that Los Alamos Labs have been and continue to advance cyber security work on the Quantum Internet as well as work in partnerships with other labs and universities; so, why isn’t Mason not collaborating with Los Alamos on developing an improved hacker proof net? Doesn’t look like the most effective and cost efficient approach.
Imagine burglars have targeted your home, but before they break in, you’ve already moved and are safe from harm.
Now apply that premise to protecting a computer network from attack. Hackers try to bring down a network, but critical tasks are a step ahead of them, thanks to complex algorithms. The dreaded “network down” or denial of service message never flashes on your screen.
That’s the basic idea behind new research by George Mason University researchers, who recently landed some $4 million in grants from the Defense Advanced Research Projects Agency (DARPA). George Mason’s researchers are leading an effort that includes Columbia University, Penn State University and BAE Systems.
Theoretical chemists at Princeton University have pioneered a strategy for modeling quantum friction, or how a particle’s environment drags on it, a vexing problem in quantum mechanics since the birth of the field. The study was published in the Journal of Physical Chemistry Letters (“Wigner–Lindblad Equations for Quantum Friction”). “It was truly a most challenging research project in terms of technical details and the need to draw upon new ideas,” said Denys Bondar, a research scholar in the Rabitz lab and corresponding author on the work.
Quantum friction may operate at the smallest scale, but its consequences can be observed in everyday life. For example, when fluorescent molecules are excited by light, it’s because of quantum friction that the atoms are returned to rest, releasing photons that we see as fluorescence. Realistically modeling this phenomenon has stumped scientists for almost a century and recently has gained even more attention due to its relevance to quantum computing.
An algorithm developed by Google is designed to encode thought, which could lead to computers with ‘common sense’ within a decade, says leading AI scientist.
If you’ve ever seen a “recommended item” on eBay or Amazon that was just what you were looking for (or maybe didn’t know you were looking for), it’s likely the suggestion was powered by a recommendation engine. In a recent interview, Co-founder of machine learning startup Delvv, Inc., Raefer Gabriel, said these applications for recommendation engines and collaborative filtering algorithms are just the beginning of a powerful and broad-reaching technology.
Raefer Gabriel, Delvv, Inc.
Gabriel noted that content discovery on services like Netflix, Pandora, and Spotify are most familiar to people because of the way they seem to “speak” to one’s preferences in movies, games, and music. Their relatively narrow focus of entertainment is a common thread that has made them successful as constrained domains. The challenge lies in developing recommendation engines for unbounded domains, like the internet, where there is more or less unlimited information.
“Some of the more unbounded domains, like web content, have struggled a little bit more to make good use of the technology that’s out there. Because there is so much unbounded information, it is hard to represent well, and to match well with other kinds of things people are considering,” Gabriel said. “Most of the collaborative filtering algorithms are built around some kind of matrix factorization technique and they definitely tend to work better if you bound the domain.”
Of all the recommendation engines and collaborative filters on the web, Gabriel cites Amazon as the most ambitious. The eCommerce giant utilizes a number of strategies to make item-to-item recommendations, complementary purchases, user preferences, and more. The key to developing those recommendations is more about the value of the data that Amazon is able to feed into the algorithm initially, hence reaching a critical mass of data on user preferences, which makes it much easier to create recommendations for new users.
“In order to handle those fresh users coming into the system, you need to have some way of modeling what their interest may be based on that first click that you’re able to extract out of them,” Gabriel said. “I think that intersection point between data warehousing and machine learning problems is actually a pretty critical intersection point, because machine learning doesn’t do much without data. So, you definitely need good systems to collect the data, good systems to manage the flow of data, and then good systems to apply models that you’ve built.”
Beyond consumer-oriented uses, Gabriel has seen recommendation engines and collaborative filter systems used in a narrow scope for medical applications and in manufacturing. In healthcare for example, he cited recommendations based on treatment preferences, doctor specialties, and other relevant decision-based suggestions; however, anything you can transform into a “model of relationships between items and item preferences” can map directly onto some form of recommendation engine or collaborative filter.
One of the most important elements that has driven the development of recommendation engines and collaborative filtering algorithms is the Netflix Prize, Gabriel said. The competition, which offered a $1 million prize to anyone who could design an algorithm to improve upon the proprietary Netflix’s recommendation engine, allowed entrants to use pieces of the company’s own user data to develop a better algorithm. The competition spurred a great deal of interest in the potential applications of collaborative filtering and recommendation engines, he said.
In addition, relative ease of access to an abundant amount of cheap memory is another driving force behind the development of recommendation engines. An eCommerce company like Amazon with millions of items needs plenty of memory to store millions of different of pieces of item and correlation data while also storing user data in potentially large blocks.
“You have to think about a lot of matrix data in memory. And it’s a matrix, because you’re looking at relationships between items and other items and, obviously, the problems that get interesting are ones where you have lots and lots of different items,” Gabriel said. “All of the fitting and the data storage does need quite a bit of memory to work with. Cheap and plentiful memory has been very helpful in the development of these things at the commercial scale.”
Looking forward, Gabriel sees recommendation engines and collaborative filtering systems evolving more toward predictive analytics and getting a handle on the unbounded domain of the internet. While those efforts may ultimately be driven by the Google Now platform, he foresees a time when recommendation-driven data will merge with search data to provide search results before you even search for them.
“I think there will be a lot more going on at that intersection between the search and recommendation space over the next couple years. It’s sort of inevitable,” Gabriel said. “You can look ahead to what someone is going to be searching for next, and you can certainly help refine and tune into the right information with less effort.”
While “mind-reading” search engines may still seem a bit like science fiction at present, the capabilities are evolving at a rapid pace, with predictive analytics at the bow.
If you’ve ever tried to learn how to spin a pencil in your hand, you’ll know it takes some concerted effort—but it’s even harder for a robot. Now, though, researchers have finally built a ‘bot that can learn to do it.
The reason that tasks like spinning a stick are hard is that a lot happens in a very short time. As the stick moves, the forces exerted by the hand can easily send it flying out of control if they’re not perfectly co-ordinated. Sensing where the stick is and varying the hand’s motion is an awful lot for even the smartest algorithms to handle based on a list of rules.
CoinFac Limited, a technology company, has recently introduced the next generation quantum computing technology into cryptocurrency mining, allowing current Bitcoin and Altcoin miners to enjoy a 4,000 times speed increase.
Quantum computing is being perceived as the next generation of supercomputers capable of processing dense digital information and generating multi-sequential algorithmic solutions 100,000 times faster than conventional computers. With each quantum computing server costing at an exorbitant price tag of $5 Million — $10 Million, this revolutionary concoction comprising advanced technological servers with a new wave of currency systems, brings about the most uprising event in the cryptocurrency ecosystem.
“We envisioned cryptocurrency to be the game changer in most developed country’s economy within the next 5 years. Reliance of quantum computing technology expedite the whole process, and we will be recognized as the industry leader in bringing about this tidal change. We aren’t the only institution fathom to leverage on this technology. Other Silicon big boys are already in advance talks of a possible tie up”, said Mike Howzer, CEO of CoinFac Limited.“Through the use of quantum computing, usual bitcoin mining processes are expedited by a blazing speed of 4,000 times. We bring lucrative mining back into Bitcoin industry, all over again”.
Researchers at the University of Liverpool have developed a set of algorithms that will help teach computers to process and understand human languages.
Whilst mastering natural language is easy for humans, it is something that computers have not yet been able to achieve. Humans understand language through a variety of ways for example this might be through looking up it in a dictionary, or by associating it with words in the same sentence in a meaningful way.
The algorithms will enable a computer to act in much the same way as a human would when encountered with an unknown word. When the computer encounters a word it doesn’t recognise or understand, the algorithms mean it will look up the word in a dictionary (such as the WordNet), and tries to guess what other words should appear with this unknown word in the text.