It seems that the D-Wave Computer does work, and the theory is that the hardware is 3,600 times faster than other supercomputers. It is the nearest we have to quantum computing, and there have been two tests leading to the announcement that it was far more quickly than simulated annealing which is a copy of quantum computation carried out on a standard computer chip.
Category: supercomputing – Page 85
Eventually it will be in everything tech. This version by IBM; is not for the masses. However, don’t worry; it’s coming.
Users will eventually be able to contribute and review results in the coming community, which will be hosted on the IBM Quantum Experience. So kudos to IBM for properly managing expectations.
The researchers at IBM have created a quantum processor, made up of five superconducting quantum bits (qubits).
The company said anyone can run experiments on the computing platform by accessing its website connected to the IBM Cloud. Arvind Krishna, senior vice president and director, IBM Research, noted that quantum computers would be very different from even today’s top supercomputers in looks, structure, and capabilities. A universal quantum computer, once built, has the potential to solve problems that are not solvable with today’s classical computers, IBM said. It can also allow for analysis of much larger quantities of data than can be done by today’s supercomputers.
Hmmm; my verdict is out for now.
IBM Quantum Computing Scientist Jay Gambetta uses a tablet to interact with the IBM Quantum Experience, the world’s first quantum computing platform delivered via the IBM Cloud at IBM’s T. J. Watson Research Center in Yorktown, NY.
On Wednesday, May 4, for the first time ever, IBM is making quantum computing available via the cloud to anyone interested in hands-on access to an IBM quantum processor, making it easier for researchers and the scientific community to accelerate innovations, and help discover new applications for this technology. This is the beginning of the quantum age of computing and the latest advance from IBM towards building a universal quantum computer. A universal quantum computer, once built, will represent one of the greatest milestones in the history of information technology and has the potential to solve certain problems we couldn’t solve, and will never be able to solve, with today’s classical computers. (Jon Simon/Feature Photo Service for IBM)
IBM scientists have built a quantum processor that users can access through a first-of-a-kind quantum computing platform delivered via the IBM Cloud onto any desktop or mobile device. IBM believes quantum computing is the future of computing and has the potential to solve certain problems that are impossible to solve on today’s supercomputers.
I am totally jealous right now!
Australia opened a new quantum computing lab at the University of New South Wales (UNSW).
This follows the government’s $26-million investment in the Centre of Excellence for Quantum Computation & Communication Technology (CQC2T) as part of the National Innovation and Science Agenda. The government’s investment is supported by $10 million each from Telstra and the Commonwealth Bank of Australia (CBA).
Today’s digital computers have finite processing power whereas a commercial quantum computer will deliver a significant speed-up in power, including over a supercomputer.
Interesting insight on Aluminum Nitride used to create Qubits.
http:///articles/could-aluminum-nitride-be-engineered-to-pro…nteresting insight.
Newswise — Quantum computers have the potential to break common cryptography techniques, search huge datasets and simulate quantum systems in a fraction of the time it would take today’s computers. But before this can happen, engineers need to be able to harness the properties of quantum bits or qubits.
Currently, one of the leading methods for creating qubits in materials involves exploiting the structural atomic defects in diamond. But several researchers at the University of Chicago and Argonne National Laboratory believe that if an analogue defect could be engineered into a less expensive material, the cost of manufacturing quantum technologies could be significantly reduced. Using supercomputers at the National Energy Research Scientific Computing Center (NERSC), which is located at the Lawrence Berkeley National Laboratory (Berkeley Lab), these researchers have identified a possible candidate in aluminum nitride. Their findings were published in Nature Scientific Reports.
I do love Nvidia!
During the past nine months, an Nvidia engineering team built a self-driving car with one camera, one Drive-PX embedded computer and only 72 hours of training data. Nvidia published an academic preprint of the results of the DAVE2 project entitled End to End Learning for Self-Driving Cars on arXiv.org hosted by the Cornell Research Library.
The Nvidia project called DAVE2 is named after a 10-year-old Defense Advanced Research Projects Agency (DARPA) project known as DARPA Autonomous Vehicle (DAVE). Although neural networks and autonomous vehicles seem like a just-invented-now technology, researchers such as Google’s Geoffrey Hinton, Facebook’s Yann Lecune and the University of Montreal’s Yoshua Bengio have collaboratively researched this branch of artificial intelligence for more than two decades. And the DARPA DAVE project application of neural network-based autonomous vehicles was preceded by the ALVINN project developed at Carnegie Mellon in 1989. What has changed is GPUs have made building on their research economically feasible.
Neural networks and image recognition applications such as self-driving cars have exploded recently for two reasons. First, Graphical Processing Units (GPU) used to render graphics in mobile phones became powerful and inexpensive. GPUs densely packed onto board-level supercomputers are very good at solving massively parallel neural network problems and are inexpensive enough for every AI researcher and software developer to buy. Second, large, labeled image datasets have become available to train massively parallel neural networks implemented on GPUs to see and perceive the world of objects captured by cameras.
Post-quantum cryptography discussion in Tacoma WA on May 5th discussing hacking by QC hackers and leveraging Cryptography algorithms to offset the attacks; may be of interest to sit in and even join in the debates. I will try attend if I can because it would be interesting to see the arguments raised and see the responses.
The University of Washington Tacoma Institute of Technology will present a discussion about the esoteric field of post-quantum cryptography at the Northwest Cybersecurity Symposium on May 5.
“I’ve been researching post-quantum cryptography for years, finding ways to protect against a threat that doesn’t yet exist,” said Anderson Nascimento, assistant professor of computer science at the institute, in a release.
Post-quantum cryptography refers to encryption that would be secure against an attack by a quantum computer — a kind of supercomputer using quantum mechanics, which, so far, exists only in theory.
RPI’s new material takes semiconducting transistors to new levels.
Two-dimensional phosphane, a material known as phosphorene, has potential application as a material for semiconducting transistors in ever faster and more powerful computers. But there’s a hitch. Many of the useful properties of this material, like its ability to conduct electrons, are anisotropic, meaning they vary depending on the orientation of the crystal. Now, a team including researchers at Rensselaer Polytechnic Institute (RPI) has developed a new method to quickly and accurately determine that orientation using the interactions between light and electrons within phosphorene and other atoms-thick crystals of black phosphorus. Phosphorene—a single layer of phosphorous atoms—was isolated for the first time in 2014, allowing physicists to begin exploring its properties experimentally and theoretically. Vincent Meunier, head of the Rensselaer Department of Physics, Applied Physics, and Astronomy and a leader of the team that developed the new method, published his first paper on the material—confirming the structure of phosphorene—in that same year.
“This is a really interesting material because, depending on which direction you do things, you have completely different properties,” said Meunier, a member of the Rensselaer Center for Materials, Devices, and Integrated Systems (cMDIS). “But because it’s such a new material, it’s essential that we begin to understand and predict its intrinsic properties.”
Meunier and researchers at Rensselaer contributed to the theoretical modeling and prediction of the properties of phosphorene, drawing on the Rensselaer supercomputer, the Center for Computational Innovations (CCI), to perform calculations. Through the Rensselaer cMDIS, Meunier and his team are able to develop the potential of new materials such as phosphorene to serve in future generations of computers and other devices. Meunier’s research exemplifies the work being done at The New Polytechnic, addressing difficult and complex global challenges, the need for interdisciplinary and true collaboration, and the use of the latest tools and technologies, many of which are developed at Rensselaer.
Supercomputer facing problems?
In the world of High Performance Computing (HPC), supercomputers represent the peak of capability, with performance measured in petaFLOPs (1015 operations per second). They play a key role in climate research, drug research, oil and gas exploration, cryptanalysis, and nuclear weapons development. But after decades of steady improvement, changes are coming as old technologies start to run into fundamental problems.
When you’re talking about supercomputers, a good place to start is the TOP500 list. Published twice a year, it ranks the world’s fastest machines based on their performance on the Linpack benchmark, which solves a dense system of linear equations using double precision (64 bit) arithmetic.
Looking down the list, you soon run into some numbers that boggle the mind. The Tianhe-2 (Milky Way-2), a system deployed at the National Supercomputer Center in Guangzho, China, is the number one system as of November 2015, a position it’s held since 2013. Running Linpack, it clocks in at 33.86 × 1015 floating point operations per second (33.86 PFLOPS).
Now, I have been hearing folks are planning to experiment with block chaining on the new Nvidia DGX-1. I do know Nvidia’s CEO mentioned that DGX-1 could be used in conjunction with block chaining as an interim step to Quatum Computing to help secure information. We’ll see.
Sterling Heights, MI (PRWEB) April 24, 2016.
Rave Computer, an Elite Solution Provider in the NVIDIA Partner Network program, today announced that it has been selected to offer the new NVIDIA® DGX-1™ deep learning system, the world’s first deep learning supercomputer designed to meet the unlimited computing demands of artificial intelligence.
Rave will support NVIDIA’s marketing and sales efforts by qualifying, educating, and managing potential customers’ requirements and orders. Rave will also actively be involved in driving market awareness and demand development.