Toggle light / dark theme

I already voiced my concerns of this technology in the hands of criminals and terrorists. If we can have it so can others. Only when QC and a Quantum net is in place will we be truly protected with bots.


Cybersecurity could soon be another place where bots become invaluable for experts. DARPA recently organized The Cyber Grand Challenge, where computer algorithms showed how easy it is to clean up vulnerabilities in code written by humans. ( DARPA )

The Cyber Grand Challenge took place under DARPA patronage, and it is good to see how preoccupied the U.S. Department of Defense is with cybersecurity.

The event pitted computers against each other in an attempt to uncover which one can best fulfill the tasks of human cybersecurity researchers, that is, discovering a bug in a software program and fixing it.

Read more

Quantum computing remains mysterious and elusive to many, but USC Viterbi School of Engineering researchers might have taken us one step closer to bring such super-powered devices to practical reality. The USC Viterbi School of Engineering and Information Sciences Institute is home to the USC-Lockheed Martin Quantum Computing Center (QCC), a super-cooled, magnetically shielded facility specially built to house the first commercially available quantum optimization processors — devices so advanced that there are currently only two in use outside the Canadian company D-Wave Systems Inc., where they were built: The first one went to USC and Lockheed Martin, and the second to NASA and Google.

Quantum computers encode data in quantum bits, or “qubits,” which have the capability of representing the two digits of one and zero at the same time — as opposed to traditional bits, which can encode distinctly either a one or a zero. This property, called superposition, along with the ability of quantum states to “interfere” (cancel or reinforce each other like waves in a pond) and “tunnel” through energy barriers, is what may one day allow quantum processors to ultimately perform optimization calculations much faster than is possible using traditional processors. Optimization problems can take many forms, and quantum processors have been theorized to be useful for a variety of machine learning and big data problems like stock portfolio optimization, image recognition and classification, and detecting anomalies. Yet, exactly because of the exotic way in which quantum computers process information, they are highly sensitive to errors of different kinds.

Read more

Quantum computers promise speedy solutions to some difficult problems, but building large-scale, general-purpose quantum devices is a problem fraught with technical challenges.

To date, many research groups have created small but functional computers. By combining a handful of atoms, electrons or superconducting junctions, researchers now regularly demonstrate quantum effects and run simple —small programs dedicated to solving particular problems.

But these laboratory devices are often hard-wired to run one program or limited to fixed patterns of interactions between the quantum constituents. Making a quantum computer that can run arbitrary algorithms requires the right kind of physical system and a suite of programming tools. Atomic , confined by fields from nearby electrodes, are among the most promising platforms for meeting these needs.

Read more

Interesting and true on many situations; and will only expand as we progress in areas of AI, QC, and Singularity as well.


The use of algorithms to filter and present information online is increasingly shaping our everyday experience of the real world, a study published by Information, Communication & Society argues.

Associate Professor Michele Willson of Curtin University, Perth, Australia looked at particular examples of computer algorithms and the questions they raise about personal agency, changing world views and our complex relationship with technologies.

Algorithms are central to how information and communication are located, retrieved and presented online, for example in Twitter follow recommendations, Facebook newsfeeds and suggested Google map directions. However, they are not objective instructions but assume certain parameters and values, and are in constant flux, with changes made by both humans and machines.

Read more

Big Data and 3D.


3D printing remains one of those technological areas that holds a great amount of fascination. What began as a type of niche market has expanded rapidly in the past few years to encompass nearly every industry out there, from the medical field to manufacturing.

The outlook is a positive one in terms of 3D printing’s future, with Gartner predicting the amount of spending on 3D printers to exceed more than $13 billion in 2018. While 3D printing has always held a lot of promise, one of the factors truly taking the concept to the next level is big data.

In much the same way that big data has benefited businesses of all types and sizes, it has proven to play a pivotal role in the growth of 3D printing. As more organizations get a firm grasp on how best to use both big data analytics and 3D printing capabilities, the two areas will form a more established and interdependent relationship.

Read more

Electronic computer technology has moved from valves to transistors to progressively more complex integrated circuits and processor designs, with each change bringing higher levels of performance. Now the advent of quantum computers promises a huge step increase in processor performance to solve certain types of problems.

Quantum computers are much faster than the world’s fastest supercomputers for some applications. In 1994 Peter Shor, an applied mathematician at Bell Laboratories, gave the encryption world a shock when he demonstrated an algorithm showing that quantum computers could threaten conventional prime number based encryption methods.

If an adversary conducts successful espionage raids on encrypted information stored in present technology computer installations, possibly through a compromised or issue-motivated individual who transfers it to portable media, it could become vulnerable to decryption by that rival’s quantum computers.

Read more

Interesting…


However, new research carried out at the University of Waterloo and University of Lethbridge, in Canada, argues there is a much longer measureable minimum unit of time.

If true, the existence of such a minimum time changes the basic equations of quantum mechanics.

This means our understanding of how the universe operates on a very small scale may need to reconsidered.

Read more

Big Data and Obama’s Brain Initiative — As we harness mass volumes of information and the current tech explosion around information; we will seeing an accelerated growing need/ urgency for more advance AI, QC, and new brain-mind interface intelligence to assist others when working with both super-intelligence AI and the mass volumes of information.


Engineers are experimenting with chip design to boost computer performance. In the above layout of a chip developed at Columbia, analog and digital circuits are combined in a novel architecture to solve differential equations with extreme speed and energy efficiency. Image: Simha Sethumadhavan, Mingoo Seok and Yannis Tsividis/Columbia Engineering.

In the big data era, the modern computer is showing signs of age. The sheer number of observations now streaming from land, sea, air and space has outpaced the ability of most computers to process it. As the United States races to develop an “exascale” machine up to the task, a group of engineers and scientists at Columbia have teamed up to pursue solutions of their own.

The Data Science Institute’s newest working group— Frontiers in Computing Systems —will try to address some of the bottlenecks facing scientists working with massive data sets at Columbia and beyond. From astronomy and neuroscience, to civil engineering and genomics, major obstacles stand in the way of processing, analyzing and storing all this data.

Read more