Menu

Blog

Archive for the ‘computing’ category: Page 514

Jun 30, 2019

Google Is 2 Billion Lines of Code—And It’s All in One Place

Posted by in category: computing

Circa 2015


By comparison, Microsoft Windows—one of the most complex software tools ever built for a single computer—is about 50 million lines.

Jun 30, 2019

Quantum Computing Vs. Blockchain: Impact on Cryptography

Posted by in categories: bitcoin, computing, encryption, quantum physics

Quantum computers will not kill blockchain, but they might trigger fundamental changes in underlying cryptography.

Jun 29, 2019

Startup Catalog has jammed all 16GB of Wikipedia’s text onto DNA strands

Posted by in categories: biotech/medical, computing

Biological molecules will last a lot longer than the latest computer storage technology, Catalog believes.

Jun 29, 2019

How quantum brain biology can rescue conscious free will

Posted by in categories: biological, computing, information science, neuroscience, quantum physics

Conscious “free will” is problematic because brain mechanisms causing consciousness are unknown, measurable brain activity correlating with conscious perception apparently occurs too late for real-time conscious response, consciousness thus being considered “epiphenomenal illusion,” and determinism, i.e., our actions and the world around us seem algorithmic and inevitable. The Penrose–Hameroff theory of “orchestrated objective reduction (Orch OR)” identifies discrete conscious moments with quantum computations in microtubules inside brain neurons, e.g., 40/s in concert with gamma synchrony EEG. Microtubules organize neuronal interiors and regulate synapses. In Orch OR, microtubule quantum computations occur in integration phases in dendrites and cell bodies of integrate-and-fire brain neurons connected and synchronized by gap junctions, allowing entanglement of microtubules among many neurons. Quantum computations in entangled microtubules terminate by Penrose “objective reduction (OR),” a proposal for quantum state reduction and conscious moments linked to fundamental spacetime geometry. Each OR reduction selects microtubule states which can trigger axonal firings, and control behavior. The quantum computations are “orchestrated” by synaptic inputs and memory (thus “Orch OR”). If correct, Orch OR can account for conscious causal agency, resolving problem 1. Regarding problem 2, Orch OR can cause temporal non-locality, sending quantum information backward in classical time, enabling conscious control of behavior. Three lines of evidence for brain backward time effects are presented. Regarding problem 3, Penrose OR (and Orch OR) invokes non-computable influences from information embedded in spacetime geometry, potentially avoiding algorithmic determinism. In summary, Orch OR can account for real-time conscious causal agency, avoiding the need for consciousness to be seen as epiphenomenal illusion. Orch OR can rescue conscious free will.

Keywords: microtubules, free will, consciousness, Penrose-Hameroff Orch OR, volition, quantum computing, gap junctions, gamma synchrony.

We have the sense of conscious control of our voluntary behaviors, of free will, of our mental processes exerting causal actions in the physical world. But such control is difficult to scientifically explain for three reasons:

Jun 29, 2019

Neuromorphic Hardware: Trying to Put Brain Into Chips

Posted by in categories: computing, neuroscience

Hi all.


Up until now, chip-makers have been piggybacking on the renowned Moore’s Law for delivering successive generations of chips that have more compute capabilities and are less power hungry. Now, these advancements are slowly coming to a halt. Researchers around the world are proposing alternative architectures to continue producing systems which are faster and more energy efficient. This article discusses those alternatives and reasons why one of them might have an edge over others in averting the chip design industry from getting stymied.

Moore’s law, or to put it differently — savior of chip-makers worldwide — was coined by Dr. Gordon Moore, the founder of Intel Corp, in 1965. The law states that the number of transistors on a chip would double every 2 years. But why the savior of chip-makers? This law was so powerful during the semiconductor boom that “people would auto-buy the next latest and greatest computer chip, with full confidence that it would be better than what they’ve got”, said former Intel engineer Robert P. Colwell. Back in the day writing a program with bad performance was not an issue as the programmer knew that Moore’s law would ultimately save him.

Continue reading “Neuromorphic Hardware: Trying to Put Brain Into Chips” »

Jun 28, 2019

Neuroimaging Of Brain Shows Who Spoke To A Person And What Was Said

Posted by in categories: computing, information science, neuroscience

Flashback to 2 years ago…


Scientists from Maastricht University have developed a method to look into the brain of a person and read out who has spoken to him or her and what was said. With the help of neuroimaging and data mining techniques the researchers mapped the brain activity associated with the recognition of speech sounds and voices.

In their Science article “‘Who’ is Saying ‘What’? Brain-Based Decoding of Human Voice and Speech,” the four authors demonstrate that speech sounds and voices can be identified by means of a unique ‘neural fingerprint’ in the listener’s brain. In the future this new knowledge could be used to improve computer systems for automatic speech and speaker recognition.

Seven study subjects listened to three different speech sounds (the vowels /a/, /i/ and /u/), spoken by three different people, while their brain activity was mapped using neuroimaging techniques (fMRI). With the help of data mining methods the researchers developed an algorithm to translate this brain activity into unique patterns that determine the identity of a speech sound or a voice. The various acoustic characteristics of vocal cord vibrations (neural patterns) were found to determine the brain activity.

Jun 28, 2019

Diamond on silicon chips are running at 100 Gigahertz and can also make power chips for directing 10,000 volts

Posted by in categories: computing, mobile phones

Circa 2016


Diamond computer chips running at 100-GHz have been demonstrated by Akhan Semiconductor. They are currently using design rules in the 100s of nanometers.

Developers are focusing on power applications on 12-inch wafers. They hope to drive down the costs of production with higher volumes. Power devices are moving into pilot production at a fab. They are using the fab-lite model—that is produce small- to medium-sized runs themselves. They will then transfer their process to foundries when they ramp up into volume production.

Continue reading “Diamond on silicon chips are running at 100 Gigahertz and can also make power chips for directing 10,000 volts” »

Jun 28, 2019

Optimal quantum computation linked to gravity

Posted by in categories: computing, quantum physics

Information and gravity may seem like completely different things, but one thing they have in common is that they can both be described in the framework of geometry. Building on this connection, a new paper suggests that the rules for optimal quantum computation are set by gravity.

Physicists Paweł Caputa at Kyoto University and Javier Magan at the Instituto Balseiro, Centro Atómico de Bariloche in Argentina have published their paper on the link between and gravity in a recent issue of Physical Review Letters.

In the field of , one of the main ideas is minimizing the cost (in terms of computational resources) to solve a problem. In 2006, Michael Nielsen demonstrated that, when viewed in the context of differential geometry, computational costs can be estimated by distances. This means that minimizing computational costs is equivalent to finding minimal “geodesics,” which are the shortest possible distances between two points on a curved surface.

Jun 28, 2019

Stacking Graphene Creates Entirely New Quantum States

Posted by in categories: computing, quantum physics

The discovery could help overcome a major quantum computing hurdle.

Jun 27, 2019

Physicists ‘teleport’ logic operation between separated ions

Posted by in categories: computing, particle physics, quantum physics, space

Physicists at the National Institute of Standards and Technology (NIST) have teleported a computer circuit instruction known as a quantum logic operation between two separated ions (electrically charged atoms), showcasing how quantum computer programs could carry out tasks in future large-scale quantum networks.

Quantum teleportation transfers data from one quantum system (such as an ion) to another (such as a second ion), even if the two are completely isolated from each other, like two books in the basements of separate buildings. In this real-life form of teleportation, only quantum information, not matter, is transported, as opposed to the Star Trek version of “beaming” entire human beings from, say, a spaceship to a planet.

Teleportation of quantum data has been demonstrated previously with ions and a variety of other quantum systems. But the new work is the first to teleport a complete quantum logic operation using ions, a leading candidate for the architecture of future quantum computers. The experiments are described in the May 31 issue of Science.