Quantum cognition is a new research program that uses mathematical principles from quantum theory as a framework to explain human cognition, including judgment and decision making, concepts, reasoning, memory, and perception. This research is not concerned with whether the brain is a quantum computer. Instead, it uses quantum theory as a fresh conceptual framework and a coherent set of formal tools for explaining puzzling empirical findings in psychology. In this introduction, we focus on two quantum principles as examples to show why quantum cognition is an appealing new theoretical direction for psychology: complementarity, which suggests that some psychological measures have to be made sequentially and that the context generated by the first measure can influence responses to the next one, producing measurement order effects, and superposition, which suggests that some psychological states cannot be defined with respect to definite values but, instead, that all possible values within the superposition have some potential for being expressed. We present evidence showing how these two principles work together to provide a coherent explanation for many divergent and puzzling phenomena in psychology. (PsycInfo Database Record © 2020 APA, all rights reserved)
Category: quantum physics – Page 183
What is quantum cognition
Posted in quantum physics
This second position, while certainly not inconsistent with realism per se, turns upon a distinction involving a notion of “observation”, “measurement”, “test”, or something of this sort—a notion that realists are often at pains to avoid in connection with fundamental physical theory. Of course, any realist account of a statistical physical theory such as quantum mechanics will ultimately have to render up some explanation of how measurements are supposed to take place. That is, it will have to give an account of which physical interactions between “object” and “probe” systems count as measurements, and of how these interactions cause the probe system to evolve into final “outcome-states” that correspond to—and have the same probabilities as—the outcomes predicted by the theory. This is the notorious measurement problem.
In fact, Putnam advanced his version of quantum-logical realism as offering a (radical) dissolution of the measurement problem: According to Putnam, the measurement problem (and indeed every other quantum-mechanical “paradox”) arises through an improper application of the distributive law, and hence disappears once this is recognized. This proposal, however, is widely regarded as mistaken.[4]
As mentioned above, realist interpretations of quantum mechanics must be careful in how they construe the phrase “the observable A” A A has a value in the set B” B B”. The simplest and most traditional proposal—often dubbed the “eigenstate-eigenvalue link” (Fine [1973])—is that (• holds if and only if a measurement of A” A A yields a value in the set B” B B with certainty, i.e., with (quantum-mechanical!) probability 1. While this certainly gives a realist interpretation of (•,[5] it does not provide a solution to the measurement problem. Indeed, we can use it to give a sharp formulation of that problem: even though A” A A is certain to yield a value in B” B B when measured, unless the quantum state is an eigenstate of the measured observable A” A A, the system does not possess any categorical property corresponding to A” A A ’s having a specific value in the set B” B B.
Humans can sometimes be hard to understand, much like quantum physics — unless you watch this channel regularly of course. That’s why a mathematician has come out with an idea of “quantum cognition”. What is this so-called quantum cognition? Does it explain why humans make irrational decisions? Let’s have a look.
📝 Transcripts and written news on Substack ➜ https://sciencewtg.substack.com/
🤓 Check out my new quiz app ➜ http://quizwithit.com/
💌 Support me on Donatebox ➜ https://donorbox.org/swtg.
👉 Transcript with links to references on Patreon ➜ / sabine.
📩 Free weekly science newsletter ➜ https://sabinehossenfelder.com/newsle…
👂 Audio only podcast ➜ https://open.spotify.com/show/0MkNfXl…
🔗 Join this channel to get access to perks ➜
/ @sabinehossenfelder.
🖼️ On instagram ➜ / sciencewtg.
#sciencenews #science #physics #consciousness
Combinatorial optimization problems (COPs) have applications in many different fields such as logistics, supply chain management, machine learning, material design and drug discovery, among others, for finding the optimal solution to complex problems. These problems are usually very computationally intensive using classical computers and thus solving COPs using quantum computers has attracted significant attention from both academia and industry.
Over the past decade, organic luminescent materials have been recognized by academia and industry alike as promising components for light, flexible and versatile optoelectronic devices such as OLED displays. However, it is a challenge to find suitably efficient materials.
To address this challenge, a joint research team has developed a novel approach combining a machine learning model with quantum-classical computational molecular design to accelerate the discovery of efficient OLED emitters. This research was published May 17 in Intelligent Computing.
The optimal OLED emitter discovered by the authors using this “hybrid quantum-classical procedure” is a deuterated derivative of Alq3 and is both extremely efficient at emitting light and synthesizable.
Ever since its discovery, dark matter has remained invisible to scientists despite the launch of multiple ultra-sensitive particle detector experiments around the world over several decades.
Now, physicists at the Department of Energy’s (DOE) SLAC National Accelerator Laboratory are proposing a new way to look for dark matter using quantum devices, which might be naturally tuned to detect what researchers call thermalized dark matter.
Most dark matter experiments hunt for galactic dark matter, which rockets into Earth directly from space, but another kind might have been hanging around Earth for years, said SLAC physicist Rebecca Leane, who was an author of the new study.
Entanglement is a widely studied quantum physics phenomenon, in which two particles become linked in such a way that the state of one affects the state of another, irrespective of the distance between them. When studying systems comprised of several strongly interacting particles (i.e., many body systems) in two or more dimensions, numerically predicting the amount of information shared between these particles, a measure known as entanglement entropy (EE), becomes highly challenging.
More than two decades ago, scientists predicted that at ultra-low temperatures, many atoms could undergo ‘quantum superchemistry’ and chemically react as one. They’ve finally shown it’s real.
As transistors are made ever tinier to fit more computing power into a smaller footprint, they bump up against a big problem: quantum mechanics. Electrons get jumpy in small devices and leak out, which wastes energy while degrading performance. Now a team of researchers is showing that it doesn’t have to be that way. With careful engineering, it’s possible to turn electrons’ quantum behavior into an advantage.
A team of English, Canadian, and Italian researchers have developed a single-molecule transistor that harnesses quantum effects. At low temperatures, the single-molecule device shows a strong change in current with only a small change in gate voltage, nearing a physical limit known as the sub-threshhold swing. Getting near or beyond this limit will allow transistors to be switched with lower voltages, making them more efficient and generating less waste heat. The research team, including physicists at Queen Mary University of London, achieved this by taking advantage of how quantum interference alters the flow of current in single molecules.
“We’ve demonstrated, in principle, that you can use destructive quantum interference for something useful.” —Jan Mol, Queen Mary University of London.