Toggle light / dark theme

We use a convolutional neural network (CNN) to study cosmic string detection in cosmic microwave background (CMB) flat sky maps with Nambu-Goto strings. On noiseless maps we can measure string tensions down to order $10^{-9}$, however when noise is included we are unable to measure string tensions below.

$10^{-7}$. Motivated by this impasse, we derive an information theoretic bound on the detection of the cosmic string tension $Gμ$ from CMB maps. In particular we bound the information entropy of the posterior distribution of.

$Gμ$ in terms of the resolution, noise level and total survey area of the CMB.

Google, Microsoft, and Amazon are teaming up with Big Oil to squeeze more oil and gas out of the ground using machine learning technology.

Sources:
Brian Merchant (Gizmodo) https://gizmodo.com/how-google-microsoft-and-big-tech-are-au…1832790799
Christopher M. Matthews (Wall Street Journal) https://www.wsj.com/articles/silicon-valley-courts-a-wary-oil-patch-1532424600
Matt Novak (Gizmodo) https://paleofuture.gizmodo.com/article-from-1975-the-world-…1732903871
Kasia Tokarska
Daniel Civitarese
Ghassan AlRegib — https://ghassanalregib.info/

Google, Microsoft, and Amazon have been very vocal about their efforts to reduce the world’s dependence on fossil fuels. But as The Wall Street Journal and Gizmodo have reported, these same companies are currently teaming up with fossil fuel industry to help them squeeze as much oil and gas out of the ground as possible.

Oil has always been hard to find and hard to extract, and so the industry has teetered precariously on the edge of profitability several times over the course of its history. Over and over again, experts have predicted that we’ll soon run out of accessible, affordable oil — but so far, they’ve been wrong. Just when things look bleakest for black gold, new technology swoops in to keep the industry afloat.

Experiments in rodents have revealed that engrams exist as multiscale networks of neurons. An experience becomes stored as a potentially retrievable memory in the brain when excited neurons in a brain region such as the hippocampus or amygdala become recruited into a local ensemble. These ensembles combine with others in other regions, such as the cortex, into an “engram complex.” Crucial to this process of linking engram cells is the ability of neurons to forge new circuit connections, via processes known as “synaptic plasticity” and “dendritic spine formation.” Importantly, experiments show that the memory initially stored across an engram complex can be retrieved by its reactivation but may also persist “silently” even when memories cannot be naturally recalled, for instance in mouse models used to study memory disorders such as early stage Alzheimer’s disease.

“More than 100 years ago Semon put forth a law of engraphy,” wrote Josselyn, Senior Scientist at SickKids, Professor of Psychology and Physiology at the University of Toronto and Senior Fellow in the Brain, Mind & Consciousness Program at the Canadian Institute for Advanced Research, (CIFAR) and Tonegawa, Picower Professor of Biology and Neuroscience at the RIKEN-MIT Laboratory for Neural Circuit Genetics at MIT and Investigator of the Howard Hughes Medical Institute. “Combining these theoretical ideas with the new tools that allow researchers to image and manipulate engrams at the level of cell ensembles facilitated many important insights into memory function.”

“For instance, evidence indicates that both increased intrinsic excitability and synaptic plasticity work hand in hand to form engrams and that these processes may also be important in memory linking, memory retrieval, and memory consolidation.”

For as much as the field has learned, Josselyn and Tonegawa wrote, there are still important unanswered questions and untapped potential applications: How do engrams change over time? How can engrams and memories be studied more directly in humans? And can applying knowledge about biological engrams inspire advances in artificial intelligence, which in turn could feedback new insights into the workings of engrams?


If you’re interested in mind uploading, I have a book that I highly recommend. Rethinking Consciousness is a book by Michael S. A. Graziano, who is a Princeton University professor of psychology and neuroscience.

Early in his book Graziano writes a short summary:

“This book, however, is written entirely for the general reader. In it, I attempt to spell out, as simply and clearly as possible, a promising scientific theory of consciousness — one that can apply equally to biological brains and artificial machines.”

The theory is Attention Schema Theory.

I found this work compelling because one of the main issues in mind uploading is how do you make an inanimate object (like a robot or a computer) conscious? Graziano’s Attention Schema Theory provides a methodology.