Toggle light / dark theme

You might wonder, at some point today, what’s going on in another person’s mind. You may compliment someone’s great mind, or say they are out of their mind. You may even try to expand or free your own mind.

But what is a mind? Defining the concept is a surprisingly slippery task. The mind is the seat of consciousness, the essence of your being. Without a mind, you cannot be considered meaningfully alive. So what exactly, and where precisely, is it?

Traditionally, scientists have tried to define the mind as the product of brain activity: The brain is the physical substance, and the mind is the conscious product of those firing neurons, according to the classic argument. But growing evidence shows that the mind goes far beyond the physical workings of your brain.

Read more

Tilt Brush example. — Pictures courtesy of HTC ViveTilt Brush example. — Pictures courtesy of HTC ViveLONDON, Dec 27 — From January 11 to 14, 2017, the Royal Academy of Art in London will present the first ever 3D-printed artworks in virtual reality, produced in collaboration with HTC Vive.

Artists from the Royal Academy and its alumni will create artwork using the virtual reality platform HTC Vive, creations that visitors to the exhibition will be able to experience in real time, “fully immersing themselves in the virtual piece.”

Read more

Breaches, hacking, ransomware, cyber threats, weaponized AI, smart toothbrushes are but a few examples of scary tech out there to make your day less than fantastic.

Weapons systems that think on its own are in production, with governments racing to catch up on how to regulate these fast-paced advancements.

Police and military already use drones and robots to eliminate threats, but (as far as we know) it’s hardware controlled by humans.

Read more

Excellent read on the brain’s inhibitory circuits v. excitatory circuits when involving the processing of smells.


Summary: Inhibitory neurons form neural networks that become broader as they mature, a new study reports.

Source: Baylor College of Medicine.

Scientists have discovered that networks of inhibitory brain cells or neurons develop through a mechanism opposite to the one followed by excitatory networks. Excitatory neurons sculpt and refine maps of the external world throughout development and experience, while inhibitory neurons form maps that become broader with maturation. This discovery adds a new piece to the puzzle of how the brain organizes and processes information. Knowing how the normal brain works is an important step toward understanding the nature of neurological conditions and opens the possibility of finding treatments in the future. The results appear in Nature Neuroscience.

Nice; using gene regulatory protein from yeast as a method for reducing the work required for making cell-specific perturbations.


The human brain, the most complex object in the universe, has 86 billion neurons with trillions of yet-unmapped connections. Understanding how it generates behavior is a problem that has beguiled humankind for millennia, and is critical for developing effective therapies for the psychiatric disorders that incur heavy costs on individuals and on society. The roundworm C elegans, measuring a mere 1 millimeter, is a powerful model system for understanding how nervous systems produce behaviors. Unlike the human brain, it has only 302 neurons, and has completely mapped neural wiring of 6,000 connections, making it the closest thing to a computer circuit board in biology. Despite its relative simplicity, the roundworm exhibits behaviors ranging from simple reflexes to the more complex, such as searching for food when hungry, learning to avoid food that previously made it ill, and social behavior.

Understanding how this dramatically simpler nervous system works will give insights into how our vastly more complex brains function and is the subject of a paper published on December 26, 2016, in Nature Methods.

Read more

Not sure that I would claim 2016 as the year that AI exploded; I believe a better term for 2016 is the year that AI reinvented itself. I still see us in an evolution trend in 2017 as we still need to see more AI technology embedded in our back office platforms and apps than where we are today to claim we’re in a real AI explosion. Once we start seeing more IT organizations and CxOs embracing it in lowering their operational costs then we can claim we’re in an explosion.


Artificial intelligence isn’t a new concept. It is something that companies and businesses have been trying to implement (and something that society has feared) for decades. However, with all the recent advancements to democratize artificial intelligence and use it for good, almost every company started to turn to this technology and technique in 2016.

The year started with Facebook’s CEO Mark Zuckerberg announcing his plan to build an artificially intelligent assistant to do everything from adjusting the temperature in his house to checking up on his baby girl. He worked throughout the year to bring his plan to life, with an update in August that stated he was almost ready to show off his AI to the world.

In November, Facebook announced it was beginning to focus on giving computers the ability to think, learn, plan and reason like humans. In order to change the negative stigma people associate with AI, the company ended its year with the release of AI educational videos designed to make the technology easier to understand.

Since their development in 1960, lasers have become an indispensable tool supporting our modern society, finding use in fields such as medicine, information, and industry. Thanks to their compact size and energy efficiency, semiconductor lasers are now one of the most important classes of laser, making possible a diverse range of applications. However, the threshold current of a typical semiconductor laser—the minimum electrical current required to induce lasing—increases with temperature. This is one of a number of disadvantages that can be overcome by using quantum dot lasers. Professor Yasuhiko Arakawa of the Institute of Industrial Science at the University of Tokyo has been researching quantum dot lasers for about 35 years, from their conception to commercialization.

An electron trapped in a microscopic box

Sunlight is composed of light of various colors. The property that determines the color of light is its wavelength, or in other words, the distance between two successive wave peaks or troughs. The location of the peaks and troughs in the waveform is known as its phase. As a laser emits light waves in a uniform phase at the same wavelength, the light can be transmitted as a beam over long distances at high intensity.

Read more

What’s next? Nanocavities in a diamond for small devices.


Researchers have developed a new type of light-enhancing optical cavity that is only 200 nanometers tall and 100 nanometers across. Their new nanoscale system represents a step toward brighter single-photon sources, which could help propel quantum-based encryption techniques under development.

Quantum encryption techniques, which are seen as likely to be central to future data encryption methods, use individual photons as an extremely secure way to encode data. A limitation of these techniques has been the ability to emit photons at high rates. “One of the most important figures of merit for single-photon sources is brightness — or collected photons per second — because the brighter it is, the more data you can transmit securely with quantum encryption,” said Yousif Kelaita of Stanford University.

In the journal Optical Materials Express, from The Optical Society (OSA), Kelaita and his colleagues show that their new nanocavity significantly increased the emission brightness of quantum dots — nanometer-scale semiconductor particles that can emit single photons.

Caloric restriction can help tumour supression.


Tumor suppressors stop healthy cells from becoming cancerous. Researchers from Charité — Universitätsmedizin Berlin, the Medical University of Graz and the German Institute of Human Nutrition in Potsdam-Rehbruecke have found that p53, one of the most important tumor suppressors, accumulates in liver after food withdrawal. They also show that p53 in liver plays a crucial role in the body’s metabolic adaptation to starvation. These findings may provide the foundation for the development of new treatment options for patients with metabolic or oncologic disorders. Results of this study have been published in The FASEB Journal.

Previously described as the ‘guardian of the genome’ and voted ‘Molecule of the Year’ in 1993, p53 is one of the most important proteins regulating cell growth and a major focus for oncology research. It is a protein that has the ability to interrupt the cell cycle and block the division of diseased cells. In order to better understand its physiological regulation, the researchers around Prof. Dr. Michael Schupp from Charité’s Institute of Pharmacology studied the regulation and function of p53 in normal, . After withholding food from mice for several hours, the researchers were able to show that p53 protein accumulates in the liver. In order to determine which type of cause this accumulation, the researchers repeated the experiment using cultured hepatocytes. They found that the starvation-induced accumulation of p53 was indeed detectable in hepatocytes, irrespective of whether these cells were of mouse or human origin.

“Our data also suggest that the accumulation of p53 is mediated by a cellular energy sensor, and that it is crucial for the metabolic changes associated with starvation,” explains Prof. Michael Schupp. The researchers were able to show that mice with an acute inactivation of the p53 gene in liver had difficulties in adapting their metabolisms to starvation. “Food intake seems crucial in determining the protein levels of p53 in liver, and p53 also plays an important role in normal liver metabolism,” says Prof. Schupp. The researchers are planning to study whether their observations are limited to liver cells, or whether this p53 accumulation also occurs in other tissues and organs. Prof.