Mentally count the windows in your home. Did you close your eyes? Visualize your house’s layout in your head? I did, when I tried this task. But some people, researchers have discovered, seem to be incapable of producing and holding such images in their mind’s eye. (They’re also perfectly capable of answering the window question.)
Category: neuroscience – Page 1,161
“Here, we show for the first time that continuously spoken speech can be decoded into the expressed words from intracranial electrocorticographic (ECoG) recordings.” Read more
The FDA-approved BrainPort V100 translates visual images into vibrations that can be felt on the tongue to help users better understand their surroundings.
My new story for Vice Motherboard on how in the near future we will edit our realities to suit our tastes and desires:
“In a sense, all four pillars of the mind-uploading roadmap—mapping the brain’s structure and function, creating the software and hardware to emulate it—are now areas of active research. If we take Koene’s optimistic view, within a decade, we may have the technological capacity to fully map and emulate a very simple brain—say, that of a Drosophila fruit fly, which contains roughly 100 thousand neurons. ”
Brain activity recorded by electrocorticography electrodes (blue circles). Spoken words are then decoded from neural activity patterns in the blue/yellow areas. (credit: CSL/KIT)
If there was a drug that meant you never had to sleep again, would you take it? Would those who didn’t need to sleep have special advantages over those who did? All that and a side of zombies, in this week’s episode of Meanwhile in the Future.
In the Culture novels by Iain M. Banks, futuristic post-humans install devices on their brains called a “neural lace.” A mesh that grows with your brain, it’s essentially a wireless brain-computer interface. But it’s also a way to program your neurons to release certain chemicals with a thought. And now, there’s a neural lace prototype in real life.