brain simulation – Lifeboat News: The Blog https://lifeboat.com/blog Safeguarding Humanity Sat, 20 Aug 2011 14:35:37 +0000 en-US hourly 1 https://wordpress.org/?v=6.7.1 More on Problems of Uploading an Identity https://lifeboat.com/blog/2011/08/more-on-problems-of-uploading-an-identity Sat, 20 Aug 2011 10:39:07 +0000 http://lifeboat.com/blog/?p=2042 The vulnerability of the bio body is the source of most threats to its existence.

We have looked at the question of uploading the identity by uploading the memory contents, on the assumption that the identity is contained in the memories. I believe this assumption has been proved to be almost certainly wrong.

What we are concentrating on is the identity as the viewer of its perceptions, the centroid or locus of perception.

It is the fixed reference point. And the locus of perception is always Here, and it is always Now. This is abbreviated here to 0,0.

What more logical place to find the identity than where it considers Here and Now – its residence in Space Time.

It would surely be illogical to start searching for the identity where it considers to be Somewhere Else or in Another Time.

We considered the fact that the human being accesses the outside world through its senses, and that its information processing system is able to present that information as being “external.” A hand is pricked with a pin. The sensory information – a stream of neural impulses, all essentially identical — progress to the upper brain where the pattern is read and the sensation of pain is felt. That sensation, however, is projected or mapped onto the exact point it originated from.

One feels the pain at the place the neural disturbance came from. It is an illusion — a very useful illusion.

In the long slow progress of evolution from a single cell to the human organism, and to the logical next step — the “android” (we must find a better word) – this mapping function must be one of the most vital survival strategies. If the predator is gnawing at your tail, it’s smart to know where the pain is coming from.

It wasn’t just structure that evolved, but “smarts” too… smarter systems.

Each sensory channel conveys not just sensory information but information regarding where it came from. Like a set of outgoing information vectors. But there is also a complementary set of incoming vectors. The array of sensory vectors from visual, audible, tactile, and so on, all converge on one location – a locus of perception. And the channels cross-correlate. The hand is pricked – we immediately look at the place the pain came from. And… one can “follow one’s nose” to see where the barbecue is.

Dr Shu can use both his left hand and arm; and his right hand and arm in coordination to lift up the $22M Ming vase he is in the process of stealing.

Left/right coordination — so obvious and simple it gets overlooked.

A condition known as Synesthesia [http://hplusmagazine.com/editors-blog/sight-synesthesia-what…be-rewired ] provides an example of how two channels can get confused — for example, being able to see sounds or hear movement.

Perhaps the most interesting example is the rubber hand experiment from UC Riverside. In this the subject places their hands palm down on a table. The left arm and hand are screened off, and a substitute left “arm” and rubber hand are installed. After a while, the subject reacts as though the substitute was their real hand.

It is on Youtube at https://www.youtube.com/watch?v=93yNVZigTsk.

This phenomenon has been attributed to neuroplasticity.

A simpler explanation would be changed coordinates — something that people who row or who ride bicycles are familiar with — even if they have never analysed it. The vehicle becomes part of oneself. It becomes a part of the system, an extension. What about applying the same sense on a grander scale? Such a simple and common observation may have just as much relevance to the next step in evolution as the number of teraflops per second.

So, we can get the sensory vectors to be re-deployed. But one of the fundamental questions would be – can we get the 0,0 locus, the centroid of perception, to shift to another place?

Our environment, the environment we live in, is made of perception. Outside there may be rocks and rivers and rain and wind and thunder… but not in the head. Outside this “theater in the head,” there is a world of photons and particles and energy and radiation — reality — but what we see is what is visible, what we hear is what is audible, what we feel is what is tangible … that is our environment, that is where we live.

However, neurones do not emit any light, neurons do not make any sound, they are not a source of pressure or temperature so what the diddly are we watching and listening to?

We live in a world of perception. Thanks to powerful instrumentation and a great deal of scientific research we know that behind this world of perception there are neurons, unknown to us all the time working away providing us with colors and tones and scents….

But they do not emit colors or tones or scents – the neuronal language is binary – fired or not fired.

Somewhere the neuronal binary (Fired/Not Fired) language has to be translated into the language of perception – the range of colors, the range of tones, the range of smells … these are each continuous variables; not two-state variables as in the language of neurons.

There has been a great flurry of research activity in the area of neurons, and what was considered to be “Gospel” 10 years ago, is no longer so.

IBM and ARM in the UK have (summer 2011) announced prototype brains with hyper-connectivity – a step in the right direction but the fundamental question of interpretation/translation is side-stepped.

I hope someone will prove me wrong, but I am not aware of anyone doing any work on the translator question. This is a grievous error.

(To be continued)

]]>