I’ve been researching the relationship between brain neurons and nodes in neural networks. Repeatedly it is claimed neurons can do complex information processing that vastly exceeds that of a simple activation function in a neural network.
The resources I’ve read so far suggest nothing fancy is happening with a neuron. The neuron sums the incoming signals from synapses, and then fires when the sum passes a threshold. This is identical to the simple perceptron, the precursor to today’s fancy neural networks. If there is more to a neuron’s operation that this, I am missing it due to lack of familiarity with the neuroscience terminology. I’ve also perused this stack exchange, and haven’t found anything.
If someone could point to a detailed resource that explains the different complex ways a neuron processes the incoming information, in particular what makes a neuron a more sophisticated information processor than a perceptron, I would be grateful.
Comments are closed.