> while indeed we can’t simulate the whole of neuron, why would we want to do that? I think that is backwards. We only have to model the actually important function of a neuron.
Yeah so I think this is where we fundamentally differ. It seems like your assumption is that neurobiology is fundamentally messy and inefficient, and we should be able to dispense with the squishy bits and abstract out the real core "information processing" part to make something more efficient than a brain.
So if that's your assertion, what would that look like? What would be the subset of a neuron that we could simulate which would represent that distillation of the information processing part?
Because my argument would be, the squishy, messy cellular anatomy is the core information processing part. So if we try to emulate neural processing with the assumption that a whole neuron is the base unit, we will miss a lot of that micro-level processing which may be essential to reaching the utility and efficiency achieved by the human brain.
I'm not against the idea that whatever brains we happened to evolved are not the most efficient structure possible. But my position would be, we're probably quite far in terms of current computing technology from being able to build something better. I would imagine we might have to be able to bioengineer better neurons if we really want to compete with the real thing, rather than trying so simulate it in software.
I can’t think of any field of research where the model used is completely accurate. At one point we will have to leave behind the messy real world. While a simple weighted node is insufficient for modeling a neuron, there are more complex models that are still orders of magnitudes less complex than simulating every single interaction between the I don’t know how many moles of molecule (which we can’t even do as far as I know, not even on a few molecule basis, let alone at such a huge volume).
But I feel I may be misrepresenting your point now. To answer your question, maybe a sufficient model (sufficient to be able to reproduce some core functionality of the brain, eg. make memories) would be one that incorporates a weight for each sort of signal (neurotransmitter) it can process, complete with a fatigue model per signal type, as well as we can perhaps add the notable major interactions between pathways (eg. activation of one temporarily decreasing the weight of another, but in a way bias is sorta this in the very basic NNs). But to be honest, such a construction would be valuable even with arbitrary types of signals, no need to model it exactly based on existing neurotransmitters. I think most properties interesting from a GAI perspective are emerging ones, and whether dopamine does this and that is an implementation detail of human brains.
So just to unpack this a little - there's a lot of different mechanisms going on in neural computation.
For instance one of those is spike-timing dependent plasticity. Basically the idea is that the sensitivity of a synapse gets up-regulated or down-regulated depending on the relative timing of the firing of the two neurons involved. So in the classic example, if the up-stream neuron fires before the down-stream neuron, the synapse gets stronger. But if the down-stream neuron fires first, the synapse gets weaker.
Another one is synchronization. It appears that the firing frequency of groups of neurons which are - for instance representing the same feature - become temporally synchronized. I.e. you could have different neural circuits active at the same time in the brain, but oscillating at different frequencies.
Another interesting mechanism is how dopamine works in the Nucleus Accumbens. Here you have two different types of receptors at the same synapses: one of them is inhibitory, and is sensitive at low concentrations of dopamine. The other is excitatory, and is sensitive at high concentrations. What this means is, at a single synapse, the same up-stream neuron can either increase or decrease the activation of the down-stream neuron: if the up stream neuron is firing just a little, the inhibitory receptors dominate. But if it's firing a lot, the excitatory receptors take over, and the down-stream neuron starts to activate more. Which kind of connection weight in an ANN can model that kind of connection?
My overall question would be, do you think back-propogation and markov chains are really sufficient to account for all that subtlety we have in neural computation, especially when it comes to specific timing and frequency-dependent effects?
If Markov processes won’t cut it, a Turing machine will. And an ANN can approximate a Turing machine.
To boil it down, if you really want to argue that the behaviour of a neuron can’t be simulated by an ANN, you’re arguing that a neuron is doing something non-computable. At which point you might as well argue it’s magical.
Yeah so I think this is where we fundamentally differ. It seems like your assumption is that neurobiology is fundamentally messy and inefficient, and we should be able to dispense with the squishy bits and abstract out the real core "information processing" part to make something more efficient than a brain.
So if that's your assertion, what would that look like? What would be the subset of a neuron that we could simulate which would represent that distillation of the information processing part?
Because my argument would be, the squishy, messy cellular anatomy is the core information processing part. So if we try to emulate neural processing with the assumption that a whole neuron is the base unit, we will miss a lot of that micro-level processing which may be essential to reaching the utility and efficiency achieved by the human brain.
I'm not against the idea that whatever brains we happened to evolved are not the most efficient structure possible. But my position would be, we're probably quite far in terms of current computing technology from being able to build something better. I would imagine we might have to be able to bioengineer better neurons if we really want to compete with the real thing, rather than trying so simulate it in software.