Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Electrical properties of dendrites help explain our brain’s computing power (news.mit.edu)
180 points by magoghm on Oct 24, 2018 | hide | past | favorite | 48 comments


They're comparing human brains to rodent brains here. Suzana Herculano-Houzel did some interesting work recently that showed that human (and primate, and bird) brains have neuron counts that scale linearly with volume while most mammals only scale something like the 3/4 power of volume. So I wonder how a crow or lemur brain would look in this regard.


Dr. Herculano-Houzel is a real life Indiana Jones of the brain. She's an absolute jewel in science. I remember her telling a group of us about her efforts in Botswana(?) to get elephant brain specimens. Normally, to preserve a brain, you pump fixative in via the arteries at low concentrations. Even in a mouse, this takes a few hours. For megafauna, it can take a few days. To get good elephant data, she told us that you have a new problem: scavengers. You take the elephant (per game-park regulations) and then get the skull off and gain access to the massive arteries. Then you have to find a tall enough tree to hang fixative from, as you can't get a pump out into the veld that will run for days. You just have to let gravity do the work. Then you slowly fix the brain in the skull in some random place in the bush. However, the rest of the elephant is very good at attracting lions, hyenas, etc at all hours of the day. So you have to fend off the skull from very hungry predators. It's not a 'fun' experience, in her recollection. I remember something about 4-bore rifles being used a fair bit. Finally you have to get the brain out of country, which, depending on international regulations and local bribery customs, can also be difficult.

Honestly, this woman is AMAZING. Here's her website: http://www.suzanaherculanohouzel.com/lab


Oddly we know a lot more about the electrophysiology of neurons of mice than of humans. Compartmentalization of synaptic integration is known to have computational consequences, e.g. see Bartlett Mel's work : http://www.pnas.org/content/111/1/498.short


Well, as the article mentions, getting a viable sample "to play with" seems to be hard.


I don't find it odd, I don't plan to volunteer my living brain for patch clamp experiments :)


i dont think you can do that


Wouldn't one use two-photon microscopy? Patch clamp is quite limited in comparison.

There also is the combination of both methods - not surprisingly, they write how hard it is in a living brain (the clamping, the two-photon microscopy part is much easier): https://www.the-scientist.com/daily-news/robotic-patch-clamp...

Here is an article describing "three dimensional two-photon brain imaging in freely moving mice using a miniature fiber coupled microscope": https://www.nature.com/articles/s41598-018-26326-3 (the experiments that I was aware of all still were done on a sedated animal with a fixated head).


when studying the propagation of dendritic signals, you would like to know the subthreshold voltage response at the soma. Since no spikes are elicited there is no calcium signal. Thats why you need patching.


I was unclear. My comment was just a tongue in cheek way of saying: it's not odd that we don't have much electrophysiological knowledge of single human neurons because of the practical and ethical problems that come with brain slice electrophysiology on humans -- you need live tissue which is hard to get.

What you've linked to deals with "in vivo" rather than brain slice ("in vitro") electrophysiology, which in humans would encounter the same issues. Cool techniques though nonetheless.


There's quite a lot of fluff here, but I think the actual story is that some MIT researchers have been using novel techniques to do electrophysiology on dendrites?

Which is cool; I just wish the article had gone into more detail on that. (and maybe included references :-) )


It’s not an article, it’s a press release from MIT’s publicity office full of stuff reporters can pull out to write their own articles without attempting to understand the paper.

The actual paper is here: https://www.cell.com/cell/pdf/S0092-8674(18)31106-1.pdf


thank you! :-)


Dendritic recordings per se aren't novel, but the human brain aspect here is.



Would be interesting to see if genetically engineered rodents with similar neuronal characteristics have improved intelligence (for some measure of intelligence).


> electrical signals weaken more as they flow along human dendrites, resulting in a higher degree of electrical compartmentalization, meaning that small sections of dendrites can behave independently from the rest of the neuron

Reading this, I just realized that all this time I was using the wrong computing analogy to understand the brain: neurons are not the transistors of the brain, dendrites are!


Dendrites are just a specific part of a neuron (1), so you are no more correct.

More importantly, neurons are nothing like transistors. Each neuron contains significant computing power. For example, in the retina of a frog there is a neuron that accepts as inputs dendrites from an area of light-sensitive cells (rods and cones) and only fires when a small, round, moving object is in the field of view. These neurons are colloquially called "bug perceivers." (2)

1 https://en.wikipedia.org/wiki/Neuron

2 "What the Frog's Eye Tells the Frog's Brain" (1959)


To be honest, using a transistors analogy may keep you on the wrong path for a long time as well :) There is not a lot of "digital" going on in those cells...


Wait - transistors are not digital they're analog. And neurons are terribly digital, acting as rate controllers and pulse counters and dividers and so on.


Any digital analogy fails, since the biological signals attenuate with distance from source. A digital gate acts as a repeater, re-boosting the signal strength to original level. That is not the case with neurons/dendrites.

Perhaps a better model would be for each neuron to be modeled as a current source instead of a voltage source. The more parallel paths that current is funneled into, the smaller it gets, until it is below the threshold of detection. You can have your transistor model allowing/blocking that current, but never boosting it back up.


> the biological signals attenuate with distance from source

The voltage signal is propagated in axons sometimes over very long distances (a meter, e.g. from the foot to the spine). It is renewed (see "Nodes of Ranvier") along the path for as long as needed, at distances that depend on how, or if, the axon is myelinated. That's why the speed is so slow (ca. 120 m/s max., often much slower) - it's actual ion movement through lots and lots of opening and closing voltage triggered channels all along the axons, compared to a purely electrical signal like in a metal wire. This means while in a wire the electrical field is from start to end, leading to electrons exiting on the far end pretty much instantaneously, in an axon am electrical field is only local, and only strong enough to trigger anything for about 1-2 millimeters at best. Then there have to be a new set of channels triggered by the electrical field further up the axon that renew the signal by letting in ions.

Attenuation is more important in dendrites and neuron bodies, spatial distance of simultaneously incoming action potentials (via connected axons from other neurons) plays a role in determining whether a threshold is reached that would trigger firing an action potential from this neuron. Since a dendritic arbor can be quite extensive and have lots of branches location - where exactly along the branches does an incoming signal attach - matters a lot.


True. However, logic gates constructed from transistors are digital. I'm wondering what the corresponding higher level neuronal structure would be...


Neurons are already logic gates. And FFT devices, and discriminators and ...


So basically neurons are ASICs, partially digital and partially analog.


They're basically cogs, rods and gears.

Seriously, molecular biologists devote decades of study and publish volumes figuring out how some small virus 'works'. A human neuron is orders of magnitude more complex than such a virus. To dismiss it as 'like an _____', is a (sometimes necessary) analogy of very little explanatory depth.


I'm not dismissing things. In fact, 'JoeAltmaier is correct that my analogy here was in context of the usual "neuron as signal adder" / "neuron as transistor".

And yes, I'm aware neurons are complex. But that does not mean all of that complexity is relevant to the task of thinking; all living cells have machinery related to self-replication, self-maintenance and survival in a biological system.

Or to use another analogy - a smartphone is orders of magnitude more complicated than a flashlight. Yet in context of people illuminating their way during the night, a smartphone is just a flashlight.


Metaphors help understanding. A neuron is more on the order of an entire logic subsystem, than a transistor or a gate. That's a reasonable comparison-of-complexity statement.


Memistor/Memristor


https://en.wikipedia.org/wiki/Memristor

Many synapses behave very much like memristors, though caveats apply.


That's oversimplified and usually how it's taught at the undergrad level. Neurons and especially the networks they form are more analogous to analog electronics, but both analogies end up failing at some point.


IAMNANS (I am not a neuroscientist) but my very simple understanding is that to a reasonable approximation, inputs are analog (there’s a lot of nuance in if and how much signal makes it into a neuron) and outputs are digital (neuron fires or doesn’t), leading to all kinds of interesting circuitry.


From what I know, neurons typically click on or off, so they are digital in that way. Signal strength is implied by the frequency of those clicks, so that's analog. Transistors are analog electronics, but used to make digital switches so it's sort of the reverse.

I guess my point is that calling neurons analog or digital is not a very good analogy and should probably be avoided.


It's a mixed-signal, general-purpose ASIC done in wetware. Combines both digital and analog traits. Some of most interesting research I saw in hardware CompSci does the same. Never goes mainstream since industry prefers things that are easy to automate and scale up for obvious reasons.


Transistors also shut on and off fully in, say, a switched mode power supply, or a class D amplifier, or any other pulse-width-modulation-like scenario. That is all considered analog.


But whether or not the incoming signals ever lead to a new action potential depends on so many factors (spacial, temporal, chemical, biological,... and it's all dynamic on many levels) that can all be called "analog" and are quite diverse that it's hard to call anything "digital". One action potential, that thing that brings to mind the "digital" analogy, really is a very small part in that system.


The brain is analogous to an analog analogue of digital computing?


It is and it isn't. There is a two-layer model of neuron that captures some properties like firing rates of hippocampal neurons, but there are also alternate models. E.g: https://static1.squarespace.com/static/5267aed6e4b03cb52f5e0...


Thanks for this! I'm not in the neuroscience field, but the title itself tells me that I've a misconception that needs correcting.


Does this imply that the neuron is more of a power supply, and the properties of sections of dendrites (not the whole dendrite) and their connections is what does the actual computation? If so, that adds a layer of messy complexity which is fairly typical in living things.


Both. Because neuroscience is hilariously complex and a big "fuck you" to neuroscientists.

Computation happens at many levels - among dendrites, among axons (some neurons have multiple axons as well!), axonal branches, within the neuron, and also among various networks that arise (behavioral and structural networks). It's all a bit much, if you ask me.


I kinda wonder if neuro-atypical human neurons behave about the same or differently -- for example, autism, ADHD, etc.


How bad these news are for studies of human neurodegenerative deseases on animal models?


Not really. Much of the research doesn't focus on the electrophysiology of single neurons, but rather how networks are impaired and what causes the impairment.


Is it possible to strip the Facebook click ID from URLs submitted to Hacker News?

The link for the story includes the following: fbclid=IwAR2szOstJ6_hkoar2mo8NkXXMaOnfnIS5rFq5YNcOPf397n5HctnSUCGHjk#.W86CSfP5s-9.facebook


Thanks! We've added that to our list of strippy things (AMP is up next) and removed it in this case.


For real though, this is kind of gross and it'd be nice (and super easy) if hn could strip this off.


funny how neuroscientists use terms such as "computation" and "(more) computing power", with no proper definition


Floating point operations, and more FLOPS.

Definitely.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: