Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Calling it "compute" might be part of the issue : insects aren't (even partially) digital computers.

We might or might not be able to emulate what they process on digital computers, but emulation implies a performance loss.

And this doesn't even cover inputs/outputs (some of which might be already good enough for some tasks, like the article's example of remotely operated machines).



> Calling it "compute" might be part of the issue : insects aren't (even partially) digital computers.

I have trouble with that. I date from the era when analogue computers were a thing. They didn't have a hope of keeping up with digital 40 years ago when clock speeds were measured in the KHz, and a flip flop took multiple mm². Now they are digital computersliterally 10's of thousands times faster and billions of times smaller.

The key weakness of analogue isn't speed, power consumption, or size. They excel in all those areas. Their problem is the signal degrades at each step. You can only chain a few steps together before it all turns to mush. Digital can chain an unlimited number of steps of course. Because it's unlimited can emulate any analogue system with reasonable fidelity. We can emulate the weather for a few days out, and it is one of the most insanely complex analogue systems you are likely to come across.

Emulating analogue systems using lots of digital steps costs you size and power of course. In a robot we don't have unlimited amounts of either. However right now if someone pulled off the things he is talking about while hooked up to an entire data centre we'd be ecstatic. That means can't even solve the problem given unlimited power and space. We truely don't have a clue. (To be fair this isn't true any more if you consider Waymo to be a working example. But it's just one system, and we haven't figured out how to generalise it yet.)

By the way, this "analogue losses fidelity" problem applies to all systems, even insects. The solution is always the same: convert it to digital. And it has to happen very early. Our brains are only 10 neurons deep as I understand it. They are digital. 10 steps is far too much for analogue. It's likely the very first process steps in all our senses such as eyesight are analogue. But before the information leaves the eyeball it's already been converted to digital pulses running down the optic nerve. It's the same story everywhere. This is true for our current computer systems too of course. Underneath, MLC flash uses muplitple voltages, QAM is a encoding of multiple bits in a sine wave, a pixel in a camera is the output from multiple sensors. We do some very simply analogue manipulation on it like amplification, then convert it to digital before it turns to mush.


I see your point and mostly agree, but looks like we still need to use different words for the ways that neurons are digital compared to how transistor-based (binary) computers are digital...




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: