Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

"Hardness" is a difficult quantity to define if you venture beyond "humans have been trying to build systems to do this for a while, and haven't succeeded".

Insects have succeed in build precision systems that combine vision, smell, touch and a few other senses. I doubt finding a juicy spider, immobilising it, is that much more difficult that finding a door knob and turning it, or folding a T-Shirt. Yet insects accomplish it with I suspect far less compute than modern LLM's. So it's not "hard" in the sense of requiring huge compute resources, and certainly not a lot of power.

So it's probably not that hard in the sense that it's well within the capabilities of the hardware we have now. The issue is more that we don't have a clue how to do it.



Calling it "compute" might be part of the issue : insects aren't (even partially) digital computers.

We might or might not be able to emulate what they process on digital computers, but emulation implies a performance loss.

And this doesn't even cover inputs/outputs (some of which might be already good enough for some tasks, like the article's example of remotely operated machines).


> Calling it "compute" might be part of the issue : insects aren't (even partially) digital computers.

I have trouble with that. I date from the era when analogue computers were a thing. They didn't have a hope of keeping up with digital 40 years ago when clock speeds were measured in the KHz, and a flip flop took multiple mm². Now they are digital computersliterally 10's of thousands times faster and billions of times smaller.

The key weakness of analogue isn't speed, power consumption, or size. They excel in all those areas. Their problem is the signal degrades at each step. You can only chain a few steps together before it all turns to mush. Digital can chain an unlimited number of steps of course. Because it's unlimited can emulate any analogue system with reasonable fidelity. We can emulate the weather for a few days out, and it is one of the most insanely complex analogue systems you are likely to come across.

Emulating analogue systems using lots of digital steps costs you size and power of course. In a robot we don't have unlimited amounts of either. However right now if someone pulled off the things he is talking about while hooked up to an entire data centre we'd be ecstatic. That means can't even solve the problem given unlimited power and space. We truely don't have a clue. (To be fair this isn't true any more if you consider Waymo to be a working example. But it's just one system, and we haven't figured out how to generalise it yet.)

By the way, this "analogue losses fidelity" problem applies to all systems, even insects. The solution is always the same: convert it to digital. And it has to happen very early. Our brains are only 10 neurons deep as I understand it. They are digital. 10 steps is far too much for analogue. It's likely the very first process steps in all our senses such as eyesight are analogue. But before the information leaves the eyeball it's already been converted to digital pulses running down the optic nerve. It's the same story everywhere. This is true for our current computer systems too of course. Underneath, MLC flash uses muplitple voltages, QAM is a encoding of multiple bits in a sine wave, a pixel in a camera is the output from multiple sensors. We do some very simply analogue manipulation on it like amplification, then convert it to digital before it turns to mush.


I see your point and mostly agree, but looks like we still need to use different words for the ways that neurons are digital compared to how transistor-based (binary) computers are digital...


well the magic of transformer architecture is that if the rules exist and are computationally tractable, the system will find them in the data, and we don’t have to have a clue. so. how much data do we need?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: