I don't mean just vacuum tubes or even electronics at all. Mechanical analog computing is insane when you get down to it. You have special shapes that move against each other and do calculus.
We make these mechanical models as analogs of more complex physical systems. We can turn huge calculations into relatively simple machines. That we can roll two weirdly shaped gears together and get an integral out says to me something very profound about the universe. I find it to be one of the most beautiful concepts in all of the sciences.
What's even more wild is that we can take those mechanical analogs of physical systems and build an electronic analog out of vacuum tubes. That vacuum tubes work at all is just completely insane, but it's some absolutely beautiful physics.
And yes, there are equally beautiful problems that can only be solved in the digital domain, but it just doesn't speak to me in the same way. The closest thing is the bitwise black magic like fast inverse square root from a special constant and some arithmetic. Besides, that's more a property of number systems than it is of digital computation.
I understand how and why digital took over, but I can't help but feel like we lost something profound in abandoning analog.
The tide height is a function of the earth/sun/moon systems. Earth and Moon aren't at a fixed distance from eachother, and neither is the sun, so every day is a unique tide but you can predict the range.
The analog way to do it is to make a gear for each point of data in the system and synchronize all their gears. Then you use them all to rotate one final gear, which will show you the prediction for the time you've chosen.
I used to know nothing about Lord Kelvin except he said things like "It seems, therefore, on the whole most probable that the sun has not illuminated the earth for 100,000,000 years, and almost certain that he has not done so for 500,000,000 years"[1] and allegedly "everything which can be discovered, has been discovered"; until last year's Veritasium video on YouTube[2] about analog computers, and learned he invented tide-predicting analog computers to "substitute brass for brains" and add sinusoidal curves, and a mechanical integrator to separate out individual sinusoidal frequencies from the sum.
I know, I've had my eye on this topic for a while.
Honestly it seems like a perfect application. Neural networks are analog systems. An analog computer can represent neurons very accurately and the entire network is inherently parallel, for free!
I can't wait to see what comes out of this research
My research was in this direction. We already know that these analog neural chips could be orders of magnitude faster than digital equivalents. There is also a lot of military research going on in this area for few decades. However architecture innovations are much faster on software level and dedicated hardware approaches have not been able to catch up. Once things slow down on software level, slowly hardware llms could become the norm.
I didn't make that connection until my late 20s and when I finally did, it radically changed how I look at and understand analog systems.
In today's world, we still build analogs, we just coerce them into strictly numerical, digital models. I don't know if you can call it better or worse, but digital is definitely less magical and wondrous than mechanical analog systems.
Nature, almost completely analog, has been around a thousand times longer than humans. How many times has 'evolution' used digital methods to accomplish something? Perhaps we've chosen to switch to digital because we're in a hurry and its easier ... in hopes of, some day, asymptotically approaching the advantages of analog.
The main reason is that digital computers are so incredibly, overwhelmingly more flexible than analog. Analog computers are (generally) bespoke single-purpose devices. It really isn't too far off to imagine analog computers as code made physical, with all that entails.
Imagine writing a program if every time you wanted to change something you had to cut a new gear, or design a new mechanism, or build a new circuit. Imagine the sheer complexity of debugging a system if instead of inspecting memory, you have to disassemble the machine and inspect the exact rotation of hundreds of gears.
Analog computing truthfully doesn't have enough advantages to outweigh the advantage of digital: you have one truly universal machine that can perform any conceivable computation with nothing but pure information as input. Your application is a bunch of binary information instead of a delicate machine weighing tens to hundreds of pounds.
Analog computing is just too impractical for too little benefit. The extra precision and speed is almost never enough to be worth the exorbitant cost and complexity.
Yeah but ChatGPT is using $700k per day in compute (or was in April). Someone's going to make an analog machine that uses mirrors and light interference or something to do self attention and become very, very wealthy.
Neural networks are a very good application for analog computing (imo). You have a ton of floating point operations that all need to happen more or less simultaneously. And what are floating point numbers if not digital approximations of analog values? :)
This can be implemented as a network of transistors on a chip, but driven in the linear region instead of trying to switch them on as hard as possible as fast as possible. Which is, I believe, what researchers are trying to do.
There are also some interesting ideas about photonic computing, but I'm not sure if that's going anywhere.
A few months back, someone on YouTube attempted to design a mechanical neural network as a single 3D printed mechanism. It ended up not working, but the concept was pretty solid.
Perhaps that's only because we haven't begun to understand analog yet. And our crude original perceptions have long suffered for being ignored. For example, I have yet to actually hear any digital music yet ... that didn't have to pass through a DtoA converter. Hell, we may even learn that braining is not really the product of individual neurons at all, but a coordinated ballet oscillating like seaweed. I'll go bigger: is consciousness analog?
I'm not sure you understand what we're talking about. You seem to be talking about analog electronics, where I'm talking about computation with mechanical or electrical analogs of physical systems.
Both domains are extremely well understood. Analog electronics is an incredibly deep field, and forms the foundations of basically all of our electronic infrastructure. For instance, the transceivers that power cell stations are all analog and are incredibly complex. This stuff would seem like alien magic to anyone from even 30 years ago. The sheer magnitude of complexity in modern analog circuits cannot be overstated.
As for analog computing, well, it's just math. We can design analog computers as complex as our understanding of the physics of the system we want to model. There's not really any magic here. If we can express a physical system in equations, we can "simply" build a machine that computes that equation.
> I have yet to actually hear any digital music yet ... that didn't have to pass through a DtoA converter.
This is simply not true. There are plenty of ways to turn a digital signal into sound without an intermediate analog stage. See PC speakers, piezo buzzers, the floppotron. You can also just pump a square wave directly into a speaker and get different tones by modulating the pulse width.
The reason we use an intermediate analog stage for audio is because direct digital drive sounds like total trash. I won't go too much into it, but physics means that you can't reproduce a large range of frequencies, and you will always get high frequency noise that sounds like static.
Edit: I didn't notice your username before. All 8 bit systems make heavy use of the square wave voice, which is a digital signal. But it's typically passed through an analog filter to make it sound less bad. Music on e.g., the first IBM PCs was purely digital, played through a piezo beeper on the motherboard.
No, that's not really the problem. You can implement branching of a sorts in analog, but branching isn't a very useful concept here.
The strength of digital is that your logic is implemented as information instead of physical pieces. Your CPU contains all the hardware to perform any operation, and your code is what directs the flow of information. When you get down to bare basics, the CPU is a pretty simple machine without much more complexity than a clockwork mechanism. It's an extremely fascinating subject and I very highly recommend Ben Eater's breadboard CPU videos on YouTube. But I digress.
The real trick is that digital computers are general purpose. They can compute any problem that is computable, with no physical changes. It's purely information that drives the system. An analog computer is a single-purpose device[0] designed to compute a very specific set of equations which directly model a physical system. Any changes to that set of equations requires physical changes to the machine.
[0] general purpose analog computers do exist, but generally they're actually performing digital logic. There have only been a few general purpose true-analog computers ever designed AFAIK. See Babbage's difference engine.
DNA is digital. I think crucial digital feature is ability to have exact result from imperfect components, especially important for self-replicating systems. Instead of having calculation that is always off by 1%, you can have perfect result 99% of the time. And you can improve MTBF by stacking error correction on top of it, without necessarily having to improve manufacturing tolerances.
DNA copying always introduces errors. Organisms have quite a few error correcting mechanisms to mitigate damage from bad copies.
Most DNA errors turn out to be inconsequential to the individual. If a cell suffers catastrophic errors during reproduction, it typically just dies. Same for embryos, they fail to develop and get reabsorbed. Errors during normal RNA transcription tend to encode an impossible or useless protein that usually does nothing. Malformed RNA can also get permanently stuck in the cellular machinery meant to decode it, but this also has no real effect. That transcriptase floats around uselessly until it's broken down and replaced. You've got a nearly infinite number of them.
DNA and all the machinery around it is surprisingly messy and imprecise. But it all keeps working anyway because organisms have billions or trillions of redundant copies of their DNA.
*take with a grain of salt, I last studied this stuff many years ago.
You may conceptualize it as digital, as most of our modern mythology does with appearances these days. But does that really correspond with the ding-an-sich? Or, again, how much analog developent happened before our rush to commoditize everything as quickly as possible?
'Imperfect components' is a value judgement. Apparently an analog world was a necessary part of self-replicating 'mechanisms' arising while floating in the analog seas.
I agree. I remember climbing into the turret of the USS Massachusetts and playing with the ranging computer. It was just impressive that a geared device could do pretty complicated math in real time.
Electronic analog computing is also still being researched, eg https://arxiv.org/abs/2006.13177 ("Analog multiplication is carried out in the synapse circuits, while the results are accumulated on the neurons' membrane capacitors. Designed as an analog, in-memory computing device, it promises high energy efficiency")
As for something you can easily get your hands on, micrometers are incredible. A simple screw and graduated markings on the shaft and nut give you incredibly precise measurements. You can also find mechanical calculators (adding machines) on eBay. But those really aren't very sexy examples of the concepts.
Analog computers aren't very common anymore. Your best bet is visiting one of the computer museums that house antique machines. Or watching YouTube videos of people showing them off. There's plenty of mechanical flight computers in particular on YouTube.
If you have access to a 3D printer, there's plenty of mechanisms one can print. The antikythera mechanism is a very interesting celestial computer from ancient times, and 3D models exist online.
Look around on YouTube. There's some fascinating videos from the 1950s on US Navy mechanical firing computers.
These machines can calculate ballistic trajectories with incredible accuracy, accounting for the relative motion of the ships, wind speed, and even the curvature of the earth. Those calculations are not at all trivial!
Hah, this is the exact video I was referring to in the sibling comment. This is what really captured my imagination with regard to mechanical computers
I don't mean just vacuum tubes or even electronics at all. Mechanical analog computing is insane when you get down to it. You have special shapes that move against each other and do calculus.
We make these mechanical models as analogs of more complex physical systems. We can turn huge calculations into relatively simple machines. That we can roll two weirdly shaped gears together and get an integral out says to me something very profound about the universe. I find it to be one of the most beautiful concepts in all of the sciences.
What's even more wild is that we can take those mechanical analogs of physical systems and build an electronic analog out of vacuum tubes. That vacuum tubes work at all is just completely insane, but it's some absolutely beautiful physics.
And yes, there are equally beautiful problems that can only be solved in the digital domain, but it just doesn't speak to me in the same way. The closest thing is the bitwise black magic like fast inverse square root from a special constant and some arithmetic. Besides, that's more a property of number systems than it is of digital computation.
I understand how and why digital took over, but I can't help but feel like we lost something profound in abandoning analog.