Member-only story
Guess What? Analog Computing May Be Coming Back
See? I told you technology advances in cycles
As someone who has spent a fair amount of my career rummaging around legacy systems, I’ve learned not to dismiss old paradigms and approaches. They always come back. Marketing gurus repackage them with different terms and pretend its something new, but technology advances in cycles not a straight line. The old ways come back as market forces shift our focus between capacity and transaction speed.
I wrote a whole chapter on this in my book Kill It With Fire and now I have another example to add to my list. Something I never seriously considered I would ever find myself saying …
Analog computing may be coming back.
What Is Analog Computing?
The first thing most people think of when they hear the word “analog” is a mechanical or a manual process — cranks and gears and whatnot. But analog actually refers to continuous measurements of physical quantities. The term analog itself comes from the Greek word for ratio or proportion. Rather than running calculations based on translating an intermittent signal into numbers as digital computers do, analog computers use measurements of variables like weight, speed, or pressure to calculate proportional changes in other measurements. Think about how a thermometer works. Heat pushes mercury up a channel which produces a measurement. The more heat, the higher the measurement.
The easiest way to understand the difference between analog and digital computers is to think about the input they can both process: voltage. A digital computer interprets voltage in on/off states and then our software treats those on/off states as binary numbers. We compute things based on these symbolic representations. Although the line between on/of isn’t exactly at 0V and may vary depending on circuit design, increasing the voltage once those design decisions are made doesn’t change the output.
In analog computers, an increase in voltage leads to a different output. An analog amplifier gets louder with higher voltage, for example.
Why Is It (Potentially) Coming Back?
In a word (…well, two words): optical computing.