the analog art

The 'Operateur Mathematique Electronique', analog computer built by the Societe d'Electronique et d'Automatisme in 1949.
The ‘Operateur Mathematique Electronique 12’, analog computer built by the Societe d’Electronique et d’Automatisme in 1949.

analog computing

Analog computers have intrigued me ever since I saw one for the first time and had no clue how to operate it, despite the fact that it looked very familiar with its patchpanel and knobs; superficially not very different from an analog modular synthesizer for sound. The interest triggered by that experience has remained dormant for about twenty years, but has now finally escalated in the last seven or eight months into lots of reading and soldering. As of this post, that reading and soldering is now officially in preparation of a project that until further notice will be called #59, and that will most likely at least result in something like a film and perhaps also in an installation, performance or another form of presentation if that turns out more appropriate.  In this post I would like to give a short overview of analog computing and my interest in it, which means that I will be skimming over many topics that deserve more attention.

One of the reasons I’m interested in analog computers is still related to that first experience: the fact that they are different from what I know (which is clearly a motivation that will diminish the longer I work on this..). That ‘otherness’ is ofcourse the result of analog computers being completely forgotten nowadays, which is a strange fact in itself. Analog computers were rather common and important between 1950 and 1975; analog computers made supersonic airplanes, missiles and nuclear reactors possible, calculated Dutch flood defenses and helped put man on the moon. But the rare mentions of analog computers in most books on the history of computing (and for instance in the Wikipedia entry of “Computer”) mostly refer to the mechanical devices that were invented before the Second World War and that did none of these things. Interesting machines for sure, but rather marginal compared to those that existed alongside the early digital computers. The electronic analog computers are generally left out from such historical overviews, I suppose because they do not fit into a linear account leading from tally sticks to cloud computing. Strange that the inherently multi-tasking nature of history seems to be so hard to deal with, and even stranger that it is still acceptable in the history of technology to reduce it to an unbroken line of precursors leading to the apparently unavoidable present. One straight-forward reason to talk about electronic analog computers is simply that they were around for much longer than the mechanical ones that made it into the history books: it took until 1975 or 1980 until digital computers could do what analog computers were mostly used for; after they finally disappeared and the cultures and practices associated with them were absorbed into the culture around digital simulation.
The following quote comes from James Small’s book (see below) when he writes about ‘project Cyclone’, one of the large US-military-funded analogue computer projects. It gives an idea of how different the relationship between analog and digital computing was around 1955-56: “In keeping with project goals to improve and to verify the accuracy of results, problems were routinely re-run on digital computers. In a typical application – the simulation of a guided missile in three dimensions – the average run time for a single solution on the electronic analogue computer was approximately one minute. The check solution for this problem by numerical methods on an IBM CPC (Card Programmed Calculator) took 75 hours to run.”

Overview of 'Project Typhoon' analog computer, one of the large US military projects around 1950.
Overview of ‘Project Typhoon’ analog computer, one of the large US military projects around 1950 (note the threedimensional rocket model in the foreground).

The term ‘Analog Computer’ covers a bewildering variety of devices that seem to have little in common and that can be classified in many different ways. There are the mechanical ones, ranging from calculating aids like slide rules, nomographs, calculating discs and planimeters to much more complicated devices such as tide predictors, harmonic synthesizers and gun directors. The most sophisticated of these mechanical machines were the very precise and versatile differential analyzers built at MIT by Vannevar Bush in the 1930’s (some later European versions of which were actually built out of Meccano parts). These were the first ‘general purpose analog computers’ ever built. Then there are the electronic analog computers I will be mostly writing about in this post, but also the earlier electric ‘network analyzers’, and a whole range of electric set-ups involving conductive fluid, paper or foam. Without the mechanical and electr(on)ic computers, this still leaves a crazy fauna of intriguing, exotic devices involving compressed air, heat, polarized light, elastic rubber sheets, film projectors or soap bubbles that have also been called analog computers at some point or other. And more recent proposals for biological or quantum analog computers can now be added to the list too.
With such variety, the word ‘analog’ lost its original meaning already somewhere in the 1950’s and started to be used to refer to machines in which quantities are being represented by some continuously variable physical property. The more common and dominant digital computing became, the more analog came to be defined in terms of its being non-discrete. Some of a more rich use of ‘analog’ has been preserved in what I think is the most relevant distinction in analog computers: the distinction between ‘direct analog’ computers, and ‘indirect analog’ or ‘functional analog’ computers. Both categories have aspects I am looking to incorporate into my project.

past, present and future of analog computing according to George Philbrick Researches.
past, present and future of analog computing according to George Philbrick Researches.

direct analog

‘Direct analog’ electr(on)ic computers are devices that are so analog that for 21st-century ears they are perhaps better called ‘electr(on)ic models’ instead of computers. A definition of ‘direct analog’ could be that one or more physical properties of the computing system are an analog of the same number of properties of the system being computed. For instance the equations describing the relations between current, inductance, capacitance and resistance in an electrical circuit can be mapped on the equations describing the relations between applied force, mass, elasticity and friction in a mechanical system. This means that you can make an electrical model of an mechanical system using coils, capacitors and resistors, and measure voltages and currents in the electrical model to make predictions about the mechanical system. In this particular example it would be much more easy to change a component in an electrical circuit than to change a complex parameter such as elasticity in a mechanical setup. And this is just one of many equivalences between groups of physical properties that can be exploited, see the table below (do click on the pictures to see more detail !).

overview of analogous parameters in various physical systems (from McMaster, Merrill, List, "Analogous Systems in Engineering Design", Product Engineering, Jan. 1953)
overview of analogous parameters in various physical systems (from McMaster, Merrill, List, “Analogous Systems in Engineering Design”, Product Engineering, Jan. 1953)

An interesting set of analogies mentioned in that chart are those between the curves of potential around electrodes in a resistive material and fields of various kinds. These fields can represent the flow of air around objects, or the propagation of water through porous rock or soil, for instance. The electrodes that are used in such models are usually metal and have special shapes that relate to the system being modelled. The resistive material was often an electrolytic tank: a reservoir filled with water or a some kind of solution, which meant that the models could also be threedimensional. Another popular setup that was strictly twodimensional used a special kind of conductive paper called “Teledeltos” paper (originally designed to be used as a kind of graphing paper in special recorders), with metal strips or conductive paint as the electrodes. In both of these techniques, a measuring probe is moved around in the tank or on the paper to measure the values in the modelled field and plot the curves.

field computing setup using conductive paper at Rijkswaterstaat (Dutch Water Authority), early 1950's.
field computing setup using conductive paper at Rijkswaterstaat (Dutch Water Authority), The Hague, early 1950’s.

Such field problems, and more complicated – non-linear – ones, could also be modelled on ‘Network Analyzers’ (not to be confused with more recent devices that are called the same). These often huge machines were first built by US companies such as General Electric and Westinghouse starting in the twenties and thirties, initially in order to study the behaviour of  power grids consisting of many, interconnected elements. They started as a kind of toolkit to build scale models of those grids, basically a big patchboard connecting different resistors in different ways, and devices to measure and plot the results. Later and more complicated versions of these setups were still essentially big patchboards connecting passive elements, but apart from resistors these now also included capacitors and inductors (coils) and often signal generators were added to provide inputs to the network. This made them suitable for many more applications; they could be used for building any type of direct electrical analog, especially those for which many elements were necessary. These devices continued to be built until the mid-1950’s, after that they were largely overtaken by functional analog computers, even though some of them remained in use well into the 1970’s.

Harold Hazen's network analyzer at MIT, around 1930.
Harold Hazen’s network analyzer at MIT, around 1930.

There are a number of things I find interesting about the direct analogs (apart from the great use of ‘analog’ as a noun). Of the two kinds of analog computers, the direct analogs include the most variation in techniques and approaches and are in that sense close to the more exotic non-electrical analogs. I find that variety fascinating, and it made me realize how incredibly standardized our desktop or mobile computers are in comparison. On a more philosophical level I like the idea of using the equivalences between processes happening in different materials; it’s that kind of analogy that I think has the potential to provoke new insights, for instance because it helps one look at known things with an uncommon perspective or because of an analogy breaking down unexpectedly. And a third aspect I find interesting is that these processes are called computation but are also so rooted in the properties of the materials being used; descriptions of field models using electrolytic tanks are filled with an unlikely combination of mathematical equations with plenty of integrals and nitty-gritty details about the shape and composition of the electrodes or the best amount of soap to add to avoid disruptions of the surface by the measuring probe. These three things combined make me think that an aspect of these direct analogs that is still very valuable is the formulating and materializing of the analogies; I can imagine there are many cases where this process is more interesting than the numerical results the analogs produce.

direct analog simulation of fields
direct analog simulation of airflow.

A big question I had about direct analogs is why these setups are called computers at all. Part of an answer I can imagine is that they were called computers because the early digital devices that were starting to be used for similar purposes were also called computers, and adopting the terminology was necessary to even have a discussion about the relative merits of different methods. So I suspect that some of the older techniques would have been called ‘modelling’ before they were called ‘computing’, and might now be called ‘modelling’ again if they are still around.
That is one possible answer, but a more interesting answer involves thinking about what computation actually is. Now that is surely the subject of many books and discussions I have not yet read, but investigating these models-called-computers is certainly thought-provoking in this respect. Any computing can perhaps be defined as a measurement that is done after performing a mechanical procedure on some initial conditions. In this sentence, ‘mechanical’ would mean that it is a procedure that does not need understanding and that gives predictable results, and for the computation to be of use, the ‘mechanics’ should perform a desirable mapping from input to output. What I like about this view of computation, is that it involves a measurement, which involves an observer with an agenda. This view of computation also works for digital computing, except that in that case the measurement has been made rather trivial and the procedures in the ‘model’ have been reduced to agglomerations of the smallest imaginable trivia too. But they are still there.

functional analog

The second group of analog computers are called ‘indirect analog’ or ‘functional analog’ computers. In these, physical properties of the computing system stand for values in a mathematical equation and the modules in a functional analog computer perform mathematical operations on these values. So where in a direct analog computer all voltages in the system would represent the same property, for example displacement, in a functional analog computer there could be one voltage representing displacement and other voltages representing speed or acceleration. Functional analog computers mainly consist of adders, inverters and integrators, and the values are in most cases represented by DC voltages. It makes sense that these machines have much less of the variety that direct analogs have; the variety that is there mostly concerns the internal workings of the modules, not their functions, since these correspond to a small number of mathematical operators.

Helmut Hoelzer's functional analog computer from 1941.
Helmut Hoelzer’s functional analog computer from 1941.

Two important early devices in this history are George Philbrick’s “Polyphemus” and Helmut Hoelzer’s “Mischgeraet”. Helmut Hoelzer started working for the German army in 1939 and came up with the idea for a fully electronic guidance system inside the V2-rocket. This was a circuit using capacitors and resistors to do integration and differentation, for which he developed several novel techniques. The outcomes of these calculations were ‘mixed together’ in an addition circuit, hence the name ‘Mischgeraet’. Hoelzer also realized that the same approach could be used to make a reconfigurable device for more general calculations, and built one around 1941. This is generally considered to be the first electronic functional analog computer. It consisted of electronic integrators, differentiators and adders, together with a mechanical function generator, and it was used to calculate trajectories. Using the same elements, Hoelzer also made a simulator for the whole V2-rocket system. Immediately after the war, Helmut Hoelzer was brought to the US as part of ‘Operation Paperclip’, together with other rocket scientists like Werner Von Braun, taking one of his analog computers with him. It is said that this computer was still in use until the late 1950’s, and that it served as the basis for another analog computer built at the Jet Propulsion lab around 1950. Even though Hoelzer’s devices were the first, their influence on later analog computers seems to have been rather marginal. In Germany, research and development of analog computers was forbidden for ten years after the war. For that reason, and given the fast technological development in that time, I would suppose that the first analog computers by Telefunken in West-Germany and even the first analog computers in East-Germany were more based on American and British examples than on Hoelzer’s inventions. Hoelzer had a long career in NASA and kept working on missile guidance systems and computing, but as far as I know he was not directly involved with the development of analog computing in the US; the important US developments were happening elsewhere.

"Polyphemus" by Georges Philbrick with two different front plates.
“Polyphemus”, made by Georges Philbrick in 1938, shown with two different front plates.

Even though “Polyphemus” was a much simpler device than Hoelzer’s first computer, it is much more connected to the later history of analog computing and for that reason I would think it is much more important historically. It was built in 1938 by George Philbrick, a visionary engineer, businessman and a great writer with a flair for metaphors taken from the classics. “Polyphemus” was a simulator and one could say that it was half way between a direct analog and a functional analog computer. It was a electronic model with a fixed structure, a number of controls, and it used an oscilloscope as the only form of output. By changing the legends on the front of the device, the model could be applied to different systems, for instance to a system consisting of water conduits, pumps, tanks and a regulator, or a thermal system with equivalent components. It was mostly used as a training and demonstration device and remained functional until the 1970’s. As far as I know it was the first device to work according to what was later called ‘repetitive operation’ or ‘rep-op’. This means that the computer or simulator is performing its calculations 10 times per second or more, so that the curves that are calculated can be shown on an oscilloscope. In contrast, the early network analyzers and many of the later analog computers work in ‘normal’ or ‘slow’ mode and use xy-recorders (that plot curves on paper) or voltmeters for output. In ‘rep-op’, when the user turns one of the controls, the changing behaviour of the simulated system can be experienced in real-time. This focus on interactive exploration remained central to Philbricks later work in analog computing.

system consisting of Philbrick units, 1957.
system consisting of Philbrick units, 1957.

The main building block of an indirect analog computer is the operational amplifier or op amp. This is still one of the most common components in analog electronics, and it derives its name from the fact that it was invented to perform mathematical operations, something I did not realize until I started reading about the history of analog computing. The groundwork for the invention of the op amp was done by engineers working on long-distance phone lines, with the invention of the feedback amplifier by Harold Black in 1927 and its theoretical development by Nyquist and Bode in the thirties. But it was not until rather late during the second world war that it became clear that problems in telecommunications and in the positioning of anti-aircraft guns had large overlaps in more general theories of feedback and stability, and that these had much wider applications. A large group of people such as Norbert Wiener, George Philbrick and Harold Hazen was working on this topic by the end of the war, and early versions of these ideas found their way into gun directors and especially into the use of radar to point anti-aircraft guns. One of the first publications of this wartime research was in an article by Ragazzini, Randall and Russell from 1947, under the title “Analysis of Problems in Dynamics by Electronic Circuits”. In it they describe how they realized that a standard, stabilized feedback amplifier could also be used for computation since it can perform addition, integration and differentation, depending on how it is connected. After this realization, they started to use these amplifiers to model missiles and their guidance systems, and a student of Ragazzini, Loebe Julie, developed a significantly improved amplifier that is the direct precursor of the op amps that still exist. Their work at Columbia University was the basis of the American work on electronic analog general purpose computers, and it was the beginning of project Cyclone, one of the big military projects to develop computation and simulation that were started immediately after the war. Because most of this was classified at the time, this part of the history is actually still rather obscure today.

The manufacturing of EAI analog computers at the end of the 1950's.
The manufacturing of EAI analog computers at the end of the 1950’s.

Until the beginning of the fifties, most analog computing installations were custom-built, from the large-scale military projects in the US and UK, to the much smaller installations built in many universities around the world. They were used for many different applications, but economically the biggest driver of the development of this technology was the military research into supersonic flight. Mostly as a side-effect of this research, analog computing technology also became commercially available around 1950. Early manufacturers were those companies that were building the analog computers for the large military projects, such as Reeves (REAC, Reeves Electronic Analog Computer), Boeing (BEAC, Boeing Electronic Analog Computer) and the Goodyear Aircraft Corporation (GEDA, Goodyear Electronic Differential Analyzer). After the mid fifties this market was gradually taken over by electronics companies such as Systron Donner, Beckman, Applied Dynamics, Telefunken and especially Electronics Associated International (EAI), who were for a long time the largest company worldwide building and developing analog computers.
These commercial machines differentiated into large, expensive installations for research institutes on the one hand, and tabletop models for education and smaller companies on the other hand. Large installations would consist of 50 to 500 adder/integrator modules and were used in government research institutes, universities and large companies. From the end of the fifties until the beginning of the seventies, these machines gradually evolved into hybrid computer systems. In these, the digital computer mostly took care of the patching of the various analog computing elements, the setting of initial conditions and storage and analysis of results. For some time, this kind of hybrid computer was seen as the future of computing, combining the best of both worlds (the history of the analog-digital debate in computing is an interesting but much larger topic I am not writing about here).
The much smaller tabletop models would consist of about five to ten adder/integrator modules and perhaps a few multipliers. They were made for smaller companies, but especially for education. The last analog computers to be commercially available were built by Comdyna; their GP-6 was developed and first sold in 1968, and production continued until 2004 (I own one from the seventies..).
I think it is correct to say that the main developments in the history of analog computers took place in the US, largely because of US military spending. But analog computers were built and used everywhere, with large and competitive companies developing them also in Germany, the UK and Japan, and significant activity I have read about in the Soviet Union, France, Poland, Czechoslowakia, Norway and Brazil (and probably a lot more countries I have not yet read about).

EAI analog computer installation at the Deutsches Zentrum für Luft- und Raumfahrt.
EAI analog computer installation at the Deutsches Zentrum für Luft- und Raumfahrt.
the NASA General Purpose Simulator, used during the development of the Saturn V rocket, early 1960's.
overview of the NASA General Purpose Simulator, used during the development of the Saturn V rocket, early 1960’s.
saturn02
more detailed view of NASA’s General Purpose Simulator.

What I find interesting about these functional analog computers is that they are really computers that can be programmed, but based on a set of operators that is very different from those at the basis of digital computing. The programming happens through patch cables and the resulting system remains inherently parallel and fast; the limits of speed and precision are given by the bandwidth and noise levels of the amplifiers used. Also I find it interesting that these are computers without storage of data and without memory, which means that the emphasis lies on the modelling of equations and system behaviour. Because of this, ideas about ‘information processing’ are mostly absent, which I find gives a refreshing relativization of many current and crude metaphysical ideas that are inspired by the concept of ‘information’ as some kind of absolute quality.

preparing analog computing programmes on removable patchpanels
preparing analog computing programmes on removable patchpanels

more recent analog computing

Even though the kinds of analog computers discussed above had almost completely vanished by the 1980’s, other practices around analog computing have continued to develop and emerge. I would like to point out just a few examples I happen to know about, but I am sure there is much more and I want to do more research into this.
In the second half of the eighties, a lot of work was done around analog neural networks, involving custom hardware. Such experiments go back to at least 1960 (and for instance the fascinating work of Bernard Widrow), but was given a huge push by Carver Mead when it became possible to make very large circuits on a chip. His most well-known projects from that time were a ‘silicon retina’ and an ‘electronic cochlea’, both using rather large networks of analog computing elements on a single VLSI chip. Such work on ‘neuromorphic’ hardware has grown a lot since, and now many institutes and companies are working on similar devices, desirable because of their speed and robustness. More playful spin-offs of this kind of work are the analog circuits that Mark Tilden uses to control his robots and the analog networks that have been built by several artists, and of which the Analog Artificial Neural Network pieces by artist Phillip Stearns are a good example.

Another very intriguing line of work is almost purely theoretical and goes back to a paper written by Claude Shannon in 1941 under the title ‘The Mathematical Theory of the Differential Analyzer’, discussing what can perhaps best be explained as a kind of analog equivalent of a Turing machine. In that paper he proves that a very large class of equations can be solved by a General Purpose Analog Computer (GPAC) consisting of adders and integrators, the elements that comprise both the mechanical differential analyzers he was writing about as well as the functional analog computers discussed above. Shannon’s proofs were refined and completed by Marian Pour-El in 1974 and the limitations of the GPAC were further theorized by Lee Rubel at the eind of the 1980’s, when he proposed the Extended Analog Computer (EAC) that would not have these limitations. Like the GPAC, this is also a ‘conceptual computer’ with the aim to think about the computability of real-valued functions, and he proved that the EAC would be able to compute any differentially algebraic function of any finite number of real variables.
A question that is related to this work is the question whether analog computing is (or can be) in any sense more powerful than discrete computing. Hava Siegelmann has proven that a certain class of finite, real-valued, recurrent neural network is more powerful than a Turing computer, which points us to the wonderful world of super-Turing powers and hypercomputation. I have not read any of her work yet, nor any of the work by others researching similar topics, but it is high on the list.

top and bottom of Jonathan Mill's second prototype Extended Analog Computer, showing piece of conductive foam
top and bottom of Jonathan Mills’ second prototype Extended Analog Computer, showing piece of conductive foam

I did read most of the papers published by Jonathan Mills, a researcher in Bloomington (and Bristol), who seems to have been mostly (or at least publicly) active until a few years ago. He built several prototypes for hardware to realize Lee Rubel’s Extended Analog Computer, which would be a formidable device, super-Turing or not. Mills uses ‘Lukasiewicz logic arrays’ to replace the standard functional analog computer units such as integrators, adders and multipliers, but the most intriguing part in his prototypes basically amounts to a direct field analog. This takes the shape of a piece of conductive foam or ’empty’ silicon and some electrodes, prompting funny anecdotes about the mad professor building analog computers out of the foam used to wrap the digital computers used by the department next door.  I do hope he vanished from the scene to work on mass-production of these devices !

analog computing and interactive exploration

In their assessments of analog computing, both Mindell as well as Small (see sources below) point to the connection between analog computing and a hands-on culture of interactive simulation that itself has strong links to engineering. Small refers to a very interesting book by Eugene S. Ferguson (“Engineering and the Mind’s Eye”, MIT Press, Boston, 1992.) that deals with the contribution of non-verbal understanding and ‘feel’ to engineering judgements. These aspects used to be an integral part of engineering education and Ferguson fears they are being replaced by a dangerous over-reliance on abstract design theories and mathematics. His fears closely resemble some of the arguments that were made for analog computing in the debate about the relative merits of analog and digital computers; the first digital computers could not be operated interactively, so it was felt they brought a loss of ‘feel’ and ‘insight’ with them.
Two aspects of analog computers that are important for hands-on simulation are the idea of analogy that permeated the early thinking about analog computing and the fact that analog computers can be fast enough to respond in real-time. George Philbrick wrote extensively about both and made them central to the philosophy of the company he founded in 1947. Already during the second world war, Philbrick suggested a ‘supersimulator’: “an extremely general and flexible assembly, covering every conceivable type of system, which could be adapted to any particular problem simply by the manipulation of conveniently provided organizational controls.” By means of such a device “years of experience, and of trial and error, on the development of controls and dynamic components could thus be collapsed into hours”. He called the newsletter of his company “The Lightning Empiricist” and wrote about how real-time analogs make it possible for the user to get a feel and non-verbal understanding of the behaviour of the system that is modelled, also without necessarily being able to mathematically describe it. In a much later text he distinguishes three types of simulators: training simulators to familiarize users or controllers with the behaviour of a system, educational simulators to demonstrate systems in order to explain them, and developmental simulators as aids in attempts to understand system behaviours.
In his book, Charles Care (see also the sources below) proposes a history of computers as ‘modelling machines’, as opposed to the current histories of computers as ‘information machines’. He proposes this as an alternative view that would give analog computers their rightful place. It is interesting to note that the disappearance of analog computers seems to coincide with a more widespread use of digital computer visualizations and simulations, suggesting a continuity of development and at least the possibillity that some parts of analog computing culture were incorporated into these evolutions. I have come across several examples of people or institutes that gradually switched from analog computing to digital simulation and visualization during the time that analog computing disappeared.

the Sandin Image Processor at the SAIC in Chicago (picture by Rosa Menkman).
the Sandin Image Processor at the SAIC in Chicago (picture by Rosa Menkman).

An interesting question is how this hands-on engineering culture relates to the hands-on culture of the electronic arts, both analog and digital. Also this is a much larger topic I will not adress here, but in this context it brings me to a few much more concrete historical questions I am curious about and that relate to the question whether the analog computer can be considered as the ancestor of the later modular synthesizers for sound and video.
Some of these concrete questions relates to my first experience with an analog computers: where does the patchpanel in a modular synthesizer actually come from ? Does this come from analog computers, from telephone switchboards or perhaps from some other patching technology closer by ? Did the engineers behind the first modular synthesizers look at analog computers ? And are the virtual patchcords in visual programming languages such as Max/Msp/Jitter inspired by real-world patchcords or by flowcharts ?
Another concrete question relates to one of the pioneers of video synthesis, Dan Sandin. In a famous short video from 1973, he explains his Image Processor, and says: ” … a compact way of saying what it is, is to say that it is a general purpose, … patch-programmable analog computer … optimized for processing video information”. I have always found it rather pedantic to explain this machine by referring to it as an analog computer, but now that I know more about this history, this might actually be just be how it was seen at the time. In art circles, Dan Sandin is mostly known as a video art pioneer, but he devoted most of his career to scientific simulation and visualization at the time that both analog and digital technologies existed, and I would be surprised if he never saw an analog computer. Was the analog computer one of the inspirations for the Sandin Image Processor ? I guess I should write him to ask !

#59

As said before, my interest in analog computing is connected to work on my project #59, and I don’t want to finish without briefly mentioning my main motivations and progress so far.  I imagine that this project will at least result in a film / single channel work of some kind, but this will be a long-running project of which I am just at the beginning, which means that everything is still pretty much wide open and nothing has been excluded yet.

picture of my analog setup last March.
picture of my analog setup last March.

Practical activity until now has mostly consisted of learning about electronics by building several utility modules to work with analog signals and turn them into a VGA signal that corresponds to HD video. About two months ago I have started to build a functional analog computer (hopefully in at least two senses of ‘functional’) that can work at HD video speeds. This will consist mostly of the classical adder/integrator units, inverters, multipliers, comparators and switches.
Another line of investigation I have pursued a lot less so far, is to find an interesting way to explore the materiality of the analog devices I am building. For this reason I am very interested in the direct analog devices I briefly pointed out above, partly because of their ‘sculptural’ possibillities, but also because I suspect that such devices will exhibit disturbances that have the potential to be very interesting. I am interested in devices that are more open to their surroundings, inspired by legendary devices such as Gordon Pask’s chemical computers and Stafford Beer’s early experiments with biological computing.
In the past months I have learned a lot, but still I find it refreshing to think of analog ways to achieve things or to simply explore analog computing. Perhaps more importantly, it is also new (and refreshing) for me to think of images as one-dimensional signals as opposed to two-dimensional filmframes or zero-dimensional pixels. Working with images as signals poses rather strict but interesting limitations, and also trigger the need to explore different ways to think about composition and developments over time. But in any case, for now, building or connecting modules is a welcome change from writing code. Apart from the simply enjoyable activity of soldering and building, it is becoming a different way for me to explore ideas. The resistances one encounters are different, and different types of mistakes, accidents and serendipity occur.

recent early test for #59
recent early test for #59

some sources

In the fifties and especially the sixties, many books were published about analog computing, especially in English and German. Since demand at this moment in time is almost completely zero, it is now relatively easy to build up a small library about this topic in these two languages; only the earlier books are hard to find, and also books in other languages than English and German.
Please note that I am still very much interested in any material about analog computing that was published before 1960 or outside the US and (West-)Germany. Also I am very interested in actual analog computers you might have lying around, small or large ! You can contact me here.

own modest library on analog computing.
own modest library on analog computing.

The main primary sources for this post were:

  • Herbert M.Paynter, “A Palimpsest on the Electronic Analog Art, being a collection of reprints of papers and other writings which have been in demand over the past several years”, George A. Philbrick Researches Inc., Boston, 1955.
    A great and inspiring collection of early papers that are almost impossible to find otherwise.
  • Stanley Fifer, “Analogue Computation, Volume I-IV”, McGraw-Hill, New York, 1961.
    The most extensive of the early books on analog computing, written by the former director of Project ‘Cyclone’, 1331 pages in four volumes covering all aspects of functional analog computers and some discussion of direct analog techniques.
  • Francis J. Murray, “Mathematical Machines, Volume II: Analog Devices”, Columbia University Press, New York, 1961.
    A great overview of analog computing techniques, including pointers to many very exotic direct analogs.

and the main secondary sources:

  • James S. Small, “The Analogue Alternative; The Electronic Analogue Computer in Britain and the USA, 1930-1975″, Routledge, London, 2001.
    The only solid historical overview of electronic analog computers I am aware of. Great overview of technical principles and development, economic history and the analog versus digital debate.
  • David A. Mindell, “Between Human and Machine; Feedback, Control and Computing before Cybernetics”, Johns Hopkins University Press, Baltimore, 2002.
    Amazing study of the history of gun control and the different strands of research that came together in the mechanical and electronic gun directors that were used at the end of the second world war.
  • Bernd Ulmann, “Von der Raketensteuerung zum Analogrechner; Helmut Hoelzers Peenemuender Arbeiten und ihr Erbe”, slides for a colloquium in Hamburg.
    Great and detailed source of information about the work of Helmut Hoelzer.
  • Bruce J. MacLennan, “Review of Analog Computing“, Technical Report UT-CS-07-601, University of Tennessee, Knoxville, 2007.
    (Unedited draft of entry 19, pp. 271-294 in Robert A. Meyers et al. (ed.), “Encyclopedia of Complexity and System Science” ,  Springer, 2009).
    Very interesting overview of recent interest in analog computing and its links with unconventional computing and hypercomputation / super-Turing ideas.

other books I recommend:

  • Granino A. Korn and Theresa J. Korn, “Electronic Analog Computers, Second Edition”, McGraw-Hill, New York, 1956.
    One of the early overviews of the field and a great introduction. Exists in several editions, there is also a later version that includes discussion of hybrid computers.
  • Charles Care, “Technology for Modelling: Electrical Analogies, Engineering Practice, and the Development of Analogue Computing”, Springer, New York, 2010.
    Interesting book that develops several interesting ideas on the history of computing and discusses a number of case studies of applications of analog computing.
  • Bernd Ulmann, “Analogrechner; Wunderwerke der Technik – Grundlagen, Geschichte und Anwendung”, Oldenbourg Wissenschaftsverlag GmbH, Oldenburg, 2010.
    A book I have not (yet) seen, but based on his site (see below) I am confident it is great.

collections online:

  • Indicative of the lowly status of this history is that there is no institutional museum devoted to analog computers; some computer museums have one or two, tucked away in a corner with exotic devices, but the only real museum devoted to this machines is the personal project of Bernd Ulmann. I really have to go there and visit ! As counterpart of his physical collection of machines, Bernd Ulmann hosts an extensive site with a lot of documentation. Especially the library section is pure gold.
  • Another great online museum is the one maintained by Doug Coward, with a good overview of machines made by different manufacturers.
  • Joe Sousa does a great job in collecting the work of George Philbrick and his company online in the Philbrick Archive.