Abstract
We discuss two different ways that the term “analog” (as opposed to “digital”) is used in the methodology of computer science and those engineering disciplines that are related to computer science. We show that formal models of computation on real numbers provide, indeed, an explication of what corresponds to the intuition that certain devices operating on continuous quantities perform computations. We call this “the analog continuous thesis” (“the AN-C thesis”), and we show how it is similar to other theses used to explicate computation, such as the Church-Turing thesis or the Cobham-Edmonds thesis.