Last week we read the paper “Algorithm” by Andrew Goffey.

link to Goffey’s text

Goffey contributed this essay to explore different ways of defining the word. It is one of 36 essays in ‘Software Studies, a Lexicon’[1];

He starts the paper by offering the following definition of Algorithm (A.) “an unambiguous specification of how to solve a class of problems in a well-defined formal language [2] for calculating a function, for data processing or automated reasoning tasks”. Goffey (G.) heads up his paper with “Algorithm = logic + control.” [3]
Goffey points out that with A. there is not only a scientist/mathematician’s view of A. as a “theoretical entity … having an autonomous existence independent… of implementation details…having an autonomous independent existence… in (class) libraries…” but also, a software engineer’s view as A. as “A. having pragmatic efficiency to solve a particular problem”.
G. points out that it is not enough to define Algorithms as existing in a purely mathematical, abstract sense – for we cannot view A. in total isolation in the real world – for they affect and are affected by things around them. He goes on to point out specifically how A. are dependent on data and how A. and data are useless without each other. Also, G. points out, the purely ’scientific’ or ‘software engineer’ definitions do not adequately inform us about the social, cultural, and political role A.’s play.

G. approaches the definition from a philosophical standpoint; he describes programming languages as ‘syntactic artefacts’, they exist as machines – a series of steps to perform an actual outcome. G. goes onto illustrate the problem of bridging the purely theoretical definition of A. and the ‘real’ physical world, referring in his paper to Foucault.
G. proposes A. to be referred to as a (series of) statement(s), as Foucault terms it, in ‘Archaeology of Knowledge’ [4] . This is quite a dense work for this short summary, nevertheless, the statement used in the Foucault sense, when we think of this in A. context, would undermine the differentiation between the purely theory/practice definition. This returns us to the point that A. surely cannot exist in a void, independent of extrinsic factors.
The paper written in 2006 however does show how rapidly the world is changing; coincidentally, ‘Hinton, Osindero and The’ [5] published a paper in 2006, proposing a many layered neural network acting as an unsupervised machine, fine tuning it with back propagation.
This contrasts rules-based A. which is largely based on formal logic whereas the above paper heralds the onset of machine learning, giving computers the ability to refine their A. with no human intervention. Goffey does not bring this into his essay, a shame as he misses possibly the greatest leap in technology since the introduction of computers themselves.
The machine writes and re-writes its own A., with no human intervention. AI machines refining their own algorithms with the near impossibility of deciphering what/how they have done it, does pose us with new questions. With AI, there is slim possibility of understanding how the machine-created A.’s work – a black box. Does this pose a threat to humans – or possibly, will it save us and the planet?

[1]Software Studies a Lexicon Ed. Matthew Fuller Software Studies A Lexicon edited by Matthew Fuller The MIT Press Cambridge, Massachusetts London, England (2006)
[2]Rogers 1987:2
[3]Communications of the ACM CACM Volume 22 Issue 7, July 1979
[4]Foucault, Michel. 1969. The Archaeology of Knowledge. Trans. A. M. Sheridan Smith. London and New York: Routledge, 2002. ISBN 0-415-28753-7.
[5]Hinton, G. E.; Osindero, S.; Teh, Y. W. (2006). “A Fast Learning Algorithm for Deep Belief Nets” (PDF). Neural Computation. 18 (7): 1527–1554. PMID 16764513. doi:10.1162/neco.2006.18.7.1527.