Lexicon / algorithmization

Douglas Edric Stanley

2005.07.27

The process whereby various phenomena are rendered accessible to algorithmic analysis, and eventually algorithmic manipulation.

With the invention of the Turing Machine, a new form of measurement was invented, namely that of computability. So too does algorithmization appear to us as a measurable quality, only in this case in relation to concrete programmable machines. Where the Turing Machine measured computability, programmable algorithm machines can be the measure of algorithmization, or further bastardize terminology: algorithmability. This measurement is not of any specific algorithm itself, but the degree to which an event, person, place or thing can be entered into an algorithmic process. Even processes themselves, algorithmic or not, can be algorithmized.

Often algorithmization requires a first step of either digitizing or tagging whatever is to be algorithmized. While these two possible steps are in most cases the essential prerequisites to the creation of an algorithmic machine, they are not to be confused with the process of algorithmization itself. This is a common misconception, which is further acerbated by the fact that many exhibits and publications use a common term — the digital — to describe an infinite range of very different works of art (cf. 01.01.01, ZeroOne, Digital Arts Museum, Villette Numérique, et cætera). Digital is for our purposes a mostly quantitative material, not a qualitative one, and we would even go so far as to suggest that a digitized object remains ultimately inactive until entered into some sort of algorithmic machine; as such it is merely a material that simplifies the operation described by the algorithm.

Algorithmization is therefore the process of rendering the world accessible to algorithms. This can simply mean giving computer programs access to a flat database of digital materials, otherwise known as data, and as a consequence conforming that data to the program. But it can further imply the creation of interoperability protocols, allowing one type of data to be spliced, compared, modified, or compressed along with another data of a wholly different type. Before the data has been siezed by any specific algorithm, it has already entered into a proces of algorithmization, making it compatible for some future algorithm. For example, algorithmization could describe the process whereby we conform an image to a similar data format as that of a piece of text, as in the PostScript protocol. The creation of a data protocol is therefore one of the fundamental forms of algorithmization and further suggests why digitization is a minor process in the overall picture of a computer in operation. As many contemporary computer users know, having a digital image is useless unless you or the computer know what format that image has been saved in. These formats are the conformance keys that give programs access to it.