As one knows without saying, we do not write anymore. The crazy kind of software engineering that was writing suffered from an incurable confusion between use and mention.
-- Friedrich Kittler
We write and read software, we write and read about software, and, quite often, we write and read about writing software. However, there was never writing without computing, as alphabetic writing itself is a computation of speech. If we want to clean up the muddle around our ideas of the gesture of writing software, we have to start where the writing begins. At the dawn of writing, around ten thousand years ago, there was nothing to compute but a syntax free digital datum, tokens relating to things.
Yes, alphabetic writing traces back to a data processing system used for administration ten thousand years ago in Mesopotamia. The storage of goods begets the storage of data. Thus clay tokens of different shapes are stored and processed in one to one correspondence with the goods.
The tokens allow storage and alteration of digital memory, and thus the implementation of finite state machines run on human accountants. What is often missed about them is their relation to the things they stand for.
For example, an ovoid token manifests a jar of olive. The fluid olive is divided and digitalised into jars and tokens. The tokens and jars are separate and correspond to each other. The information about the substance is kept and processed separately from the matter.
The ovoid token is not as nourishing os olive, it is oblivious to its existence, and it is effectively decoupled from it. It is stripped of all the existential value and meaning of olive as well as its particularity. This de-signifying of both existential meaning and particular physicality of the olive, which would not be possible without physical dividing of oil into pots, is the source of use of the token. For instance, it enables us to store the count of olive jars decoupled from the count of olive jars. We can reconcile by counting, which means assigning a token to a vessel via a sequence of gestures which form a programme and could be described as an implementation of a simple state machine. Thanks to the decoupling we notice that the count of pots is not equal to the count of pots. We infer that a jar has been taken from the storage. Being about olive means being effectively decoupled from the olive and significantly less oily.
Soon the tokens are put in envelopes, the envelopes marked with two-dimensional signs corresponding to the three-dimensional tokens. The envelopes become immutable data stores for transporting the tokens. As data stores, they are decoupled from the causality of the world and securely transported, in time and space, between computations. Forthwith tokens vanish from inside the envelopes which now often register debt. To write debtor's names marks are used as placeholders for sound. With the advent of alphabet sound itself is digitalised. The signs called alpha, beta and gamma used to mean an ox, a house and a club—the staples of settled life. Now, registering sounds, they acquire a new meaning. The ox, house and club are no more. Alpha, beta and gamma take over.
Owning an ox is an existential hurdle. It might be a powerful prime mover enabling excellent yields, yet it needs its own food, it needs shelter, it is an individual with whom we have a meaningful relationship. The ox is significant, it is full of meaning. The inscription saying I own an alpha to a temple abstracts all this meaning along with significant existential baggage away. It decouples and de-signifies existence and particularity of the ox. This abstraction still has existential meaning to me, and to the ox. The ox will be killed for celebrations, I will be punished if I do not bring in the ox - yet it is a new existential quality in the world which writes digital words on top of the lived experience it diminishes.
The two basic properties of the design of token systems - designifing of existence and effective decoupling - are the source of use, meaning and significance of any writing system. Thus they are determinant to writing and to computation itself.
The most influential XX philosophical schools - analytical and existential - were both in the blind to these properties of registring systems.
The analytical philosophy tried to create an effective theory of meaning, which requires a coupling between a token system and the thing it decouples from.
The existential philosophers wanted to address the existential meaning via a written discourse or theory of sign. Thus they wanted a token system which does not designify but signifies instead.
Even if alphabet could not express such languages, alphanumerics can.
A language which can describe causal interactions and actually cause them, which can affect our existence without human mediation, which is to say an effective and meaningful language, is called a programming language.
Such a synthetic language does not have any relation to speech or our natural language except for the moments we arrange the tokens to cause hallucinations of common understanding called connotations when naming variables, functions and abstract data types. The programming language is a system of enumerable tokens and a syntax which can be effectively coupled to physical machine processing analogue electrical signals. Such an analogue machine is called a universal digital computer as it is described by engineers using enumerable values and can run any computation which can be expressed as a system of digital tokens.
With the advent of stored-programme computers, token systems not only store data or describe programmes for their processing, but actually, cause the prescribed computation to occur. It is a bit like in the ancient magic where the inscriptions were believed to have a causal relationship to the world. Except today words make stuff happen for real. For instance, they can make a computer heat up.
Before we apply what we have learned about the dawn of writing to the practice of software development, let us clean our concepts up. By describing something we designify it and decouple from it. By designifing something, we are making it existential bearable. For example, by processing document designifing a person, a bureaucrat makes the suffering they cause sustainable. By decoupling from a thing, we make it manageable. For instance, by using a todo list, I make doing my shopping possible.
When we talk about writing software, we may speak of writing documentation, writing code, writing tests or modifying some parts of our build system. We call it writing software as the alphanumerics we arrange are about and cause a process running on some analogue machine. The stuff which we write, when we write software, describes the runtime. It means that it decouples and designifies it. Yet, we write software to make runtime happen.
The documentation describes the software process. It is useful as it is decoupled from the process and the code and thus designifies both of them. If we read the API documentation, we can interact with the process ignoring the specifics of the implementation and the runtime. We can couple the documentation to the implementation by autogenerating it from the code. This way, we lose some use. For example, no longer can we reconcile the documentation with the process coupled to the code. It is as if we were no longer able to check the count of the jars of olive against the past value.
The code describes the software process. It is useful as it can be coupled to the process. By running it through the build system and deploying, we can cause a process on the machine. The fact that code can effectively cause a runtime to happen is the most revolutionary of its properties.
Yet, one who modifies the memory of a running process to change its operation or directly inscribes machine code into a computer memory does not write software.
The meaning of code is the designification and decoupling from and within the runtime. Thanks to the decoupling, we can manipulate the code without modifying any existing runtime or element of it. Thanks to designifing, we can change the described aspects of the runtime without the existential burden of accidentally affecting the rest of it. For instance, adding a Domain Transfer Object into our code makes the runtime domain boundaries decoupled and insignificant to the rest of the runtime. Even if the data descriptions we create are isomorphic the sole existence of them as separate tokens in the codebase reduces the burden on the developer. Concerning software, it is possible to grasp the difference between abstracting and designifing. When abstracting we shrink the codebase and make code reusable. In turn, we multiply points of contacts of some tokens in the codebase with the runtime. To designify a part of the runtime, we add code which might not even affect it. Designification of runtime introduces concrete points of contact between code and processes caused by the code.
The tests describe a relationship between the software process and code. Tests say: The thing expressed by x should cause the outcome prescribed by y. This is a bizarre relationship, as it means the interconnection between tokens of tests and tokens of software is not formal. The two token arrangments are related not by properties on syntax level, as types and values are, but the existential relationship of the processes they cause. We can see it by testing the actual performance or undefined behaviour. In both cases, the tests can tell us things far beyond the reach of a formal analysis of the code. By decoupling from both the software process and the code, tests enable redundancy in descriptions. By designifing both the software process and the code, they allow for manipulating both of them independently. Thanks to tests, one can change the desired shape of the runtime or the state of the code alone.
All the elements which we did not designify by tests - from null values to performance - can convert into significant problems.
The build and deployment systems describe the causal process causing the runtime to happen out of the codebase. For example, they can connect the tests with the runtime by not emitting an artefact on test failure, thus making the tests an equal with the code as the cause of a process. They can decouple our code from the compiler input by preprocessing them and designify the existential burden of runtime dependencies by making them immutable. We design our build systems to allow us to build and deploy, introduce and obsolete runtimes fast, as though we would like to diminish their significance.
Just as the tokens were data in a human-powered finite state machine to manipulate oxen, our codebases are tokens in a human- and machine-powered finite state machine to control runtimes.
Thus the purpose of codebases containing documentation, code, tests and build systems is to make the runtime process insignificant and malleable. The gesture of writing code is not as much about making code run as it is to decouple from and designify the running software - to make deploying it and changing the way it runs intellectually and emotionally sustainable.
Writing software is not about nice naming, pythonic codes, autogenerating everything, smashing stuff and making it run. It is not the same as the art of choreography of gestures called computer programming. It is not the same as the skill of bending computer systems to your will called hacking.
Writing software is about making a system of tokens with multiple redundancies, mutual designification and decoupling.
We write software to make the manipulation of runtimes existentially bearable.