The moral of the story can only be that a nontrivial algorithm is just nontrivial, and that its final description in a progranlming language is highly compact compared to the considerations that justify its design: the shortness of the final text should not mislead us!
the accusation that "I had made programming difficult". But the difficulty has always been there, and only by making it visible can we hope to become able to design programs with a high confidence level, rather than "smearing code", i.e., producing texts with the status of hardly sup- ported conjectures that wait to be killed by the first counterexample.
None of the programs in this monograph, needless to say, has been tested on a machine.
When programming languages emerged, the "dynamic" nature of the assignment statement did not seem to fit too well into the "static" nature of traditional mathematics.
one", but even "the true one". In view of the fact that we cannot even define what a Turing machine is supposed to do without appeal- ing to the notion of repetition,
For the absence of a bibliography I offer neither explanation nor apology.
Executional abstraction is so basic to the whole notion of "an algorithm" that it is usually taken for granted and left unmentioned. Its purpose is to map different computations upon each other. Or, to put it in another way, it refers to the way in which we can get a specific computation within our intellectual grip by considering it as a member of a large class of different computations;
The simplicity of the cardboard mechanism is a great virtue, but it is overshadowed by two drawbacks, a minor one and a major one. The minor one is that the mechanism can, indeed, be used for producing the greatest common divisor of 111 and 259, but for very little else. The major drawback, however, is that, no matter how carefully we inspect the construction of the mechanism, our confidence that it produces the correct answer can only be based on our faith in the manufacturer: he may have made an error, either in the design of his machine or in the production of our particular copy.
We could do the arith- metic ourselves, but of course it is better if the machine could do that for us. If we then want to believe the answer, we should be able to convince ourselves that the machine compares and subtracts correctly.
Instead of considering the single problem of how to compute the GCD(111, 259), we have generalized the problem and have regarded this as a specific instance of the wider class of problems of how to compute the GCD(X, Y).
when asked to produce one or more results, it is usual to generalize the problem and to consider these results as specific instances of a wider class. But it is no good just to say that everything is a special instance of something more general!
The machine that upon request can produce as answer the value of all sorts of funny functions of 111 and 259 becomes harder to verify as the collection of functions grows.
The freedom to replace one pebble with a two-dimensional freedom of position by two half-pebbles with a one-dimensional freedom of position is exploited in the suggested two-register machine. From a technical point of view this is very attractive; one only needs to build registers able to distin- guish between 500 different cases ("values") and by just combining two such registers, the total number of different cases is squared! This multi- plicative rule enables us to distinguish between a huge number of possible total states with the aid of a modest number of components with only a modest number of possible states each.
we have insisted upon a compact justification for the rules of the game and this implies that the rules of the game themselves have to be compact.
It is a fascinating thought that this chapter could have been written while Euclid was looking over my shoulder.
The power of a formal notation should manifest itself in the achievements we could never do without it!
such a notation technique should enable us to define algorithms so unambiguously that, given an algorithm described by it and given the values for the arguments (the input), there should be no doubt or uncertainty as to what the corresponding answers (the output) should be.
It is then con- ceivable that the computation is carried out by an automaton that, given (the formal description of) the algorithm and the arguments, will produce the answers without further human intervention. Such automata, able to carry out the mutual confrontation of algorithm and argument with each other, have indeed been built. They are called "automatic computers".
natural languages, non- formalized as they are, derive both their weakness and their power from their vagueness and imprecision.)
the fact that programming languages could be used as a vehicle for instructing existing automatic compu- ters, has for a long time been regarded as their most important property.
the fact that programming languages could be used as a vehicle for instructing existing automatic compu- ters, has for a long time been regarded as their most important property. The efficiency with which existing automatic computers could execute programs written in a certain language became the major quality criterion for that language!
it is not unusual to find anomalies in existing machines truthfully reflected in programming languages, this at the expense of the intellectual manageability of the programs expressed in such a language (as if programming without such anomalies was not already diffi- cult enough
our algorithms could actually be carried out by a computer as a lucky accidental circumstance that need not occupy a central position in our considerations.
If you are convinced of the usefulness of the procedure concept and are surrounded by implementations in which the overhead of the procedure mechanism imposes too great a penalty, then blame these inadequate implementations instead of raising them to the level of standards!
I view a programming language primarily as a vehicle for the description of (potentially highly sophisticated) abstract mechanisms.
the algorithm's outstanding virtue is the potential compactness of the arguments on which our confidence in the mechanism can be based. Once this compactness is lost, the algorithm has lost much of its "right of existence" and therefore we shall consciously aim at retaining that compactness.
They are truly "names" in the sense that by inspecting the sequence "one, two, three" no rule enables us to derive that the next one will be "four". You really must know that. (At an age that I knew perfectly well how to count -in Dutch- I had to learn how to count in English, and during a school test no clever inspection of the words "seven" and "nine" would enable me to derive how to spell "eight", let alone how to pronounce it!)
The total number of points in that state space is the product of the number of points in the state spaces from which it has been built (that is why it is called the Cartesian product). Whether such a register is considered as a single variable with 10 8 different possible values, or as a composite variable composed out of eight different ten-valued variables called "wheels" depends on our interest in the thing.
Many a programmer's decisions have to do with the introduction of state spaces with coordinate systems that are appropriate for his goal
the design of such a system is a goal-directed activity, in other words that we want to achieve something with the system.
the set of possible post-conditions is in general so huge that this knowledge in tabular form (i.e. in a table with an entry for each R wherein we would find the corresponding wp(S, R» would be utterly unmanageable, and therefore useless.
For a fixed mechanism S such a rule, which is fed with the predicate R denoting the post-condition and delivers a predicate wp(S, R) denoting the corresponding weakest precondition, is called "a predicate transformer". When we ask for the definition of the semantics of the mechanism S, what we really ask for is its corresponding predicate transformer.
Secondly -and I feel tempted to add "thank goodness"- we are often not interested in the complete semantics of a mechanism. This is because it is our intention to use the mechanism S for a specific purpose only, viz. for establishing the truth of a very specific post-condition R for which it has been designed. And even for that specific post-condition R, we are often not interested in the exact form ofwp(S, R); often we are content with a stronger condition P,
For many years thereafter I have regarded the irreproducibility of the behaviour of the nondeterministic machine as an added complication that should be avoided whenever possible. Interrupts were nothing but a curse inflicted by the hardware engineers upon the poor software makers!
program testing can be quite effective for showing the presence of bugs, but is hopelessly inadequate for showing their absence.
any design discipline must do justice to the fact that the design of a mechanism that is to have a purpose must be a goal-directed activity. In our special case it means that we can expect our post-condition to be the starting point of our design considerations. In a sense we shall be "working backwards".
in practice the notion of a liberal pre-condition is a quite useful one. If one implements, for instance, a programming language, one will not prove that the implementation executes any correct program correctly; one should be happy and content with the assertion that no correct program will THE CHARACTERIZATION OF SEMANTICS 23 be processed incorrectly without warning-provided,
Once this tool has been developed, we shall consider how it can be bent into one allowing us to talk about liberal pre-conditions to the extent we are interested in them.
If one so desires one can approach the problem of programming language design from out of that corner. In such an approach the -rather formal- starting point is that the rules for constructing predicate transformers must be such that whatever can be constructed by applying them must be a predic- ate transformer enjoying the properties 1 through 4 from the previous chapter "The Characterization of Semantics", for if they don't, you are just massag- ing predicates in a way such that they can no longer be interpreted as post- conditions and corresponding weakest preconditions respectively.
Note. Those who think it a waste of characters to introduce an explicit name such as "skip" for the empty statement while "nothing" expresses its semantics so eloquently, should realize that the decimal number system was only possible thanks to the introduction of the character "0" for the concept zero.
We shall not try to demonstrate this and leave it to the reader's taste whether he will regard this as a trivial or as a deep mathematical result.
At first sight such a cyclic definition seems frightening, but upon closer inspection we can convince ourselves that, at least from a syntactic point of view, there is nothing wrong with it.
It is tradition to denote the composite object cor- responding to that predicate transformer by "Sl; S2" and we define wp("Sl; S2", R) == wp(Sl, wp(S2, R»
Wtf this is cps but why? Doesnt square with my understanding of what were defining
A third reason is precisely that most of us are so famiJiar with them that we have forgotten how, a long time ago, we have convinced ourselves of their correctness:
There has been a time that technical considerations exerted a very strong pressure towards minimization of the number of flip-flops used and to use more than six flip- flops to build a finite-state machine with at most 64 states was at that time regarded, if not as a crime, at least as a failure.
They were conceived at a time when it was the general opinion that it was our program's purpose to instruct our machines, in contrast to the situation of today in which more and more people are leaning towards the opinion that it is our machines' purpose to execute our programs.
Any effort to evaluate the value of a variable having that unique, special value "UNDEFINED" can then be honoured by program abortion and an error message. Upon closer scrutiny, however, this simple proposal leads to logical problems; for instance, it is then impossible to copy the values of any set of variables. Efforts to remedy that situation include, for instance, the possibility to inspect whether a value is defined or not. But such ability to manipulate the special value -e.g. the value "NIL" for a pointer pointing nowhere- easily leads to confusions and contradictions: one might discover a case of bigamy when meeting two bachelors married to the same "nobody". Another way out, abolishing the variables with undefined values, has been the implicit initialization upon block entry not with a very special, but with a very common, almost "neutral" value (say "zero" for all integers and "true" for all booleans). But this, of course, is only fooling oneself; now detection of a very common programming error has been made impossible by making all sorts of nonsensical programs artificially into legal ones.
The fact that the three repetitive constructs, separated by semicolons, now appear in an arbitrary order does not worry me: it is the usual form of over-specifica- tion that we always encounter in sequential programs prescribing things in succession that could take place concurrently.
The moral of the story is that once one has an algorithm, one should not be content with it too soon, but investigate whether it can still be massaged. When one has made such recon- siderations a habit, it is unlikely that the notion of "ultraviolet" would in this example have escaped one's attention.
The concerns to be separated could be called "the mathematical concerns" and "the engineering concerns". With the mathematical concerns I refer to the program's correctness; with the engineering concerns I refer mainly to the various cost aspects -such as storage space and computation time - associated \vith program execution.
There exist, regretfully enough, machines in which the continuous check that the simulation of the behaviour of the UM is not beyond their capacity is so time-consuming, that this check is suppressed for the supposed sake of efficiency: whenever the capacity would be exceeded by a correct execution, they just continue -for the supposed sake of convenience- incorrectly.
Once the automatic computer was there, it was not only a new tool, it was also a new challenge and, if the tool was without precedent, so was the challenge.
we are faced with the challenge of discovering new (desirable) applications, and this is not easy, because the applications could be as revo- lutionary as the tool itself.
For anyone interested in the human ability to think difficult thoughts (by avoiding unmastered complexity) the programming task provides an ideal proving ground.
The purpose of thinking is to reduce the detailed reasoning needed to a doable amount, and a separation of concerns is the way in which we hope to achieve this reduction.
The crucial choice is, of course, what aspects to study "in isolation", how to disentangle the original amorphous knot of obligations, constraints, and goals into a set of "concerns" that admit a reasonably effective separa- tion.
Bertrand Russell's verdict: "The advan- tages of the method of postulation are great; they are the same as the advan- tages of theft over honest toiL".
My interest in formal correctness proofs was, and mainly still is, a derived one. I had witnessed many discussions about programming languages and programming style that were depressingly inconclusive. The cause of the difficulty to come to a consensus was the absence of a few effective yardsticks in whose relevance we could all believe. (Too much we tried to settle in the name of convenience for the user, but too often we confused "convenient" with "conventional", and that latter criterion is too much dependent on each person's own past.)
And, above all, length of a formal proof i an objective criterion: this objectivity has probably been more effective in reaching a comfortable consensus than anything else, cer- tainly more effective than eloquence could ever have been. The primary interest was not in formal correctness proofs, but in a discipline that would assist us in keeping our programs intelligible, understandable, and intellec- tually manageable.
I prefer to view formal methods as tools, the use of which might be helpful. I have tried to present programming rather as a discipline than as a craft.
it does not suffice to design a mechanism of which we hope that it will meet its requirements, but that we must design it in such a form that we can convince ourselves -and anyone else for that matter- that it will, indeed, meet its requirements.
As remarked above, the purpose of thinking is to reduce the detailed reasoning needed to a doable amount. The burning question is: can "think- ing" in this sense be taught? If I answer "No" to this question, one may well IN RETROSPECT 217 ask why I have written this book in the first place; if I answer "Yes" to this question, I would make a fool of myself, and the only answer left to me is "Up to a point.