 ## CS 334 Programming Languages Spring 2002 Lecture 21

### Axiomatic Semantics

{P}Stat{Q} means that if P is true and Stat is run and it halts, then Q will be true after it halts.

Axioms and Rules:

```	{P [expression / id]} id := expression  {P}

If {P & B} stats {P}, then {P} while B do stats {P & not B}

If {P} S1 {Q}, {R} S2 {T}, and Q => R,
then {P} S1; S2 {T}

If {P & B} S1 {Q} and {P & not B} S2 {Q},
then {P} if B then S1 else S2 {Q}

If P => Q, R => T, and {Q} S {R},
then {P} S {T}
```

Prove program correct if show

```	{Precondition} Prog {PostCondition}
```

Usually easiest to work backwards from Postcondition to Precondition.

Ex:

```	{Precondition: exponent0 >= 0}
base <- base0
exponent <- exponent0
ans <- 1
while exponent > 0 do
{assert:  ans * (base ** exponent) = base0 ** exponent0}
{           & exponent >= 0}
if odd(exponent) then
ans<- ans*base
exponent <- exponent - 1
else
base <- base * base
exponent <- exponent div 2
end if
end while
{Postcondition: exponent = 0}
{               & ans = base0 ** exponent0}
```

Let us show that:

```	P =  ans * (base ** exponent) = (base0 ** exponent0) & exponent >= 0
```
is an invariant assertion of the while loop.

The proof rule for a while loop is:

```	If {P & B} S {P}  then  {P} While B do S {P & not-B}
```
We need to show P above is invariant (i.e., verify that {P & B} S {P}).

Thus we must show:

```{P & exponent > 0}
if odd(exponent) then
ans<- ans*base
exponent <- exponent - 1
else
base <- base * base
exponent <- exponent div 2
end if
{P}
```
However, the if..then..else.. rule is:
```    if {P & B} S1 {Q} and {P & not-B} S2 {Q} then
{P} if B then S1 else S2 {Q}.
```
Thus it will be sufficient if we can show
```(1) {P & exponent > 0 & odd(exponent)}
ans<- ans*base; exponent <- exponent - 1 {P}
```
and
```(2) {P & exponent > 0 & not-odd(exponent)}
base <- base * base; exponent <- exponent div 2 {P}
```
But these are now relatively straight-forward to show. We do (1) in detail and leave (2) as an exercise.

Recall the assignment axiom is {P[exp/X]} X := exp {P}.

If we push P "back" through the two assignment statements in (1), we get:

```{P[ans*base/ans][exponent - 1/exponent]}
ans<- ans*base; exponent <- exponent - 1 {P}
```
But if we make these substitutions in P we get the precondition is:
```    ans*base* (base ** (exponent - 1)) = base0 ** exponent0
& exponent - 1 >= 0
```
which can be rewritten using rules of exponents as:
```    ans*(base ** exponent) = base0 ** exponent0 & exponent >= 1
```
Thus, by the assignment axiom (applied twice) we get
```(3) {ans*(base**exponent) = base0**exponent0 & exponent >= 1}
base <- base * base; exponent <- exponent div 2 {P}
```

Because we have the rule:

```    If {R} S {Q} and R' => R  then {R'} S {Q}
```
To prove (1), all we have to do is show that
```(3)     P & exponent > 0 & odd(exponent) =>
ans*(base ** exponent) = base0 ** exponent0
& exponent >= 1
```
where P is
```    ans*(base**exponent) = (base0**exponent0) & exponent >= 0.
```
Since ans * (base ** exponent) = (base0 ** exponent0) appears in both the hypothesis and the conclusion, there is no problem with that. The only difficult is to prove that exponent >= 1.

However exponent > 0 & odd(exponent) => exponent >= 1.

Thus (3) is true and hence (1) is true.

A similar proof shows that (2) is true, and hence that P truly is an invariant of the while loop!

Axiomatic semantics due to Floyd & Hoare, Dijkstra also major contributor. Used to define semantics of Pascal [Hoare & Wirth, 1973]

Too high level to be of much use to compiler writers.

Perfect for proving programs correct.

## Denotational Semantics

Mathematical definition of meaning of programming constructs. Find denotation of syntactic elements.

E.g. (4 + 2), (12 - 6), and (2 * 3) all denote the same number.

Developed by Scott and Strachey, late 60's early 70's

Program is defined as a mathematical function from states to states. Use these functions to derive properties of programs (e.g. correctness, soundness of typing system, etc.)

Start with functions defined by simple statements and expressions, combine these to get meaning of more complex statements and programs.

Tiny

Syntactic Domains:

```    I in Ide
E in NumExp
B in BoolExp
C in Command
```

Formal grammar

```    E ::= 0 | 1 | read | I | E1 + E2
B ::= true | false | E1 = E2 | not B | fn x => E | E1 (E2)
C ::= I := E | output E | if B then C1 else C2 |
while B do C |  C1; C2
```

Semantic Domains:

```    State   =   Memory x Input x Output
Memory  =   Ide -> [Value + {unbound}]
Input   =   Value*
Output  =   Value*
Value   =   Nat + Bool + (Nat -> Value)
```
where Nat = {0, 1, 2, ...} is as defined above, and Bool = {true, false}

We assume that the following built-in functions are defined on the above domains:

```    and : Bool x Bool -> Bool,
if...then...else... : Bool x Y x Y -> Y + {error}
for any semantic domain Y,
=  : Value x Value -> Bool,
hd : Value* -> Value,
tl : Value* -> Value*,
```
where each of these has the obvious meaning.

In the denotational semantics given below, we use s or (m, i, o) for a typical element of State, m for Memory, i for Input, o for Output, and v for Value.

Denotational Definitions:

We wish to define:

• E: NumExp -> denotations (or meanings) of numeric expressions
• B: BoolExp -> denotations (or meanings) of Boolean expressions
• C: Command -> denotations of commands

Note that in the first instance the valuation of an expression may

1. result in an error,
2. depend on the state, or
3. cause a side effect.

Therefore we will let E have the following functionality:

```        E : NumExp -> [State -> [[Value x State] + {error}]]
```
where we write
```  [[E]]s = (v,s') where v is E's value in s & s' is the state
after evaluation of E.
or = error, if an error occurs
```

B and C are defined similarly. The three functions are defined below.

Define E : NumExp -> [State -> [[Value x State] + {error}]] by:

```    E []s = (0,s)

E []s = (1,s)

E [[read]](m, i, o) = if (empty i) then error,
else (hd i, (m, tl i,  o))

E [[I]](m, i, o) = if m i = unbound  then  error,
else  (m I, (m, i, o))

E [[E1 + E2]]s = if (E [[E1]]s = (v1,s1) & E [[E2]]s1 = (v2,s2))
then (v1 + v2, s2)                                       else error

E [[fn x => E]]s  = fun n in Nat. E [[E]](s[n/x])

E [[E1 (E2)]]s = E [[E1]]s (E [[E2]]s)
```
Note difference in meaning of function here from that of operational semantics!

Define B: BoolExp -> [State -> [[Value x State] + {error}]] by:

```    B [[true]]s = (true,s)

B [[false]]s = (false,s)

B [[not B]]s = if  B [[B]]s = (v,s') then  (not v, s'),
else  error

B [[E1 = E2]]s = if  (E [[E1]]s = (v1,s1)  & E [[E2]]s1 = (v2,s2))
then  (v1 = v2, s2)
else error
```

Define C : Command -> [State -> [State + {error}]] by:

```    C [[I := E]]s = if  E [[E]]s = (v,(m,i,o)) then (m[v/I],i,o)
else  error
```
where m' = m[v/I] is identical to m except the value of I is v.
```    C [[output E]]s = if  E [[E]]s = (v,(m,i,o)) then (m,i,v.o)
else  error
```
where v.o is the result of attaching v to the front of o.
```    C [[if E then C1 else C2]]s = if  B [[B]]s = (v,s')
then  if  v  then  C [[C1]]s'
else  C [[C2]]s'
else  error

C [[while E do C]]s = if  B [[B]]s = (v,s')
then if v then if  C [[C]]s' = s''
then C [[while E do C]]s''
else  error
else  s'
else  error

C [[C1; C2]]s = if  C [[C1]]s = error then error
else  C [[C2]] ( C [[C1]]s)
```
End Tiny

Notice that definition of while is a recursive definition.

Thus, if B [[B]] s = True and s' = C [[S]] s, then C[[while B do S]]s = C [[while B do S]]s'

Solution involves computing what is known as least fixed points.

Denotational semantics extremely popular during 70's and 80's, but has generally fallen out of favor because it is not as good for modelling concurrent systems. Also natural operational semantics is very similar, but does a better job with concurrency and is easy to convert to an interpreter.

Which is best?

No good answer, since have different uses.

Complementary definitions. Can be compared for consistency.

Programming language definitions usually still given in English. Formal definitions often too hard to understand or require too much sophistication. Gaining much more acceptance. Now relatively standard intro graduate course in CS curricula.

Some success at using formal semantics (either denotational or operational) to automate compiler construction (similar to use of formal grammars in automatic parser generation).

Semantic definitions have also proved to be useful in understanding new programming constructs.

Semantics and type theory now being used to prove security properties of code.

## More on object-oriented programming

### Modifying the interpreter

Compare the implementations of interpreters in ML and Java. Suppose that in fact the interpreters included both prettyprint and interpreter functions.

1. For which is it easier to add a new type of term? Why?

2. For which is it easier to add a new type of function on terms? Why?

This reflects an important difference between functional and object-oriented languages. In functional languages, it is straightforward to add new functions because all the pieces are contiguous, but hard to add new alternatives to a data type. With object-oriented languages it is easy to add new alternatives (just add a new class implementing the interface), but hard to add new functions, because require modifications to all of the classes representing alternatives. It would be nice to be able to have a language that would make it easy to do both!

### Things are even worse!

Did any of you write your interpreter by using inheritance to extend classes of PrettyPrinter? Why was that hard?

Because of the problems, instead modified the interfaces and classes (or started over). Violates "Open-closed" principle of classes. Classes should be open to extensions, but closed to modifications of the class itself. Problems with modification include impact on subclasses, users, etc. Can't always avoid this, but it is desirable if possible.

Back to: