Part III: Abstractions of State
In the previous part, we characterized each type of data by the
collection of operations that could be performed on data of that type.
However, there were only two fundamental kinds of operations: those
that constructed new things and those that asked questions about
existing things. In this part, we'll add a third, fundamentally
different, kind of operation, one that modifies an existing object.
This kind of operation also raises the possibility that two concurrently active
computations will interact because one can modify an object the other
is using. Therefore, at the conclusion of this part we'll examine
concurrent, interacting computations.
We will lead into our discussion of changeable objects by looking at
how computers are organized and how they carry out computations. The
relevance of this discussion is that the storage locations in a computer's memory
constitute the fundamental changeable object. We'll look next at how
numerically indexed storage locations, like a computer's memory, can
be used to make some computational processes dramatically more
efficient by eliminating redundant recomputations of subproblem
results. Then we'll look at other forms of modifiable objects, where
the operations don't reflect the computer's numbered storage
locations but rather reflect application-specific concerns. We'll
then build this technique into object-oriented programming by blending in the
idea that multiple concrete kinds of objects can share a common
interface of generic operations. Finally, we will show how to
transplant these same ideas into another programming language, Java,
that we will use to introduce programs that have concurrent,
interacting components.
Chapter 11: Computers with Memory
We first address the questions of what a computer is and how
it comes to compute by presenting a simplified RISC architecture and
showing how to program in its assembly language. We call attention
to memory as a numbered collection of storage locations, and use this
as motivation for introducing Scheme's vectors (one-dimensional
arrays) as a way to access storage from within the high-level
language. In the application section, we use vectors to program a
simulator for our RISC architecture in Scheme.
Chapter 12: Dynamic Programming
We show how storage, in the form of vectors, can be used to
dramatically improve the efficiency of algorithms through the closely
related techniques of memoization and dynamic programming. The
starting point is a tree-recursive procedure that repeatedly
recomputes solutions to identical subproblems. In memoization, a
table is added that indicates for each subproblem whether it has
already been solved, and if so, what the solution was. This is used
to avoid resolving the same subproblem again. Dynamic programming is
a variation on this theme in which the ``top down'' or ``demand
driven'' tree-recursive structure of the computation is replaced by a
systematic ``bottom up'' or ``supply driven'' filling in of the
table.
The application section shows how these techniques can be used to
format paragraphs into lines in an aesthetically pleasing way.
Chapter 13: Object-Based
Abstractions
We combine the notion of storage locations from chapter 11
with the earlier notion of abstract data types from chapter 6 to
introduce objects with state. Each class of objects has a particular
set of operations on it, some of which may mutate the object's state.
We focus on representation invariants: invariant properties that
describe the underlying internal representation of an abstraction that
has state. If the class's constructors establish the invariants and
the mutators preserve them, then they can always be counted on. The
application section uses object-based programming to construct a
new, more efficiently enlargeable, version of the movie database.
Chapter 14: Object-Oriented Programming
We extend object-based programming to true object-oriented
programming by reintroducing the generic operations from chapter 9
(operations that can be applied uniformly to objects of varying
classes), and then introducing class hierarchies as a way of
abstracting out the commonality amongst classes. In addition to
showing how object-oriented programming can be used, we show how it
can be efficiently implemented. (The implementation section can be
skipped without loss of continuity.) The application section uses
object-oriented programming to develop an ``adventure game''
simulation.
Chapter 15: Java, Applets, and Concurrency
We show how the object-oriented programming ideas from the
previous chapter can be transplanted into another programming
language, Java. With Java as a tool, we then introduce two new kinds
of programs: programs that respond to direct graphical manipulation by
the user, and programs that divide their attention between multiple
``threads'' of execution. We show that if one isn't careful,
concurrent (multi-threaded) programs can sporadically misbehave when
unusual timing coincidences occur, and show how this problem can be
prevented. In particular, we emphasize the use of synchronization
disciplines that build directly on our earlier material on
representation invariants.
This is a portion of the annotated table of contents
for the textbook Concrete Abstractions: An
Introduction to Computer Science Using Scheme, by Max Hailperin, Barbara
Kaiser, and Karl Knight. Free access to the full
text in PDF formd and supporting materials are also
available.