PLEASE NOTE: I am not a language designer, and I am not a compiler
writer. These opinions are based solely on my experiences -- and my
frustrations -- with the current choice of programming languages.
I don't plan on making this essay a history of programming languages,
but to briefly -- and simplistically -- summarize things, high-level
computer languages were invented in the late 1950s. What followed was
an explosion of different computer languages, most of which died away.
One computer language, Algol, became very influential in the design of
languages that proceeded it. One Algol successor was particularly
influential: Simula, which introduced object-oriented programming.
Note that object-oriented programming took about 15-20 years to be
adopted by programmers for mainstream use.
In the 1970s, several other important programming language concepts
were developed, most notably exceptions, paramaterized types and
concurrency. These concepts were adopted by the Ada language in
1980. But Ada was never really successful outside of the military,
and the concepts it expounded are only being adopted by programmers
Since then, there haven't been any significant new features in
programming languages. Two of the most widely adopted languages
today, C++ and Java, are fairly recent, but they contain mixes of
features found in older languages, and don't have any significant
What these languages have in common is that they are all closely tied
to the von Neumann architecture. Variables refer to words of memory.
The flow of control is linear, structured using if and while
statements (or syntactic variations of them), and have synchronous
However, in the last 40 years, computer hardware has become
significantly more complicated. Data is no longer held in fixed
memory locations, but instead can exist in various levels of cache, or
in registers. Computer hardware can execute instructions in a
different order than specified by the programmer. However, all of
this is hidden from the programmer, and the same simple computer model
is still presented.
The type of programming problems has also changed in the last 40
years. Instead of simple batch programming jobs with very few inputs
and outputs, today's applications are GUI based and highly
interactive. The computer system often has multiple CPUs, and the
program should be efficiently spread across them. The program itself
might be distributed, and run across many computers which may be
anywhere on the globe.
The one key idea here is that a computer program has to deal with
events from many different input sources, all of which can occur
asynchronously. The program often spends less time performing
computations, and more time switching data between its various input
and output sources. The channels often have high latencies, and the
program usually spends much of its time waiting for events to happen.
Using today's computer languages, there are basically two ways of
handing this type of problem. One is to have a main loop in your
program -- an event loop -- that listens to all of the program's input
sources and then dispatches the data using subroutine calls,
i.e. callbacks. The problem is that dealing with stateful data using
callbacks is extremely difficult. The other way is to use threads,
where each thread listens to its own input source, and deals with its
own state. The problem here is that threads are a very low-level
concept. Transfering data between threads is difficult. It is very
easy to write code which is not "thread-safe", and the compiler will
provide the programmer with little guidance writing safe code.
The underlying problem with both of these solutions is with the
programming language. While the languages let the programmer write
code where the flow of execution is structured, they provide little
help for structuring the flow of data. Callbacks and threads are
bad because they obscure the flow of data through the program.
This is a problem that has long been recognized by computer
scientists, yet ignored by programmers. In 1977, John Backus gave his
Turing Award lecture, "Can Programming Be Liberated From the von
Neumann Style? A Functional Style and its Algebra of Programs."  In
this lecture he proposed FP, a functional programming language. The
key features of his language was that it had no variables, and no
side-effects. The good part about this design is that it solves our
dataflow problem -- the dataflow is now highly structured. However,
it still exhibits the same problems if state management that we had
Another good approach, which would solve our state management problem
would be coroutines. Coroutines were described by Knuth in
of Computer Programming. They were implemented in Simula, yet sadly,
not in the languages that evolved from it. However, coroutines are
starting to gain in popularity, mostly through the efforts of the
Stackless Python effort.
Another problem with both function calls and coroutines as a dataflow
mechanism is that they are synchronous. A program must explicitly
transfer control to a (sub/co)routine, and control must be explicitly
returned. This can lead to tremendous inefficiencies when writing
A popular model for distributed programming is the remote procedure
call, whether implemented as Sun-RPC, RMI, CORBA or SOAP. This
purports to make distributed programming simple by letting the
programmer use a mechanism that is readily available in her
programming language of choice -- the subroutine call. However,
trying to simulate a synchronous subroutine call over an asynchronous
message passing channel is extremely inefficient. A program must
block while it waits for a result. Of course, you can thread your
program to get around this, but at that point the alleged benefit of
the RPC model -- its simplicity -- starts to become illusionary. The
problem is that you are trying to bend the distributed programming
problem to fit the limitations of the programming language, rather
than the other way around.
My ideal programming language would have the following features:
- The flow of data would be explicit and structured.
- The program can easily handle multiple inputs and outputs.
- The program can handle input and output asynchronously.
- Input and output can be pipelined at all stages to deal with
- Programs should be easily scalable across many CPUs and across networks.
The limitations of current programming languages is only going to
become worse as computer architectures evolve. At a recent
conference, Ivan Sutherland
the architecture of his asynchronous CPU :
FleetZero replaces synchronous arithmetic operations with
data-movement operations among asynchronous processor blocks
called ships. In place of synchronous chips' instructions are
binary codes that specify the routes data items need to take
among the processor blocks to accomplish the same tasks that
synchronous chips perform via sequential operations. Software
compilers would manage on-chip communications, routing data among
the asynchronous blocks.
This CPU would implement in hardware all of the features I described
above. It would be a shame if we didn't have a programming language
to take advantage of it.
 John Backus: Can Programming Be Liberated From the von Neumann
Style? A Functional Style and its Algebra of Programs. CACM 21(8):
613-641(1978) (Can anyone find a link to this article?)
 Donald E. Knuth: The Art of Computer Programming. Third Edition,
 "Scrap system clock, Sun exec tells Async". EE Times, 03/19/01.