What has Symbolics done for me lately?

Jay Nelson jay@REDACTED
Wed Feb 26 08:02:54 CET 2003


At 07:05 PM 2/25/03 +0100, you wrote:
>Just a stupid question, if that environment was that
>great, why has nobody cloned it in software?
>Or was it related to the hardware in some way?

A disarmingly simple question to ask.  Why,
pray tell, has erlang not displaced Java or
even Visual Basic?
;-)

>Or is some spectacular feature of it, one that
>I post-Symbolics-born take for granted, in
>use in some software today?

This gave me the same feeling when at lunch with a
20-something young lady I referred to an event that
happened "around the time MTV started".  The look
on her face said, "Oh my god... you were around
when electricity was discovered!?"  I'm not really as
old as I sound -- at least _I_ don't think so...

Thanks Michael for the link to the paper.  I skimmed
it and it reminded me of a lot that was happening then.

A short little ramble of my own opinions:

The Lisp Machine and Symbolics Machines were
hardware solutions that were necessary to make
Lisp perform competitively at the time.  Symbolics
used a 40-bit custom tagged architecture (every
word had 32 bits plus 4 for error detection and
another 4 for type tagging and garbage collector
marking -- execution of instruction went in parallel
with datatype checking and was invalidated in a
hardware trap).  At the time Sun was a 16-bit
machine and the IBM PC was only a few years
old.  The 640K barrier hadn't been broken and
I seem to recall 24MB or 48MB of memory.
When 600x480 was high resolution, these babies
had B&W monitors bigger than an A4 sheet of
paper (with similar dimensions, vertically oriented),
3 button mice which were rare, and resolution >1K
or 2K (I remember using big full size laser disks
to display detailed topographic maps of the world
on a custom color monitor we had for one machine).
The Symbolics tackled problems that the other
machines couldn't do at all, but it still ran regular
LISP code nearly as fast as the other equipment
ran C code.  The project I worked on was about
250K lines of LISP, developed by a team of 5 or 6,
an impossible task in any other language on any
other hardware.

The big thing was that every line of code running
in the OS, the editors, and everything was written in
Symbolics Lisp and Flavors (the OO enhancements).
All the source code came with the machine and you
could hit the Suspend key (yes a real key), modify the
OS code that was executing (say for example the
scheduler if your process was not getting enough
CPU time), and hit the Resume key
(a different key) to restart the OS at the line of code
that was suspended.  Everything was open and you
could even send code to run dynamically on a remote
machine. Today the security implications have removed the
possibility of the things we used to do as fun hacks.
The roots pre-dated the Free Software Foundation, although
the ideas came from the same environment (MIT).
[The only similar machine I worked on was the Lilith
which ran Niklaus Wirth's new Modula-2 in a similar
manner.  The billiards application that came with it
was truly amazing at the time (1984).  It was used
for CAD/CAM as an alternative to the LISP-based
original AutoCad software].  Symbolics advanced
the state of the art in CAD/CAM, 3D modelling and
animation (as in multi-light source shading and
scripting of the TV network logos that fly through
space).

The problem was that one machine started at $40K
and they ran up to $150K or so, but they weren't
servers.  They were workstations for individual programmers.
Sun was still on the Sun 2 or 3, but they were getting
faster and people were saying Lisp could run almost
fast enough on them.  Meanwhile Symbolics delivered
a full C interpreter(!) where you could modify code in
the debugger and resume with new variables defined
or even new functions.  C on a LISP machine is a hard
sell.

The bottom line was that custom hardware always loses
to commodity hardware.  Disruptive technology is
cheaper, and less capable, but good enough as an
alternative.  The target market of Symbolics was too
small and the performance of Common Lisp on Sun
hardware was too good (but nowhere near what
Symbolics had).  Meanwhile free C compilers on every
hardware made that the standard.  There was a good
paper on why Common LISP failed vs. C written by
Richard Gabriel who headed Lucid at the time.

http://www.dreamsongs.com/WorseIsBetter.html
(Browse the site, Gabriel has some amazing insight.)


What are you using today that you could credit to
Symbolics or LISP?  Quite a lot more than you would
expect:

1) LISP was one of the first languages in the 50's
(after Fortran and before Cobol and APL) and the
only language to survive the change to structured
programming in the 70's and OO plus message
passing in the 80's, adapting all trends into the
language itself yet retaining its original roots.
Language evolution is what it taught designers.

2) erlang to me has a lot in common with LISP in the
way lists work, the recursion, fun objects, and the
vast library of functions (Common LISP had over
900 BIFs).  I credit Standard Template Library (STL)
as a direct admission that C couldn't compete
without all the missing BIFs from CLOS (and boy,
do they feel unnatural in C/C++).

3) The Meta Object Protocol (MOP) of Common LISP
Object System introduced the concept of reflexive and
introspective languages (although Smalltalk took a first
cut at it)  -- all the Java stuff for introspection
come directly from  these concepts.  [Read the book by
Gregor Kiczales and Jim Des Rivieres to understand
how to implement an object language with the fullest
power possible.]

4) Garbage collection technology went through many
generations (pun intended) because of LISP.  Java
owes that to LISP as well, and presumably erlang too.

5) Exceptions, catch and throw were new to Common
LISP, and worked well with unwind-protect.

6) I believe object-oriented databases also came out
of this work (could be wrong on that one, but I remember
using one on Symbolics in 1986).

7) A lot of compiler optimization and compiler hints via type
declaration were developed, especially on the CMU Python
compiler; compiler memoization and lazy evaluation;
interpreter advances; code inspectors.  (there is still no
alternative that allows the 'advising' of a compiled piece
of code, essentially a function that wrapped a fun around
code and proxied the data in and out so that you could
redefine the function on a compiled image where source
was not available)

8) The Symbolics Z-macs (Zeta Lisp) environment has
surely spawned some of the ideas that are now in the latest
emacs.

9) CLIM's mouseover and context-sensitive help pre-dated
Windows but was probably traced back to Xerox work on
their Word Processing machines of the late 70's.

10) Hypertext documentation was really big on the Symbolics.
Nothing else had it.  9 and 10 certainly helped define HTML
and web browsing.

In general, a lot of the advances were absorbed and parts
were incorporated everywhere.  A lot of the influence is
now untraceable, but wouldn't have happened without the
LISP trend which was advanced greatly by the existence
of Symbolics.  You never know what you are a part of
until it is over with and you look back with perspective.
I've also noticed that some things just happen.  They can't
be recreated in another era -- the world changes.

On the other hand, maybe it is all just nostalgia.

erlang is giving me the same pleasure I had back then.
Maybe distel will put me back in the same frame of mind
for developing code that Symbolics had, I'll have to try it.

erlang is either the cusp of the realization that the Internet is a
distributed network of machines, or it too will be overlooked
and consigned to the trash heap of history in favor of
some overblown XML simulation of networked processes.

Learn early in your career that the best solution doesn't
always win.  Sometimes (more often than we care to
admit) worse is better.

jay



More information about the erlang-questions mailing list