The way to Lisp
Lisp has come in from the cold. After years enduring the icy Artificial Intelligence (AI) winter, a period in the late eighties when Lisp was dumped by many thanks to associations with over-hyped AI research, and a frosty decade in academic obscurity, the open source community has dusted down this unique language, and freshened her up with lemon odour. Or rather Lemonodor.com, a high profile Web site which actively promotes and describes Lisp use and can readily be considered the Slashdot of the Lisp world, has performed this service for the community.
Like a proud mother, Lisp, probably the oldest language still in use today, can view her children and remark on how their best features all stem from her genes or rather memes. Flexibility and speed of development are her watchwords, and key enabling features of the language, such as run-time typing and an interactive approach to interpretation or compilation have been adopted by newer languages such as Python. Yet Python misses the killer core or inner beauty which so many celebrate in Lisp.
To approach this core takes more than simply enumerating Lisp's good points. To hijack Python guru, Tim Peter's, words, "Good languages aren't random collections of interchangeable features: they have a philosophy and internal coherence that's never profitably confused with their surface features."
Understanding Lisp is a gradual process, and involves some degree of immersion in the history, philosophy and culture of Lisp as well as a gentle involvement with coding to further elucidate core concepts such as recursion and macros. Yet, Lisp, with an array of dialects, libraries and implementations is far from the easiest language to get up and running with, in contrast to Python, which obviously presents a one stop shop. And although a clear conceptual understanding can be readily sought in printed works such as The Little Lisper or in Abelson and Sussman's seminal Structure and Interpretation of Computer Programs (SICP), practical advice which would further support this knowledge is sorely lacking within the community. As we'll see, certain initiatives are under way to resolve these issues, but aside from evangelising Lisp, an effort which is close to achieving its goals, more work does need to be done in this direction, and this short article only aims to provide an overview for the intelligent and necessarily intrigued beginner.
No-one has been more instrumental in persuasively arguing the case for Lisp than coder and writer Paul Graham, whose own work with Viaweb is good testament both to the power of Lisp as an expressive language and Lisp's relevance within enterprise level, large-scale projects. Graham's essays and books provide a singular practical and conceptual resource, elegantly tying real world arguments within a neat conceptual bundle, which is well wrapped up by his strong theoretical grounding and knowledge of the language. He speaks from experience, and those who paraphrase Graham without his depth of understanding will always sound a bit hollow. Nevertheless, Lisp's features and advantages must be outlined, and these can readily be pinned down both to a decent level of abstraction and to highly usable abstractions themselves, such as closures and macros. It's worth remembering that Lisp belongs within the family of functional languages, which implies modularity, an essential requirement for large-scale, complex projects. Such abstractions, and features such as automatic memory management, courtesy of built-in garbage collection, readily enable new ways of programming, and this is Lisp's great advantage.
Sure, the semi-interpreted nature of Lisp, with functions able to be tried and tested at the interactive REPL (Read Eval Print Loop) prompt, or so called top level, assists in rapid development, and, Graham proffers amusing examples of improving server software on the fly, but Lisp's real advantage remains in its extensibility which can readily be seen as stemming from a core feature of the language; the fact that Lisp programs are expressed as Lisp data structures. Indeed, John McCarthy, a key figure within AI and inventor and primary implementor of Lisp, remarks in an essay on the early history of Lisp, that "One can even conjecture that Lisp owes its survival specifically to the fact that its programs are lists, which everyone, including me, has regarded as a disadvantage." Hacker, Pascal Constanza, argues that this core feature makes of Lisp much more than just another Turing complete language, and this notion, which embeds a complete theory of computation within Lisp, is a rich seam to mine, with demanding papers on this subject furnished by the likes of Graham, Sussman and Guy Steele, another major player in the early life of Lisp, and co-inventor of the intriguing Connection Machine.
More to Lisp than lists
Extendibility here means both extending the Lisp language itself in a completely transparent manner, thus building a domain specific language for the application, and providing the facility for others to readily extend and customise an application. In the first instance, the field of equivalence between software and data allows for coding custom languages with powerful abstractions, and in the latter case, this form of what Graham calls bottom-up programming naturally results in extensible software, with GNU Emacs as prime example here. From this perspective, Lisp isn't about writing applications, it's about writing languages. And, given the growing complexity of both hardware and contemporary software systems, this heavily modular, high level approach to programming and systems architecture is seriously compelling. Indeed, pioneers Sussman and Abelson preface their essential SICP volume with clear indications of how to control complexity, through establishing conventional interfaces, and establishing new languages.
And, alongside those who claim that Lisp is unusable due to an arcane syntax which multiplies parentheses, some would still maintain that Lisp is slow, and argue thus that the need for speed has lost out in the battle with complexity. But, given that nearly all modern implementations compile during interpretation, following a just-in-time model, or allow specifically for interactive compilation, speed is rarely an issue. And the speed of prototyping or coding is a further issue which should also enter the equation. With Lisp, at least one has greater options for exploration and prototyping, and, if needs be, optimisations can be furnished later in the day. Such notions regarding Lisp as sluggish belong within an old-fashioned view of Lisp which focuses on syntax and on lists, arguing that LISP stands for List Processing Language, and which corrals this powerful language within an academic prison. The new view of Lisp is that, given automatic highlighting and indentation, parentheses and other syntactical issues disappear. Lisp is a far more flexible language than the acronym would suggest.
A further consequence of the flattening of software and data as quite distinct animals, is that Lisp really does live in the land of the meta, and that's a place where a good few sophisticated coders and theorists like to hang out. Douglas Hofstadter, in his seminal mathematical and meta-math work, Godel, Escher, Bach: An Eternal Golden Braid, provides many mind stimulating adventures at the meta-level, and Lisp makes for a very natural fit here. And yet another consequence of the much vaunted data/software equivalence, is the unusual defining quality that Lisp can be written in itself. In practice, this can be achieved in very few lines of code, and such a resulting beast is the rather frightening metacircular interpreter or metacircular evaluator. This creature lies at the very heart of an understanding of the history and conceptual underpinnings of Lisp, and writing such an interpreter forms a useful exercise for the novice.
Again this is another rich seam well worthy of further investigation, and the intrigued could start with Graham's excellent paper, The Roots of Lisp, or plough straight into Abelson and Sussman's work. Indeed, in their heyday, these pioneers were churning out meta-circular evaluators for subsets and dialects of Lisp at an alarming rate, and their work forms an important link between the more exciting aspects of mathematics, philosophy and computer science. Another valuable starting point here would be the common assertion that the proof of Godel's incompleteness theorem, which is essential to an understanding of AI, would have been easier if he had invented Lisp first, given Lisp's predilection for the meta-circular. And just before any unthinking C coders chime in, a C compiler written in C, which can be used for bootstrapping, does not belong in the realm of the metacircular which further specifies that precise semantics must not be defined in the evaluator. The common parallel is with looking up a word in the dictionary, and finding that the definition uses the original word. That is how things work with a Lisp written in Lisp; eval, which quite obviously evaluates expressions, is implemented by calling eval. In contrast, a C compiler must specify detailed and precise semantics for each and every construct, and take care of boring old parsing. The REPL defines all that is needed to build a Lisp interpreter; read an expression, evaluate it and then print the results. It has to be admitted that there's a certain beauty and simplicity at work here, and Lisp is certainly unique in this respect.
A good deal of this simplicity stems from Lisp's roots, with the theoretical work of John McCarthy in the 1950s, which touches on all the rich thematics wrapped up by Godel's work in the sphere of mathematics. Both McCarthy and Graham write well on this early history of the language, and their texts make for essential reading. McCarthy did not set out to design and create a programming language to meet specific programming needs or satisfy a problem domain, rather he was interested in mathematical notation and expressing theory. This makes Lisp unique in the field of programming, and quite distinct from the functionality associated with C or C++. Lisp is a flexible, theoretical language which is primarily expressive.
Rather than jogging through the history of Lisp, which is well rehearsed elsewhere by the likes of McCarthy and Steele, Graham's Roots of Lisp paper presents a conceptual walk-through of the birth of Lisp, with McCarthy's notation translated into Common Lisp code, and along the way he provides a good description of the primitive Lisp forms, which are function calls or macros, before arriving at the detailed explication of an eval function written in Lisp. Graham describes Lisp's elegant syntax and notation, and key terms such as expression, form, list and atom. Alongside six primitive operators, the quote operator, which has obvious parallels with quotation in the English language, is well described as functioning to distinguish data from code. The lambda notation for denoting functions is clearly elaborated, and with great elegance, Graham whips out a surprise Lisp eval, written using functions built from only seven primitives. Further functions can be elaborated and evaluated using this eval, which can readily be transformed towards a contemporary Lisp, and thence bended towards implementations which can easily furnish abstractions such as object-oriented programming (OOP). Indeed, in his excellent ANSI Common Lisp, Graham shows as an exercise how a minimal OOP can be implemented in Common Lisp, without using CLOS (Common Lisp Object System) features. His preliminary language is implemented in just eight lines of code.
Under the mini OOP embedded language, when it comes down to improving the syntax of message calls to make them read more like Lisp, the rather more complex meta world of macros is encountered. Once again macros come courtesy of the uniform treatment of code and data as forms for manipulation. Graham has devoted the whole of the huge On Lisp volume to macros, which he considers of great importance within the paradigm of bottom-up programming, and macros are certainly an essential, if hard to learn, feature which allows for writing programs that write programs. Macros are quite simply functions which transform expressions and they can themselves call other functions and make use of other macros; a heady brew indeed whose power of transformation is unheard of elsewhere. To clear up any confusion, macros under Lisp have little to do with their C-based namesakes, which perform purely string substitutions. Macros allow the language to play with its own readily accessible internals as data, and a good many Common Lisp functions are implemented as macros themselves. Understanding macros is one thing, and making use of them perhaps even more complex, but given the definition that macros are simply operators that are implemented by transformation, and noting a few example expansions which can readily be tested with the macroexpand-1 function should set the beginner on the right track.
Though Lisp's history post-McCarthy does make for interesting reading, with colourful anecdotes peppering the story of computer science's most philosophical language and furnishing a classic narrative of riches to rags and back again, there is little here that is totally relevant to the contemporary Lisper, aside perhaps from intriguing material covering the hardware implemented Lisp Machines and their associated development environments such as Genera, which few contemporary IDEs can even dream of competing with. It's also worth bearing in mind that given Lisp's flexibility and extendibility which make it easy to create quite radically different dialects of Lisp, Lisp should really be considered as a family of languages rather than a language in its own right. And until the early to mid 80s the Lisp world was seriously splintered with competing dialects and implementations proliferating. To address these issues, hardcore Lisp hackers gathered to standardise a new language, Common Lisp, which is the main Lisp in use today, alongside Scheme, an unusual, elegant dialect created by Sussman and Steele in the late 70s. Common Lisp is well specified in Common Lisp the Language, or CLtL for those in the know, authored by Guy Steele. ANSI standardisation for Common Lisp followed a few years later.
Thus, one of the first choices facing the novice Lisp coder, before even considering free implementations, is whether to ride with Common Lisp or Scheme. There can be no easy answer and the question has probably fed more flame wars in both communities than any other issue. Researching the culture of both dialects can throw interesting light on theoretical issues under both languages, and it's relatively easy to grasp the fundamental differences in feel and approach. Scheme does have a particularly interesting history, and its creation is considered as of seminal importance within the history of computing, resulting as it does from an attempt to understand Actors, Carl Hewitt's message passing model of computation.
Scheme can be viewed as a more minimal language than Common Lisp, more elegant and crystalline, as opposed to the baroque, full features of Common Lisp. But there are great similarities between the languages, and core features of Scheme such as continuations, which freeze the state of a computation for later use, can readily be achieved with macros under Common Lisp. Once again, it seems that languages cannot so readily be boiled down to a feature set. The relative size of Scheme is also an issue. Given that Lisp doesn't really bother about the difference between built-in functions and user-defined functions, it's a tough call to decide where the core language ends and library functions begin. Under ANSI Common Lisp, the piece of string is certainly seen as being a good deal longer than under Scheme, but to do any useful work Schemers may have to take on board some supplementary libraries. It is perhaps more fitting to investigate specific implementations, with PLT Scheme as a hot favourite on that side of the Lisp fence. Scheme does have a lot of things going for it, and contrary to the argument that Scheme's simplicity and beauty don't play well with real-world issues, it is totally possible to produce enterprise level apps under Scheme. That said, the free software Common Lisp community, grouped around hearty resources such as cliki.net and Lemonodor, just seems much more active, though there is a good deal of crossover, with seasoned hackers turning their hands to both dialects. For the sake of simplicity, Scheme will remain outside the scope of these articles, with the caveat that an understanding of the conceptual underpinnings of Scheme and the lazy evaluation which continuations facilitate can prove suitably enriching for any hacker.
Beyond this essential conceptual background, what the newbie sorely needs to know are the specifics of respected implementations and how to get up and running with these most efficiently. Sure you can fire up, say, CLISP, a pleasant, simple Common Lisp interpreter, straight from the command line and throw a few Lisp expressions at it, but to get some decent work done you'll need a more powerful implementation which integrates well with an editor such as GNU Emacs to form an efficient IDE. In part two, we'll pit SBCL against CMUCL, the two favoured free implementations, integrate these tightly with GNU Emacs using a touch of SLIME, throw packaging and packages into the mix and touch on embedding and extending with Lisps such as librep and functionalities such as FFI (Foreign Function Interface).
The Lemonodor effect
The Common Lisp community has certainly gone from strength to strength in the last two years, with blogs and wikis as primary medium for the sharing of enthusiasm, inspiration and information. There are blogs by and for newbies, and blogs from old hands such as Rainer Joswig and Daniel Barlow. However, the Slashdot of all Lisp weblogs, if such a thing can be imagined, is surely Lemonodor, clearly signalling that Lisp is the new rock and roll with a mix of hardcore news from the Lisp community, John Wiseman's artistic LA lifestyle reports, and a marked emphasis on Lisp within the field of robotics.
Lushly produced, with intriguing illustrations often loosely associated with the topic at hand, Lemonodor is a legend in the land of Lisp. And a mention by Wiseman of a site, surely results in record traffic, if not of Slashdotting proportions. Even one of the best free Common Lisp IDEs, SLIME, makes mention of achieving Lemonodor fame in the startup routines. Inspired by Planet GNOME, Planet Lisp acts as meta blog, collecting essential content from a huge number of Lisp-related weblogs. Despite sometimes straying from the Lisp path into territory which indeed may rival the poor signal-to-noise ratios of Slashdot, Planet Lisp does make for a decent daily immersion in Lisp culture.
With plentiful RSS feeds, and much cross linking the Lisp world is well connected, and the greatest resource of them all, cliki.net collects much of these links and blogs and provides further essential resources. Cliki.net, powered by Daniel Barlow's CLiki engine, a Common Lisp wiki, is well worth checking out changelog-wise on a daily basis. The practical lisp page is a useful starting point for newbies, if somewhat dated in respect to SLIME use, but cliki.net does provide links or information on practically every aspect of Lisp culture and practice. Parodying or mirroring the evangelicism of a religious Web site, another CLiki, the ALU (Association of Lisp Users) wiki, is packed with so called "Road to Lisp" pages which demonstrate the fervent admiration which Lisp provokes. Unfortunately, the ALU wiki has recently been the subject of prolonged spam attacks, and its active life does appear to be in danger. Other CLikis of note include the *hyper-cliki*, an annotatable Common Lisp reference, and the wonderfully funky TUNES project CLiki which outlines a range of projects and resources towards the creation of a free reflective computing system, or Lisp-based OS. Other essential online resources linked from cliki.net, include the Common Lisp Cookbook, an attempt to create a community resource parallelling the Perl Cookbook approach and both Successful Lisp and Practical Common Lisp, two essential online works.
See also the sequel to this article: The Road to Lisp
Practical Common Lisp
Common Lisp Cookbook