Requiem for Bell Labs, Unit 1127

The cultures that produced modern technology are like history itself: largely dead or transmuted beyond recognition. There are those among us who believe modern technology has no need of institutions, states, governments, or the tenuous support of organised religion.

It remains for us to find the determinants of modern technological development. We might also have to be prepared to accept that the determinants might have changed over time. The first half of the 19th century saw some astonishing developments, from the commercial development of the steam engine to mechanical calculators and the inevitable breech-loading cannon.

What is surprising about all of them is that they didn't so much depend on the prowess of individuals leading groups of workers and engineers, a profession that only became socially acceptable during the Napoleonic era in France, or even on the talents of inventors, but on the dissemination of expertise across large population segments. The principles of all these marvels of technical ingenuity and in some cases, their implementations, had been thought of before, but knowledge was lost and skills faded simply because the infrastructure and personnel could not be educated and paid for.

Aliens Have Their Own Languages

Computer science and its resulting engineering practices are very different, since its progenitors ended up inventing a new culture, and the medium within which they were able to express their very concepts much more fluently than on paper or silicon. Computer scientists and programmers invented their own languages, like LISP, C or formal notations, and their own media, like the Internet or blogging.

The standards most engineering professions produced over the centuries seem to take less time to create for computer engineers than in any other profession in history: architects and civil engineers took thousands of years and several breakdowns of civilisations before structural standards that were provably sound appeared. Complaints about practices among programmers are less and less justified simply because programmers are working with better tools and quickly acquire better practices through the large amounts of open source code available globally.

Computers had to rely on hardware whose complexity astounded and frustrated its creators right from the moment the first computers started operation. But the first computers were clearly the result of either academic efforts residing in the US or international military and government-operated labarotories, particularly after the 2nd World War. Zuse's Z3, the first Turing-complete machine ever built in the world, was the great exception to the rule, but Zuse's straitened financial circumstances were as much a result of political priorities as the short-sightedness of research policies of a German state that was in the process of eliminating much of its elite on ethnic and political grounds.

Big Science, Bit Engineering

The post-war period was different: large government projects and incipient interest by mid-size and large corporations were fuelled by reconstruction, the Cold War rivalry, and the overwhelming sense that engineers and scientists were going to solve the problems of the world. Techno-utopias trickeled down via television and burgeoning sales of science fiction, soon making up 10 per cent of all fiction sales in the English and French-speaking world. Computers and their iconic value as central repositories of data, and indeed sources of benevolent management, were concepts that were as yet unquestioned.

Only a few thousand people had access to computers in a commercial setting. It was fairly unlikely that they would use it for small-scale purposes like text processing. Researchers had different priorities and usually shared access to early mainframes, a strong motivation to develop time-sharing systems. That we would all use computers to write letters and send them across world-wide networks was a dream, but hardly more than that. The notion of global networking was tied to telecommunications; the sharing of data on a large scale was a pipedream, even though hypertext had been theorized about from the late 1940s.

What Research Was For

The culture of research was very much dominated by two concerns: there always was and still is blue-sky research, where research is done for its own sake. It is a bit difficult to remember today, when governments and corporations interfere in the running of universities for the sake of so-called markets, that universities used to be places where education was directed towards the formation of minds, not the latest corporate fads. The ability to conduct independent research was considered a result of that education, even though the more costly areas of scientific research were already beginning to be dominated by military and commercial interests: John von Neumann's earliest computer underpinned by the architecture bearing his surname was used to predict the shape and impact of the earliest American H-bomb detonations. Both industry and the Pentagon were prominent among his backers, although, and this is a point that cannot be repeated too often, the computer's specifications were freely accessible from the start. Surprisingly, not many historians of computer science seem to be aware of that fact.

That Rare Thing, A Programmer

But the late 1960s changed all that. At the time, industrial research and its corporate incubators were not that well-known. Although today we tend to be more aware of IBM research labs and Google's incipient attempts to play a major role in computer science research, we tend to forget that the early history of computer science was played out very much in a non-academic context simply because what we would consider computer science today had neither an institutional nor an intellectual identity. Internationally, universities only began to produce computer science graduates in the early 1960s - and even by the early 1970s, a PhD in computer science was considered a rarity. Much of the body of theory underpinning modern computer science including modern automatata theory and relational algebra did not even exist in its modern form then: Cook's theorem is a product of the late 1970s and database theory did not find any applications before the early 1970s. Algorithmics as a separate body of knowledge was only worked out in the 1970s and 80s, producing a stream of results ultimately collected and systematized in Knuth's Art of Programming.

Some programming paradigms were, of course, very much a product of even later developments, and are still in the process of maturing. Indeed, to build a computer science research lab, corporate managers, who often had science backgrounds themselves, felt that a background in physics and electrical engineering was more relevant than a focus on an area of computer science. Up to the late 1980s, many computer scientists would have been hard-pressed to define a dividing line between mathematics and computer science.

In other words, the major backers of modern computer science, defence organisations, secret services, international corporations, and to a lesser extent, academic scientists, founded departments within industrial research laboratories that developed larger software systems, simply because software needed to be written by groups of programmers and not by lonely geniuses in cramped backrooms. They did so by relying on the intellectual and material resources available among mathematicians, physicists and electrical engineers. There was a certain degree of influx from mathematics departments, although given the importance of mathematicians for cryptography and information theory, industrial laboratories did not become a major employer for them until much later. It might help to take into account a very strong bias of working mathematicians against computers, which they tended to regard as irrelevant to their work: automatic theorem proving was worked on at the MIT, but was not practical until the early 1980s.

Why Researchers don't Like Engineers

The precise reasons for these developments are not very well-known and the subject of much controversy among historians of computer science. Some research laboratories, however, ended up in the spotlight, simply because some of their systems ended up having a major impact on the industry and in some cases, formed the basis of modern communications. We mentioned the von Neumann machine, but its institutional setting was an anomaly: the Princeton School of Advanced Studies (SAS) and its personnel were less than thrilled about having to share their precious cafeteria with engineers. Needless to say that many "engineers" were mathematicians and on occasions ex-philosophers in disguise; some of them were quite capable of appreciating the presence of a Kurt G�el in their midst. His reactions have not been recorded, but SAS administration was rather sniffy about it all.

IBM research laboratories were rather less research-oriented at the time than they are today: early attempts at producing operating systems suffered less from sufficient academic input than from management problems, a state of affairs that provided one of the managers, Fred Brooks, with much material for the now legendary "The Mythical Man-Month".

Xerox PARC was a fairly curious creation: like so many research organization founded by large corporations, its researchers produced miracles, although management defiantly expected something more pedestrian. Laptops, object-oriented programming, Ethernet, wireless networking and computer science education all found their most fervent supporters, and some working applications, there. British research organisations suffered from similar problems: GCHQ in Cheltenham sat on an early version of public key cryptography - and the theory of object-oriented and practical functional programming was worked out at universities and companies across the land, without any appreciable impact on the IT industry within the UK.

Although the UK government and in particular its tax authorities enthusiastically adopted computers in the 1960s, programming remained a low prestige occupation. Few researchers followed Alan Turing's lead in AI theory, indeed his name and his astonishing influence on the very foundations of computer science and the course of World War 2 history remained practically unknown outside working computer scientists until the early 1980s.

Why Telcos Needs Boffins

Which leaves us with that "other" fulcrum of modern computer science research: telecommunications companies. It is almost impossible to understand today that there was a time when computers were regarded as rather superfluous to the work of telecommunications and media companies. RCA partly funded the early Neumann machine, but Bell Labs, which had no involvement in the same, under its early monopolistic umbrella was hampered in its computing endeavours simply because it was constrained for legal reasons in the independent pursuit of new computer and programming projects.


This state of affairs did not stop the company, however, from taking an active interest in MULTICS, (Multiplexed Computing and Information Service), a time-sharing operating system initiated by (D)ARPA's Information Processing and Technology Office in 1965. AT&T, the Massachusetts Institute of Technology and General Electric all signed up to create a system that would enable 100s of users to work on a central server.
MULTICS did not happen for a number of years. Although once it was finally released, it powered mainframes for a quarter of a century, until the last installation was shut down in October 2000. MULTICS throws the dilemma of military-commercial-academic collaboration into sharp relief: the reactions to inevitable delays were extremely different: Bell pulled out in 1969, leaving the field to the likes of the MIT and GE.

Bell Labs At Its Height

Inadvertantly, Bell had just created the conditions for one of the most popular, if not the most important, technological and scientific developments spawned within the culture of Bell Labs: the team working initially on MULTICS redirected their own efforts in such a way as to create a new operating system that in part mimicked MULTICS in an emasculated fashion, leading to the often misunderstood pun of "Unix".
Bell Labs was probably one of the most intriguing places to do research in the world, short of dodging Soviet party cadres at the Moscow Academy of Sciences under Brezhnev - even though many contemporary commentators seemed to assume that industrial research was on its last legs.

Bell Labs pulled out of MULTICS, leaving some of the main programmers with little direction. Unit 1127, as it was called then, consisted of some of the most hallowed names in computer science research; but we should recall that back then they were hardly more than recently minted PhDs with divergent specialties. It is also interesting that some of them seemed far more interested in producing applications rather than what we would call operating systems or kernels.

Operating systems as we know them are actually a fairly recent invention: most computers had so little memory and computing power that for all intents and purposes they had to be considered application-specific devices. Operating systems only make sense if various applications or users thereof compete for scarce computing resources. The very ability to share can of course be written into software or engineered into hardware: early Unix implementations dealt with this problem in various ways.

Geniuses Galore

The roles of Dennis Ritchie, Ken Thompson and Brian Kernighan in the development of Unix are extremely well-known and we should possibly use this opportunity to focus on some of the unsung heroes of Unix development. Al Aho, the man who made regular expressions ubiquitous in Unix applications, is a good exmple. He was a member of Bell Lab's mathematics centre, and found Unix development to be a better way to write papers about automata theory: he discovered early on that much of the published work about automata theory and formal languages wasn't very well done technically; he set about improving and straightening out many papers published on automata theory during the 1950s and 60s. He went on to become the A in the pattern scanning language AWK.

Automata Theory, Compiler Practice

When he started working with the Unix group under the auspices of its manager Doug Illroy, he used the insight - natural to anyone with a nodding acquaintance of formal language theory - that automatata as classified in the Chomsky hierarchy make it perfectly natural to parse programming languages. It is also possible to construct parsers automatically, provided the results of automata theory are adhered to strictly. Yacc and lex were the tools that made the automatic construction of lexers, parsers and - building on both - compilers possible: as Al Aho was fond of pointing out, yacc turned compiler building from an exercise worthy of a PhD thesis into a pleasant diversion lasting barely a week.

Modern compiler building has progressed since, but yacc and lex had such an influence on compiler construction that many modern languages are first prototyped using yacc or bison, its GPL-ed successor, before more recent advances in code optimisation or the merging of lexing and parsing are applied.

Within the context of Unix development, it meant that languages like PL/I, which MULTICS had been programmed in, and FORTRAN were not a necessary evil anymore: C could either just be extended to fit a new purpose or programmers in and outside the Unix world went on to build languages like Tcl and Perl.

G(awk)ing at Text

Stephen C. Weinberger, the W in AWK, was another mathematician who realized that the core of computer science needed alignment and recovery from the period of unstructured development between the 1970s and 1980s. By 1977, the Unix group had changed its emphasis from being a group of filesystem specialists who tried to solve problem relating to multi-user access, to a group who spearheaded the development of a fully-fledged operating system for major industrial customers. The culture was still unique, essentially a group of highly-educated equals who wrote and fixed code in daily - and sometimes nightly - rituals. Weinberger did not want to change the filesystem's fundamental structure or the way in which new programming languages were designed: he realised that it was not entirely obvious how to extract information from Unix files, then process and structure the information without building huge commandlines. In a way, AWK represents a slight departure from Unix toolbox philosophy, since it combines various tasks into a subdomain and dedicates what amounts to a programming language to it. But it was recognized already back then that certain task sequences lent themselves to treatment by domain-specific or scripting languages - the latter term not being in evidence before the mid-1980s.

AWK used the magic of regular expressions to scan for patterns, but since regular expressions in their mathematical formulation do not permit the use of recursion and therefore, memory, the results of pattern-scanning operations had to be structured and saved to something that resembled database reports. File I/O entered the realm of Unix tools and building text processing tool chains become a much more rational affair. Weinberger later wrote a FORTRAN I/O library, the results of which he seems less proud than of his other endeavours.

Doug Doesn't Like MULTICS

The long-time manager of the Unix group, Doug McIllroy, has been at pains over the years to point out that while Unix filled the hole left by MULTICS, it was by no means certain whether the researchers and programmers would form a group building an operating system: after all, building an operating system was not the goal of theorists who happened to be programmers. At the time, they were not even working at the same department. Neither was it particularly encouraged by Bell Labs management who were afraid that they would land up with another MULTICS-sized multi-year project without tangible results. Like MULTICS, Unix became a multi-year project, but one that produced results very early.

The culture of research, rather than the pressure to produce tangible commercial results led to the creation of Unix and more than one of its derivatives. BSD Unix was another, although strictly speaking academic attempt at creating a free Unix variety that incorporated the TCP/IP stack when it was still in its experimental stage.

Unix was an Accident

Unix was also a result of a culture that took ideas from wherever they could be found: although text processing or databases were not commercially valuable projects at the time when the first Unix applications were created, no-one stopped the researchers and programmers while they were spending years at perfecting their formatting programs and regex engines. The first text formatter was actually eqn and it formatted mainly mathematical formulae: troff followed, but Unix and Linux have come a long way regarding text and document processing since then. Neither was it terribly important for the designers of Unix whether a large number of users could be persuaded to adopt the system as their main operating system. It was far more important to department 1127 that they produced an operating system that ran on as many platforms as possible, a design decision that led to parts of the OS being rewritten not just in 2, but in 3 programming languages: PDP assembler, B and C. B was a BCPL dialect, although it did not become a system standard at any stage. C was designed with portability in mind, although it retained some assembler type capabilities.

Neither did Unix become the final halt for Department 1127: although Unix was an advance for its time, it had always suffered from hold-overs and concerns that were important for the 1960s and early 70s: the fact that pipes were a major advance handling I/O between programs points towards this fact. Unix was a file-oriented OS, and started its life as an attempt to fix the abandoned MULTICS filesystem. But in the mid-1980s, life as the Unix programmers knew it had changed beyond recognition: time-sharing systems were not the latest fad anymore: minicomputers, with the (at the time) much-maligned PC architecture were at the horizon. TCP/IP had been implemented and academic networks were running on early Ethernet and similar networking standards.

Instead of running compute servers with fairly dumb terminals, workstations became standard. And the additional problem of graphics and all the consequences it had for user interfaces were added to the now-traditional Unix architecture.

Plan 9 From Bell Labs

Unix was not built to handle all these problems at once and despite undoubted advances in processor speed, new approaches were needed. The new operating system coming out of Unit 1127 was called Plan 9, after a Science Fiction movie reputed to be the worst ever made. Plan 9 was also supposed to be a distributed operating system that used many specialised servers underneath an interface whose only metaphor was the file. Sitting in front of a Plan 9 terminal is a fairly unusual experience since all resources, including graphics, individual applications, and whatever the network makes available to each and every user does not map to one machine, but to many.

Some of the old Unix group was still at Bell Labs, and indeed there are still some programmers working at the same location to this day. But Plan 9, in some ways, proved to be the swan song of the original Unix group. Plan 9 development continues to this day, but Unit 1127 ceased to exist in mid-1995.

Ken Thompson

Ken Thompson spent almost his entire professional life at Bell Labs. Right after he earned a Masters in electrical engineering in 1966, he decided to join Bell Labs to work on MULTICS. If there is a father of Unix, as opposed to the godfather, Doug McIllroy, Thompson is probably the one who deserves this accolade.

He started to rewrite the filesystem inherited from MULTICS in 1969, creating almost by accident the system known today as Unix. Unix originally ran on a PDP-7, but had to be ported to a PDP-11 in 1971. The bureaucratic struggle to get a more powerful machine while management was striving to stop Unix from becoming another MULTICS-like never-ending project has been recounted many times, but what always remained astonishing was the friendship and openness between the various programmers who put Unix together. There was little sense of hierarchy, even though pipes came pretty close to being imposed by managerial fiat. It must be one of the more fortuitous management decisions in IT history.

Although C wasn't really his idea - Ken Thompson was a compiler expert, not a language designer - he rewrote Unix in C in 1973 and ultimately it was his version of Unix that made BSD Unix possible. Thompson was visiting professor at UC Berkeley in 1975/6 and brought the tapes with him.

Later he was reponsible for the creation of Plan 9 and one of his creations include UTF-8, the current encoding standard for Linux internationalisation.

Later he developed a world-beating chess system and shortly afterwards received the ACM's Turing award. Ken Thompson retired in 2000 and holds a fellowship with Entrisphere, a company founded by a Plan 9 and Inferno programmer, Phil Winterbottom.

Dennis Ritchie

Dennis Ritchie is that rare creature in computing: the self-effacing genius. He designed and implemeted C, the language in which Unix and almost every other operating system of note (except OS/2) has been coded. He graduated with a PhD in mathematics from Harvard and joined Bell Labs in 1967 to work for the mathematics centre. He became a member of the Computing Sciences Research Centre and has been working for the Centre ever since.

Ritchie usually credits others with pushing forward developments, and often describes his own involvement as more or less accidental. After creating C and influencing much of the standards' work going into C89/90, he slowly withdrew from actual programming and became a manager of the Systems Software Research Group within Bell labs. He oversaw both the Plan 9 and Inferno efforts, which, he apparently signed some of the cheques for. Further involvement has been rumoured, but his actual role has been downplayed by him. That Plan 9 never quite became the success that many of Bell Lab's programmers would have wished for, probably had a lot to do with the difficulty of changing the world of operating systems without having huge numbers of volunteers or an enormous budget to beginn an IT revolution.

Much of Unix' success was due to the fact that not only was it written in a programming language that made portability easy, but once rewritten in C, its structure was disciplined and modular.

Ritchie was rather heavily involved in the production of most Unix manuals, and co-wrote the original C Programming book known as K & R (Kernighan and Ritchie). His writing style influenced generations of writers of programming books and technical writers.

Frank Pohlmann


Unix history

AWK (Aho, Weinberger, Kernighan)

Early Unix Programmers and Managers


Plan 9



The W in AWK is Peter J. Weinberger, not Stephen C. Weinberger.

Ken Thompson left Entrisphere and is now at Google.

Correction acknowledged

Many thanks. The correction will appear in the article on Monday!



more corrections

yacc was written by steve johnson (heh, steven c, actually) not al or doug. lex was written by mike lesk and an intern named eric schmidt.

center 1127, not unit 1127 (and it was originally just 127 until a labs-wide reorg in the early 80's).


I conflated Steven C Johnson and Peter J Weinberger for some reason. Doug suggested the yacc project, to my knowledge.

Overall responsibility was with Doug, but Al Aho quite definitely had major input in lex and yacc. In fact he was the residing automata theory expert at Bell Labs! The documentation for yacc and lex was written by the people you mention, so as to the actual code, I or someone else would have to check who was actually writing which code and at what time.



Peter Weinberger (not Stephen)

K&R book

One of the most concise, best written language guides I have ever found.

Funny story: my wife took a C course at a local state college. She came home with the book, and I laughed out loud. It was a typical college book, heavy, two and a half inches thick, with oversized pages of coated paper, with loads of color. The title was something like "C Programming with Breadth". In addition to a very detailed description of C, with loads of helpful problem sets, there were color photos and short descriptions of how computers help us in our daily life (the "breadth", I suppose)

After I stopped laughing , I showed her my copy of K&R and told her it was all you need to learn C.

Back to top