Worse is not better! March 11, 2010 | 11:36 pm

This is something that has been bothering me for a while. Never, with the possible exception of “Democrats are weak” (thank god we had a real conservative as president when we fought global fascism, and not some lilly livered spineless liberal Democrat), have I ever seen a baseless criticism be so eagerly adopted by those whose are criticized, as with the Unix-hating diatribe The Rise of Worse is Better. And baseless sour grapes it is.

First of all, any system of size, complexity, and age to be interesting is going to have warts. It’s inevitable- all things made by imperfect humans in this imperfect universe are imperfect. And Unix has it’s share of warts. Starting with the fact that “create” is spelled with an e at the end. This applies to Lisp as well! Let’s start with the mess that is equality. But the existence of warts does not prove that the system does not have elegance in it’s design, and beauty in it’s core principles. This applies to Unix as well as it applies to Lisp.

Oh, wait, you were talking about the platonic ideal of lisp that exists in your head (and nowhere else)? Yes, your vaporware is better than my decades old shipping systems. Bully for you.

Corollary to Greenspun’s 10th rule: any sufficiently powerful I/O library is reinventing half of Unix’s pipes and filters design. Generally badly.

But even more to the point, whether a given behavior is a feature or mis-feature (“wart”) can only be determined in it’s relationship to the rest of the system, it can not be determined in isolation. Let me give you an example- indeed, let me use Richard Gabriel’s own example. He says:

Two famous people, one from MIT and another from Berkeley (but working on Unix) once met to discuss operating system issues. The person from MIT was knowledgeable about ITS (the MIT AI Lab operating system) and had been reading the Unix sources. He was interested in how Unix solved the PC loser-ing problem. The PC loser-ing problem occurs when a user program invokes a system routine to perform a lengthy operation that might have significant state, such as IO buffers. If an interrupt occurs during the operation, the state of the user program must be saved. Because the invocation of the system routine is usually a single instruction, the PC of the user program does not adequately capture the state of the process. The system routine must either back out or press forward. The right thing is to back out and restore the user program PC to the instruction that invoked the system routine so that resumption of the user program after the interrupt, for example, re-enters the system routine. It is called “PC loser-ing” because the PC is being coerced into “loser mode,” where “loser” is the affectionate name for “user” at MIT.

In unix terminology, what he’s talking about is that in Unix, when you’re in a system call and you receive a signal, after the signal handler exits the system call exits as well (with errno set to EINTR). According to Richard, this is just obviously wrong, and even more so, so terribly and blatantly wrong that by itself it suffices to condemn all of Unix.

And I note that the story is well sourced as your average Washington Post article.

But let’s consider this feature in terms of it’s relationships with the rest of the system- more specifically, let’s consider what would happen if we “fixed” this feature. Well, currently you can slap a timeout on almost any system call (or sequence of system calls) in this manner: call the alarm(2) system call, which sets up to send your process a signal after so many seconds, and then perform the system call. If the system call completes in time, another call to alarm(2) with a timeout of 0 disables the alarm. But if the system call takes too long, the alarm fires off a signal, which kicks of off the system call. If we were to fix this “misfeature”, we would have to add time out arguments to every single system call that could block in some implementation. Like Lisp, Unix has been reinvented and reimplemented many times over the years, so you have to cover not only this current implementation but possible future implementations as well. Or simply resign yourself to wait for the result to come back, even if that is forever.

I have to say “most system calls” above, because, I am sad to say, there are some system calls that block the process so completely that signals don’t break the process out of waiting for them. NFS, I’m looking at you. This is a legitimate wart with no purpose. But it also serves to demonstrate why this “misfeature” isn’t really a misfeature at all.

Another point I’d like to make is that Lisp is a programming language, while Unix is an operating system. How is comparing them not comparing apples and oranges? Because Lisp tolerates nothing else. It’s a programming language that wants to exclude everything else, including the operating system, so it has to be an operating system as well. From the very beginning, Unix had two different languages it supported- C and sh. You want to talk about warts, let’s talk sh, but the important effect this had was to inculcate a distinction between operating system and language. Which is why most programming on Unix today is not being done in either C or sh, but instead C++, Java, Perl, Python, Ruby, Haskell, even Lisp.

Yes, Lisp. Because I misspoke. It’s not really Lisp that has a problem with everything not-Lisp, it’s Lisp advocates. Unix has probably done more to promote Lisp than Lisp ever did. Consider- the Unix text editor, to the extent that there is one, has to be vi. The core of vi (vi is just the visual interface to the ex editor, which is just an extension of the original ed) is as bound up, if not more so, with the history of Unix than C is. Obscure command names like grep, tr, and sed make sense if you know vi/ex/ed (and if Lisp advocates wish to hold obscure names up as a problem with Unix, might I remind them of car and cdr, core Lisp functions named after registers in a machine that stopped being built before I was born).

But vi is probably not the most popular editor on Unix these days. That honor belongs to Emacs. Which is a port of a lisp development environment/operating system interface. Unix, with Emacs, is bringing Lisp to millions. And Clojure, Lisp on the JVM, is bringing it to yet more people. You might think this would be cause for celebration. Think again.

Normally, being insulted by lisp programmers wouldn’t faze me much (or at all). But what really gets my goat about this essay is how it’s been adopted by other people as a license to suck. The logic, such as it is, goes like this: this guy wrote this article about Unix sucked and Lisp didn’t, and that’s why Unix succeeded and Lisp didn’t. Well, we’re going to succeed even bigger than Unix, because we suck even worse! We rock because we suck!” No. Stop. Wrong. Worse is not better.

  • http://meekostuff.net Sean Hogan

    Ha ha :)

    I suspect the language of the future will be an unexpected but elegant merging of unix and lisp that will be easily parallelizable and provide the simplest expression of complex problem domains. (And I’m not even joking or stoned)

    By the way, the universe is not (fundamentally) imperfect – it’s just complex.
    Here’s a perfect universe : 0
    It’s yours, you can keep it.


  • http://enfranchisedmind.com/blog/posts/author/candide/ Robert Fischer

    When I read that Loper OS post, I thought it was satire. It reads like something I’d write on April Fool’s Day, if I felt like taking a shot at Lispers. Surreal.

  • http://www.loper-os.org S. Datskovskiy

    I am the author of the much-hated “Thumbs Down For Clojure” post.
    I would like to confirm that it is not, in fact, a satire.

    I actually believe Clojure to be a massive technological step backwards.

    The language is marred by a number of ugly compromises (lack of reflectivity; weird syntactic warts; lack of tail-recursion; lack of reader macros; spewing of Java stack traces; many others) for dubious gain (still less than ~1/2 the runtime speed of Common Lisp.) None of this would not bother me if the language didn’t have a massive cult following, consisting of people who have never used Common Lisp (much less a Lisp Machine) and actually believe that Clojure has “picked up the torch” of Lisp.

    Judging by the sheer vitriol that my little post continues to attract even now, these criticisms are not baseless.

    Slightly off subject: there is a Symbolics Lisp Machine emulator floating around the Net. I highly recommend giving it a spin. See what a non-braindamaged operating system is like, just once in your life. Unix (and all of its cheap imitations) is a 1970s OS well past its sell-by date. Demonstrably technologically superior alternatives exist – have existed – for decades.