as simple as possible...
… but not simpler.
Einstein's (attributed) quotation has become an aphorism, taken for granted by every mathematician or physicist i've ever met (to mention two kinds of people i've been frequently involved with). One would expect the same attitude from a community that invented the term 'no silver bullet', and yet, since i got into computer science, first for fun and later on for a living, i've found lots of people with, er, a different viewpoint. Take for instance this excerpt from Lisp is sin, a widely cited and commented article by Sriram Krisnan:
In Visual Studio, we look at 3 distinct categories of programmers. We call them Mort, Elvis and Einstein - the 3 personas we take into consideration whenever we create any developer technology. What is difficult for some of us geeks to comprehend sometimes is - all 3 of them are equally important. When you look at technology like Windows Workflow Foundation, one of the driving forces was to let non-geeks build software. Not everyone needs to be a Raymond Chen or a Dave Cutler. Not everyone needs to understand the difference between the various GC algorithms. However, everyone needs the ability to be productive. And everyone needs to be able to get software working without needing a CS degree.
We cannot afford to restrict software development only to those who know Windows kernel internals or those who can understand what a continuation is. It's not that other people are not smart - they just have better things to do. That's the key piece of understanding I find missing sometimes.
Nonsense, if you ask me. And yet, i've been hearing this same argument, in different dressings, once and again since i got into computer science. Let's apply the same line of reasoning to other disciplines, and see how well it fares:
Hey Albert, your General Relativity is awesome but, you know, with all that jazz about differential geometry and curved spacetimes, it's too hard; we're not as smart as you, pal, so we'd better use Newtonian or Aristotelian mechanics to calculate those GPS satellite orbits and get going with other important things we need to do. Hope you understand, Albert.
Well Santiago, your ideas about neurons and surgery sound pretty deep and mystifying, but please, think of the Galens among us: we don't have the time to investigate every new fad, and, anyway, we wouldn't understand it if we did. Know what? We'll keep using our old good cures and stay away from untrodden venues. Our healing parchments are a bit of a hack, but they get the work done… most of the time, that is.
Does it make any sense? Now, maybe you think that i am exaggerating, and that the comparisons above are stretching the point a bit too far. If so, take a second to look back to the people that made your nice computing environment possible. Take a look at Charles Babagge visions; read about Alan Turing and Alonzo Church or John von Neumann; admire the elegance of McCarthy's original LISP (1960); prepare to be surprised with the things the people in Dough Engelbart's Augmentation Research Center were doing during the sixties; try to find a modern drawing program that matches Sketchpad's algorithms (or see it in action in this presentation by Alan Kay); follow the fascinating development of the overlapping windows interface, hand in hand with Smalltalk history back at Xerox PARC, and do it from the horse's mouth; feel the thrill of the people that went beyond Xerox's big wigs' shortsightedness and on to making a dent in the universe: it was 1984, that same year the lisp machine wars culminated in the creation of the GNU project, which was all about ideals, about empowering people, about freedom. When you're done, tell me whether i'm going overboard when making parallelisms between computer science and physics or medicine!
All those people had a vision, a dream, and pursued it with an amazing display of hard work, stubbornness and intelligence. They took no prisoners, and by the late eighties had pushed that new thing called, for want of a better name, Computer Science to its modern standards.
Then winter came. Not just the AI winter. Compare the swift pace of CS developments during the 1960-80 period with the subsequent advancements in the field. We're using the same metaphors, the same kind of systems that we inherited from those guys and gals. Why, we even envy the power of Lisp Machines these days. It's been a long, cold winter for CS. And the main reason was the appearance of the mentality that i'm criticising in this post, what Alan Kay aptly calls, in a recent interview, a pop culture of computers:
Perhaps it was commercialization in the 1980s that killed off the next expected new thing […] But a variety of different things conspired together, and that next generation actually didn't show up. One could actually argue—as I sometimes do—that the success of commercial personal computing and operating systems has actually led to a considerable retrogression in many, many respects. You could think of it as putting a low-pass filter on some of the good ideas from the '60s and '70s, as computing spread out much, much faster than educating unsophisticated people can happen. In the last 25 years or so, we actually got something like a pop culture, similar to what happened when television came on the scene and some of its inventors thought it would be a way of getting Shakespeare to the masses. But they forgot that you have to be more sophisticated and have more perspective to understand Shakespeare. What television was able to do was to capture people as they were. So I think the lack of a real computer science today, and the lack of real software engineering today, is partly due to this pop culture.
Dead on, i say. People advocating about making programming simpler than possible are the hallmark of this pop culture. And when corporate and economic interests enter the picture, things get even worse. The Lisp is sin essay goes on to say:
I frequently see on Slashdot "Windows is designed for stupid users". That is quite insulting to the millions of moms and dads, teachers and laywers and people from other walks of life who use Windows or even the Mac. If we mandated that every new user understand Windows' command line syntax or Emacs, we would have failed as an industry - we would have locked out the rest of the world.
In my opinion, this totally misses the point. There's nothing wrong in making computers simpler to users. On the contrary, that's probably what this endeavour is all about. Alan Kay saw it, Apple took head with its computer for the rest of us mantra. But it does not follow that there must be a CS for the rest of us. Making all this amazing technology possible takes effort, and needs a high level of sophistication. Alan didn't try to create systems usable by children inventing PHP. He created Smalltalk striving to improve Lisp, he studied Piaget and Papert, he has degrees in maths and biology. And he needed all that, and then more.
The (trivial) point i'm trying to make is that not everybody has what it takes to be a programmer. Just as not everybody can be a singer or a painter (as an aside, i tend to agree with the opinions that link programming and art). As a matter of fact, good programmers are rare and need a quite peculiar combination of skills and talents. Donald Knuth has put it far better than i could in the essay Theory and Practice, II (from his Selected Papers on Computer Science):
The most important lesson [after developing TeX], for me, was that software is hard; and it takes a long time. From now on I shall have significantly greater respect for every successful software tool that I encounter.[…] Software creation not only takes time, it's also much more difficult that I thought it would be. Why is this so? I think the main reason is that a longer attention span is needed when working on a large computer program than when doing other intellectual tasks. A great deal of technical information must be kept in one's head, all at once, in high-speed random-access memory somewhere in the brain.
We don't solve the painter's problem by complaining that perspective is hard to grasp and people should better use flat icons. In the same way, we shouldn't be claiming for a trivialisation of CS both in academia and in the industry. The we would have failed in the industry bit in the Sriram quote above is really sad: we're sacrificing an admirable legacy in the name of industry and corporate profit. The most remarkable feat of our current industry leaders is to have convinced the rest of the world that having software systems that eat incredible amounts of resources and explode without reason every now and then is part of an acceptable, even top-notch, technology. Fortunately, other disciplines show far more respect for the people that, ultimately, is paying their wages.
If you've got this far, you already have one of the qualities needed to become a programmer: stamina. You'll need more. Be prepared to study hard, to learn maths, to live in abstract worlds. If you feel that you have "more important things to do", well, that's all very well, but don't ask the rest of us to dumb down the subject so that everybody can be a programmer. Lisp is not a sin. The sin would be to betray the dreams, ideals and hard work of the people that have taken us this far. We owe that to them, and to ourselves.
To end this never-ending diatribe, let me add a couple of things: first, i should apologize for taking Sriram as the scapegoat to a long honed rage: his essay contains many good points worth reading and mulling on; second, i hope you're not thinking this is just an arrogant rant by an old fart: i'm not that old.