Monday, October 15, 2007


I've setup an alternate feed that splices my bookmarks from with this blog's entries. You can find it here.

Saturday, October 13, 2007


"Formal methods will never have any impact until they can be used by people that don’t understand them."
Tom Melham

"The original study that showed huge variations in individual programming productivity was conducted in the late 1960s by Sackman, Erikson and Grant (1968). They studied professional programmers with an average of 7 years' experience and found that the ratio of initial coding time between the best and worst programmers was about 20 to 1, the ration of debugging times over 25 to 1, of program size 5 to 1, and of program execution speed about 10 to 1. They found no relationship between a programmer's amount of experience and code quality or productivity.

Although specific rations such as 25 to 1 aren't particularly meaningful , more general statements such as "There are order-of-magnitude differences among programmers'" are meaningful and have been confirmed by many other studies of professional programmers (Curtis 1981, Mills 1983, DeMarco and Lister 1985, Curtis et al. 1986, Card 1987, Boehm and Papaccio 1988, Valett and McGarry 1989, Boehm et al. 2000)."

"If you look at the way software gets written in most organizations, it's almost as if they were deliberately trying to do things wrong. In a sense, they are. One of the defining qualities of organizations since there have been such a thing is to treat individuals as interchangeable parts. This works well for more parallelizable tasks, like fighting wars. For most of history a well-drilled army of professional soldiers could be counted on to beat an army of individual warriors, no matter how valorous. But having ideas is not very parallelizable. And that's what programs are: ideas."

"Software entities are more complex for their size than perhaps any other human construct because no two parts are alike (at least above the statement level). If they are, we make the two similar parts into a subroutine--open or closed. In this respect, software systems differ profoundly from computers, buildings, or automobiles, where repeated elements abound. [...]

The complexity of software is an essential property, not an accidental one. Hence, descriptions of a software entity that abstract away its complexity often abstract away its essence. For three centuries, mathematics and the physical sciences made great strides by constructing simplified models of complex phenomena, deriving properties from the models, and verifying those properties by experiment. This paradigm worked because the complexities ignored in the models were not the essential properties of the phenomena. It does not work when the complexities are the essence."
Fred Brooks
No Silver Bullet

"Architecture is intended to be facilitative, of course, in that a good architecture should enable developers to build applications quickly and easily, without having to spend significant amounts of time re-inventing similar infrastructure across multiple projects. [...]

But an architecture is also intended to be restrictive, in that it should channel software developers in a direction that leads to all of these successes, and away from potential decisions that would lead to problems later. In other words, as Microsoft's CLR architect Rico Mariani put it, a good architecture should enable developers to "fall into the pit of success", where if you just (to quote the proverbial surfer) "go with the flow", you make decisions that lead to all of those good qualities we just discussed. "

"The more interesting your types get, the less fun it is to write them down!"

"If you don’t know the difference between a group, a ring, and a field, you have no business overloading operators.

Now I’m not saying that one has to take a course in abstract algebra before you can be a competent programmer. You don’t as long as the language you program in doesn’t support operator overloading (or as long as you’re wise enough not to use it if it does). However since most programmers are didn’t get beyond ODEs in college (if indeed they got that far–some of my comp sci friends struggled mightily with calculus and had to retake it repeatedly), one can’t responsibly design a language that requires mathematical sophistication in the 99th percentile for proper use."

Elliotte Rusty Harold
Operator Overloading: Trickier Than it Looks

"You used to start out in college with a course in data structures, with linked lists and hash tables and whatnot, with extensive use of pointers. Those courses were often used as weedout courses: they were so hard that anyone that couldn't handle the mental challenge of a CS degree would give up, which was a good thing, because if you thought pointers are hard, wait until you try to prove things about fixed point theory.

All the kids who did great in high school writing pong games in BASIC for their Apple II would get to college, take CompSci 101, a data structures course, and when they hit the pointers business their brains would just totally explode, and the next thing you knew, they were majoring in Political Science because law school seemed like a better idea. I've seen all kinds of figures for drop-out rates in CS and they're usually between 40% and 70%. The universities tend to see this as a waste; I think it's just a necessary culling of the people who aren't going to be happy or successful in programming careers."

Saturday, October 06, 2007

Four books

Another blog post, still no inspiration for anything creative, so lets just rehash that old bloggers' recipe of stuffing some "cultural" reviews in a post and hope it passes for content. Excited yet?

Concepts, Techniques, and Models of Computer Programming

First up is Peter Van Roy's "Concepts, Techniques, and Models of Computer Programming". Don't let that big title scare you away. The book is pretty hefty in itself, but don't let that scare you either, it is a great read. But what is it about, you may ask? Well, CTM - as it is affectionately called - could sit comfortably on the "programming paradigms" shelf, alongside Sebesta and Kamin. All books that aim to take the reader through a stroll down the computing Zoo, allowing him or her to gaze in awe of the strength of higher order functions, be amused by the quirkiness of dataflow variables, marvel at the elegant logical predicates lying under a sunny...

Ok, I took the metaphor too far, sorry about that. What I was trying to say is that CTM doesn't limit itself to enumerate paradigms accompanying each with a brief description and a couple of examples and leaving it at that. Van Roy's text goes further by discussing in reasonable depth programming techniques applicable to each computation model (the authors prefer to avoid the term "paradigm") and, more that that, advising the reader on how to best integrate them.

The technical approach that enables this leveling is to describe the models in terms of a kernel language that is expanded throughout the book. Each chapter shows how the kernel language needs to be augmented to support the required features, how it is interpreted by an abstract machine and what syntactic sugar can be added on top of the kernel to ease programming.

It would not be a fair review if I didn't relate at least one negative point, but it is a minor one. I think that the approach to logic/relational programming would be more representative of the usual intent if the language was more predicate-and-fact based. Or, to put it in other words, I like the Prolog syntax better than the "Relational Oz" one. As the authors explain, both approaches are semantically equivalent in their core, so I'm nitpicking. Overall, I can safely say that I recommend this book. It is, if you pardon the cliché, an eye-opener, making it clear that the "mainstream" imperative and stateful programming model is but one of many equally significant alternatives.

Engines of Logic

If you've ever been subject to any formal instruction in computing (or "informatics" or Information Systems or whatever), you probably had to endure at least one lecture on the "history of computing", which usually amounts to a lengthy enumeration of machines. If you were particularly unlucky, it started with some blabber about the abacus back in who-the-fuck-cares AD, and it invariably went on to spend a great deal of time discussing punched cards and looms. Yeah, freaking looms! I'm sure Joseph Marie Jacquard is a swell guy and all, but is a rudimentary mechanical input system all that important in the grand scheme of things? My answer, of course, is no. As Dijkstra put it: "Computer science is as much about computers as astronomy is about telescopes". And that is why Engines of Logic is such a great little book, it seeks to give an account of the history of ideas that culminated in modern computing.

We see how Leibnitz' utopia of a machine to automate human reasoning, up to the point of forever settling all disputes and intellectual arguments, evolved to a series of formal mathematical systems for "calculating with thoughts" (mathematical logic) by the hand of such great man as Boole, Frege, Cantor, Gödel, and others, culminating with the notion of "universal computers" and their actual realization. The book reads like a good popular science work, with amusing biographical anecdotes scattered throughout the nine chapters. Although, contrary to many works in this genre, Engines of Logic is not afraid of stating formulas and proving theorems when when deeper insight is required*. Check out a small excerpt from the chapter on David Hilbert for a sample of the lighter side of the book:
During my own graduate student days in the late 1940s, anecdotes about Göttingen in the 1920s were still being repeated from one generation of students to the next. We heard about the endless cruel pranks that Carl Ludwig Siegel played on on the hapless Bessel-Hagen, who remained ever gullible. My own favorite story was about the time that Hilbert was seen day after day in torn trousers, a source of embarrassment to many. The task of tactfully informing Hilbert of the situation was delegated to his assistant, Richard Courant. Knowing the pleasure Hilbert took in strolls in the countryside while talking mathematics, Courant invited him for a walk. Courant managed matters so that the pair walked through some thorny bushes, at which point Courant informed Hilbert that he had evidently torn his pants on one of the bushes. "Oh no," Hilbert replied, "they've been that way for weeks, but nobody notices".
Also of note in the paragraph I quoted is the personal touch given at times by the author, Martin Davis. He is a theoretical computer scientist, with the distinction of being present in Princeton back in the 1950s, in the companion of chaps like John Von Neumann, Kurt Gödel, Hermann Weyl and Albert Einstein. As an author, Davis is probably best known for writing more technical books on computability and complexity. But please, make no mistake, this is emphatically not an academic textbook; it goes to great pains to clearly explain subtle concepts like Cantor's diagonal method, achieving a balance between rigor and ease that is hard to come by**.


It is sad that I only got around to reading this book now. "Now" meaning late 2006, as these reviews are a little bit behind schedule... Anyway, as I'm having a hard time finding worthy adjectives, I guess something I could say is that after finishing 1984 I felt utterly stunned. It is powerful and it is important, so put it on your reading list if you haven't already.

A final observation is that the edition I'm linking to - a combined printing of Animal Farm and 1984 published by Harcourt - is cheap and pretty good. The preface is signed by Christopher Hitchens.

Snow Crash

I'm getting lazy (well, lazier) so this will be short: good book, so-so plot, so-so characters, awesome ambiance.

* To be fair, some of the most tricky proofs for non-crucial topics are left to end notes. Still, those notes are far easier to read than most academic mathematical tomes.
** Off the top of my head, I can only think of Nagel and Newman's book on Gödel's proof.