Monday, September 03, 2007
Concurrency and the demand for computing
I find myself thinking along similar lines with regard to concurrency. For the sake of analysis, lets split the space of applications in server-based and client-based. Members of the first group basically deal with responding to requests coming over the network. This means that there is a naturally high degree of parallelism and, of course, this has been exploited for a long time. The typical scenario is some serial business application code atop a middleware platform that handles threading and I/O. This means that on the server front, the “multicore revolution” will impact little on most software development efforts. Now, desktop software developers don't have such luck – the era of surfing on Moore's Law is really over. And so what? The way I see it,* raw computing has ceased to be an important bottleneck, long gone are the days of watching a crude hourglass animation while the CPU labored away. Not that we do any less waiting by now, these days we spend our time waiting for the network.
Anyway, maybe the concurrency boogieman is less scary than we think.
*(this is a blog, after all)
Monday, August 27, 2007
The Wealth of Software
What brought me to this weary debate is the historical perspective brought by Herb Sutter in this recent post. He argued that we are seeing a manifestation of a cycle where computing moves periodically between the center and the edges, a movement driven by an imbalance between resources:
"More specifically, it's the industry constantly rebalancing the mix of several key technology factors, notably:While this seems reasonable enough, I think there is an element missing: he scarcely mentions the role of the applications that run on those systems. The article presents a purely supply-side analysis of the computing marketplace, to put it in "economic" terms. To illustrate the importance of the demand-side, I expanded Herb's chronology table with important application classes of each epoch:
- computation capacity available on the edge (from motes and phones through to laptops and desktops) and in the center (from small and large servers through to local and global datacenters)
- communication bandwidth, latency, cost, availability, and reliability"
Era/Epoch | The Center | | The Edge | Apps |
Precambrian | ENIAC | | | Military calculations. |
Cambrian | Walk-up mainframes | | | Huge business batch processing. |
Devonian | | | Terminals and time-sharing | Big business batch processing |
Permian | | Minicomputers | | Scientific computation, maybe? I don’t know… |
Triassic | | | Microcomputers, personal computers | Spreadsheets, desktop publishing. |
Jurassic | | File and print servers | | Departmental or Small-Business DB apps. (think video rental service software) |
Cretaceous | Client/Server, server tier | Client/Server, middle tier | | OLTP (for instance, bank account management) |
Paleocene | | | PDA | PIM |
Eocene | Web servers | | | Web portals (Yahoo!, Excite!, …) |
Oligocene | | | ActiveX, JavaScript | Web based apps. (Hotmail, many ASP/JSP/PHP db apps). |
Miocene | E-tailers | | | ? |
Pliocene | | | Flash, | Fancy web apps (Flickr, Google Maps) |
Pleistocene | Web services | | | Google Data, DabbleDB? |
Holocene | | | Google Gears | Now what? |
Now, what do we fill in that last cell? What are the killer apps of the RIA platforms? There is no clear answer, but I see basically two niches that can be a good fit for the space: apps that handle audiovisual media (youtube, picnik, etc.) and apps that require rich modes of interaction (Google Earth). Its important to bear in mind that media intensive operations are expensive all around, from server storage space to quality digital video cameras for the users. Also, in many cases AJAXian alternatives exist (see PX8N or any other Web2.0 reflective-logoed startup in techcrunch). Regarding the other niche, applications using novel user interaction features, it seems cool in theory, but apart from a handful of HCI journals there is very little action in this space nowadays. And that's probably good, because most attempts at UI innovation fall flat on the user's faces, as DHH eloquently argues in this podcast. All in all, skepticism is healthy as usual, but I don't see the door closed shut to a richer software landscape.