Silicon Dale

Parkinson's Law

It is a strange fact that through most of the past 50 years of computer development, the technology has always fallen just that little bit short of what is required to handle the problem in hand (whatever the problem might have been!). Doing multivariate statistics on geochemical data in the late 1960s and early 1970s, with a few hundred samples and 20 or 30 analysed elements, stretched the computers of the day to their limits. Yet those limits - maybe 32K bytes memory and a few megabytes of disk space - today look laughably small, certainly far less than in a palm-top personal organiser. Yet if one were to attempt similar statistical studies today, on similar data volumes, you would find the computer resources that are now required are, strangely, much larger than 30 years ago.

This phenomenon was first recognised during the 1970s, when the computer industry's market leader IBM released a succession of ever larger, ever more powerful mainframe computers in the 360/370 series. What was happening of course was that the operating systems were undergoing constant development and growth. New features were progressively added, but old features had to be retained (ostensibly) to ensure backward compatibility. Integration of new features into old software architectures itself led to problems which had to be resolved by the addition of further software which had no purpose other than resolving the conflicts. Of course the increasing processor demands made by the operating system required the introduction of more powerful processors and increasing memory and disk capacity.

The same cycle has been repeated with microcomputers. The simple, compact, and efficient CP/M system in the early 1980s allowed early development of systems such as Datamine on desktop machines which had only a little more power than the mainframes of the late 1960s and are laughably small in terms of memory, processor speed, or disk size even by comparison with today's ordinary home PC. However, one could argue that the biggest mistake in computing history was IBM's decision to contract the small software company Microsoft to develop an operating system for its new ?personal computer" - the IBM PC. This system, known by IBM as PC-DOS and by third-party licensees as MS-DOS, started life looking very like CP/M. The major difference was that CP/M was written for 8-bit chips - the Intel 8080 and the Zilog Z80 - while MS-DOS was written for the 16-bit Intel 8086 chip. However, it is clear that Bill Gates was well aware of IBM's 1970s market strategy. He had clearly also seen that the computer hardware business is hard - actually making and selling boxes - while the software business is fundamentally different in that the production costs are negligible, meaning that if your marketing is right, it is very easy to make huge profits very quickly. Incidentally that is the same conclusion that we came to in setting up Datamine - that selling mining software was potentially much more profitable than selling consultancy services since the R&D work need be done only once for many customers, leading in theory to unlimited margins, while there is a strict ceiling to the percentage returns that can be made from consultancy because of the finite number of hours in a day and days in the year.

Of course the several versions of DOS were only the beginning of the story. The appearance of a new user interface ?Windows"- similar to ideas developed in the X-Windows and Motif GUI standards in the Unix world - opened the door to the development by Microsoft of rampant bloatware. During the whole of the 1990s, and continuing today with no sign of an end, the Windows operating system has steadily grown in size, complexity, and the demands it makes on hardware resources. The result is that a Fortran program which in 1970 would run happily (and fast) in an ICL KDF9 or a CDC 3600 mainframe, now requires many times the memory and many times the disk space, to run in a modern PC. The big difference of course is the relative costs of the machines. However, even a casual glance at Task Manager will show that an enormous proportion of a modern PC's capabilities are eaten up by the Microsoft operating system. Again, as with IBM in the 1970s, a lot of features are preserved to maintain backwards compatibility. There are many other things in there How about restoring some of the simple, reliable, and effective operating systems of the past ? CP/M - although as an 8-bit system it suffered from address-space limitations, and was only a single-tasking operating system - was one of the easiest operating systems to learn and to use. Its predecessors, the PDP-11 DOS and RSTS from Digital Equipment Corporation, were equally simple and rather more powerful. For a multi-tasking system, it would be difficult to beat the original Maybe Apple have the answer in MacOS, a Unix derivative. Unix itself was developed originally in the 1970s but its core remains very little changed after 25 years. Some of the commands are strange, but a little practice makes them familiar, and their oddness can actually help the user to remember them. There are many who believe - or at least hope - that there is a solution, in the form of a small, well-designed, simple operating system known as Linux. This is an open-source implementation of Unix, and held (indeed still holds) great promise. However, there are signs that with increasing involvement of commercial vendors, Linux is already well on the path to bloatware, and with its open-source philosophy allowing anyone to extend it, this cannot really be controlled.

Maybe we just have to hope that Moore's Law (the doubling of computer power every eighteen months) continues to hold into the future, and that computer hardware capabilities stay ahead of the expansion of operating systems. Perhaps there is some hope for those of us who learned to program when saving a kilobyte or two was a necessity rather than a luxury. If operating systems continue to expand unchecked, there will again be a real premium on designing compactness into our applications. It's all very well using visual development environments - but they tend to be provided by those guilty of the operating system bloat, and they generate wastefully large application code. Hand-crafted code has a number of advantages, not least that if properly written it can be read and understood by people. I would argue that 100 lines of well (hand-)written code is infinitely better than 1000 lines of automatically generated Visual Basic or C++ to do the same job.

Stephen Henley
Matlock, England

Copyright © 2002 Stephen Henley
Parkinson's Law: Earth Science Computer Applications, v.18,no.3,p.1-2