Nowadays, it’s impossible to introduce desktop computers without seeing the name of Microsoft around sooner or later. This company totally managed to take over this world. If they tell your computer to commit suicide, unless you’re some kind of bearded geek not using their software, it will.
The story of Microsoft begins with the emergence of personal computers, and with two clever guys called Bill Gates and Paul Allen who loved to mess around with computers. They were popular at the time in the computing world, because they wrote useful software (things that made writing other software much easier, a spreadsheet, a world processor…) on a low-cost computer called the Altair 8800 (that didn’t even included a keyboard and a monitor). When people at IBM – who ruled a big part of the computer world at the time – started having an interest in micro-computers, building what was going to be a big hit called the IBM PC, they were attracted by this popularity and licensed some of Gates’s software. And when they failed to get the major operating system of those days, CP/M, they went back to Gates and asked him if he could provide them for an OS for it. Gates found one, bought it, and hired the one who wrote it in order to modify it so that it fits IBM’s need. MS-DOS was born.
The first releases of MS-DOS essentially provided the file abstraction (ie the ability to consider that disk space is divided into chunks of data called files, allowing one to ignore the actual structure of the floppy disk he works on), some primitive tools to manipulate internal memory and text files (ie files containing text and numbers), and basic configuration routines, everything being stolen from the dominant OS of those days, CP/M. One used it by typing commands on a keyboard, pressing Return, and reading a resulting dump of text on the screen, an interface that’s called command line. However, it introduced a new commercial idea that was here to last : the idea of selling an operating system bundled with a computer, betting that users won’t bother using another one if there’s already something working there.
Subsequent releases included support for higher-capacity data storage, directories (a kind of box which may contain files and other directories, and hence a hierarchical way to organize files), then for more languages, then for clocks (a chip whose goal is to measure time, useful not only for calendar/watches-like applications but also for real-time applications) and even higher data storage, then for computer networks. After that, DOS 4, 10 times heavier than the first release of DOS with 110 KB memory usage (which is 1/30 the size of a common MP3 file), introduced “dosshell”, a new optional way to explore files that was a little less linear than the command line, direct ancestor of windows’s file explorer.
Subsequent releases of MS-DOS, besides adding up support for a higher storage space (again…) through multiple hard drives management and a new file management system, and managing more RAM, introduced three new tools : one to compress files in order to economize disk space (at the cost of slower data access), one to check the hard drive for errors (due to bad machine shutdown as an example) and try to fix them, and a simple antivirus, MSAV.
As one may figure out, at this stage, DOS didn’t really evolved anymore. Those last things really were secondary, questionable features, that were added just to sell new releases of the operating system and make money from it, and maintenance improvements for keeping in touch with new hardware. MS-DOS had reached a mature stage of evolution, and was a bit left behind while Microsoft were now working on their new product, Windows, initially running on top of DOS, which we’re going to describe in a later article.
Now that we have described the features and evolution of MS-DOS, we can discuss them. A few points shine, especially :
- Most of the development effort on the hardware abstraction side was spent on file and storage space management : Some releases of DOS took months, years of development, just to add support for newer, higher-capacity diskettes. Improvement on OS support of other kind of hardware was very limited.This may be partly explained by the fact that, at the time, everyone was okay with letting programs deal with the bare hardware. Hardware manufacturers made the thing a lot easier by letting everyone know how to interact with their hardware, and by introducing hardware that was dead easy to manipulate for someone used to assembly language. Standards were the rule rather than the exception, so if you could make your program work on a computer, you were 95% sure that it would work on other hardware without modifying it in any way. There wasn’t that much viral software at the time to harm the machine given direct access to it, especially due to the fact that people didn’t downloaded and run random software from the internet, for the internet didn’t exist for most people. On the other hand, performance was a critical issues, and there’s no program faster on a specific hardware that one written specifically for it, using it directly.
Part of that wasn’t true for storage hardware. First, there wasn’t any kind of dominating storage medium, there was a jungle of incompatible technologies (multiple kinds of tapes, various flavors of diskettes, the first hard drive disks…), and they shared the common characteristic of being awful to manipulate. Then, performance wasn’t such an issue for file storage : if you store something, it’s not for using it right away, and you don’t spend your time reading and storing files on a diskette in your programs when you’ve got a main memory that’s a lot faster. Last, file storage and manipulation was almost the sole thing that an average unskilled user, trying to use software rather than write it, was forced to get into, in order to find and run the software he used, or to copy the text files he wrote to disk, so the process had to be as simple as possible. The Shell, whose only purpose was to better visualize the hierarchical structure of directories and find files quicker, perfectly illustrates that.
- Little to no planning, features come as needed : Let’s see… As DOS aged up, 2 different file systems – ways to manage the file abstraction – were used, one after another, called FAT12 and FAT16. FAT16 was introduced to address the maximum disk capacity limit of FAT12. They are extremely close to each other for older programs maximum compatibility reasons, so that differentiating FAT12 and FAT16 is extremely difficult, but at the same time they are structurally incompatible. This is a typical example of a hack, a modification in a program that a developer introduces when he understands that he messed up, but don’t want to make a new design doc and other silly rigorous conception, just want to fix the sole thing that doesn’t work. Perhaps the most common source of bugs in software is when people forget about the hack (or don’t even know about it) and push it to its limits.
At the time of DOS, they couldn’t plan things, since they didn’t have a clue of how the computer business would move on, so they introduced gradual changes, leading to hack accumulation, and hence bug multiplications. Let’s anticipate the following articles by saying that this is one of the reasons that led to the abandon of DOS later : managing, correcting, and more generally modifying it had become too complicated, since no one could ever understand how it exactly worked. The operating system needed a complete rewrite.
From the history of DOS, we may extract the following keys of its success and subsequent fall :
- Always focus on what’s important first… : Well, reasons for that are pretty obvious
- …but don’t neglect the OS’s theoretical foundation and design : Or, sooner or latter, hacks will occur.
- Make the doc simple, complete, and easily accessible : And you save people a lot of trouble, preventing effort duplication on the way. Hardware manufacturers knew that, at the time.
- Don’t neglect the average user : After all, there’s a lot more users than there’s developers, so you know which one makes lots of sales.
- Cleanness is the friend of reliability : It always seems fine at the time, but it is the source of most of computer lack of reliability nowadays.
- Compatibility may be the enemy of cleanness if the software is poorly designed to start with : This will be even more obvious after we study the story of Windows.
- The search of maximal performance may be an enemy too : There are times when only hack will make a program faster. Though if I were you, I’d better choose reliability and simplicity of design over performance, as long as said performance is sufficient.
With those last three points, one figures out that OS design is often made of dilemmas. This is not the last time I’ll be saying it. Thanks for reading !
Credits : For information on the early history of Gates, Allen, and Microsoft, took some material from http://americanhistory.si.edu/collections/comphist/gates.htm
For MS-DOS evolution, I found it here : http://fr.wikipedia.org/wiki/DOS