[ The opinions expressed by authors on Linux Today are their
own. They speak only for themselves and not for Linux Today.
]
Microsoft is a Y2K problem. Here’s why.
Some people have speculated that Microsoft may finally ‘get it
right’ with Windows 2000, which would have dire consequences for
the Linux phenomenon. In other words, if Windows 2000 turns out to
be a reliable high quality operating system (unlike its
predecessors Windows 3.1, Windows 95, and Windows 98), then Linux
will fade in importance, since it will no longer have weak
prey.
These people are operating under a fallacious assumption. In
this writers opinion, Microsoft does not know how to get
it right, so never will. It is a question of culture — the myriad
intertwined set of human activities behind any technological or
social movement.
I’m 45 years and have been programming for thirty of those
years. I have been programming professionally using Microsoft
products for the last nine years (since Windows 3.0) and have
programmed on UNIX more or less continuously since 1976, when UNIX
was entering its seventh year of evolution. I think Linux is the
archetype quality UNIX of our times, but my comments apply equally
to FreeBSD and to the culture of UNIX in general.
The main difference I see between Linux and Windows is that with
Windows, one is always having to deal with poorly specified APIs
(Application Programming Interfaces), poorly written documentation
that mixes marketing hype with fact and which is always
always incomplete.
Open a page to technical documentation in Windows, and you will
see statements like: “When you open a form bound to a record set,
the fields on the form will display the values in the current
record.” OK, fine, what if the record set is empty and there
is no current record? (I.e., no antecedent for the
anaphoric reference ‘the current record’.) Well, good luck. The
documentation was mute on the subject. This example is a distilled
stylized example from a case I experienced using Microsoft Access.
It is typical, not rare. Always, without
exception with Microsoft specs, one is forced to play a
guessing game and do what could be dubbed ‘programming by
experimentation’. True, with any system, some degree of learning is
accomplished by guessing and experimenting. And any specification
ever written always contains some fine elements which are subject
to interpretation, i.e., guessing. But with Microsoft products, one
has to guess all the time.
When I first broke open the shrink wrap to Visual Basic 5, I was
not prepared for the fact that the language had been modified, was
not backward compatible with VB4, and that once I recompiled my
current VB4 projects under VB5, the system would convert my code to
the new language and I would be unable to run my existing code in
VB4! But that actually happened. There is no published
specification of the Visual Basic Language, as far as I can
tell.
Recently I needed to apply the Microsoft Index Server for a
client. My workstation is NT server, so I clicked on the
Index Server Help menu to remind myself about what the Index Server
does. Nothing happened. Click. Nothing. (Or maybe it complained
that some help file was missing, I don’t remember.) I was not
surprised. My NT system was, in Microsoft time, rather ripe, having
been installed about one year earlier. And I am used to things not
working in Windows, aren’t you? So I was facing the standard ‘do I
reinstall from scratch or do I upgrade’ dilemma that all Windows
users experience every year or so. I chose to upgrade, from NT
Service Pack 3 to Service Pack 4. I went out to the Microsoft Web
site and tried to decipher the marketing hype in order to determine
just what the Index Server consists of and how to upgrade it. It
was a like a Chinese puzzle trying to figure out which option pack
and/or service pack to install first. I will spare you the details
of my ensuing pain, but a couple iterations of 80MB downloads and
four or five iterations through the installation script later, the
process actually converged. I had my Index Server back. The user
interface for web server administration was completely different
than before and Index Server itself had (and has) obnoxious bugs
(c.f. dejanews, search on [“Index Server” problem]). But the help
menu did now work. A day gone by.
It is a question of feel. What the system ‘feels like’ when you
try to perform system administration, when you want to program it,
even when you want to use it.
Of course, it is not just the documentation. Basic design
mistakes abound. Let me give you a simple example which has been
with us since DOS. And note that even the newest Microsoft
operating system, Windows NT, (‘New’ Technology — gag), has a DOS
command prompt window. On a DOS command line, when you type in a
command and then back up with the left arrow and then start typing
to make your correction, you overwrite your previous characters,
clobbering characters which were previously there. With UNIX, you
insert new characters at the insertion point, and the characters
which were there are shifted to the right. The DOS decision is
wrong because typically when you make a typo you wish to go back
and replace N characters by M different characters where M is not
equal to N. Hence, you want to go back, delete the mistake and
enter the new characters. With UNIX you do that. With DOS, you have
to always hit the INSERT key to get into INSERT mode before you
start typing, otherwise you end up typing over good stuff before
you remember that your command line editor was designed by Bill
Gates and Company and you have just clobbered some good part of the
line you are correcting. All all all Microsoft design is
permeated with such slightly off-kilter conceptions.
It gets old.
After years of this, I realized something one day. DOS was
invented by BASIC programmers. The company Microsoft was founded by
BASIC programmers. They were programming with BASIC and foisting 8
character file name length limits on the world starting in 1980. In
1980, UNIX was already 10 years old, already had introduced
effectively unlimited file name lengths and directory hierarchies,
had multitasking, worked on 16 bit computers, did not have an
idiotic 1MB memory limit, etc. etc. etc. What we have here is that
some guys in Albuquerque got lucky, were good hackers (with BASIC
interpretors), were great marketers, and took over the planet.
Now, I don’t have anything in particular against BASIC, per se.
It was my first programming language in High School. It is good for
doing lots of stuff. It dominated as it has because it is an
interpreter and (like LISP and like Perl), an interpreter is a
very convenient thing to have around to write small
programs fast. OK, fine, BASIC has its place and there is a reason
that non-computer-scientists took to it like ducks to
water.
But still, why are we stuck with Visual Basic in 1999? Well,
same reason. It is easy to program the first 90% of a task and
Microsoft, over the years, has managed to hack in enough semblance
of modern programming constructs that one can manage, via
programming by experimentation techniques, to cobble together a
solution.
Entire industries were born to remedy mistakes embedded within
DOS: memory managers, file system administration tools, and myriad
utilities were provided by a host of third party companies who owed
their very existence to design mistakes in DOS. People who
installed Windows are constantly asking their mechanically-inclined
or engineering-savvy friends to help them reformat disks, reinstall
programs, and help find their way through the maze that is Windows,
all as a favor, and at no cost to Bill Gates and Company. It could
be argued that the cost to the economy in lost time in
circumventing and working around the frustratingly bad design of
DOS and Windows far exceeds the personal fortune of Bill Gates.
We have left unanswered the question of why Microsoft has
succeeded so well in spite of these design flaws. Although that is
a separate subject, it deserves comment. Did Microsoft succeed
because they got product to market faster by cutting corners? Maybe
sometimes, but, in general, putting out a bad piece of software
actually increases the cost to the producer in the long run. I
think the real reason behind the success of Microsoft is what one
friend of mine calls ‘survival of the adequate’. If you can put
something out which has glitches but works, if it runs on the
Industry Standard Architecture (ISA bus), if it costs less than
products by and for yuppies (the Macintosh), if some genuinely good
applications are somehow made to run on it, then the cost-conscious
public buys it, naively (and understandably) expecting it to work.
Finally, by fiat you draft the community of programmers to be in a
constant state of beta-testing your products, get them all to do
programming by experimentation to figure out how to use your
products and develop further genuinely good applications for it,
much like a rat figures out how to get through a maze, and voila,
world domination.
It used to work that way. Linux is here now. Linux is a
better operating system and, if my thesis is correct, will
always be better. Linux costs even less so the public will
like that. Many programmers have awoken to the con job which was
pulled on them. Coordinated efforts to produce a good looking GUI
desktop with standards-based application interoperability are
rapidly being deployed on Linux. If the applications start to come
around, good bye system crashes, welcome Linux.
Will Windows 2000 solve the problem? I can only speak from
personal experience. I use both NT Server 4.0 (build 1381, Service
Pack 4) and Linux (Redhat 5.2, kernel 2.0.36). I mostly use NT
Server as my workstation, but do run it as a server to test some
development. I run one Linux box both as a workstation and as a
file, web, database, and setiathome server. I am typing this very
document into an emacs running on Linux but which is being
displayed on an X server running in NT.
I have seen numerous situations where Internet Explorer or the
File Explorer crashes and takes NT down with it. Happened under
Service Pack 3. Happens under Service Pack 4.
In contrast, I have seen Netscape crash many times in Linux,
yes, but never did a program which crashed in Linux take
down the operating system with it. My Linux machine has been up for
over six months across multiple reconfigurations of virtual IPs,
installation of new versions of basic server software, and
switching from an analog dedicated modem connection to a DSL
connection, without once needing to be rebooted.
If Windows 2000 really is Windows NT, then I’m not optimistic
that Microsoft is about to ‘get it right’.
I urge you to view Microsoft kind of like you view the Y2K
problem. A bunch of hacks were made many many years ago which,
through time, have been patched and grown over. Under it all, there
is the corporate identity which, after all, comes from the
originators of the corporation. Time buries the clunky designs, the
inability to write good English, the seeming utter ignorance of the
concept of a complete specification, and the arrogance of either
ignoring, not caring about, or trying to out do standards.
But the corporate identity, the corporate culture fostered by
Microsoft remains. The players change but the game stays the
same.
UNIX came out of the Universities and corporate research labs.
Originated by the brilliance of the likes of Ken Thompson at Bell
Labs, extended by the U.C. Berkeley Computer Science department and
Bill Joy later, and with a cast of hundreds, if not thousands,
augmented and improved up through today. Stallman, Torvalds,
Gosling and so many, so many good artist scientist engineers have
made Linux, Free BSD, Apache and all the open source efforts the
new paradigm — a different culture has coexisted along side the
commercial hacks of DOS.
The ‘feel’ one has when one is programming, using, reading about
UNIX is different than with Windows because it is a question of
that culture. All of the myriad underlying skills, computer science
background, design decisions. Hard to pin it down to any one thing
or even a finite list.
Windows 2000 will be a Y2K problem which cannot be fixed because
it has with it all the trappings of the culture which has produced
it over the past twenty years.
Linux, open source, Internet standards, and just plain good
science, may yet change the rules. Then, finally, we will be
playing a new game, and the referee won’t be Bill Gates.