Linux Today: Linux News On Internet Time.
Search Linux Today
Linux News Sections:  Developer -  High Performance -  Infrastructure -  IT Management -  Security -  Storage -
Linux Today Navigation
LT Home
Contribute
Contribute
Link to Us
Linux Jobs


Top White Papers

More on LinuxToday


Community: Rebuttal of "RedNova: WinServer Vs. Linux Study Generates Fire"

Jun 14, 2005, 23:30 (19 Talkback[s])
(Other stories by Peter Surda)

[ Thanks to Peter Surda for this article. ]

I read this paper on Sunday before it was posted on LT. It says, among other things, "we are genuinely committed to an open and professional dialogue about technology." So I wrote a rebuttal and sent it to the authors via email. I thought perhaps making it public would make some sense too, there is nothing personal in it. So here is it, straight out of my outbox:

Hello guys,

I just read your paper "Evaluation No. 1: Enterprise Operating Systems" and immediately had the urge to reply and correct several mistakes.

First of all, however, let me state that I am not "officially" impartial. I have my own Linux distribution and provide commercial Linux services. Furthermore, I only have limited experience with Microsoft Windows. However, I will try to stay objective.

I would also like to say that not all mentioned Linux problems were assessed wrongly. There are some very valid critique points. However, I won't elaborate on this, neither on the areas I don't have enough experience in to make an objective analysis (such as LDAP). Also please keep in mind that I don't get paid for this, so I can't spend much time crafting this response, and hence some parts may not be completely clear or stylistically correct.

I would like to begin with the quotation:

"In addition to being a Microsoft Certified Systems Engineer, our Linux tester was also certified an LPI by the Linux Professional Institute. Most recently, he has spent the past five years as a Linux consultant..."

Several passages of the paper shed some serious doubts about this claim (for example, both graphical remote administration and Automated Software Distribution & Maintenance are provided in both mentioned Linux distributions and are easy to use, see below). As I have no reasons to suspect the mistakes were intentional, the only conclusion left is that the person evaluating Linux had in fact very little practical experience with it.

Most of the problems were in my opinion caused by what I'd call "handling Linux as if it were Windows," i.e., using the same methodology, processes and infrastructure. Every experienced Linux administrator will tell you that this leads to a failure. I have seen this happen many times. This means that many of the perceived mistakes were not Linux' fault, but fault of the organisation, who thinks in Microsoft. In order to use Linux effectively, you have to think in Linux. The same holds in the opposite direction too: if I wanted to setup and operate a Windows network, I would almost certainly fail because I would want to treat Windows as if it were Linux, and I already realized several times that this simply doesn't work, but only after I wasted many hours of my time.

For starters, in many situations of making a company-wide linux desktop deployment, it makes much more sense to use thin-clients. Both mentioned linux distributions provide tools to do so (X, VNC) and there are additional products available (for example NoMachine NX). Using thin clients instead of "normal" desktop leads to dramatic cost reduction both in downtime and administration. I have been using a thin client as my primary "computer" for about 2.5 years both for work and private use. The machine I usually use is a real thin client (specialized hardware). This response was also created "on" it. Additionaly, wherever in the world I am and regardless of what OS is installed locally, if I want to work, I just SSH into my server and reattach the VNC session over the SSH tunnel. In Linux use vncviewer and openssh, in Windows also vncviewer but putty. So I am confident that using thin clients should work in almost all typical situations.

Furthermore, you correctly point out that setting up and administering linux requires more experience. I find it however strange that in light of this you didn't contact a professional. A professional can set up typical configurations much faster.

Another aspect of "Linux thinking" is automation. Experienced linux administrators automate typical and recurring tasks, so that setting up new machines or administering larger numbers becomes cheaper. Linux has all the necessary tools to do so. These economies of scale only show up when you actually have such large numbers and know how to automate, and are not reproducible on a small scale. I know this from personal experience, I administer several dozens of machines and co-administer many networks, the largest one consisting of about 1400 computers.

In general you can make use of this by outsourcing certain parts of your IT to a professional company that specializes in Linux and has the tools for automation. This way you'll save money and don't have to worry about understanding how LDAP or pptpd works.

Now to the specific points.

"Non-vendor support"
This seems to suffer from the same "Microsoft thinking" problem. It is there, and it is in my experience far superior to anything else. I often communicate directly with the developers and we work on fixing problems on a low level, together, usually within hours of noticing the problem. And it isn't anything rare or new, this works with almost any open source project, and has been working for years. Can you provide me an example where you found a bug in Microsoft's products, and were able to talk about it directly with the responsible people (and, god forbid, they actually fixed it)? However, a lot of inexperienced linux users don't know how to participate in this whole process properly, and hence don't reach solutions. The are either flooded with information they can't sort (as your paper mentions), or ask the wrong people, wrong places, wrong questions. I have been already thinking about this and am planning to create a document presenting this process to a beginner in a way that helps them get the most of it. It would consist of two parts: analysis/description, and a checklist for those who wish to participate. This will allow to promote the community and clear misunderstandings.

"Automated Software Distribution & Maintenance"
As I mentioned, both distributions provide tools for doing so. I have little experience with SuSE, but the tool you're looking for is yast. I also haven't used a recent Red Hat product, but Fedora (which Red Hat is based on) provides no less than three tools: Red Hat Network, yum and apt. I have experience with all of these, and didn't notice any problems. Other distributions have these or similar tools too, for example, Trustix uses "swup."

"Installing & Configuring Printers"
I must agree to some extent that linux printing is a mess. However, there is an easy workaround for typical cases: set every client to use a PostScript-compatible printer, and let the server's spooler deal with the conversion. That way, the client doesn't even have to know what kind of printer it is and doesn't need any drivers (PostScript is the default). I tested this several times and it works.

"Enterprise Administration"
As I said before, this whole area can be solved by using thin clients.

"Administration tools"
Apparently you missed "webmin," which should do exactly what you need and is web-based. I don't know if it's in the mentioned Linux distributions, but installing it isn't difficult. I'd also like to point out that an experienced linux administrator has other options too, for example I personally don't use webmin.

"Backup"
While I confess that automatic configuration backups as Windows has some merit, I personally don't have any problems with backing up any of the Linux machines I administer (there are several dozen of them, scattered over two countries). As for the "client backup," this isn't necessary on thin clients.

"Desktop Security Management"
Besides the repetitive "use thin clients," in this topic I sense "Microsoft thinking" again. If you want to disable certain client functions, then don't install the software, easy as pie (in Windows you often don't have this option). A firewall on the client isn't required, because a proper linux client machine doesn't run any services (besides SSH, which should be restricted to administrator's access upon install, and X, which should be setup in a way that it isn't accessible from the network), and isn't infected with spyware and worms. As for the automated updates, as I already mentioned before, there are several alternatives and you don't need any paid subscriptions for them. Activating them is easy, in fact Fedora does this already so I assume Red Hat solves this in a similar way.

"Configuring VPNs"
I admit here, setting up pptp/pptpd isn't easy. But as I said before, when used on large scale, you can automate both client and server side and save time (my Linux distribution "Route Hat" for example does this). Furthermore, there are other options that are easier to maintain, for example cipe or openvpn. And the universal tool SSH, which can be used to tunnel any TCP connections. Neither of the alternatives needs root privileges for using (but as far as I remember, cipe and openvpn require them to be set up initially, but this can be automated upon install). Activating/deactivating these VPN can be bound to a script on user's desktop, so that it is a matter of clicking.

"Remote Server Management"
Repeat after me: there is graphical remote administration for Linux, and has been around for many many years. X is by design network aware, the application doesn't care where you're sitting. More recent is VNC. Each has its advantages and disadvantages, I personally use both. A security-aware administrator would tunnel both over SSH or other VPN of course. But the bottom line is, it is easy and typically works automatically ("ssh -X user@server application-name"). And of course there is the already mentioned Webmin. I'd like to point out however, that I personally extremely seldom administer servers by the means of a graphical interface, so whether it is actually required depends on the administrator.

"...it was difficult to delegate server administration to non-root users..."
The tool for this is called "sudo," and is present in basically all Linux distributions. Setting it up may require some expertise, but as I said before, think economies of scale. Furthermore, the "webmin" I already mentioned twice should be able to do this too.

I omitted several parts, which was as I stated before either because I agree with them or because I don't have enough experience in that areas.

In summary, I consider your paper usable to some extent, but it should be relabled "If you want to use Linux but still think in Microsoft, this is how you will fail, so don't do it." In case you would like to make another Linux evaluation in the future, I recommend hiring a real Linux professional, not just someone who has a certificate.

Yours sincerely,
Peter

Related Story:
RedNova: WinServer Vs. Linux Study Generates Fire(Jun 13, 2005)