dcsimg
Linux Today: Linux News On Internet Time.




More on LinuxToday


Community Column: Microsoft Backdoors (a response to ESR's 'Reliance on closed source...')

May 17, 2001, 16:45 (19 Talkback[s])
(Other stories by IaN. M. Johnstone-Bryden)

Opinions expressed by contributors to Linux Today are not necessarily those of LinuxToday's staff or management.


By Ian. M. Johnstone-Bryden
Editor, Firetrench Monthly

I read with interest the posting by Eric Raymond and have much sympathy for his position but believe that there is a danger in over simplifying the situation in an enthusiasm for open sourcing.

When I started out over thirty years ago the world was firmly proprietary and closed source. Over the years I have been involved in the ARPANet/Internet, the UNIX movement and the development of published Standards (including ISO 9000, X.400/X.500, ITSEC and Common Criteria) and believed that Open Source was an important evolutionary step to better, safer, more cost effective information systems.

Over the years, I have observed the capacity for vested interests in governments and corporations to subvert open programmes in an attempt to control freedom and secure marketing locks on customers. One of our major challenges in the open programme movement is to achieve a level of cohesiveness that can counter the efforts of these subversive organisations. So far the most effective weapon in our armoury is in making technology available free of charge. Linux is a prime example, but the weakness is that commercial promotion is necessary even here to secure a strong place in the market for the technology (purists may question the relevance of a strong market position but without that position the likes of Microsoft would eventually drive Linux off the face of the planet). We all need to generate some form of income from our work. Some of our community may be able to survive on very token amounts of income, and that may be indirect payment, but most of our community has to buy food, clothes, housing and all the other necessities of life, paid for by the day job. That inevitably introduces increasing commercialism as our favoured programme gains strength and visibility. That commercialism starts to introduce closed components.

Microsoft is an unashamed mega-corporation that craves total market domination and is the natural anti-christ to the open programme movement. As a result we are never surprised to find some new undesirable element in their products or their actions.

The group of which I am a member has long (nearly 30 years) provided skills to a wide range of organisations to identify vulnerabilities in systems and procedures, both in the form of practical ethical attack and in evaluation to a published security criteria.

The 'back door' is a very common feature in a huge range of 'products', including those built with Open Systems. I cannot see any way in which a system can be built without back doors. The question is not whether, or not, they should exist, or whether open, or closed systems, are better, but a question of who knows about the back door - and also the very important questions about how you secure it and when you should remove it.

Not long ago, an impetuous young former associate was quite rightly condemned for publishing details of a back door in a shopping trolley, including the default password. The vendor was outraged and many thousands of their customers became immediately vulnerable to attack. The vendor used the traditional justification that the backdoor was inserted to enable him to assist users who had, through error, made their system inaccessible. My own view at the time was that widely publishing exploits, particularly with passwords, before having first contacted the vendor was inexcusable and unacceptable. The exploit had been performed on a copy of the product legitimately purchased and being tested for vulnerabilities before the client made the system operational on the Internet - so none of the usual cries about illegal actions applied. However, the vendor had also committed a common crime IMHO. That crime was not in adding the backdoor, but in failing to give customers the opportunity to remove it, or at least change the password/access control.

What we really don't know in any case is why the backdoor was added. Whatever the vendor says, some will suspect that the back door is there to allow the vendor to do something which the user might never agree to. With Microsoft's reputation (richly deserved or not) many will immediately see conspiracy, in addition to the obvious security vulnerability.

It is very easy to sit back with a satisfied smile and think that this is only something that happens with closed systems, because even open systems are not always open.

No one can deny that Open Source potentially provides the very best assurance that deliberately hidden vulnerabilities do not exist. When Linux was largely confined to a small enthusiastic and technically experienced group of users, the hidden vulnerabilities were confined to those parts of the product that had never been completely documented and tested to death.

Having been involved in a project to produce a maths development environment as the answer to the meaning of information system life, and having also been an enthusiastic advocate of metaCASE development, I doubt that any system can ever be built without any vulnerabilities, although I would postulate that a system built through metaCASE technology, using proven formal and structured methods, and carefully tested security clauses, is as close as we will ever get and light years ahead of traditional 'fag-packet' code design. Of course that approach can be extremely costly in the initial stages even if the life cycle costs normally prove significantly lower. Most applications and systems integration has to deliberately accept a number of short cuts and those short cuts create a wealth of vulnerabilities (both operationally and to deliberate assault).

If you accept that view of development assurance (as more fully described in a paper I co-authored with Leroy Lacy for the IIW at Gaithersburg in 1994), you have to accept that even Open Source will produce vulnerabilities that are hidden by accident. As Linux becomes a main stream commercial product, marketed in various leading distributions by ever larger corporations, back doors will become more numerous and less visible. It is only a small step to the stage where we will find Linux distributions that routinely contain code that no one will ever look at. Open Source applications built on top will probably achieve that stage first and be less reported. The saving grace with Linux, in particular, is that there is a growing army of folk who will spend thousands of hours looking through various distributions for vulnerabilities and they will publish their findings in an environment where the window of opportunity opened by publication will be extremely small - BUT ONLY if all the affected users are made aware of the discoveries at the same time as the crackers.

That then brings us neatly to the area of security that defies all solutions. Many decry 'security by obscurity' but, in reality, it is impossible to avoid depending to some extent on this approach. A common access control mechanism is still the password and that depends entirely on obscurity. Any vendor putting in a backdoor covertly can claim with some justification that they are protecting their users because it can be discovered only by deliberate act. Of course if they password protect the back door with 'password' or [blank] that argument is very weak. If they protect with a 30 character password that is changed daily, the argument is strengthened. That leaves the ethical question about a vendor adding a backdoor without any kind of knowledge by the user and without any form of authorisation.

From direct experience over many years, I know that most systems are vulnerable to attack through unknown entry points. I remember one financial system where I was a member of a team testing for the banking corporation that owned it. Our first unobserved entry to the target system was via very old slow speed modems originally installed by the main frame supplier for his remote diagnosis purposes during the implementation phase, many years before, and long since forgotten. It took less than 2 hours to find this point and exploit it, subverting a very costly new banking service. The attack went unobserved because the client had disabled all audit and alarm facilities on the main frame to increase performance cheaply. We went on to identify a host of other vulnerabilities and YES -Microsoft featured strongly in the target system, but many early vulnerabilities could just have easily existed in open systems.

Although I largely agree with Eric and understand the basis of his posting, I consider the greater danger in proprietary closed systems to be the failure to implement fixes quickly or at all. Over the years I have been involved in many testing situations where product vulnerabilities have been discovered and reported to the vendor who then does nothing. Someone like Microsoft assumes that every customer who fails to pay for the latest upgrade deserves all the grief he can get. With Open Source, a user can decide if and when to implement what enhancements/changes/new versions.

When its a closed system, the tester is in a very difficult position. You usually have to sign NDAs and that prevents you from blowing the whistle if you felt you had to do that, even though you know that the vendor agrees on the vulnerability but has deliberately decided not to correct the fault at all. Even under best circumstances, a vendor may be unable to work on a fix for months. As we all know, the fact that one person has found one vulnerability means that potentially many other unethical folk probably also know about the weakness and are working out how best to exploit it. Any delay in finding a fix increases the window of vulnerability.

I would therefore argue that the greatest risk in backdoors is inadequate protection and that the person responsible for the security of a system should know the back door is there, be able to remove it if he feels the risk justifies removal, and that he is able to implement a protective system that matches his risk policy - if he decides to keep the back door.

Ian


Contacts points: Postal: Monks Farm, Saint James Road, All Saints South Elmham, Halesworth, Suffolk, IP19 OHG, United Kingdom.

Telephone: +44 (0)1986 782 547
Telefax: +44 (0)1986 782 525
Email: ianj-b@firetrench.com
www.firetrench.com
www.firetrench.net
www.broadlyboats.com


****Newly released**** "Risk Manager's Handbook No. 5 - Information & Communications Risk ISBN 1-84280-000-0 Nighthawk Publishing available from the firetrench.net On-line Shopping Mall

***Recently reprinted*** "Managing Risk" by Ian M.Johnstone-Bryden; Avebury Imprint ISBN 1 85972 255 5 Library of Congress CICS No. 95-79002


nein ist keine antwort



Interested in submitting a Community Column or a letter to the editor for publication on Linux Today? Contact the editors with a brief summary of what you'd like to write about (or just mail the letter). Not everything will be accepted, and we do reserve the right to edit submissions.