Linux Distributions – Security through Unity?

Quite often there is discussion about what operating system to use and the pros and cons of each. Of course one aspect that comes up is security, which is definitely a worthwhile goal to have. However the discussion is usually based upon technical points only; Operating system A has this feature, while B has another that is trying to do the same thing, but doesn’t do it quite as well, while C doesn’t have that feature at all.

Technical points are important, but when you get down to the various flavours of Unix, it all rapidly becomes academic. Pretty much any Unix is more secure that any Microsoft Windows. This is because there is the
proper concept of user/uid and process separation not to mention a nice boundary between the application and the OS. This layering helps with security but it also makes updating a lot easier.

Sure, this flavour of Unix may have a certain feature, but does it really do anything worthwhile and
what is the chance of some event happening where the absence of this feature
means the server is hacked, while other similar servers with the feature are
fine. I’ve used Sun Solaris as an example here, let’s face it, there is
pretty much no ongoing new support for any other commercial Unix and
no future.

As a network engineer, security to me as just another aspect of network
management. It is important, but so is keeping the service running free of
faults and up to a certain level of performance. Perhaps some principles of
network management could be used to apply to server security.

An important lesson of network management is that quite a large number of
faults ( some studies have said 50%, some said 70%, we’ll never know the true
number ) can be attributed to a person or process failure, as opposed to
a software or hardware problem as such. Whatever the percentage is, quite
a large amount of security breaches are due to the administrators, for whatever
reason, not running their servers correctly. Therefore, anything that makes
the administrators job easier or the processes simpler makes security better.

An example

Perhaps an example will help. You’re in charge of setting up some servers
and you can choose what goes on them. You’ve narrowed it down to Solaris
or Debian GNU/Linux, what to choose?

The first answer should be, if the current operators are far more comfortable
with one over the other and you intend to use the same operators for the new
systems without any additional staff, go with whatever they are used to.

However if there is no strong preference, you then have to look at other things.
How about security patches and testing? Is the setup you’re running going
to be maintained and is it tested correctly?

Running Software – Solaris Style

Sun now has in their more recent versions included a lot more free software,
but it is still not a lot and, well they just have this habit of screwing
it up. I’m not sure why, but they don’t seem capable of compiling something
off, say, sourceforge, without making a mess of it. Top and bash used to
crash, never seen bash crash before I saw it on Solaris. And I won’t even
mention the horror of their apache build.

What happens if you want to run a MTA like postfix? Certainly a lot easier
to run and a lot more features than the standard sendmail. Or you want some
sort of web application that needs certain perl modules? If you’re running
Solaris, you download all the sources, compile and repeat all through the
dependencies. You can get pre-compiled programs from places scattered around
the Internet, but quite often there are library version conflicts.

That hasn’t even got into the problems when package A wants version 4 of
library Z but package B wants version 5 of library Z. Or what happens
if they both want version 4, but then you need to upgrade one of the
packages which needs the newer library?

Running Software – Debian Style

For the Debian user, it is usually a matter of apt-get install <packagename&g
t;.
There are nearly 9,000 packages in the distribution, so whatever you want is
probably there. There are only rare library conflicts; the library versions
are standard across each release and everyone runs the same one. The only
problems are the occasional transitional glitches as one packager is on
the new libraries and the other is still on the old one. Still the
occurrence of this sort of thing is greatly reduced.

All nearly 9,000 packages go through the same QA control and have their
bugs tracked by the same system in the same place. If the person cannot
get a problem fixed, they have the help of at least 800 of their fellow
Debian developers. If you’re having problems with your own version of the
program on Solaris, you’re on your own.

Upgrading a hassle, so it doesn’t happen

Now the problem is that upgrading on most systems is a real pain. The problems
surrounding the slammer and blaster worm on Microsoft servers is a good example.
When the
worm came out, people were saying its propagation was solely due to poor
system maintenance where the lazy administrators did not bother to
properly patch their servers.

Even the OS itself can play up, causing strange and amusing problems to
appear. My wife’s
Windows XP computer switches off when it goes into powersave mode. This
started happening after installing a security patch. I’m not sure what the
power saving code has to do with security, maybe evil hackers across the
internet cannot turn my lights on and off anymore.

While there definitely would be a subset of administrators that did fit
into the category of lazy and indept administrators, there were also
plenty that could not upgrade or fix
their systems. The problem was applying a service pack would break a great
deal of things in some random way. Some people just could not be bothered
or were too scared to upgrade.

It is generally expected that when you upgrade, you will get problems and
these problems need to be risk-managed. It shouldn’t be the usual expectation
for a simple upgrade.

While it is often a good idea, but not essential, to reboot a Debian system,
for most upgrades you just install the upgraded package and that’s the task
finished. There’s no need to stop and start the effected services because
this is generally done for you.

The clear layering of the application and OS and the reasonably clear layering
of the application and library code means that if there is a problem with
one of the layers the upgrade of it will not effect the other layers. This
is why when an application crashes on a Unix server you restart the application
while on a Windows server you reboot it.

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *