Subject Linux "Bloat": Re: [IB-Architect] Open Question to IB Developers
I'm generally annoyed by the pathetic attempts of the various
distributors to produce a "workstation" and a "server" version of
the operating system software. This inevitably tempts the
installing person (in the corporate world, this is typically a
sysadm, not the end user) to install that version.

(note: this critique applies to NT workstation and server as well -
my standard line about NT Server is that
"NT Server makes a great workstation")

In that minimalistic situation -
Inevitably some piece of code is missing that the end user has to go A)
find the original installation media - B) figure out what the heck RPM the
file you need is in C) get root on that particular system so he/she can
install it D) install it... and then and only then he/she is able
to get to the configuration step, which I'll get to in a minute.

In a day and age where 30GB of hard disk space costs 300 bucks, and a full
install of the OS AND Powertools is about 1.8GB, and (on modern hardware)
takes about 35 minutes to install - my vote is - always install
EVERYTHING. Upgrades are then easy - NFS mount the server with the
upgrades and rpm -Uvi *.rpm.

I have built a product around a single floppy Linux distribution (see As much as I too, feel nostalgic for the
days of DOS, and QEMM, and the like, the fact of the matter is that Linux
is not just an OS but a set of very expressive applications and utilities
that can be put to many uses. You can, under considerable pain, build a
version that is intended entirely just to be a database server. It may
make you esthetically happy to have just a few hundred absolutely needed
files on that server. As soon as your perfect, minimalist, server hits the
market your users will start clamoring for a C++ library, or Java, or
samba, or perl, etc... and out the window goes your esthetics in order to
make a system that can actually be used for something.

The "ease of use, size of install, upgrade, and system security debate"
has raged for years. The security guys want a computer with a minimal set
of fully audited tools, locked in a secure concrete wrapped facility in a
basement, unplugged from the network and, indeed, electricity... the
admins want something that never breaks and is easy to maintain, and the
end-user wants full root access, and every application known to man - and
they expect the security folk and the admins to put up with this.

it's creative tension at its finest!

So as the debate continues there are some positive things that fall out of
it. I think you could find a distribution on's page with
similar goals to yours and adapt it to be a "Interbase Server". I can see
a role for Interbase in a dedicated box with 40MB of flash disk - perhaps
a box that monitors hwy 17 traffic flow patterns from a telephone pole....

We need to evolve towards building computer systems that think about their
resources being centrally managed first, and individually managed second.
Plan 9 had some good things in it. I still love X terminals and remote
access solutions like the VGA stations from Maxspeed. Big honking NFS
fileservers and diskless clients... etc. I would be happier in a world
that had quad processor desktop computers that served 4 displays, and I
honestly thought, 5 years ago, that the shape of the future internet would
be that every household would have its own firewall/email/web server
physically on the property - a design that evades a ton of legal issues
and establishes basic property rights for things like your personal email,
and defends your private internet property as well.

History has not been on my side!! but I keep plugging away at it... people
need to co-operate better... on everything... that's why I like open
source, it's a steppingstone towards socialization....

As for the configuration step, I like Jim, hate redhat's default
installation of apache and lack of documentation thereof. It's easy to
uninstall RPMs and do things your own way - I generally install apache
directly from's stuff.

Out of the box the web server (or any application) needs to be secure -
and minimally useful. I'm sorry Jim ran into trouble configuring a CGI,
that spike in the learning curve has stopped many a would-be apache
user.... Similarly Samba is pretty easy to configure, except that the
useful configation tool "swat" is disabled by default because it isn't
secure. I have got NT admins to be able to configure samba in less than
15 minutes, once I got swat turned on and secured properly.

Another point about the learning curve - there is such a thing as "too
easy" - some education about good practicies of operating a computer in a
public place should be required for everyone - I call it "practiciing safe
hex". Jim, did you ultimately grok the security reasons for why you had to
setup the CGI that way?

In another message someone griped about it taking 18 months to get up to
speed on Linux. Try to factor in the fact that it probably took you quite
a few months way back when to learn DOS, or to be comfortable with NT
3.5, or MacOs 7 - just because Linux is different does not mean that the
learning curve comparison is 0 vs 18 months. I can and have
successfully introduced rank beginners to computers to Linux + Staroffice
for example, and was never required to perform additional help for them.


"The Automobile is the opiate of the masses" - Chip Mefford

On Fri, 28 Apr 2000, Paul Reeves wrote:

> Dalton,
> For the most part I am in agreement with you. Linux distros are not built with
> InterBase deployment in mind (yet). And I, too, spend an inordinate amount of
> time trying to work out which bits really are crap and which bits are just plain
> redundant. The installers are getting better but the granularity of installs is
> not quite there (for me at least.) It is difficult to decide just what is needed
> for a server install (no gui) and wading through the options when trying to do a
> developer install (non-kernel) is a real headache.
> I think an InterBase distro would be useful but maintaining it may be a bit of a
> problem. Of course if the demand is there (ie people paying for pressed CDs)
> then it would be worthwhile. However it may just be too much effort to keep it
> up to date. (Although that work ought to be fairly limited - security patches
> would probably see the biggest turnover of code.)
> An alternative may be a simple script that is run after a standard Red Hat or
> SuSE install. It just rips out the fluff and leaves the essentials.
> And I must add one note of dissention. I believe Linux is popular because it is
> complicated. Developers are leaving windows because it has become too easy and
> developers never took to the Macintosh for the same reason. (I speak fairly
> generally - there are lots of exceptions.) Ultimately there is just not enough
> to twiddle with in Windows and Linux represents the new challenge. Coming at the
> same time as the demonisation of Microsoft it is probably a truism to say that
> if Linux didn't exist someone would have invented it. Ultimately Linux is a
> developer thing and when that stops being true (ie it becomes really mainstream)
> then developers will move to something else.
> Paul
> --
> Paul Reeves
> Fleet River Software
> ------------------------------------------------------------------------
> Would you like to save big on your phone bill -- and keep on saving
> more each month? Join beMANY! Our huge buying group gives you Long Distance
> rates which fall monthly, plus an extra $60 in FREE calls!
> ------------------------------------------------------------------------
> To unsubscribe from this group, send an email to: