Subject General Multi-User Architecture Question
Author Gary
Hi All,

Thanks in advance for the thoughts on this one...

We are having an application developed out-of-house. It replaces an
older application that was poorly designed for LAN (and WAN)
environments. This replacement app has been hailed by the developers
as following a client-server architecture. When I take a close look
at how it is installed, I am concerned that such is not the case.

Installing goes like this:
1. Install InterBase on the Server (set up certificates, etc.)
2. Install app server-side components on the same Server (Compaq
ProLiant 3000)
3. Install client-side components on the clients.

Now that looks correct so far, as a general rule (in my mind).

A close examination of what constitutes server-side components and
what are client-side components in this case has me puzzled. It looks
like this:
1. InterBase and the GDB file are on the Server
2. The application executable (Delphi 6 app) are also installed on
the Server
3. User config files, such as grid config, workstation config, etc.,
are stored on the clients.

Concerns:
Having the 6.5 MB executable on the Server doesn't make sense in my
mind. With that configuration each client must pull the executable
across the network as well as the data. That seems like a poor client-
server design (at the basic app level) for one and, it also seems
like a generally inefficient model for network performance,
especially where the customers have dial-up WAN situations.

My Design Idea:
1. Install InterBase (and the GDB) on the Server
2. Install any absolutely required server-side components for the app
on the Server
3. Install all binaries (the EXE, DLLs, etc.) on the clients
4. Use the BDE and ODBC to point to the Server share

Can anyone comment on this, and correct my design idea if it's
completely hosed?

Again, very much appreciated.

Best to all,
-Gary