Subject Re: [ib-support] External tables and null values
Author Milan Babuskov
Jason Chapman (JAC2) wrote:
> I would use a delta database, i.e. a database with the same structure
> without indices, triggers and procedures. Then use replication (we use our
> own), to populate the delta database, then zip this up, put it on CD, unzip
> it at the other end and replicate into the main database.

This sounds like a good idea.

> If this sounds like a plan, you will need to add some tables and triggers to
> work out which rows have changed. How frequently do you do the update, how
> large is the whole DB and what % of it changes (by size) from one update to
> the next?

Updates happen every week now, but I can be every day if we use this
overnight idea. The remote location is vacant overnight, so computer
will have to do all automatically. Updates are not so big (the whole DB
is about 200MB now), and updates should cover between 2 and 10 MB (if

> I've done this before, + or - complexity for lots of clients. We have even
> run overnight replication over telephone lines before now, it is surprising
> how much data you can synchronize overnight.

Thanx for the idea, but I already solved the problem:

I added one char(1) field names record_status to all tables whose data
has to be synchronized. When user inserts the new record or updates the
exsisting, this field is set to null. Then I wrote small export/import
tool to take all rows that have record_status = NULL, and export it to a
file. All those files are zipped, and sent over the (slow) network.
After that I set record_status to S (sent), and that's about it.

I also wrote a small script that exports all tables, so I can easily add
more tables, just by adding another row in the script and record_status
column in the table.

On the central server, I made schedule to once-a-day unpack all packages
sent overnight and import into main database.

I have made my import/export tool open source, you can find it on:

Thanx again,

Milan Babuskov