Subject | Re: [ib-support] External tables and null values |
---|---|
Author | Milan Babuskov |
Post date | 2002-12-26T12:21:40Z |
Jason Chapman (JAC2) wrote:
overnight idea. The remote location is vacant overnight, so computer
will have to do all automatically. Updates are not so big (the whole DB
is about 200MB now), and updates should cover between 2 and 10 MB (if
weekly).
I added one char(1) field names record_status to all tables whose data
has to be synchronized. When user inserts the new record or updates the
exsisting, this field is set to null. Then I wrote small export/import
tool to take all rows that have record_status = NULL, and export it to a
file. All those files are zipped, and sent over the (slow) network.
After that I set record_status to S (sent), and that's about it.
I also wrote a small script that exports all tables, so I can easily add
more tables, just by adding another row in the script and record_status
column in the table.
On the central server, I made schedule to once-a-day unpack all packages
sent overnight and import into main database.
I have made my import/export tool open source, you can find it on:
http://fbexport.sourceforge.net
Thanx again,
Milan Babuskov
> I would use a delta database, i.e. a database with the same structureThis sounds like a good idea.
> without indices, triggers and procedures. Then use replication (we use our
> own), to populate the delta database, then zip this up, put it on CD, unzip
> it at the other end and replicate into the main database.
> If this sounds like a plan, you will need to add some tables and triggers toUpdates happen every week now, but I can be every day if we use this
> work out which rows have changed. How frequently do you do the update, how
> large is the whole DB and what % of it changes (by size) from one update to
> the next?
overnight idea. The remote location is vacant overnight, so computer
will have to do all automatically. Updates are not so big (the whole DB
is about 200MB now), and updates should cover between 2 and 10 MB (if
weekly).
> I've done this before, + or - complexity for lots of clients. We have evenThanx for the idea, but I already solved the problem:
> run overnight replication over telephone lines before now, it is surprising
> how much data you can synchronize overnight.
I added one char(1) field names record_status to all tables whose data
has to be synchronized. When user inserts the new record or updates the
exsisting, this field is set to null. Then I wrote small export/import
tool to take all rows that have record_status = NULL, and export it to a
file. All those files are zipped, and sent over the (slow) network.
After that I set record_status to S (sent), and that's about it.
I also wrote a small script that exports all tables, so I can easily add
more tables, just by adding another row in the script and record_status
column in the table.
On the central server, I made schedule to once-a-day unpack all packages
sent overnight and import into main database.
I have made my import/export tool open source, you can find it on:
http://fbexport.sourceforge.net
Thanx again,
Milan Babuskov