Subject Re: [IBO] Moving data from one database to another using Tib_import/export. Surrogate key?
Author Jason Wharton
I wrote a system for the State of Arizona to synchronize their county voter
registration databases with the centralized state database on a daily basis.
The counties didn't want to have any complex process their system was
required to maintain. Thus, each day we got a snapshot of whatever their
current database was, not some kind of an incremental file. We did do
incremental files with two counties because they were so massive but we
still had the capability to handle a full-file in order to make certain
nothing was missed. You will need to have two columns in your master table.

Entity_ID (who each unique client is)
Batch_ID (identify each synchronization event)

You will also need a unique record indicator so that each record can be
matched up.

As you go through your entire file from each entity you will either add
records that are not there or do an update if they are matched up. If there
were no changes you still need to indicate that the record was validated by
the Batch_ID. That way, when the file is completed you can delete all the
records for that entity that were not matched in that batch.

This process is rather simple to implement and especially simple for your
entities. We had many more involved processes that kicked off in addition to
this but that's not likely going to help you any. If you have any more
questions, let me know.

Oh, we also had file system folders that were created to correspond with
each batch. That way we had a history of what files were ran in what
batches. It was common for some counties to fail to make their daily upload.
In which case they deferred some of their workload is all.

Hope this helps,
Jason Wharton

----- Original Message -----
From: "sshowl09" <sshowl09@...>
To: <IBObjects@yahoogroups.com>
Sent: Saturday, January 16, 2010 2:35 AM
Subject: Re: [IBO] Moving data from one database to another using
Tib_import/export. Surrogate key?


> Hi Helen,
>
> The systems that will be uploading their data are on different pcs, on
> different networks, and possibly are in different countries than the
> master system that requires their data.
>
> I was imagining scheduling all the systems to upload their data (a csv
> export) to the central system, via ftp or http or whatever, every night,
> and scheduling the central system to import all the data every morning,
> something like that.
>
> I am using embedded firebird on both the central system and the uploading
> systems.
>
> I have considered having the uploading systems upload a firebird database
> file containing only the new data that they need to pass, or uploading a
> CSV or some other export of the data.
>
> My thinking at the moment is that the CSV of other sort of export is
> better, as it defines a cleaner interface for the data transmission, in
> case there is a future requirement for some other system to examine the
> uploaded data.
>
> Thanks for the advice. Any suggestions appreciated.
>
> Cheers
> Sam