Subject | Re: [firebird-support] Bulk Loading for FB? |
---|---|
Author | Ann W. Harrison |
Post date | 2005-05-27T19:03:58Z |
Hi Bill -
something you do a log, I'd create two DDL scripts for the database -
one that defines only columns and tables and one that adds constraints,
triggers, indexes, etc. Run the first, load the data, then run the
second.
the data in files with fixed length columns, map those files to external
tables, and load the internal tables with
insert into <internal table> select from <external table>
Of course, that gets messy if you've got nulls, but few non-database
data sources represent nulls well.
Failing that, write a preprocessed application that reads your data
source and stores the data. It should run on the server - classic on
Unix type systems, embedded for Windows.
Regards,
Ann
>Well, one approach is to turn off integrity checks. If this is
> I am looking for some guidance on bulk loading data into Firebird. What
> I want to do is turn off integrity constraints and just load the data
> as fast as possible.
something you do a log, I'd create two DDL scripts for the database -
one that defines only columns and tables and one that adds constraints,
triggers, indexes, etc. Run the first, load the data, then run the
second.
>In terms of actually loading the data, the fastest approach is to format
the data in files with fixed length columns, map those files to external
tables, and load the internal tables with
insert into <internal table> select from <external table>
Of course, that gets messy if you've got nulls, but few non-database
data sources represent nulls well.
Failing that, write a preprocessed application that reads your data
source and stores the data. It should run on the server - classic on
Unix type systems, embedded for Windows.
Regards,
Ann