Subject RE: [firebird-support] bulk insert
Author Alan McDonald
> i'm looking for direction in terms (i hope) of existing documentation
> or solution.
> i have some databases (mssql & sybase, several flavors of each) that
> i would like to run some queries on from ado.net in either c# or
> vb.net. as the queries are based on combinations of these databases
> and i'm looking to use an embedded firebird engine to temporarily
> store the relevant data then re-query it from within birebird.
> i'm running into a time bottleneck getting to data into the firebird
> db. what is the quickest way to get a million or 2 records into a
> firebird database from ado.net?
>
> currently i select the relevant data from the source databases into
> an ado.net dataset, open a connection to the firebird db, create a
> table in the local firdbird database based on the schema of the
> dataset, then step thru each of the records in that dataset and
> generate an insert statement to insert the data into the firebird
> database, and finally close the connection.
>
> if the dataset only contained a 100,000 records the time would be
> acceptable at about a second per 1000 records, but at a million or 2
> reconds this just takes too long. i'm hoping that there exists a
> solution that can insert the data into the firebird in close to the
> same time that it takes to get the data from the source databases.
> ex, a million or so records from the source databases takes about 3
> minutes.
>
> is it possible to select the data from the source databases directly
> into the firebird database?
>
> i'm using the 1.6.3 .net provider and the 1.5.2.4731 embedded
> firebird engine, and am not at all oposed to changing either if it
> will help.
>
> help with either of these questions is much appreciated.
> gabe
>

if you aren't fussed about doing it in a combination of ways, I would
suggest you do the table creation the way you want, but to write the data to
an external file and then issue one command to firebird to insert from this
table to your newly created table. I'm sure that's the fastest way to get 2
million records in.
external files need to be fixed field length however, so you may need to pad
your data as you write it to disk.
Alan