Subject | Re: [firebird-support] Gigabytes of Inserts |
---|---|
Author | David Johnson |
Post date | 2005-03-02T01:56:05Z |
Another option that I am using right now extracts the data from the
source database, transforms it to the new model, and inserts it directly
into the target.
The high level view is that we run an extraction thread and post results
in sets of 1000 to 5000 to a queue. A transformer thread consumes
record sets from the first queue, transforms them into their new form,
and posts the result to another queue. Finally, another thread reads
the result sets from this queue and posts the datasets directly to the
destination database. No intermediate datasets are ever saved. Jdbc
handles all of the type conversions, including nulls and empty strings.
On a 100 MBit line I can move roughly 8,000,000 rows per hour, c/w
transforms (in their own threads), running four sets of parallel threads
concurrently. The biggest limit I have run into is the source server
can't keep up with either the target database server or the app server
that is running the migration process.
source database, transforms it to the new model, and inserts it directly
into the target.
The high level view is that we run an extraction thread and post results
in sets of 1000 to 5000 to a queue. A transformer thread consumes
record sets from the first queue, transforms them into their new form,
and posts the result to another queue. Finally, another thread reads
the result sets from this queue and posts the datasets directly to the
destination database. No intermediate datasets are ever saved. Jdbc
handles all of the type conversions, including nulls and empty strings.
On a 100 MBit line I can move roughly 8,000,000 rows per hour, c/w
transforms (in their own threads), running four sets of parallel threads
concurrently. The biggest limit I have run into is the source server
can't keep up with either the target database server or the app server
that is running the migration process.
On Tue, 2005-03-01 at 16:40, Alexandre Benson Smith wrote:
>
>
> dbambo2000 wrote:
>
> >I have an SQL Server database that is around 10 gigabytes in size
> >which I am transferring to firebird. The database is currently on my
> >home system which is connected to the internet via a cable modem.
> >Since I'm not real confident about transferring the data directly from
> >SQL Server to the firebird server (again because the connection is
> >only a cable modem), I was planning to create a series of text files
> >which contain many insert commands.
> >
> >In order to help speed it along I was planning to do a commit every
> >1000 inserts or so. Is this what is meant by a "bulk insert" - just
> >not doing a commit too frequently?
> >
> >I was planning to zip the completed txt files up and ftp them to the
> >firebird server and then run them as SQL script files in EMS IB
> >Manager.
> >
> >Is that the quickest way to do this?
> >
> >Also, I noticed when running the SQL script file in EMS IB Manager
> >that the inserts happen very quickly at the start of the file, but
> >that they slow considerably near the end of a long script (5 or 10 MB
> >in size). Would it be quicker overall if I worked with smaller script
> >files (say a megabyte or so)?
> >
> >Can you run a script file via isql or some other similar program that
> >comes with firebird?
> >
> >Thank you for you input.
> >
> >
> >
> Search for External Tables, this is the fastest way to import lot of data.
>
> The external table is a text file with fixed length columns. You declare
> it inside FB and then could use just SQL to pump data from the text
> files to you FB tables.
>
> The problem I think is with null and with blobs. :-(
>
> HTH
>
> see you !
>
> --
>
> Alexandre Benson Smith
> Development
> THOR Software e Comercial Ltda.
> Santo Andre - Sao Paulo - Brazil
> www.thorsoftware.com.br
>
>
>
> --
> No virus found in this outgoing message.
> Checked by AVG Anti-Virus.
> Version: 7.0.300 / Virus Database: 266.5.7 - Release Date: 01/03/2005
>
>
>
>
> Yahoo! Groups Links
>
>
>
>
>
>
>