|Subject||Re: [ib-support] Importing Records|
> Does anybody know of a way import a very large number of records intoWhich kind of INSERT do you use ?
> I am adding records by the SQL command INSERT
> and doing a transaction about every 1000000 million records.
> I'm only getting about 8 million records / hour, and i have 110 million
> records to import.
INSERT INTO ... VALUES (...)
INSERT INTO ... SELECT ... FROM <external table>
My guess is that you use external table, and so you are inserting a million
rows by one command.
Each command in IB is "atomic" (it either inserts all 1000000 records,
or it does not insert anything; if e.g. row 999999 fails due to check
constraint, 999998 previously inserted rows must be removed).
To support this functionality, IB needs to hold (in memory!)
list of all changes made by current command.
So, to speed things up you do not have to commit after every X inserts,
but rather reduce amount of updates made by single command
(and thus reduce memory demands).
(You can also use select stored procedure with FOR SELECT loop,
and execute SUSPEND after some amount of INSERTs, because work done
by SP can be undone only to last SUSPEND command).