Subject | Re: [ib-support] %100 CPU usage with interbase, is it usual? |
---|---|
Author | Kaputnik |
Post date | 2001-12-28T18:13:23Z |
"KURSAT TASKIN" <kursatt@...> schrieb im Newsbeitrag
news:05664FAC761BD411B66D00D0B73EC3FE1DA59B@ETIEXCHANGE...
Please please please setup your mail-client or news-reader to correctly
quote messages with indentations, prefixes or something else so that other
can better follow what is quoted and what you've written.
So, to your problem:
Are you committing every insert? Are you committing at all? 1.000.000
records inserting without committing at all will make your engine very very
slow, especially if other transactions are open at the same time.
A good measuer is to commit after roughly 10k records +- depending on your
system.
Indizes (correct plural of Index) are used to speed up sorting and searching
tables better by creating a b-tree on the column(s) specified.
If you don't have any, then there is nothing to do anyway :-)
I am pretty curious why you need an IB_Script in parrallel to your query.
If you are generating a script at runtime and then executing it could be a
very cumbersome way.
With IBO you can best use a TIB_Cursor to loop one-way through a table.
To insert your records, take an IB_DSQL with an insert-statement using
parameters and prepare it. Loop through the data, fill the parameters and
execute once for every record.. You can commit manually (prepared statements
don't need to be reprepared, they are prepared outside of
transactions...[anybody correct me here]).
This should be lightning fast and your db won't be blocked by other users.
My Test-data generator can insert 10k+ records per second this way on my
home-machine (disk-loaded with raid) and the CPU will be more used by the
data-generating algorithm than by the DB. On our dual-CPU machine, FB will
take up to 30% pf the CPU-power while the other CPU sums up to roughly 95%
with the data-gen algorithm.
Cu, Nick
news:05664FAC761BD411B66D00D0B73EC3FE1DA59B@ETIEXCHANGE...
>RAM,
> Is this a program that you wrote ?
> If so, what are you using for database access ?
> How many tables does the import data occupy ?
>
> Kursat: I have only one table, 3 field, and 1.000.000 record txt file, I
> read data from txt file with a while loop using IBODatabase, IBOQuery,
> IB_Script, first delete all the record, and then read the data one by one.
> insert into GDB and then post. thats all. PIII 650 my computer is, 192
> WinNTOne thing:
>
>
>
Please please please setup your mail-client or news-reader to correctly
quote messages with indentations, prefixes or something else so that other
can better follow what is quoted and what you've written.
So, to your problem:
Are you committing every insert? Are you committing at all? 1.000.000
records inserting without committing at all will make your engine very very
slow, especially if other transactions are open at the same time.
A good measuer is to commit after roughly 10k records +- depending on your
system.
Indizes (correct plural of Index) are used to speed up sorting and searching
tables better by creating a b-tree on the column(s) specified.
If you don't have any, then there is nothing to do anyway :-)
I am pretty curious why you need an IB_Script in parrallel to your query.
If you are generating a script at runtime and then executing it could be a
very cumbersome way.
With IBO you can best use a TIB_Cursor to loop one-way through a table.
To insert your records, take an IB_DSQL with an insert-statement using
parameters and prepare it. Loop through the data, fill the parameters and
execute once for every record.. You can commit manually (prepared statements
don't need to be reprepared, they are prepared outside of
transactions...[anybody correct me here]).
This should be lightning fast and your db won't be blocked by other users.
My Test-data generator can insert 10k+ records per second this way on my
home-machine (disk-loaded with raid) and the CPU will be more used by the
data-generating algorithm than by the DB. On our dual-CPU machine, FB will
take up to 30% pf the CPU-power while the other CPU sums up to roughly 95%
with the data-gen algorithm.
Cu, Nick