Subject Re: [Firebird-Java] How do you handle HUGE amounts of data?
Author Helen Borrie
At 09:52 AM 5/06/2003 +0200, you wrote:
>Hei :)
>
>I must say, that interbase/firebird is driving me crazy...
>I have to use FB/SS and I hate it.
>What I programmed it some kind of datapump..
>DB2(AS/400) -> FB/SS(Linux)
>and the FB/SS eats up ALL the memory and at one point it stucks...because
>Linux has to swap and I can only do 5 transaction per minute... pretty poor,
>esp when you have in mind that I have to pump 150.000 rows of data and it
>starts stucking at 50.000...
>
>How do you handle such things.. I already use the disconnect/reconnect but
>the SS has a single always running process and this is not freeing the
>data... it's even an known bug...
>
>http://www.ibphoenix.com/main.nfs?a=ibphoenix&l=;IBPHOENIX.KNOWLEDGEBASE;ID=
>'45'

Is it? Can you describe the bug?

When you say "5 transaction per minute" - are you saying that you are
starting a new transaction for each row? If so, this will make a ve-e-e-ry
slow datapump.

The main thing with data-pumping (reading from an internal or external
source and inserting into a Fb database) is to batch the rows into
transaction groups of around 10,000 rows (less, if the row structure is
huge).

As with any DB, large batch-inserts can cause the indexes to go out of
balance and tend to slow down any operations on that table. Use ALTER
INDEX to deactivate an index before a big batch insert and then use it
again afterwards to rebuild the index. A 150,000 rows pump isn't
enormous. If you are getting an untoward amount of memory-swapping, your
db cache might need looking at, too.

This isn't a Jaybird problem, by the way. IB-Support is the proper place
for help with non-Jaybird questions.
<mailto:ib-support-subscribe@yahoogroups.com>

^heLen
(resident Moderator-witch)