Subject Re: [Firebird-Java] How do you handle HUGE amounts of data?
Author Lester Caine
> I must say, that interbase/firebird is driving me crazy...
> I have to use FB/SS and I hate it.
> What I programmed it some kind of datapump..
> DB2(AS/400) -> FB/SS(Linux)
> and the FB/SS eats up ALL the memory and at one point it stucks...because
> Linux has to swap and I can only do 5 transaction per minute... pretty poor,
> esp when you have in mind that I have to pump 150.000 rows of data and it
> starts stucking at 50.000...
>
> How do you handle such things.. I already use the disconnect/reconnect but
> the SS has a single always running process and this is not freeing the
> data... it's even an known bug...

Is this a 'one time transfer', or something that has to be
done regularly?

I dump raw data quite often ( but not using the java
interface, and not DB2 ), and I do it using more direct tools.

That said. If you are fine with 50,000 rows, can you simply
segment the dump? Dump 40,000, commit, drop out, and then go
back in for the next block. The data that has already been
committed should not be accessed when the next block is
writen. In fact, a commit should 'flush' the data and allow
you to continue anyway.

--
Lester Caine
-----------------------------
L.S.Caine Electronic Services