Subject | Memory problem |
---|---|
Author | Thomas Besand |
Post date | 2005-05-10T12:24:46Z |
Hello NG,
right now I'm writing a program that is used to administer adresses. I'm
using FB 1.5.2 as the DB. The programm itself is written in Delphi 5
with ZEOS Data Access Components.
Been writing to this group before, about memory problems I encountered
while importing large textfiles.
By now, I've been digging into the problem a little farther; here's what
I found out so far:
Because of frequent crashes while importing textfiles containing > 1
mio. records, I implemented a mechanism that allowed the user to resume
a crashed import.
As I commit writes to the dbtables every 10000 records I also write
status information to a textfile(i.e. the number of lines successfully
written to the db).
This textfile is deleted after successful completion of the import.
If an import is requested and the program finds this textfile, it offers
the user to resume the crashed import. It reads the number of lines
successfully imported and starts saving lines to the db only after that.
That way I hoped to come around the memory problem with large files.
But nope: what happens is, that with the first call to tblWriteData.Post
I can watch the amount of used memory in taskman.exe going up like a
rocket, until it is around the amount that previously caused the program
to crash. Eventually it would write a few more records, but that's not
what I intended.
Can someone please enlighten me, what happens here, and maybe point me
in a direction to avoid this behaviour.
Thanks a lot for your efforts
Thomas Besand
Berlin, Germany
right now I'm writing a program that is used to administer adresses. I'm
using FB 1.5.2 as the DB. The programm itself is written in Delphi 5
with ZEOS Data Access Components.
Been writing to this group before, about memory problems I encountered
while importing large textfiles.
By now, I've been digging into the problem a little farther; here's what
I found out so far:
Because of frequent crashes while importing textfiles containing > 1
mio. records, I implemented a mechanism that allowed the user to resume
a crashed import.
As I commit writes to the dbtables every 10000 records I also write
status information to a textfile(i.e. the number of lines successfully
written to the db).
This textfile is deleted after successful completion of the import.
If an import is requested and the program finds this textfile, it offers
the user to resume the crashed import. It reads the number of lines
successfully imported and starts saving lines to the db only after that.
That way I hoped to come around the memory problem with large files.
But nope: what happens is, that with the first call to tblWriteData.Post
I can watch the amount of used memory in taskman.exe going up like a
rocket, until it is around the amount that previously caused the program
to crash. Eventually it would write a few more records, but that's not
what I intended.
Can someone please enlighten me, what happens here, and maybe point me
in a direction to avoid this behaviour.
Thanks a lot for your efforts
Thomas Besand
Berlin, Germany