Subject | Re: [IBO] [] Execution of prepared statements consumes a lot of memory |
---|---|
Author | Helen Borrie |
Post date | 2004-12-14T22:48:22Z |
At 04:30 PM 14/12/2004 +0100, you wrote:
requoting. It's easier to follow through on complex problems if you take
the little extra time to re-paste source listings when they are referred to
again, enabling the troubleshooter to paste them directly into IB_SQL.
buffering. As it transpired (eventually) your report involved many nested
levels. IB_cursors would be OK for this if the report didn't involve
grouping - but yours clearly did.
Because no buffers were available for each iteration of each level of the
outer loop, I could envisage that FastReport was probably doing something
internally to retain (or restore) access to the data controlling the
loops. As Delphi reporting tools, in the general sense, work with buffered
datasets, I could visualise 1000 or so separate cursors accumulating there
on the server, waiting for Fetch calls that would never happen. If
FastReport didn't know how to clean up these cursors, that would account
for the unusual memory growth.
focussing on what FastReport was doing with those loops.
partly accountable for the memory growth. For example, a 20,000-page cache
of 16K page size would keeping growing until it was ~328 Mb.
If there were sorts involved (which I couldn't tell, because you didn't
provide any SQL) -- possibly at all three levels -- involving perhaps 1000
* 1000 "redundant" open cursors, then the big numbers you reported would be
unsurprising.
As to curing the problem, your tests with ib_query seem to eliminate the
overhead of these redundant cursors being held open on the server.
A big report cycle with a lot of sorts is going to consume a lot of RAM in
Fb 1.5, too, because, unlike previous servers, Fb 1.5 uses RAM for storing
the intermediate sets. As I suggested, you could lower the ceiling by
splitting the run into separate transactions, but you said you needed the
whole run to be done in one single transaction. OK -- so, to make an
omelette you have to break some eggs. :-)
As someone else suggested, if sorting performance isn't important, and low
RAM is causing problems, you could crank down the amount being used for
in-memory sorting. You would have to stop the server to do this, of
course, and any other users also connected would lose out on the benefit of
RAM sorts, too. If the server has plenty of RAM, I'd just leave the sort
memory alone and accept it as a boon that Fb 1.5 gives you a way to use
it. Database servers love RAM.
mountains. I can't fly. An I'd *never* pass the Swimsuit Test. :-)
cheers,
Helen
>As this problem is rather complicated, I found it very difficult toThe length, per se, is not so much a problem for me as messy
>decide which information should be provided and which not. Problem
>reports for complicated problems have a tendency to get too large so
>that nobody reads them ...
requoting. It's easier to follow through on complex problems if you take
the little extra time to re-paste source listings when they are referred to
again, enabling the troubleshooter to paste them directly into IB_SQL.
>You were going to try using ib_query instead ofIB_Cursor is great for things that don't need the overhead of
> > ib_cursor -- did you?;
>
>I did today. And it seems that you had a good nose. Without any
>tweaks, just with IB_Cursors changed to IB_Queries, the server memory
>usage could be reduced significantly. With IB_Cursors, the server used
>1250 MB of memory, with IB_Queries it was only 430 MB.
>It seems that Fastreport does have some problems when it is used with
>cursors. I will have a deeper look into the issue later.
>
>What do you mean by "blindly using ib_cursors"? If this should mean
>that it was wrong to use them, you are right. I just did not think
>that FastReport would screw up when working with cursors, and as
>cursors are the fastest way to retrieve rows using IBO, I thought it
>was the right decision to use them.
buffering. As it transpired (eventually) your report involved many nested
levels. IB_cursors would be OK for this if the report didn't involve
grouping - but yours clearly did.
Because no buffers were available for each iteration of each level of the
outer loop, I could envisage that FastReport was probably doing something
internally to retain (or restore) access to the data controlling the
loops. As Delphi reporting tools, in the general sense, work with buffered
datasets, I could visualise 1000 or so separate cursors accumulating there
on the server, waiting for Fetch calls that would never happen. If
FastReport didn't know how to clean up these cursors, that would account
for the unusual memory growth.
> > and you didn't answer my question about whether theIt was important at that point in the troubleshooting, since I was
> > application and the database server were running on the same machine.
>
>You did not ask this as direct question, and enthusiastic about your
>answer, it seems I did not recognize your indirect question ;-)
focussing on what FastReport was doing with those loops.
>BTW, the main reason for my last posting was because nobody answered myYes. I was trying to eliminate the possibility that an oversized cache was
>question about the page cache. You indicated that the page cache could
>use a lot of server memory, and I replied:
>"The default page cache size per database is 2048 pages, and my page
>size is 8 KB. AFAIU the page cache should only consume 16 MB then?"
partly accountable for the memory growth. For example, a 20,000-page cache
of 16K page size would keeping growing until it was ~328 Mb.
If there were sorts involved (which I couldn't tell, because you didn't
provide any SQL) -- possibly at all three levels -- involving perhaps 1000
* 1000 "redundant" open cursors, then the big numbers you reported would be
unsurprising.
As to curing the problem, your tests with ib_query seem to eliminate the
overhead of these redundant cursors being held open on the server.
A big report cycle with a lot of sorts is going to consume a lot of RAM in
Fb 1.5, too, because, unlike previous servers, Fb 1.5 uses RAM for storing
the intermediate sets. As I suggested, you could lower the ceiling by
splitting the run into separate transactions, but you said you needed the
whole run to be done in one single transaction. OK -- so, to make an
omelette you have to break some eggs. :-)
As someone else suggested, if sorting performance isn't important, and low
RAM is causing problems, you could crank down the amount being used for
in-memory sorting. You would have to stop the server to do this, of
course, and any other users also connected would lose out on the benefit of
RAM sorts, too. If the server has plenty of RAM, I'd just leave the sort
memory alone and accept it as a boon that Fb 1.5 gives you a way to use
it. Database servers love RAM.
> > Sorry, I'm not Wonder Woman. :-)Yup. I can't see through brick walls or dense mud and I can't move
>
>Are you sure? ;-)
mountains. I can't fly. An I'd *never* pass the Swimsuit Test. :-)
cheers,
Helen