Subject RE: Prepared Statements and Memory Usage (Now With Subject!!!)
Author Ryan Thomas
Ryan Thomas
TransActive Systems

P: (02) 4322 3302
F: (02) 4325 1141
E: ryan@...
W: http://www.transactive.com.au/



_____

From: Ryan Thomas [mailto:ryan@...]
Sent: Tuesday, 12 April 2005 3:32 PM
To: 'IBObjects@yahoogroups.com'
Subject:


Hi Helen, thanks for the quick response.

Helen Borrie wrote:
> Hello Ryan,
>

*snip*

>
> That's exactly what Prepare does. It literally *prepares* the various
> structures that are needed to instantiate the statement and prepare the
> XSQLDA structure for transporting the parameter data across to the
> server. Amongst the many pieces of this task is to prepare one XSQLVAR
for
> *each* parameter. Each XSQLVAR is made of several pieces, concerning data

> type, size, etc., as well as a buffer large enough to carry data of the
> maximum defined size. 82 parameters is one heck of a lot of XSQLVARs!!
If
> a lot of these are character types, as one supposes they might be if you
> are processing XML, then that's simply one heck of a lot of memory that
has
> to be allocated.

Most of the data from the xml doc for this table is numeric, there is also
one blob and one char(20).

Adding up all of this I don't think it should jump 20Mb.

> If you are iterating through a loop, it is way far more economical to keep

> the statement prepared and recycle the allocated memory, than it is to
> unprepare it all (freeing that memory) and then re-do it in the next cycle

> of the loop.
>

The problem here is that the table for the data to be inserted into is
defined in the xml, there are about 200 tables, so each iteration we have to
re-create the insert sql (or we could dynamically create the IB_DSQL objects
for each new table and store references to them in a lookup table for future
usage. Sorry, just though of that as I was typing...).

>
>>In doing some debugging, I have traced the memory usage to the
>>IB_Components.pas file (starting line 22038) where the parameters are
>>iterated through (I have included the loop below). I don't know if this is
a
>>problem with the IBO code (I hope that if it was then it would have been
>>discovered and dealt with by now) or the way we are using the queries.
>
>
> It really gets down to the way you are doing this processing. To begin
> with, IB_Query is a poor choice for iteratively processing a DML
> statement. IB_Cursor would use less resources and IB_DSQL the least of
> all. Is there some particular reason you have to use an IB_Query?

No, no particular reason. I'll switch to IB_DSQL, thanks for the insight.

> Now, whilst you can save resources by choosing the correct component for
> submitting the DML request, you can't reduce the amount of memory that is
> allocated for executing the statement. A parameter has to be large enough

> to take the largest possible parameter value presented to it; so, if this

> is the way you have to perform this task, you are very much stuck with the

> consequences.
>

I've been looking further into my code for this and tried a few things.
Commenting out all of the code *after* the prepare stops the memory jumping
when prepare is called (I can't see why though). Without the code commented,
when i step over the Prepare() call the memory jumps up by ~20Mb. Looking
further I called Unprepare() after the Prepare() call and the memory was not
freed (this *has* to be something in my app).

*snip*

> One thing that's not clear in your description is whether the client is
> local or remote, i.e. you don't say whether the client is sharing
resources
> with the server.

Its a local client and all of the memory usage is in the client app, the
server is rock solid and stays ~26Mb of VM used.

After processing about 150 records through the app my machine is up at about
1.2Gb of memory usage (according to Task Manager and the speed of my machine
in this state!).

> Another question I'm inclined to ask is why you need to do this processing

> as a "client-side" task at all. I'm asking that, since you didn't say
> whether the XML source text was coming from the output of a database
query,
> or by parsing an input file...
>
> --- if the former, then a stored procedure with a couple of nested loops

> would be far more efficient that what you're doing now
>
> -- if the latter, then the question arises of where the "hit" of file i/o
> is being taken.

The xml comes from a remote host and is processed by the client (to update
data in their db).


> regards,
> Helen
>

Thanks for all the help Helen, I'm going to find whatever is causing this
crazy memory usage, I'll let you know (as long as its not *too* embarrassing
;) ).

Regards,

Ryan Thomas
TransActive Systems

>
>
___________________________________________________________________________
> IB Objects - direct, complete, custom connectivity to Firebird or
InterBase
> without the need for BDE, ODBC or any other layer.
>
___________________________________________________________________________
> <http://www.ibobjects.com> http://www.ibobjects.com - your IBO community
resource for Tech Info papers,
> keyword-searchable FAQ, community code contributions and more !

> Yahoo! Groups Links
>
http://groups.yahoo.com/group/IBObjects/
>
IBObjects-unsubscribe@yahoogroups.com
>
>
>
>
>



Ryan Thomas
TransActive Systems

P: (02) 4322 3302
F: (02) 4325 1141
E: ryan@...
W: http://www.transactive.com.au/




[Non-text portions of this message have been removed]