Subject RE: [Firebird-Java] Maximum number of items in batch
Author Rick DeBay
Of course, the most efficient algorithm would just create an int[] array
of the proper size, and resize it before throwing the BatchException.
:-)

-----Original Message-----
From: Rick DeBay
Sent: Tuesday, March 15, 2005 5:35 PM
To: Firebird-Java@yahoogroups.com
Subject: RE: [Firebird-Java] Maximum number of items in batch


I'll see if I can link in a free profiler when I get a chance. I'm
guessing the problem is b.

One free improvement would be the result list (and the batch list):
http://soft.killingar.net/documents/LinkedList+vs+ArrayList
( the relevant line in the following graph is ArrayList best-case, as we
know the size to preallocate)
http://soft.killingar.net/docs/LLvsAL/graph.pdf

Try:
List results = new ArrayList[batchList.size()];

There is less memory overhead (ArrayList doesn't store pointers to
previous and next), memory allocation is more efficient (it's all done
at one time), and it's fail-fast (if there isn't enough memory, it'll
fail at the beginning before doing any more work)[of course, this
ignores the memory allocated with each new Integer].

BTW, is backupVars being used?

-----Original Message-----
From: Roman Rokytskyy [mailto:rrokytskyy@...]
Sent: Tuesday, March 15, 2005 4:47 PM
To: Firebird-Java@yahoogroups.com
Subject: Re: [Firebird-Java] Maximum number of items in batch


> Does it allocate memory anywhere else? The OutOfMemory error happens
> several seconds after entering executeBatch(). So the parameter List
> has already been created, and Jaybird is only iterating through it.

A bit, but not so much. Below is the code of that method:

public int[] executeBatch() throws SQLException {

Object syncObject = getSynchronizationObject();

LinkedList results = new LinkedList();
Iterator iter = batchList.iterator();

boolean commit = false;
synchronized(syncObject) {
c.ensureInTransaction();
XSQLVAR[] backupVars = null;
try {
while(iter.hasNext()) {
XSQLVAR[] data = (XSQLVAR[])iter.next();

XSQLVAR[] vars = fixedStmt.getInSqlda().sqlvar;
for (int i = 0; i < vars.length; i++) {
vars[i].copyFrom(data[i]);
isParamSet[i] = true;
}

try {
if
(internalExecute(isExecuteProcedureStatement))
throw new BatchUpdateException(
toArray(results));

int updateCount = getUpdateCount();

results.add(new Integer(updateCount));

} catch(SQLException ex) {
throw new BatchUpdateException(
ex.getMessage(),
ex.getSQLState(),
ex.getErrorCode(),
toArray(results));
}
}

commit = true;

return toArray(results);

} finally {
clearBatch();
c.checkEndTransaction(commit);
}
}
}

So, there are two possibilities:

a) vars[i].copyFrom(data[i]) - copies content of the stored parameter to
the
parameter associated with the statement. It makes a deep copy including
the
byte[] with data.

b) results.add(new Integer(updateCount));

Probably I could re-code both cases differently to minimize memory
usage,
but data in the case a) will be garbage collected on the next loop, and
case
b) will be garbage collected after the method call (if application
ignores
the values there).

But I haven't run it in the profiler, so maybe there is something more
which
I have missed. Can you check this with some memory profiler?

Roman




Yahoo! Groups Links










Yahoo! Groups Links