Subject | Re: [firebird-support] Mysterious timeout of exactly 60 seconds |
---|---|
Author | Daniel Albuschat |
Post date | 2008-09-30T13:46:47Z |
Hi Milan,
2008/9/29 Milan Babuskov <milanb@...>:
numerous optimizations to the application and database access and,
after finishing all these optimizations and this 60-seconds-issue
going away, I found the main reason for the slowdown.
I did something like "insert into table(id, name, sortid) values
(gen_id(table_gen, 1), :name, (select max(sortid) from table) + 1)".
The subselect grew ever bigger and slower as more entries have been
inserted. However, those entries were supposed to be in the 100 to
1.000 datasets-scale, but a subtile bug in the application caused the
table to be filled with 100.000+ datasets instead.
Stupid, I know, but that part of the source was a bit hidden and it
looked so innocent when scrolling through it. ;-)
--
eat(this); // delicious suicide
2008/9/29 Milan Babuskov <milanb@...>:
> Daniel Albuschat wrote:I still don't know what this timeout of 60 seconds caused. But I made
>> Every now and then, but in what seems a specific interval, the
>> execution of the update-statement takes exactly 60 seconds.
>> I suggest there's some timeout occuring here, but the application has
>> exclusive access to the database, so there are no concurrent
>> transactions besides the two this application creates.
>>
>> Do you have any idea what is happening here?
>
> Garbage collection? To test you can, for example, try disabling GC
> completely and see if the problem goes away.
numerous optimizations to the application and database access and,
after finishing all these optimizations and this 60-seconds-issue
going away, I found the main reason for the slowdown.
I did something like "insert into table(id, name, sortid) values
(gen_id(table_gen, 1), :name, (select max(sortid) from table) + 1)".
The subselect grew ever bigger and slower as more entries have been
inserted. However, those entries were supposed to be in the 100 to
1.000 datasets-scale, but a subtile bug in the application caused the
table to be filled with 100.000+ datasets instead.
Stupid, I know, but that part of the source was a bit hidden and it
looked so innocent when scrolling through it. ;-)
--
eat(this); // delicious suicide