Subject | Re: [ib-support] more deadlocks |
---|---|
Author | Pavel Cisar |
Post date | 2002-12-20T17:13:05Z |
Hi,
On 20 Dec 2002 at 14:55, Lauri Zoova wrote:
> I have a table with products(id, name, count) and another table with
> sold_products(id, name, count, invoiceno.. etc). sold_products has a
> trigger that fires on table insertions. It updates the count of products
> in products table.
Well, you have shoot yourself to the knees :) Because insert is the most
often performed action (in ordinary application about 80% of all data
manipulation, 15% is for update and 5% for delete), it's not a very good
idea to connect it with update in one transaction. While insert never
blocks, update does. The chance that blocking occurs grow with number of
updated rows per transaction, with smaller number of rows in updated
tables and number of concurrent users making inserts/updates. Sometimes,
you can end with 99% chance to clash with others.
The better approach is to never do an update connected with insert
operation. In your case, you can sum() count of products sold from
sold_products on demand. If amount of rows is too high to get a decent
response times, you can try a mixed approach. Compute sold products on
daily maintenance and then use on-demand computation that sums up newer
rows from sold_products and add stored value from products. For even
better performance, you can create an utility table just with product
code and amount sold that would be inserted together with row inserted
into sold_products. You can then sum up rows from this table on daily
maintenance, update the product records and clear this utility table for
new batch. The computation on demand then sums up few new small records
from it with values from product record. It should work like a charm even
for millions of records in sold_products table.
Best regards
Pavel Cisar
http://www.ibphoenix.com
For all your upto date Firebird and
InterBase information