Subject Re: Triggers suitability
Author Adam
--- In firebird-support@yahoogroups.com, "yeohray" <yeohray@h...> wrote:
> Are triggers a reliable way of maintaining summary tables? I have an
> orders table, where with every insert/delete/update, I need to update
a
> summary table, possibly via a trigger e.g. UPDATE summary_table SET
> quantity = quantity + :NEW.quantity WHERE item_id = x
>
> Are there any possible locking issues, or trigger failure scenarios?
> Thanks for any help.
>
> Regards
> Ray Mond

Hello Ray,

It depends on your business logic whether that approach is suitable.
The main issue is that once a transaction has added or modified an
order, no other transaction will be able to modify that order until the
first transaction commits or rolls back.

Triggers are atomic operations. If there is a "trigger failure", then
an exception would be passed back to the client application, and the
work done by the trigger and whatever statement that caused it (for
example an insert statement) would be undone.

In the client application, you choose whether you want to rollback the
entire transaction, rollback the transaction to a savepoint, or take
some other action.

There is a nice way of achieving what you want without blocking other
transactions from modifying the same item.

Create your summary table like this

item_id, quantity

Now create a before insert trigger on the orders table that looks like
this

...
begin
insert into summarytable (item_id, quantity)
values (NEW.ITEM_ID, NEW.QUANTITY);
end
^

Create a before update trigger on the orders table that looks like this

begin
insert into summarytable (item_id, quantity)
values (OLD.ITEM_ID, OLD.QUANTITY * -1);
insert into summarytable (item_id, quantity)
values (NEW.ITEM_ID, NEW.QUANTITY);
end
^

Create a before delete trigger that looks like this

begin
insert into summarytable (item_id, quantity)
values (OLD.ITEM_ID, OLD.QUANTITY * -1);
end
^

On its own this does not do anything except to place a copy of the
orders into a table that is meant to be a summary. But this is where we
take advantage of MGA.

We are going to create a stored procedure that goes through the summary
table and consolidates the records, by summing them.

So the original data might look like this

1,50
2,100
1,-30
2,20

And we sum those records and replace with this

1,20
2,120


CREATE PROCEDURE SP_UPDATESUMMARY
AS
DECLARE VARIABLE ITEM_ID INTEGER;
DECLARE VARIABLE QUANTITY INTEGER;
BEGIN
FOR SELECT ITEM_ID, SUM(QUANTITY)
FROM SUMMARYTABLE
GROUP BY ITEM_ID
INTO :ITEM_ID, :QUANTITY
DO
BEGIN
DELETE FROM SUMMARYTABLE
WHERE ITEM_ID = :ITEM_ID;

insert into summarytable (item_id, quantity)
values (:ITEM_ID, :QUANTITY);
END
END

Now you can run this stored procedure from on a frequency relative to
the rate of fill (might be every 15 minutes, might be once a month).

Some notes:

I have been lazy, there are a lot more optimisations possible. The
update trigger should really check if new.quantity <> old.quantity.

The Stored procedure should use a having count(*)>1 or something like
that so that it only summaries records with multiple values.

Use an appropriate index on the summary table, or it may be quicker to
visit every record than to use the summary table.

When you want the quantity, you can use the following view.

CREATE VIEW V_SUMMARYTABLE (Item_ID, Quantity) AS
select Item_ID, sum(quantity)
from summarytable
group by Item_ID

This will always return the correct answer regardless of when the
stored procedure is run. Obviously, the fewer records it encounters,
the quicker it will be.

Adam