Subject Re: [IB-Architect] Trigger Templates
Author Dalton Calford
> A couple of questions and comments.
> First, do you think this stuff needs to be in the engine, or would
> an extension to GDEF be sufficient? Is there really much gain
> over simple cut and paste?

I personally do not see a need for it to be in the engine IF, a client side
tool would be extended to support repetitive code blocks. I would like to
have the trigger be able to understand what table called it so that you can
point a single trigger at multiple tables (this keeps maintenance low - update
one standard trigger and have it affect all the related tables). But, of
course, I have always been a big proponent of more global variables.

> Second, C++ templates are kinda funny beasts, added very late in the
> development cycle to paper over a specific design problem concerning
> container classes. In essence, templates are compile/link time macros.
> To the best of my knowledge, no other language has adopted the idea.
> Wouldn't a simple macro facility answer your needs and be a great
> deal more flexible to boot? I'm generally not a proponent of macros
> (macros in a language is a confession by the designer that he screwed
> up), but they seem a better match for your problem.

I can see macros in the client side tools, not in the server side language.
unless we have more global variables and perhaps a new command like
sqlexecute. For example

statment = 'SELECT A, B from FOO';
SQLEXECUTE (statement)
INTO varA, varB

This combination would allow a single trigger to service multiple tables, and
properly handle the operation of each table. This would reduce repetitive
code/repetitive routines and make debuging/design a whole lot simpler.
Whether it is practicle or not to implement this sort of thing, I do not know
until I see the source.

> Lastly, and this may determine the choice between a template and a
> macro model, what update semantics do you want. I presume that
> a change to the template should force a change to all instances.
> But what if one fails -- what happens?

In a case like this, I would treat the whole command as a single transaction
with a error that states which instance caused the error. The whole
transaction would roleback. This is better than seaching through trigger
after trigger to find what ones were applied and which ones are in the old

> If the answers are a) engine, b) template, and c) consistency, there
> are major changes to the engine, the system tables, and the various
> DDL processors, most of which will be difficult to make backwards
> compatible. Do you think this is worth the cost of a little cut
> and paste activity in a text editor? It seems to me likely that the
> number of keystokes expended in the implementation would exceed the
> number of keystrokes expended over the lifetime of the feature...

Most of the changes I could see performed by a well written client
The adding of new variables and new features, well those would be nice but not
currently neccesary.
On our last Metadata extract, the script generated was over 230 MB in size.
That is just DDL and does not include the documentation that is in the
database. We would be a prime canadate to benefit from such a change, but, I
do not see such a need being a high priority at this time.

best regards