Subject | Re: [ib-support] Re: Constraints |
---|---|
Author | Paul Reeves |
Post date | 2001-06-20T12:42:55Z |
jvanderreest@... wrote:
Where a column is outside the range you can either try to correct it ( or
perhaps set a default 'error' value) or you can raise an exception. Then, in
your exception handler you write out to the 'error' table.
Let's assume the table ATABLE has one column - ACOL - and the value must be an
integer > 0
We will create two exceptions first:
create exception ACOL_OUT_OF_RANGE "Value out of range";
create exception RAISE_EXCEPTION_FOR_INSERT "Failed insert logged.";
And now a before insert trigger to check the data:
create trigger tib_ATABLE for ATABLE
before insert
as
begin
/* convert to integer and test value */
new.ACOL = cast(new.ACOL as Integer);
if (new.ACOL <=0) then
exception ACOL_OUT_OF_RANGE;
.... /* more tests for more columns here */
/* we could test for specific errors and exceptions here and handle
* each differently if we wanted. But in this case we will handle
* all errors the same way - write the row out to the error table
*/
when any do
begin
insert into ATABLE_ERRORS (ACOL) values(new.ACOL);
/* raise an exception to block the actual insert */
exception RAISE_EXCEPTION_FOR_INSERT;
end;
end
Now, in your client code you need to test for the exception raised and throw it
away, otherwise the execution of subsequent inserts will be blocked.
Don't take the above code without testing it - it is just off the top of my head
- but it should have enough to give you the basic idea.
Another alternative is to export the data out to Firebird/InterBase external
file format and do some of the processing at that stage. This is especially
worthwhile if you have large volumes of data and only a limited time window to
switch over from the old production system to the new one. Importing external
tables is the fastest import method, but if one row fails the whole batch will
roll back.
Whether it is worth the effort or not is up to you to decide. There is always a
trade-off between the complexity of the programming and the amount of data to be
migrated. But if the project is of any size either in terms of number of users
or volume of data, it is almost always worth doing properly. Trying to fix the
migration issues after migration will likely be more expensive than doing it
now.
Paul
--
Paul Reeves
http://www.ibphoenix.com
taking InterBase further
> I totally agree that the data needs to be cleaned up, but if possibleIn essence you would need to test each column for the correct range of values.
> I would like to leave that to after the migration process. Thanks for
> the 'before insert' trigger idea to clean things up. Would it be
> possible to use a trigger to trap offending records and have them
> inserted into an 'Orphan table' instead of the 'Real' table. If so,
> How would I go about that?
>
Where a column is outside the range you can either try to correct it ( or
perhaps set a default 'error' value) or you can raise an exception. Then, in
your exception handler you write out to the 'error' table.
Let's assume the table ATABLE has one column - ACOL - and the value must be an
integer > 0
We will create two exceptions first:
create exception ACOL_OUT_OF_RANGE "Value out of range";
create exception RAISE_EXCEPTION_FOR_INSERT "Failed insert logged.";
And now a before insert trigger to check the data:
create trigger tib_ATABLE for ATABLE
before insert
as
begin
/* convert to integer and test value */
new.ACOL = cast(new.ACOL as Integer);
if (new.ACOL <=0) then
exception ACOL_OUT_OF_RANGE;
.... /* more tests for more columns here */
/* we could test for specific errors and exceptions here and handle
* each differently if we wanted. But in this case we will handle
* all errors the same way - write the row out to the error table
*/
when any do
begin
insert into ATABLE_ERRORS (ACOL) values(new.ACOL);
/* raise an exception to block the actual insert */
exception RAISE_EXCEPTION_FOR_INSERT;
end;
end
Now, in your client code you need to test for the exception raised and throw it
away, otherwise the execution of subsequent inserts will be blocked.
Don't take the above code without testing it - it is just off the top of my head
- but it should have enough to give you the basic idea.
Another alternative is to export the data out to Firebird/InterBase external
file format and do some of the processing at that stage. This is especially
worthwhile if you have large volumes of data and only a limited time window to
switch over from the old production system to the new one. Importing external
tables is the fastest import method, but if one row fails the whole batch will
roll back.
Whether it is worth the effort or not is up to you to decide. There is always a
trade-off between the complexity of the programming and the amount of data to be
migrated. But if the project is of any size either in terms of number of users
or volume of data, it is almost always worth doing properly. Trying to fix the
migration issues after migration will likely be more expensive than doing it
now.
Paul
--
Paul Reeves
http://www.ibphoenix.com
taking InterBase further