Subject | Re: [firebird-support] Dealing with inserts from multiple transactions |
---|---|
Author | Michael Ludwig |
Post date | 2012-06-07T19:47:51Z |
Bob Murdoch schrieb am 07.06.2012 um 13:55 (-0400):
I don't know much about ETL and its rules. Makes me wonder, though,
whether the dataset is complete … I think it has to be if such rigourous
rules are applied to it.
So maybe a procedural SQL approach will do: Postpone problematic records
by pushing them to the final clean-up step of a stored procedure that
handles the entire ETL dataset (whatever that is). So problem records
will get postponed *but* stay within the transactional bracket *and*
succeed - provided the dataset is complete. (Which I am not sure it
really is, thanks to my lack of ETL expertise.)
Michael
> I like your idea of moving those records with a problem to a seperateOne record in a thousand postponed doesn't sound too bad to me, but then
> table and processing them later, but it breaks the rule of success or
> failure at the dataset level. I can't process 49k records and
> postpone 50 of them and still call that a successful ETL.
I don't know much about ETL and its rules. Makes me wonder, though,
whether the dataset is complete … I think it has to be if such rigourous
rules are applied to it.
So maybe a procedural SQL approach will do: Postpone problematic records
by pushing them to the final clean-up step of a stored procedure that
handles the entire ETL dataset (whatever that is). So problem records
will get postponed *but* stay within the transactional bracket *and*
succeed - provided the dataset is complete. (Which I am not sure it
really is, thanks to my lack of ETL expertise.)
Michael