Subject | RE: [firebird-support] Dealing with inserts from multiple transactions |
---|---|
Author | Bob Murdoch |
Post date | 2012-06-07T17:55:20Z |
Michael -
-----Original Message-----
From: firebird-support@yahoogroups.com
[mailto:firebird-support@yahoogroups.com] On Behalf Of Michael Ludwig
Sent: Thursday, June 07, 2012 1:12 PM
To: firebird-support@yahoogroups.com
Subject: Re: [firebird-support] Dealing with inserts from multiple
transactions
initial post that this was just an example. The real situation is far
more complex - we may have 10 different ETL processes running at
different times with overlapping schedules and no way to serialize
those processes.
I like your idea of moving those records with a problem to a seperate
table and processing them later, but it breaks the rule of success or
failure at the dataset level. I can't process 49k records and
postpone 50 of them and still call that a successful ETL.
Thank you.
-----Original Message-----
From: firebird-support@yahoogroups.com
[mailto:firebird-support@yahoogroups.com] On Behalf Of Michael Ludwig
Sent: Thursday, June 07, 2012 1:12 PM
To: firebird-support@yahoogroups.com
Subject: Re: [firebird-support] Dealing with inserts from multiple
transactions
> One solution would be to brutally serialize the problem out of theway:
> First do Sales, then do Timeclocks. Get rid of the concurrency, getrid
> of the race.Employee
>
> Another would be to re-schedule Sale records without matching
> record for later processing. Timeclock records may create Employees,separate
> but Sale records may not. Back them up to a separate file or
> table and process them again in due time. Employee records will haveBoth of those are good ideas. However, I should have made clear in my
> been created and the problem will have been avoided.
initial post that this was just an example. The real situation is far
more complex - we may have 10 different ETL processes running at
different times with overlapping schedules and no way to serialize
those processes.
I like your idea of moving those records with a problem to a seperate
table and processing them later, but it breaks the rule of success or
failure at the dataset level. I can't process 49k records and
postpone 50 of them and still call that a successful ETL.
Thank you.