Subject Absolute Mystery
Author Edwin A. Epstein, III
Well I am stumped on this one.

I had 4.2 million lines of a stored procedure that was to be run from a
script. I broke it up into several different batches, so it was not all
going to be in one transaction.

An example of a line:

EXECUTE PROCEDURE PROC_A('12341234','AB','DJSDADKJJJSJJ');

PROC_A does some lookups and inserts, etc. It also has:

WHEN ANY DO
BEGIN
INSERT INTO DROP_FILE(FIELD_1,FIELD_2,FIELD_3)
VALUES(:PASSED_VALUE_A,:PASSED_VALUE_B,:PASSED_VALUE_C);
END

So any exception that the procedure throws, deliberate or otherwise, creates
a drop record in the DROP_FILE table.

After completing all the batches, all of the results are going to be in
either TABLE_A or DROP_FILE. So a count(*) on both tables added together
should equal 4.2 million records.

It adds up 450K short. Hmmmmmm

Okay. I checked 20 random lines from each script file. Every file has the
majority of the lines processed. I assume this since the data being passed
into the procedure is unique and I can find a record in either TABLE_A or
DROP_FILE.

Why are some of the lines not being processed in the script file? The whole
script file is inside one transaction. So if the whole thing rolled back I
would see one script file missing completely. Where are the missing
records?

Has anybody had anything like this happen to them?