Subject Re: Inserting 100's of thousands from a SP
Author buppcpp
> > SQL Server supports it, and after you do a commit or rollback,
> > it automatically starts you with a new (implicit) transaction.
>
> And what, just because MS does it that makes it correct without
> question?
>
>
> If I issue a statement like:
>
> INSERT INTO ATABLE (AFIELD, BFIELD)
> VALUES ('A', 'B');
>
> And get an exception, what should be the state of the database?
>
> If we follow your advice and allow stored procedures to issue
> commits (or even commit retaining) then the state is
> indeterminate
The state is not indeterminate, the rest of the process is still
under a transaction.

- some things may or may not have changed as a
> result of internal commits.
???

You are acting like I'm trying to come up with a new way to handle
databases, I'm not, it's already being done, and being done quiet
well.



> Had you forgotten that triggers can call stored procedures?
> Can you imagine the mess a database could get into if commits
> could occur right in the middle of other DML?
>
Yea, you could also cause a mess by placing a loaded gun in your
mouth and pulling the trigger...what's your point.

If you are not capable for handling what the database has to
offer...I don't see that as the databases fault.

Again, I'm not coming up with something new, you can already do these
things and it handles them fine.


Also, what about those situations where it's OK to save part of the
transaction if the other part fails.

For example:
A new customer purchases something from your store.

You would send the new customer and purchasing document information
to your Stored Procedure/middle tier...You would go ahead and commit
the customer record first, because if the commiting of the document
fails, you don't want to lose the customer information.


Just because you guys may not see a need for it doesn't mean it's not
needed.
If you don't need it, don't use it, but other databases have it
because there is a need for it and it's very useful.

Thanks
Bupp