Subject | Re: OLEDB VERY SLOW. Need > 1000 transactions /s |
---|---|
Author | chasesmith2002 |
Post date | 2004-02-06T17:20:05Z |
I sincerely appreciate all your advice. I see 3 things that are
being suggested.
1) Upgrade my hardware
2) Go directly to the API
3) construct an external table and then just reference it
1) Well... I don't make the purchasing decisions. If i DID, you can
rest assured that the backend would be residing on something a LOT
faster than what it is on right now. Dual PIII 1GHZ, 1.5 GIG MEM,
SCSI-3 RAID with 32MB Cache. My database project does not get the
priority around here. The one that does gets the Co-located QUAD
Xeon with some pretty good bandwidth. I am working on it though.
2) Go Directly to the API. Quite frankly, I dont seem to have the
skill to do this. I am using VB6 and I cannot seem to get a handle
on the dll to look at the API functions and/or properly reference
them from VB. If anybody does have these written and is willing to
share, I would be insanely gratefull. I KNOW that if I was in C++
Builder i could probably get 6000-8000 tps, since that is what the
frontend for the other database gets (to a local network server, not
the co-located quad). However, I don't have the time to relearn c++
completely again. I got stuck in VB. Its not all bad :)
3) construct an external table and then just reference it. I think
this sounds like a plan. Right now I actually abandoned the ADO for
large transactions and wrote a server-client that constructs a SQL
script and sends it to the server to be executed with that ISQL
command line tool. That did get me around 1200-2500 tps initially
but then slowed down severely for some reason. I really need to be
able to have error trapping anyways which could be accomplished in a
stored procedure that it is processing that external table.
How do you construct the external table and then tell the backend
about it?
being suggested.
1) Upgrade my hardware
2) Go directly to the API
3) construct an external table and then just reference it
1) Well... I don't make the purchasing decisions. If i DID, you can
rest assured that the backend would be residing on something a LOT
faster than what it is on right now. Dual PIII 1GHZ, 1.5 GIG MEM,
SCSI-3 RAID with 32MB Cache. My database project does not get the
priority around here. The one that does gets the Co-located QUAD
Xeon with some pretty good bandwidth. I am working on it though.
2) Go Directly to the API. Quite frankly, I dont seem to have the
skill to do this. I am using VB6 and I cannot seem to get a handle
on the dll to look at the API functions and/or properly reference
them from VB. If anybody does have these written and is willing to
share, I would be insanely gratefull. I KNOW that if I was in C++
Builder i could probably get 6000-8000 tps, since that is what the
frontend for the other database gets (to a local network server, not
the co-located quad). However, I don't have the time to relearn c++
completely again. I got stuck in VB. Its not all bad :)
3) construct an external table and then just reference it. I think
this sounds like a plan. Right now I actually abandoned the ADO for
large transactions and wrote a server-client that constructs a SQL
script and sends it to the server to be executed with that ISQL
command line tool. That did get me around 1200-2500 tps initially
but then slowed down severely for some reason. I really need to be
able to have error trapping anyways which could be accomplished in a
stored procedure that it is processing that external table.
How do you construct the external table and then tell the backend
about it?