Subject | Re: [Firebird-Architect] Batch/Block operations |
---|---|
Author | Alexandre Benson Smith |
Post date | 2005-02-21T22:15:02Z |
Jim Starkey wrote:
But my main problem, I think, is this:
When I open a Purchase Order Entry form, for example, during the filling
a bunch of queries are sent like:
(I will put some names on portuguese because I don't know the English
translation)
select CustomerID, Name, SalesRep, CNPJ, CreditLimit, CFOP, (plus some
more) from Customers where CustomerName = 'Thor'
then I get some response from the server (maybe there is no customer
with name Thor then I try another query with like 'Thor%'
after I get the customer data.
I need to get the contact name for that costumer
So I send something like this:
Select Name, Phone, e-mail from Contacts where CustomerId = 123 (the
CustomerID)
then I need the sales representative data
so another query is:
select Name, Comission from SalesRep where SalesRepID = 124 (the sales
representative ID) got from the first query.
and so on, I send a dozen small queries like this that brings textual
data of values that are codified on the customer data.
Each of these values has a especialized control that offers some search
options to easy the user life when filling the order form. some os these
values differs based on other filed data, for example:
The Customer Data bring information about the default delivery location,
but the user can modify the delivery address, so the tax for could be
diferent (then I need another query to ask for the taxes of the specific
state) this is one of the reasons that I cant just send a single big
query with a lot of joins with all the envolved tables, I need to send a
lot of small ones.
One thing that could improve signifcantly the speedy is to hold a lot of
prepapred statements, I plan to do it, today I don't reuse preapred
statements (cause I will need a big list of prepapred statements). The
other reason that I don't hold preapred statements is that it could
change for every table that could be searched.
For example:
The default search criteria for the Customer Table is by Name, if I type
Thor on the Customer Name Edit I will search for = 'Thor' if not found
it tries like'Thor%' but I could type 1:71613525 the preceding "1:"
means that is to search for customers using the second (0 based) search
field available (in this case the query will be select bla bla bla from
Customer where CNPJ = '71612525' again if not found the search will be
done again using like '71613525%'), just to clarify CNPJ is a kind o SSN
for the companies here in Brazil.
So if I have purchase order table with 20 FK's to other tables I will
have at least 40 prepared statements for this form (one for de surogate
key, another for the primary search field) and a lot more if the user
starts to search by alternative fields. This is just for one table
(entry form) I need to test how will my application behave with some
hundreds of prepared statmentes floating around my memory, and how I
will manage them. (this is why my eyes glows when I read about compiled
statement cache)
One of the diferentials of my aplication is the easy of use, when I go
to the bank to deposit some money on someone account they just asks me,
what is account number ? If I don't know or annoted it wrong there is no
way to go further. In a system like a bank that the load is HUGE I can
understant that all must be kept to a minimum of resource usage to
handle the load, so I understand the approach of "Give me de correct
code or get away from here" but in my case, I don't need my costumers
remembering codes, they just put what he knows about someone and the
system brings the choices (customer could be searched by name, by
salesrep, by city, by phone number, etc., etc., etc.). So an entry form
is a very interative situation where a bunch of small selects are send
to the server to bring data to the user almost at each filled field.
Then I think that I am on the side where I make a lot of client/server
conversation during the data entry process and there is no huge amount
of data per si crossing the wire, then I think that the round trips
needed for each statement execution is where my problem lies.
Thanks for your interest and atention and patience to read my bad english.
see you !
--
Alexandre Benson Smith
Development
THOR Software e Comercial Ltda.
Santo Andre - Sao Paulo - Brazil
www.thorsoftware.com.br
>>It sounds like your application is retrieving a great deal more dataSome places you are correct.
>>than is being presented to the user. If this is the case, the answer
>>may be in restructuring the application to given the database engine a
>>large part of the court. A trivial example is a counting records. If
>>you application count them, a million records is a million messages
>>whether compressed or not compressed, blocked, or no blocked. If you
>>let the engine do, it's one record. If there's aggregation to do done,
>>let the engine do it.
>>
>>
>>
>>
But my main problem, I think, is this:
When I open a Purchase Order Entry form, for example, during the filling
a bunch of queries are sent like:
(I will put some names on portuguese because I don't know the English
translation)
select CustomerID, Name, SalesRep, CNPJ, CreditLimit, CFOP, (plus some
more) from Customers where CustomerName = 'Thor'
then I get some response from the server (maybe there is no customer
with name Thor then I try another query with like 'Thor%'
after I get the customer data.
I need to get the contact name for that costumer
So I send something like this:
Select Name, Phone, e-mail from Contacts where CustomerId = 123 (the
CustomerID)
then I need the sales representative data
so another query is:
select Name, Comission from SalesRep where SalesRepID = 124 (the sales
representative ID) got from the first query.
and so on, I send a dozen small queries like this that brings textual
data of values that are codified on the customer data.
Each of these values has a especialized control that offers some search
options to easy the user life when filling the order form. some os these
values differs based on other filed data, for example:
The Customer Data bring information about the default delivery location,
but the user can modify the delivery address, so the tax for could be
diferent (then I need another query to ask for the taxes of the specific
state) this is one of the reasons that I cant just send a single big
query with a lot of joins with all the envolved tables, I need to send a
lot of small ones.
One thing that could improve signifcantly the speedy is to hold a lot of
prepapred statements, I plan to do it, today I don't reuse preapred
statements (cause I will need a big list of prepapred statements). The
other reason that I don't hold preapred statements is that it could
change for every table that could be searched.
For example:
The default search criteria for the Customer Table is by Name, if I type
Thor on the Customer Name Edit I will search for = 'Thor' if not found
it tries like'Thor%' but I could type 1:71613525 the preceding "1:"
means that is to search for customers using the second (0 based) search
field available (in this case the query will be select bla bla bla from
Customer where CNPJ = '71612525' again if not found the search will be
done again using like '71613525%'), just to clarify CNPJ is a kind o SSN
for the companies here in Brazil.
So if I have purchase order table with 20 FK's to other tables I will
have at least 40 prepared statements for this form (one for de surogate
key, another for the primary search field) and a lot more if the user
starts to search by alternative fields. This is just for one table
(entry form) I need to test how will my application behave with some
hundreds of prepared statmentes floating around my memory, and how I
will manage them. (this is why my eyes glows when I read about compiled
statement cache)
One of the diferentials of my aplication is the easy of use, when I go
to the bank to deposit some money on someone account they just asks me,
what is account number ? If I don't know or annoted it wrong there is no
way to go further. In a system like a bank that the load is HUGE I can
understant that all must be kept to a minimum of resource usage to
handle the load, so I understand the approach of "Give me de correct
code or get away from here" but in my case, I don't need my costumers
remembering codes, they just put what he knows about someone and the
system brings the choices (customer could be searched by name, by
salesrep, by city, by phone number, etc., etc., etc.). So an entry form
is a very interative situation where a bunch of small selects are send
to the server to bring data to the user almost at each filled field.
Then I think that I am on the side where I make a lot of client/server
conversation during the data entry process and there is no huge amount
of data per si crossing the wire, then I think that the round trips
needed for each statement execution is where my problem lies.
Thanks for your interest and atention and patience to read my bad english.
see you !
--
Alexandre Benson Smith
Development
THOR Software e Comercial Ltda.
Santo Andre - Sao Paulo - Brazil
www.thorsoftware.com.br