Subject | Re: [IBO] Conceptual problems |
---|---|
Author | Helen Borrie |
Post date | 2005-05-25T15:53Z |
At 03:02 PM 25/05/2005 +0000, you wrote:
As far as the "concept" or "flow" issue is concerned, I see three seriously
slow things in your model:
1. "select * from table" queries do not belong in client/server, even on
fast networks. When you are limited to DSL, you must be strict with
yourself on this principle and "think minimum". For a maintenance program,
you want to supply keys and request back single rows for updates and
deletes - if you actually need them at all. For an insert, you don't need
any full rows at all until after the new row is posted.
2. Don't open any sets until you actually want to use them. Think in
"tasks" and talk to the users about what they actually *need* for each
task. You (and they) might be surprised at how easily they can do without
a huge scrolling grid of 35 fields and thousands of rows, just to update
the prices on half a dozen products. Modularise each of the tasks so that
no user has to have stuff sitting around in datasets that she isn't going
to use.
3. For stable lookups - like your product type lookup - don't use a live
dataset. Write a function to pull the salient fields for the lookup record
across into an ib_cursor, read them all into a stringlist - such as the
items list in a tib_ComboBox - then close and unprepare the cursor. Call
this function occasionally, i.e. not more frequently than you need to,
maybe only once, at FormCreate.
The same strategy applies to selecting records. Do these one-shot reads to
populate selection lists - e.g. unique ID, product description and product
code. Let the user drill down and pick the records she wants. Behind the
scenes, each of your "live" queries will be a parameterised statement that
tightly limits (in the parameters of the WHERE clause) the actual records
that are fetched.
Make good use of executable DML - quick updates, inserts, deletes and
executes. Write stored procedures to automate things: don't pull rows
over to the client for operations that iterate through sets. Use
parameterised statements and procedure calls and do this kind of stuff on
the server.
You'll know from your own tests already that even a minor query across a
phone line can seem to take ages, even on a good day. With DSL you can't
rely on having good line quality at the times of day when people are at
work. So make sure that you soak-test the users' real-life conditions
thoroughly and EARLY.
The positive side of all this is that it will make you into a seriously
good client/server developer if you actually get it working
satisfactorily. It's a big ask. You'd get a heck of lot better results by
abandoning 2-tier C/S entirely and going for a multi-tier web-based solution.
Helen
>Hello everybody:Certainly, you will be helped by maintaining a schema cache at the client.
>
>I've bought IBObjects because I think they are by far the best
>Interbase components available. So now that I'm ready to use them I
>have found some conceptual problems.
>
>I need to develop a Client/Server application used via DSL networks so
>speed is my main goal.
>
>My idea is to put a single IB_QUERY with request live=true on my
>Datamodule for each table I need to maintain (Let's say for this
>example I have Products and Type of Products).
>
>In the create event of the datamodule, after checking connexion is ok,
>I open the IB_QUERYS.
>
>The sql in the IB_QUERYS is "select * from tabla order by fields".
>
>I know that by openning the IB_QUERY it will fetch all the records but
>I think it is not a bad thing because if you try, for example, to add
>a new type of product you will be preseted with a Grid showing all the
>records.
>
>My main doubt is that when I enter the Products form I have a
>IB_LOOKUPCOMBO where you can select the Type of Product so in the
>create event of the form I close the Type of Products' IB_QUERY and
>assign values to keysource and keylinks to create the reference
>between products and type of products. In the close event of the
>form I clear the keylinks and assing keysource to nil.
>
>Every thing is working fine but since speed is a must for me I wonder
>if there is a better way of doing this.
As far as the "concept" or "flow" issue is concerned, I see three seriously
slow things in your model:
1. "select * from table" queries do not belong in client/server, even on
fast networks. When you are limited to DSL, you must be strict with
yourself on this principle and "think minimum". For a maintenance program,
you want to supply keys and request back single rows for updates and
deletes - if you actually need them at all. For an insert, you don't need
any full rows at all until after the new row is posted.
2. Don't open any sets until you actually want to use them. Think in
"tasks" and talk to the users about what they actually *need* for each
task. You (and they) might be surprised at how easily they can do without
a huge scrolling grid of 35 fields and thousands of rows, just to update
the prices on half a dozen products. Modularise each of the tasks so that
no user has to have stuff sitting around in datasets that she isn't going
to use.
3. For stable lookups - like your product type lookup - don't use a live
dataset. Write a function to pull the salient fields for the lookup record
across into an ib_cursor, read them all into a stringlist - such as the
items list in a tib_ComboBox - then close and unprepare the cursor. Call
this function occasionally, i.e. not more frequently than you need to,
maybe only once, at FormCreate.
The same strategy applies to selecting records. Do these one-shot reads to
populate selection lists - e.g. unique ID, product description and product
code. Let the user drill down and pick the records she wants. Behind the
scenes, each of your "live" queries will be a parameterised statement that
tightly limits (in the parameters of the WHERE clause) the actual records
that are fetched.
Make good use of executable DML - quick updates, inserts, deletes and
executes. Write stored procedures to automate things: don't pull rows
over to the client for operations that iterate through sets. Use
parameterised statements and procedure calls and do this kind of stuff on
the server.
You'll know from your own tests already that even a minor query across a
phone line can seem to take ages, even on a good day. With DSL you can't
rely on having good line quality at the times of day when people are at
work. So make sure that you soak-test the users' real-life conditions
thoroughly and EARLY.
The positive side of all this is that it will make you into a seriously
good client/server developer if you actually get it working
satisfactorily. It's a big ask. You'd get a heck of lot better results by
abandoning 2-tier C/S entirely and going for a multi-tier web-based solution.
Helen