Subject | Re: [firebird-php] set cpu or memory max limit |
---|---|
Author | Lester Caine |
Post date | 2013-02-26T14:54:21Z |
Santiago Laparra wrote:
Santiago initial posts are always moderated, so can take a little time to
appear. I've killed the duplicate ;)
trick is to create queries in PHP that limit the number of records returned to
what can be sensibly displayed in a web page. Even adding a 'count' to tell you
how many are not being displayed can be a problem as this can take time to
calculate, so we tend to store useful totals in a data table.
This is more a hardware problem than Firebird. With a multi core processor you
can have several connections to firebird running (I tend still to use 'classic'
for this) and several instances of httpd creating pages. If one lookup is slow
it will not hold up the rest. If you find that you ARE using 100% of the
processing time a lot of the time then it may be you need a second machine to
handle the database. We tend to run extra machines when the load gets too great,
and have one or two generating pages while a third provides the database engine.
Queries that take a long time can often be broken down into smaller chunks, or
reworked to be quicker by the addition of extra indexes or expressing the query
differently. But often a FIRST x or the more modern ROWS filter on a query. All
of my own PHP pages tend to have a 'paginate' function, so you step through
blocks of 10 or so records at a time ...
--
Lester Caine - G8HFL
-----------------------------
Contact - http://lsces.co.uk/wiki/?page=contact
L.S.Caine Electronic Services - http://lsces.co.uk
EnquirySolve - http://enquirysolve.com/
Model Engineers Digital Workshop - http://medw.co.uk
Rainbow Digital Media - http://rainbowdigitalmedia.co.uk
Santiago initial posts are always moderated, so can take a little time to
appear. I've killed the duplicate ;)
> we have a web site with PHP5.3 and firebird 2.5 database. Some queries thatThe trick is not to try and limit the database. That simply can't be done. The
> search in heavy tables, about 1,8 million of records, with joins to other
> tables, makes 100% of cpu. We have about 50 users concurrent connected
> (hits per second), so the users have to wait to the last
>
> I was wondering to know if we can set the max limit per thread or process
> (we tried with SuperServer, Superclassic and Classic server).
trick is to create queries in PHP that limit the number of records returned to
what can be sensibly displayed in a web page. Even adding a 'count' to tell you
how many are not being displayed can be a problem as this can take time to
calculate, so we tend to store useful totals in a data table.
This is more a hardware problem than Firebird. With a multi core processor you
can have several connections to firebird running (I tend still to use 'classic'
for this) and several instances of httpd creating pages. If one lookup is slow
it will not hold up the rest. If you find that you ARE using 100% of the
processing time a lot of the time then it may be you need a second machine to
handle the database. We tend to run extra machines when the load gets too great,
and have one or two generating pages while a third provides the database engine.
Queries that take a long time can often be broken down into smaller chunks, or
reworked to be quicker by the addition of extra indexes or expressing the query
differently. But often a FIRST x or the more modern ROWS filter on a query. All
of my own PHP pages tend to have a 'paginate' function, so you step through
blocks of 10 or so records at a time ...
--
Lester Caine - G8HFL
-----------------------------
Contact - http://lsces.co.uk/wiki/?page=contact
L.S.Caine Electronic Services - http://lsces.co.uk
EnquirySolve - http://enquirysolve.com/
Model Engineers Digital Workshop - http://medw.co.uk
Rainbow Digital Media - http://rainbowdigitalmedia.co.uk