Subject | Re: [firebird-support] incremental search ? |
---|---|
Author | Kjell Rilbe |
Post date | 2011-04-08T07:23:04Z |
Den 2011-04-08 05:26 skrev Sergio H. Gonzalez såhär:
client and do everything in JavaScript. With just a few thousand records
I think that would be OK with today's bandwidth availability, especially
if the table is not updated often, and you can enable some kind of
caching algorithm, e.g. have the table exported into a .js file that the
browser will cache, and update the .js file as often as needed depending
on your update frequency and business requirements.
If that is not an option, I'd suggest going for starts with rather than
contains, because it can use an index. But it won't list as many hits.
Again, it depends on your business requirements.
Also, you should consider latency between client and server. Are they on
a LAN with high bandwidth and very few hops? Then no problem. Is Server
in Korea and Client in South Africa? Then perhaps you've got a problem.
Kjell
--
--------------------------------------
Kjell Rilbe
DataDIA AB
E-post: kjell@...
Telefon: 08-761 06 55
Mobil: 0733-44 24 64
> Hello! probably this is a silly question, but I'd like to know other'sDepending on the record count, you could pull the entire table to the
> experiences in this matter. I'd like to implement an incremental search
> facility in some forms. I think I've read somewhere that incremental
> search is not a good idea on client/server databases... so my first
> question: is that true?
>
> I'm using Firebird 2.1 with Delphi / IBX components and I'm testing with
> a very simple method: on the OnChange event of an Edit box I close the
> query (select * from my table where name containing :somename) pass
> Edit1.Text as parameter and open the query again. So far (and with a few
> thousand records) it seems to be ok. So the next question: Is that
> method OK? should I be aware of someting in the future? I didn't test it
> with A LOT of records yet so I'd like to hear any advice. Thank you very
> much!!-Sergio
client and do everything in JavaScript. With just a few thousand records
I think that would be OK with today's bandwidth availability, especially
if the table is not updated often, and you can enable some kind of
caching algorithm, e.g. have the table exported into a .js file that the
browser will cache, and update the .js file as often as needed depending
on your update frequency and business requirements.
If that is not an option, I'd suggest going for starts with rather than
contains, because it can use an index. But it won't list as many hits.
Again, it depends on your business requirements.
Also, you should consider latency between client and server. Are they on
a LAN with high bandwidth and very few hops? Then no problem. Is Server
in Korea and Client in South Africa? Then perhaps you've got a problem.
Kjell
--
--------------------------------------
Kjell Rilbe
DataDIA AB
E-post: kjell@...
Telefon: 08-761 06 55
Mobil: 0733-44 24 64