Subject Re: Table with more than 100 index.
Author svanderclock
because as the user can freely write the params of the query, it's very hard to know how much rows will result

for exemple these fields are mandatory :

ID_place = xxx and
Status = 1 and
kind = 65 and
subcategory = 12

in the case yes can be < 10000 row, but just changing kind = 65 by kind = 64 will output 2 000 000 row !

we have an huge amout of data (the system must be ready for 80 000 000 rows!) and what is the most cost intensive is the order by after the filter ! for this order by we don't have any other choice than use all the index combination as possible. because even if a set return 10 000 row, the order by will kill the performance.

but in the other way we can not have a system that cost 1 seconds to add a row ! did we cross the limit of firebird ? or did we do something wrong in our index (too much fields / index for exemple) ?

stephane




--- In firebird-support@yahoogroups.com, "Martijn Tonies" <m.tonies@...> wrote:
>
> Hello Stephane,
>
> > we have a database with 8 000 000 objects and a form where the user can
> > filter the data. all the data must be given in an ordered way.
> >
> > the result must be given to the user in < 100 ms. for that we create a
> > dedicated "search" table
> >
> > CREATE TABLE SEARCH_1
> > (
> > ID INTEGER NOT NULL,
> > ID_place VARCHAR(100),
> > Status SMALLINT NOT NULL,
> > kind SMALLINT NOT NULL,
> > subcategory SMALLINT NOT NULL,
> > category SMALLINT NOT NULL,
> > image D_BOOLEAN NOT NULL,
> > weight INTEGER NOT NULL,
> > Creation_date TIMESTAMP NOT NULL,
> > price NUMERIC(15,5),
> > size NUMERIC(12,4),
> > height SMALLINT,
> > ID_object VARCHAR(15) NOT NULL,
> > PRIMARY KEY (ID)
> > );
> >
> > so the normal request of an user is :
> > select .... from SEARCH
> > Where
> > ID_place = xxx and
> > Status = 1 and
> > kind = 65 and
> > subcategory = 12 and
> > category = 1 and
> > price > 25000 and
> > price <= 30000
> > order by ID_place,
> > Status,
> > kind,
> > subcategory,
> > category,
> > price
> >
> > so i don't find better way than put an index
> > on (ID_place, Status, kind, subcategory, price) to make that this request
> > can return in < 100 ms (8 000 000 records and the requierement is to make
> > the software work on 80 000 000 records)
> >
> > but the user can do some different combination, like replacing price >
> > 25000 and
> > price <= 30000
> >
> > by
> >
> > size > 35 and
> > size < 50
> >
> > and in this way i need the index on
> > (ID_place, Status, kind, subcategory, size)
> >
> > etc...
> >
> > after checking all the possible combination, the result egal that i need
> > 112 index (not really to much in fact for such scenario)
> >
> > but now the big probleme is that is very sloooow to create this 3
> > databases (it's cost more than 1 second to add one row !) where when i
> > deactivate all the index it's cost only 5 ms !
> >
> > i don't know how to improuve it :(
>
> You don't need all indices in order to search ...
>
> Are there any criteria that are -always- used in searches? eg "ID_Place"?
>
> What are the number of different values for each column? This will be
> relevant to index selectivity, the better the
> selectivity, the better the index.
>
> If, for example, you got 10000 items left after a search on a specific
> column, you hardly need an index for a value
> in the second column.
>
> With regards,
>
> Martijn Tonies
> Upscene Productions
> http://www.upscene.com
>
> Download Database Workbench for Oracle, MS SQL Server, Sybase SQL
> Anywhere, MySQL, InterBase, NexusDB and Firebird!
>
> Database questions? Check the forum:
> http://www.databasedevelopmentforum.com
>