Subject Re: How to improve update performance with millions records?
Author firebird_jimmy
--- In firebird-support@yahoogroups.com, Svein Erling Tysvær <svein.erling.tysvaer@...> wrote:
>
> > >Hi, I'm jimmy
> >
> > Hi Jimmy!
> >
> > > I have a problem, please help me!
> > > I have a table with 3,000,000 rows record, every row update 1 to 5 times in one day.
> > > The speed of update statement become slowly, about 30
> > > records/second, but insert speed above 1000 records/second.
> > > Is my usage be bad or not?
> > > what should i do?
> >
> > I don't know whether your usage is bad or not. What kind of indexes do
> > you have and how do you update? What about transactions, do you have a
> > noticeable gap between oldest (active) transaction and next
> > transaction? I don't know whether it is still relevant (it is a very
> > old article), but in some cases I think rdb$db_key can be useful for
> > updates: http://ibexpert.net/ibe/index.php?n=Doc.TheMysteryOfRDBDBKEY
> >
> > HTH,
> > Set
> >
>
> > Thanks your help!
> > My table have a primary key with integer, when update 200 rows then
> > commit trans. Th
>
> Sounds OK, but what's more important is the gap mentioned above - it doesn't help if the update commit every 200 rows if there is one or more other, concurrent transactions, that runs for a long time without committing (well, transactions that are read only AND read committed are OK, but other combinations are not).
>
> And take up Thomas offer, he will notice if there's something wrong with the output of gstat.
>
> Set
>

> I read http://ibexpert.net/ibe/index.php?n=Doc.TheMysteryOfRDBDBKEY, use rdb$db_key as condition of update statement, now update speend improve to 600 to 2000 rows per second.
> my database only one user can update data, other only connect and read record.
> my table define
CREATE TABLE MEASURE_VALUE_DEFINE(
MEASURE_VALUE_ID INTEGER NOT NULL,
MEASURE_POINT_ID INTEGER,
DATA_ITEM_ID INTEGER,
HIS_TB_NAME VARCHAR(255),
DATA_CLASS INTEGER,
PHASE INTEGER,
USER_DEFINE INTEGER,
DATA DOUBLE PRECISION,
DATA_TP TIMESTAMP,
COLLECT_TP TIMESTAMP,
IS_STORE INTEGER,
STORE_PERIOD INTEGER,
IS_CALC INTEGER,
IS_GRAPHICS INTEGER,
CODON FLOAT,
MULTI FLOAT,
ZERO_RANGE FLOAT,
THRESHOLD FLOAT,
UPPER_LIMIT DOUBLE PRECISION,
LOWER_LIMIT DOUBLE PRECISION,
ULTIMATE_UPPER_LIMIT DOUBLE PRECISION,
ULTIMATE_LOWER_LIMIT DOUBLE PRECISION,
STATISTIC_PERIOD INTEGER,
IS_STATISTIC_MIN INTEGER,
IS_STATISTIC_AVG INTEGER,
IS_STATISTIC_MAX INTEGER,
MIN_VALUE DOUBLE PRECISION,
MIN_VALUE_TP TIMESTAMP,
MAX_VALUE DOUBLE PRECISION,
MAX_VALUE_TP TIMESTAMP,
AVG_VALUE DOUBLE PRECISION,
AVG_COLLECT_COUNT INTEGER,
WHO_FREEZE INTEGER,
UPDATE_TIME TIMESTAMP,
MEASURE_VALUE_GUID VARCHAR(50) NOT NULL,
IS_DELETE INTEGER
);

Alter table measure_value_define add constraint pk_Measure_Value_Define Primary Key (Measure_value_id);

Create Index idx_Measure_Value_Define on MEASURE_VALUE_DEFINE(Measure_Point_ID, Measure_value_id);

>the table measure_value_define not depend on other tables or proc..., vice versa

>update statement is:
update Measure_value_define
set data = :data, data_tp = :data_tp, ...
where measure_value_id = :id;

>now, i have a question:
i use normal update statement with pk to update record, it's performance be so slowly, beside method use rdb$db_key, have other ways ?