Subject | Optimization questions |
---|---|
Author | svanderclock |
Post date | 2009-04-09T06:50:04Z |
Hello,
in my database i have somes table that are very write intensive (but very). but the data in such table are not very important if they are lost. even one of this table could be only a memory table (if firebird can have memory table of course)?
the configuration
* Firebird 2 superserver (dedicated);
* more than 1 000 000 query every day ;
* SAS hard drive
what the best optimisation advice can you gave me ?
also i just add a new very intensive table, but i notice that now my processor is alway at 80 - 100 % and can not handle correctly all the request. the difference between this new and the other is that on this new the primary key is a BIGINT instead of a VARCHAR (does this can be the responsible of the slow?)
i use this store procedure to update this table
SET TERM !! ;
CREATE OR ALTER PROCEDURE ADD_EDENWAY_ADS (ID BIGINT, Transac_kind SMALLINT, Property_category SMALLINT, ID_geo_place VARCHAR(100))
AS
DECLARE VARIABLE aRECCOUNT INTEGER ;
BEGIN
SELECT COUNT(*) FROM EDENWAY_ADS WHERE ID = :ID INTO :aRECCOUNT ;
IF (aRECCOUNT=0) THEN
INSERT INTO EDENWAY_ADS (
ID,
Creation_date,
Last_verification_date,
Transac_kind,
Property_category,
ID_geo_place
)
VALUES (
:ID,
'NOW',
'NOW',
:Transac_kind,
:Property_category,
:ID_geo_place
);
ELSE
UPDATE
EDENWAY_ADS
SET
Last_verification_date = 'NOW',
Transac_kind = :Transac_kind,
Property_category = :Property_category,
ID_geo_place = :ID_geo_place
WHERE ID = :ID ;
END !!
SET TERM ; !!
only 100 000 row in this table. do you think that this stored procedure can be optimized ?
thanks by advance for your advice
stephane
in my database i have somes table that are very write intensive (but very). but the data in such table are not very important if they are lost. even one of this table could be only a memory table (if firebird can have memory table of course)?
the configuration
* Firebird 2 superserver (dedicated);
* more than 1 000 000 query every day ;
* SAS hard drive
what the best optimisation advice can you gave me ?
also i just add a new very intensive table, but i notice that now my processor is alway at 80 - 100 % and can not handle correctly all the request. the difference between this new and the other is that on this new the primary key is a BIGINT instead of a VARCHAR (does this can be the responsible of the slow?)
i use this store procedure to update this table
SET TERM !! ;
CREATE OR ALTER PROCEDURE ADD_EDENWAY_ADS (ID BIGINT, Transac_kind SMALLINT, Property_category SMALLINT, ID_geo_place VARCHAR(100))
AS
DECLARE VARIABLE aRECCOUNT INTEGER ;
BEGIN
SELECT COUNT(*) FROM EDENWAY_ADS WHERE ID = :ID INTO :aRECCOUNT ;
IF (aRECCOUNT=0) THEN
INSERT INTO EDENWAY_ADS (
ID,
Creation_date,
Last_verification_date,
Transac_kind,
Property_category,
ID_geo_place
)
VALUES (
:ID,
'NOW',
'NOW',
:Transac_kind,
:Property_category,
:ID_geo_place
);
ELSE
UPDATE
EDENWAY_ADS
SET
Last_verification_date = 'NOW',
Transac_kind = :Transac_kind,
Property_category = :Property_category,
ID_geo_place = :ID_geo_place
WHERE ID = :ID ;
END !!
SET TERM ; !!
only 100 000 row in this table. do you think that this stored procedure can be optimized ?
thanks by advance for your advice
stephane