Subject | Re: Hardware for firebird |
---|---|
Author | Adam |
Post date | 2005-11-24T23:32:57Z |
--- In firebird-support@yahoogroups.com, "as1616z" <angelszc@h...> wrote:
This will help if the process is Hard disk or Memory bound, won't do a
whole lot if your fbserver is pegged on 100% CPU.
What is the nature of the problem? If it is a batch import, then
process, then delete type problem, and no-one uses the data during the
import process, then you can de-activate all indices and foreign keys
etc until it is imported.
What do you mean by "1 row per time"? Do you mean you commit on every
insert? If so, you probably don't need to, but use the same
transaction and you can see the uncommitted writes for your other tasks.
Consider using stored procedures, and make sure your queries are
prepared. If you are the only user at this time, then use embedded
which is a lot quicker.
Another approach is to use external tables, format the data using
whatever tool generated the million records into the right format,
then select from the external table into a real table.
Adam
>Angel,
> Hello,
>
> I need to do every week a massive "INSERT INTO" of 1 million records.
> I need to insert a row per time because additional queries and
> complex calc). This is a very hard and long work and is important to
> save the maximum time of process.
>
> Is a raid 5 with 4 SCSI HD 15000rpm a good option ?
> Increase ram memory up to 4GB can improve significatively this work ?
> Any ideas ?
>
> Thanks,
> Angel
>
This will help if the process is Hard disk or Memory bound, won't do a
whole lot if your fbserver is pegged on 100% CPU.
What is the nature of the problem? If it is a batch import, then
process, then delete type problem, and no-one uses the data during the
import process, then you can de-activate all indices and foreign keys
etc until it is imported.
What do you mean by "1 row per time"? Do you mean you commit on every
insert? If so, you probably don't need to, but use the same
transaction and you can see the uncommitted writes for your other tasks.
Consider using stored procedures, and make sure your queries are
prepared. If you are the only user at this time, then use embedded
which is a lot quicker.
Another approach is to use external tables, format the data using
whatever tool generated the million records into the right format,
then select from the external table into a real table.
Adam