Subject | Re: Performance with indices |
---|---|
Author | Svein Erling Tysvær |
Post date | 2006-04-04T06:57:20Z |
What about transactions? I mean, if you have one transaction running
while starting and committing 100000 other transactions, then things
will inevitably slow down.
Also, does many of these 100000 rows update one particular row? I
would imagine 100000 updates to the same row to add to the potential
problem above.
As for deactivating triggers, another option would be to have a field
which had a particular value whenever you executed your import. That
way, the trigger itself could know that it was an import and that it
shouldn't do anything if the field contained this particular value.
HTH,
Set
while starting and committing 100000 other transactions, then things
will inevitably slow down.
Also, does many of these 100000 rows update one particular row? I
would imagine 100000 updates to the same row to add to the potential
problem above.
As for deactivating triggers, another option would be to have a field
which had a particular value whenever you executed your import. That
way, the trigger itself could know that it was an import and that it
shouldn't do anything if the field contained this particular value.
HTH,
Set
--- In firebird-support@yahoogroups.com, "amoradell" wrote:
> Thanks Rick, but the index that I use is activated so I can search
> for duplicates.
>
> As the loading goes, searching is getting slower.
> My question : slowness is it due to index updates or to statistics
> not up to date ? or both ?
>
> Finally, as you suggest, I can deactivate and reactivate these
> indices (primary keys) between files or groups of records (every
> 100000 for example).
>
> Alexandre
>
> --- In firebird-support@yahoogroups.com, "Rick Debay" wrote:
> >
> > When you reactivate them their statistics will be recomputed. Be
> > sure to recompile your stored procedures to take advantage of the
> > new information, if the stats have changed.
> >
> > -----Original Message On Behalf Of amoradell-----
> >
> > Hi,
> >
> > I have to load several thousands records in a database through
> > text files of about 100000 records each.
> >
> > The records are put into several tables (about 20 with one
> > exceding 10 million records).
> >
> > For performance reasons, I must deactivate almost every index and
> > also triggers.
> >
> > I have only primary keys activated to avoid duplicates.
> >
> > I want to know if these indices on primary keys have their
> > statistics up to date or do I have to recompute them between each
> > file to maintain good performance ?
> >
> > Thanks
> >
> > Alexandre