Subject | Re: [firebird-support] Handling large imports while system is in production |
---|---|
Author | Mark Rotteveel |
Post date | 2013-02-05T13:43:02Z |
On Tue, 05 Feb 2013 13:21:01 -0000, "rudibrazil78" <rudibr@...>
wrote:
at night, during lunchbreak)?
transaction, or a transaction per individual change, or something in
between (eg a commit for every x updates)?
You could consider adding a separate staging table where you first load
the data into the database in that separate table, and only then (eg using
a stored procedure, or using MERGE or UPDATE OR INSERT, etc) do you start
loading it into the actual table(s). You might even massage the data before
loading it into the final destination (eg remove records that don't need to
be updated). This might not necessarily improve things, but I expect it
will decrease time to load the data, and also reduce the performance hit
for your operators.
Mark
wrote:
> Hi all-
>
> We've been having to deal with a bad situation for quite a few years now
> how to handle large imports without taking the server down (in fact thatHow often do you do this, and isn't it possible to do it in off-hours (eg
> data is meant to be acessible as soon as its ready in production).
>
> This is a call center software which routinely imports lists of clients
> and possible clients to contact.
at night, during lunchbreak)?
> The import happens on 2 main tables, clients and calls, which areexactly
> the same tables the operators will use/update while working.Are you only adding, or also updating? Do you do everything in a single
transaction, or a transaction per individual change, or something in
between (eg a commit for every x updates)?
> These are very large tables (over 200 fields each), with many fk's (anda
> lot of the fk indexes are reportedly bad ones).happening,
>
> Not only importing is taking very long, but while importing is
> performance in production takes a severe hit.What is 'very long' and how many updates are we talking about?
> I know theres no magic wand, but I seriously need to start making afurther.
> checklist of practices I should be trying out to contour this situation.
>
> Thanks in advance...please let me know if I can clarify the issue
You could consider adding a separate staging table where you first load
the data into the database in that separate table, and only then (eg using
a stored procedure, or using MERGE or UPDATE OR INSERT, etc) do you start
loading it into the actual table(s). You might even massage the data before
loading it into the final destination (eg remove records that don't need to
be updated). This might not necessarily improve things, but I expect it
will decrease time to load the data, and also reduce the performance hit
for your operators.
Mark