Subject Large DB after batch import: index rebuild/set stats, backup/restore?
Author Kjell Rilbe
Hi,

We're preparing a database for an online system and have imported a
large amount of data. The DB file is currently 42 gigabyte. There are
two tables with over 110 million records and then some with about 1-2
million records. Most other tables only have a few hundred or a few
thousand records. No records are deleted.

Page size is 4096.

Now, after the batch import finishes, I assume the indexes could use
with some "conditioning". But what's my best option?

1. Just set statistics on all of them?

2. Rebuild all indexes with deactivate/reactivate?

3. Do a backup/restore cycle with gbak?

4. Other options?

Would you handle indexes differently depending on selectivity? For
example, the "largest" table, with over 110 million records, has a
primary key where key values are incremental (not always +1, but always
+something). Is it better to leave that index without a rebuild?

Also, what considerations should I make regarding page size? Maybe I
should bump up the page size?

Downtime is no problem. More important is to maximize index and query
performance before the system is put online.

I'm happy to be pointed to relevant docs in case these are FA/RTM questions.

Thanks,
Kjell
--
--------------------------------------
Kjell Rilbe
DataDIA AB
E-post: kjell@...
Telefon: 08-761 06 55
Mobil: 0733-44 24 64