Subject Big DB
Author Chad Z. Hower aka Kudzu
We have an application that used to use big indexed flat files. Then we
moved the indexes and some things into DBISAM but kept all the "blob" data
external, and had ready made indexes external etc.. It all worked very well
except even the DB part was getting big and DBISAM occasionly would corrupt
things.

We've moved it to FB and deciced to move ALL data into the DB. The old
system was VERY fast, but pruning itmes out of the blobs was very difficult
because you had to repack the file and update the ready made indexes, etc..

So we've converted and my resulting GDB is 1.2 GB without any indexes. Im
adding indexes now....

1) Any tips? Will FB have trouble with this DB or potentially larger? The DB
structure is pretty small, just 6 tables or so. Most of the data is in two
tables. Tehre are very few joins, mostly simple lookups in fact.

2) One of the common lookups is by a unique integer field. No problem.. But
one of the common lookups is on a varchar 100. In our old system we took a
32 bit hash of this field and put it in a "index". When we wanted to lookup,
we scaned for matching hashes. If more than one, then we comparedt eh actual
text of those with matching hashes. Very fast....

Now I assume FB does something simlar for #2? Should we bother with this or
just index the varchar 100 and let FB takie care of it? Rigth now it's a
straight port, so its still doing it. Also how will FB store this as an
idnex?