Subject | Re: [firebird-support] BLOB |
---|---|
Author | Ann Harrison |
Post date | 2014-10-10T21:22:47Z |
On Fri, Oct 10, 2014 at 9:58 AM, tiberiu_horvath@... [firebird-support] <firebird-support@yahoogroups.com> wrote:
my BLOB records are plain text phrases, somewhere between 1 and 400 char-s , containing setup information (in Delphi I save a TStringList to file and I backup that file in my database).Hmmm. What is your page size? How big are the records (excluding blobs)? Those are very small blobs and are probably being stored on the same page with the data. That's interesting but probably not significant. If you know that your blobs will always be text and never exceed 500 characters or so, you might be better off using a varchar field assuming that the change doesn't cause your record to exceed 64K bytes. There's a slight overhead for reading a blob, even a small one on the same page, because the application has to read the record first, get the blob id, then read the blob using the id. Two reads to the same page in cache is not significantly expensive, but it is more expensive than reading once. Firebird's record compression will eliminate the extra space - declare your field to be varchar (800) - you'll waste a few bytes of compressed nothing, but less than the blob overhead.My question was about saving these tables with gbak -v (verbose) where I can see the time spent to backup / restore each table. Can I speedup this process somehow ?1. maybe gbak backup without some tablesNope. Gbak creates a new database on restore. A new database with only the volatile tables just won't be the same.2. maybe some magic gbak switch that "knows" that my BLOBs are text onlyText, binary, it's all the same to gbak. Try the -g suggestion - if gbak is cleaning out garbage, it's slow.Good luck,Ann