Subject Re: [firebird-support] gbak or service call / fixed size backup chunk files . . .
Author Helen Borrie
At 01:40 AM 1/08/2008, you wrote:
>Hello everybody,
>I use Delphi 6 and IBX components.

The IBX components that shipped with Delphi 6 are too buggy to use reliably. I think you can still get improved binaries for Delphi 6 from the Codegear website though...

>I am trying to backup a database as several 650 MB chunk files.
>What I want to see is, enough number 650 MB chunk file plus one
>chunk file smaller than 650 MB if necessary.
>* First, I supply 99 unique name and theirs sizes are 650 MB. On 4
>MB test db, what I see were one 4 MB chunk and ninetyeight 1 KB

The last chunk must have no file size specified. If your service component is implementing gbak -b multi-file properly, then it should remove the file size for the last chunk.

But this looks OK. All of the backup data from a 4 MB database would fit into the first (primary) backup file with about 649 MB of spare capacity. Each of the 1 KB files will contain only the path to the next file in the sequence and the limit specified as maximum file size, but no user data.

>* Second, I supply 1 unique name and its size is 1 MB. On 4 MB test
>db, what I see were one 1 MB chunk and one 3 MB chunk.

Numerically, this is consistent with the first test. The backup data is slightly larger than 1 MB so a secondary file was created. However, using gbak "raw", this situation would cause an exception. It is not valid to supply a file size for the "last" file (the primary file, in the single-file case). So what this seems to tell us is that the service component is probably wrapping a handler for that exception and is automatically requesting a secondary file. How was the secondary file named?

>So, is it another method to do what I want.

You need to do more testing with *actual* databases of typical size.

>If it is not, is it a method to calculate expected backup size ?

Other than testing typical databases that have been working under conditions that are typical for your user applications, no. The file size of a database is almost always larger than the total size of the active data, because there will be pages that contain garbage. Sometimes a database could be two or three times larger than the active data, when housekeeping is neglected for long periods.

Backups don't back up garbage, so a freshly restored database is the best indicator of the actual size of the database. You could do some statistics comparing the backup file size with the size of the database restored from it, to get a rough ratio.

>Or can I delete small chunks safely in my first method (I mean this and future versions) ?

You must not delete ANY chunks in the sequence. A restore from a multi-file backup must be able to find all of the files that were specified in the backup call: each of those "empty" files contains the link to the next file.

For your testing, I recommend you use gbak and file-copies of real databases taken from production conditions. But for your IBX testing, get hold of the most recent version of the service components and test the theory that the TIBBackupService component is wrapping a "fallback" for situations where the primary file was specified alone with a file size.