Subject | gbak & BLOB (isc_bpb_type_stream) problem |
---|---|
Author | c_barheine |
Post date | 2004-11-26T13:17:59Z |
Hi,
the database for my test case consists of one table with a single blob
column:
CREATE DATABASE 'd:\gbakblob.fdb' USER 'sysdba' PASSWORD 'masterkey'
PAGE_SIZE 4096 DEFAULT CHARACTER SET NONE;
CREATE TABLE testtable (streamblob BLOB);
There is one row in it containing a _stream_ blob of size 2049 (2k +
1) bytes, created like this:
char bpb[] = { isc_bpb_version1, isc_bpb_type, 1, isc_bpb_type_stream };
isc_create_blob2(..., sizeof(bpb), bpb);
and then populated by calling isc_put_segment() twice - with
seg_buffer_length 2048 and 1, respectively. (The blob id was stored in
the parent row.)
The database behaves as expected, I can read from the blob via
isc_get_segment(), it is navigable via isc_seek_blob(), ...
Everything is still fine when I do a local database backup with
gbak.exe as follows:
gbak -b -v -user sysdba -pas masterkey d:\gbakblob.fdb d:\gbakblob.fbk
After copying gbakblob.fdb to d:\ on "remotehost", I can use the
service manager to create a remote backup file
gbak -b -v -user sysdba -pas masterkey -se remotehost:service_mgr
d:\gbakblob.fdb d:\gbakblob.fbk
but I would also like to be able to create a _local_ backup of a
remote database:
gbak -b -v -user sysdba -pas masterkey remotehost:d:\gbakblob.fdb
d:\gbakblob.fbk
This works when the blob field contains a regular binary blob
(isc_bpb_type_segmented) or when the stream blob is written in one
segment only (seg_buffer_length == 2049).
In the current situation however I get the following output:
gbak: readied database localhost:d:\gbakblob.fdb for backup
gbak: creating file d:\gbakblob.fbk
gbak: starting transaction
gbak: database localhost:d:\gbakblob.fdb has a page size of 4096 bytes.
gbak: writing domains
gbak: writing domain RDB$1
gbak: writing shadow files
gbak: writing tables
gbak: writing table TESTTABLE
gbak: writing column STREAMBLOB
gbak: writing functions
gbak: writing types
gbak: writing filters
gbak: writing id generators
gbak: writing stored procedures
gbak: writing exceptions
gbak: writing Character Sets
gbak: writing Collations
gbak: writing data for table TESTTABLE
gbak: ERROR: segment buffer length shorter than expected
gbak: ERROR: gds_$get_segment failed
gbak: Exiting before completion due to errors
Needless to say that the error persists when I switch to a different
database page size. Moreover, it is version-insensitive: I tried
Firebird-1.5.1.4481, -1.5.2.4671-0_RC2, and -1.5.2.4719-0_RC3 on WinXP.
My question is: How can I gbak a remote database containing stream blobs?
Since I am afraid attachments are not allowed, please contact me
privately for a copy of the database (3kB fbk file).
It would be very nice if anyone could take a look.
Thanks,
Christian
the database for my test case consists of one table with a single blob
column:
CREATE DATABASE 'd:\gbakblob.fdb' USER 'sysdba' PASSWORD 'masterkey'
PAGE_SIZE 4096 DEFAULT CHARACTER SET NONE;
CREATE TABLE testtable (streamblob BLOB);
There is one row in it containing a _stream_ blob of size 2049 (2k +
1) bytes, created like this:
char bpb[] = { isc_bpb_version1, isc_bpb_type, 1, isc_bpb_type_stream };
isc_create_blob2(..., sizeof(bpb), bpb);
and then populated by calling isc_put_segment() twice - with
seg_buffer_length 2048 and 1, respectively. (The blob id was stored in
the parent row.)
The database behaves as expected, I can read from the blob via
isc_get_segment(), it is navigable via isc_seek_blob(), ...
Everything is still fine when I do a local database backup with
gbak.exe as follows:
gbak -b -v -user sysdba -pas masterkey d:\gbakblob.fdb d:\gbakblob.fbk
After copying gbakblob.fdb to d:\ on "remotehost", I can use the
service manager to create a remote backup file
gbak -b -v -user sysdba -pas masterkey -se remotehost:service_mgr
d:\gbakblob.fdb d:\gbakblob.fbk
but I would also like to be able to create a _local_ backup of a
remote database:
gbak -b -v -user sysdba -pas masterkey remotehost:d:\gbakblob.fdb
d:\gbakblob.fbk
This works when the blob field contains a regular binary blob
(isc_bpb_type_segmented) or when the stream blob is written in one
segment only (seg_buffer_length == 2049).
In the current situation however I get the following output:
gbak: readied database localhost:d:\gbakblob.fdb for backup
gbak: creating file d:\gbakblob.fbk
gbak: starting transaction
gbak: database localhost:d:\gbakblob.fdb has a page size of 4096 bytes.
gbak: writing domains
gbak: writing domain RDB$1
gbak: writing shadow files
gbak: writing tables
gbak: writing table TESTTABLE
gbak: writing column STREAMBLOB
gbak: writing functions
gbak: writing types
gbak: writing filters
gbak: writing id generators
gbak: writing stored procedures
gbak: writing exceptions
gbak: writing Character Sets
gbak: writing Collations
gbak: writing data for table TESTTABLE
gbak: ERROR: segment buffer length shorter than expected
gbak: ERROR: gds_$get_segment failed
gbak: Exiting before completion due to errors
Needless to say that the error persists when I switch to a different
database page size. Moreover, it is version-insensitive: I tried
Firebird-1.5.1.4481, -1.5.2.4671-0_RC2, and -1.5.2.4719-0_RC3 on WinXP.
My question is: How can I gbak a remote database containing stream blobs?
Since I am afraid attachments are not allowed, please contact me
privately for a copy of the database (3kB fbk file).
It would be very nice if anyone could take a look.
Thanks,
Christian