Subject | RE: [firebird-support] Error during database backup. Any help very much appreciated ! |
---|---|
Author | Ryan Baldwin |
Post date | 2003-10-08T14:32:24Z |
Hi,
I have a problem trying to backup a database - I would greatly appreciate
any help or ideas on this problem. Basicly GBAK fails with this message:
gbak: writing Collations
gbak: writing data for table O132438IDMAP
gbak: 0 records written
gbak: writing index RDB$PRIMARY17
gbak: writing data for table O1DATA
gbak: ERROR: segment buffer length shorter than expected
gbak: ERROR: gds_$get_segment failed
gbak: Exiting before completion due to errors
The metadata for table O1DATA as shown by IBConsole is
/* Table: O1DATA, Owner: SYSDBA */
CREATE TABLE "O1DATA"
(
"ID" INT64 NOT NULL,
"DATA" BLOB SUB_TYPE 0 SEGMENT SIZE 80,
PRIMARY KEY ("ID")
);
I can see in backup.epp that this is the snipet of code(at bottom of post)
from which the error is being reported. So it would seem to me that the
value being returned from isc_blob_info for max_segment is not suffiectly
large to read all the segments of the blob.
How could someone get their database into this state ?
What could be done to restore the database into a state where a backup can
be done ?
Thanks
Ryan
segments = max_segment = 0;
p = blob_info;
while ((item = *p++) != isc_info_end)
{
l = (USHORT) isc_vax_integer((char*) p, 2);
p += 2;
n = (USHORT) isc_vax_integer((char*) p, l);
p += l;
switch (item)
{
case isc_info_blob_max_segment:
PUT_NUMERIC(att_blob_max_segment, (int) n);
max_segment = n;
break;
case isc_info_blob_type:
PUT_NUMERIC(att_blob_type, (int) n);
break;
case isc_info_blob_num_segments:
PUT_NUMERIC(att_blob_number_segments, (int) n);
segments = n;
break;
default:
BURP_error_redirect((ISC_STATUS*) NULL_PTR, 21, (void*) (ULONG) item,
NULL);
/* msg 21 don't understand blob info item %ld */
}
}
/* Allocate a buffer large enough for the largest segment and start
grinding. */
if (!max_segment || max_segment <= sizeof(static_buffer))
buffer = static_buffer;
else
buffer = BURP_ALLOC(max_segment);
PUT(att_blob_data);
while (segments > 0)
{
if (isc_get_segment(status_vector,
GDS_REF(blob),
GDS_REF(l),
max_segment,
(char*) GDS_VAL(buffer)))
{
BURP_error_redirect(status_vector, 22, NULL, NULL); <<< Error reportes
from here
}
/* msg 22 gds__get_segment failed */
PUT(l);
PUT(l >> 8);
if (l)
{
(void) PUT_BLOCK(buffer, l);
}
--segments;
}
I have a problem trying to backup a database - I would greatly appreciate
any help or ideas on this problem. Basicly GBAK fails with this message:
gbak: writing Collations
gbak: writing data for table O132438IDMAP
gbak: 0 records written
gbak: writing index RDB$PRIMARY17
gbak: writing data for table O1DATA
gbak: ERROR: segment buffer length shorter than expected
gbak: ERROR: gds_$get_segment failed
gbak: Exiting before completion due to errors
The metadata for table O1DATA as shown by IBConsole is
/* Table: O1DATA, Owner: SYSDBA */
CREATE TABLE "O1DATA"
(
"ID" INT64 NOT NULL,
"DATA" BLOB SUB_TYPE 0 SEGMENT SIZE 80,
PRIMARY KEY ("ID")
);
I can see in backup.epp that this is the snipet of code(at bottom of post)
from which the error is being reported. So it would seem to me that the
value being returned from isc_blob_info for max_segment is not suffiectly
large to read all the segments of the blob.
How could someone get their database into this state ?
What could be done to restore the database into a state where a backup can
be done ?
Thanks
Ryan
segments = max_segment = 0;
p = blob_info;
while ((item = *p++) != isc_info_end)
{
l = (USHORT) isc_vax_integer((char*) p, 2);
p += 2;
n = (USHORT) isc_vax_integer((char*) p, l);
p += l;
switch (item)
{
case isc_info_blob_max_segment:
PUT_NUMERIC(att_blob_max_segment, (int) n);
max_segment = n;
break;
case isc_info_blob_type:
PUT_NUMERIC(att_blob_type, (int) n);
break;
case isc_info_blob_num_segments:
PUT_NUMERIC(att_blob_number_segments, (int) n);
segments = n;
break;
default:
BURP_error_redirect((ISC_STATUS*) NULL_PTR, 21, (void*) (ULONG) item,
NULL);
/* msg 21 don't understand blob info item %ld */
}
}
/* Allocate a buffer large enough for the largest segment and start
grinding. */
if (!max_segment || max_segment <= sizeof(static_buffer))
buffer = static_buffer;
else
buffer = BURP_ALLOC(max_segment);
PUT(att_blob_data);
while (segments > 0)
{
if (isc_get_segment(status_vector,
GDS_REF(blob),
GDS_REF(l),
max_segment,
(char*) GDS_VAL(buffer)))
{
BURP_error_redirect(status_vector, 22, NULL, NULL); <<< Error reportes
from here
}
/* msg 22 gds__get_segment failed */
PUT(l);
PUT(l >> 8);
if (l)
{
(void) PUT_BLOCK(buffer, l);
}
--segments;
}