Subject | Re: [firebird-support] backup on amd64 restore on raspberry pi 3 |
---|---|
Author | |
Post date | 2018-08-25T18:46:48Z |
Mark, checking on max row size I find :
on https://www.ibphoenix.com/resources/documents/general/doc_323
"Rows are restricted to 32767 bytes."
on https://www.firebirdsql.org/en/firebird-technical-specifications/
"Maximum row size 64 KB"
Counting first reference, I'm out of bounds big time ...
CREATE TABLE TABLALOG
(
NID ENTERO NOT NULL,
NUSUARIO ENTERO NOT NULL,
TSINGRESO T_STAMP,
DALCANCE CHAR_255CI,
DAPP CHAR_50CI,
OEVENTO CLA_SHORT,
DIPV4 CHAR_15,
DIPV6 CHAR_50,
DHOST CHAR_255CI,
DENTIDAD CHAR_50CI,
OACCION CHAR_50CI,
DCONTENIDO VARCHAR(32765),
CONSTRAINT PK_TABLALOG PRIMARY KEY (NID)
);
Where domains are:
ENTERO integer
T_TAMP is a timestamp
CHAR_255CI varchar(255) AI_CI
CHAR_50 varchar(50) AI_CI
CHAR_15 varchar(15) AI_CI
CLA_SHORT short
Database is generated as ISO-8859-1, so using 2 bytes per char, I think
I'm past all bounds..
I recreated the varchar(32765) attribute to varchar(255), backuped an restored. The error now shows on another table which has more than 32767 bytes (another log table, almost identical):
gbak: restoring index PK_ERRORLOG
gbak:restoring data for table ERRORLOG
gbak: ERROR:message length error (encountered -32512, expected 33024)
gbak: ERROR:gds_$send failed
gbak:Exiting before completion due to errors
I guess I must use a 32 kbytes max instead of 64kbytes, right?
I'm worried that server didn't yelled at me when generating/altering database ddl .
Pablo
on https://www.ibphoenix.com/resources/documents/general/doc_323
"Rows are restricted to 32767 bytes."
on https://www.firebirdsql.org/en/firebird-technical-specifications/
"Maximum row size 64 KB"
Counting first reference, I'm out of bounds big time ...
CREATE TABLE TABLALOG
(
NID ENTERO NOT NULL,
NUSUARIO ENTERO NOT NULL,
TSINGRESO T_STAMP,
DALCANCE CHAR_255CI,
DAPP CHAR_50CI,
OEVENTO CLA_SHORT,
DIPV4 CHAR_15,
DIPV6 CHAR_50,
DHOST CHAR_255CI,
DENTIDAD CHAR_50CI,
OACCION CHAR_50CI,
DCONTENIDO VARCHAR(32765),
CONSTRAINT PK_TABLALOG PRIMARY KEY (NID)
);
Where domains are:
ENTERO integer
T_TAMP is a timestamp
CHAR_255CI varchar(255) AI_CI
CHAR_50 varchar(50) AI_CI
CHAR_15 varchar(15) AI_CI
CLA_SHORT short
Database is generated as ISO-8859-1, so using 2 bytes per char, I think
I'm past all bounds..
I recreated the varchar(32765) attribute to varchar(255), backuped an restored. The error now shows on another table which has more than 32767 bytes (another log table, almost identical):
gbak: restoring index PK_ERRORLOG
gbak:restoring data for table ERRORLOG
gbak: ERROR:message length error (encountered -32512, expected 33024)
gbak: ERROR:gds_$send failed
gbak:Exiting before completion due to errors
I guess I must use a 32 kbytes max instead of 64kbytes, right?
How two byte per char count here ?
Is the limit for record declaration or actual record size ?
Pablo