Subject Re: [firebird-support] Digest Number 9396
Author Ann Harrison

On Nov 19, 2016, at 11:20 PM, Dalton Calford dcalford@... [firebird-support] <> wrote:

Every time you do not define a domain when creating a column in a table, firebird creates a new domain ...   So, you perform a backup and it recreates all the prior domains first, then, proceeds to create new domains.   

I think that's not correct prior to V3 and probably not for V3 either.  Gbak maintains the relationship between fields and the domains on which they're based. The problem with extremely high domain numbers is probably related to temporary tables. 

Good luck,


Either way, you need to get your data out of the database and into one that is restorable.   You also need to change your create table routines to use pre-defined domains.   What you did not realize is that regardless of whether you want to use domains, you will be using them.   The difference is whether you use a domain you define, or let the system create domains on the fly.

At least with domains you create, you can reuse them and you do not encounter the issue you are seeing.

1 GB is not very much, My smallest DB that I maintain is 20 GB.  

With FB 3, you can do a few things to speed up the process, but, at this point, you may need more help than simple forum list support can give.    

Where on the planet are you located?   Perhaps we can get you in touch with a FB specialist in your area.

best regards


From: <> on behalf of tiberiu_horvath@... [firebird-support] <>
Sent: November 19, 2016 9:12:59 AM
Subject: Re: [firebird-support] Digest Number 9396

I use only 

CREATE TABLE <TN>  and DROP TABLE <TN> metadata commands in runtime. I populate the tables and that's all. I don't create domains, not at all. 

I use FB 3.0 on a Linux server (I don't maintain that server) . 

The database file is very big - 1+ GB in size, my customers append arround 3000 records each day. 

To try to fix my domain name problem I do as you suggested : I do the restore of the new database file with the correct structure and datapump the records. I just did that yesterday (I have some 6 million records in a few tables) but this process is very time consuming - around 4 hours of work. 

I have one possible explanation :  last time I upgraded from FB 2.55 to FB 3.0, I did the process on a Windows machine : backup FB 2.55 and restore FB 3.0 on the machine. I copied , then, the FDB file in Linux (Samba) , didn't performed a normal restore on Linux (because it is very time-consuming and my customer couldn't wait). Maybe something bad happened then with those domain names. Yesterday I did this : 
1. Exported the records (in some XML files) 
2. Restored an empty database file (full restore - not copy the FDB file) 
3. Datapumped the records (from the XML files) . 

I will see the result and keep you informed. 

Thank you ,