Subject Re: slow backup (fb 1.5 classic on linux)
Author ggroper

Thanks for the advice.

I did do a completes back up and restore to a blank data base befor my
pump , so I assume there was not garbage to clean up. I did this
three times all running from a local server.

Anyway, I recreated the clean database again this morning and used the
data pump from DW from a MS SQL Server. The tranfer was slow but it
got there. This time I can do a back up in less than 5 minutes and a
restore in about 15.

So, I suspect either the process of pumping data from MS Access
or using IBPump may be a problem, but I am in no position to point a
finger. The two MS Access databases were also the source for the MS
SQL Server data, so I thing they are OK.

Thanks again for the help,


--- In, Aage Johansen <aagjohan@o...>
> On Sat, 6 Mar 2004 23:41:26 +0000 (UTC), ggroper <ggroper@y...> wrote:
> > I backed up the structure of my data base and restored it with no
> > I used the IBPump.exe utility that I downloaded. It has a date of
> > 11/1/2002. Maybe this is a problem and is not compatible with FB
1.5 ???
> > I pumped a small amount of data - about 300,000 records, rebult a
> > summary table and updated fields in the large table. This seemed to
> > work OK. I then did a database check and a sweep with Database
> > Workbench. All ok. I then used a DOS batch file to do a complete
> > backup and restore using gbak, this went OK and pretty fast. I then
> > deletede the rows in my two test files, the big table and a small
> > summary table created by a sql procedure. I re ran the back up and
> > restore for the empty data base all OK ready to go!!
> Deleting (all) the rows leaves (a lot of) garbage - which eventually
> be cleaned out (GC).
> Doing backup (with or without GC) plus restore will accomplish this
> > The IBPump program ran the load of the full data set, 3.5 mill
> > records, in about 2 hours. This seemed faster than the MS SQL Server
> > utility. I next ran and update stored procedure to update two fields
> > in the large table, took about an hour and a half. I next ran a
> > summary procedure to create a new file, again slow but it finished.
> Did you run the process on the server (is there network times
included in
> the timings)?
> I haven't used IBpump, but I once transferred data from one database to
> another (all running on the same server) on a 2x800MHz Xeon server
> about 2000 records (select+insert) per second. Declared record size
> 512B (probably about 50% filled with data). This was a Delphi program
> using IBO without any special optimization for those databases. No
> defined, just the primary key.
> Updates also leaves some garbage, the system will eventually clean up.
> When I do massive updates or deletes I usually try to get rid of the
> garbage there and then (this used to be easy, but it seems we are now
> slaves of the GC thread).
> GC can be particularly painful with low-selectivity indexes on large
> > I looked at the data, ran a few queries, slow but data seemed to
be OK.
> >
> > I next ran the DOS bactch file to back up the data base. It started
> > out and then stalled, I looked at the file as it was being created -
> > very very slow almost a hang-maybe 1 kb a sec. After two hours I
> > killed the job, rebooted and tried a back up from Database Workbench,
> > again it started off OK, but then slowed down at about 260,000
> > and seemed to hang there.
> Maybe the system was also taking out the garbage while doing the backup
> (any low-selectivity indexes?)
> Since it started off fine, maybe som garbaga was already cleaned out.
> > I looked at the data with a Delphi test program and Jason Wharton's
> > controls and all the data seemed to be there. The first query with
> > three parameters on indexed fields took over a minute to search
> > through the 3.5 million records, but changing the paremeters, closing
> > and reopening the query was sub second??? (Same query with ADO to MS
> > SQL Server was initially 4 secs and subsequent queriesw sub second
> > slightly faster than FB.)
> Take a look at the plan for the query - this may give a hint.
> > So, I have data that looks good, executes slow, and may be corrupted.
> > It seems I can not do a sweep, or a backup????
> Check the consistency of the data with the command line tools (gfix).
> > How long do you estimate it should take to sweep? and a back up a
> > database with 3.5 million new records, and I guess 3.5 million new
> > updates??? Does 4 or 5 hours seem right or more like 30 minutes???
> This depends. With a lot of garbage (and low-selectivity indexes)
it may
> really take a loooong time. If I will delete all records and there
> may dependencies, I consider dropping the table and recreating it.
> I do a sweep every night (at work), but haven't timed it. My guess is
> about an hour or so - maybe some more. The database has about 4 mill.
> records (1.3GB), not much garbage (800MHz proc., 512MB RAM and 15krpm
> disk. Win2k/SP2).
> Hmmmmm - I just ran a sweep on a local database (at home). 900MB
> (no garbage), 1.7GHz cpu, 7.2krpm IDE disk: less than a couple of
> I did:
> gfix -sweep -user sysdba -password masterkey D:\DB\db.gdb
> Ran it once more: 35 seconds - amazing. This is FB1.5/RC6.
> > I assume that FB 1.5 is compatable with Database Workbench, and the
> > IBPump. May be not and there is some weird corruption.
> > Again, the WIN XP retore is turned off, the database extension is
> > I have a 3ghz PC with 1 gig memory, and two disk drives with over 10
> > gig each available on both drives. The FB database is about 2 gig
> > before a back up and restore.
> > Any ideas?? Maybe try again and pump from Database Workbench??
> I wouldn't suspect DBW or IBPump. However, if you find corruptions you
> should contact the authors.
> Touch wood - I haven't seen a corrupted IB or Fb database since I
swept an
> IB/5.0 database (where we had deleted just over 6 mill. records from a
> single table).
> --
> Aage J.