Subject | Re: [firebird-support] Re: nbackup strategy advice |
---|---|
Author | Kjell Rilbe |
Post date | 2014-03-16T09:01:18Z |
Den 2014-03-16 07:27 skrev hugo.larson@... såhär:
N=0 backs up everything.
N=1 backs up every page that's changed since last N=0.
N=2 backs up every page that's changed since last N=1.
...
N=k backs up every page that's changed since last N=k-1.
So, with your suggested scheme, first month would go like this:
Day 1: backup entire database (N=0).
Day 2: backup pages changed since day 1 (N=1).
Day 3: backup pages changed since day 1 (N=1). Will include all pages
copied day 2 + additional pages changed since day 2.
Day 4: backup pages changed since day 1 (N=1). Will include all pages
copied day 3 + additional pages changed since day 3.
...
Day 28/30/31: backup pages changed since day 1 (N=1). Will include all
pages copied the day before + additional pages changed the last day.
If Month 2 you increment N to 2, you will get this:
Day 1: backup pages changed since last day of month 1 (N=2).
Day 2: backup pages changed since last day of month 1 (N=2).
...
Day 28/30/31: backup pages changed since last day of month 1 (N=2).
At the end of the year, your actual "final" backup sequence will be:
N=0: Initial backup first day of the year.
N=1: Last backup of month 1.
N=2: Last backup of month 2.
...
N=12: Last backup of month 12.
It would probably make more sense to do it like this:
First day of year: N=0, initial complete backup.
First day of each month: N=1, will contain all pages changed since first
day of year.
First day of each week: N=2, will contain all pages changed since first
day of month.
Each day: N=3, will contain all pages changed since first day of week.
If two such days coincide, you still need to run both "colliding" levels
(lower N first, higher N directly afterwards), or the sequence will be
broken next day.
This way, you will have a daily backup that's complete, consisting of
four parts (N=0, 1, 2 and 3).
In general, Nbackup should be run with each value of N at a regular
interval, with tighter intervals for higher values of N. Incrementing N
over time as you suggested is not suitable.
Note that Nbackup has no way of detecting any database corruptions, so
if that happens it will go completely undetected. Might be a good idea
to combine it with some client local gbak or gfix- v or gfix -v -full,
as often as is viable.
I might also mention the possibility to simply lock the database with
Nbackup and copy the database file with rsync. This will probably have
similar or better performance over a slow connection. That's what I do
for out 200 Gbyte database, although that's on a single server with a
backup volume attached via high bandwidth network. :-)
Regards,
Kjell
>That wouldn't work. I don't think you understand how Nbackup works.
> Hello Paul.
> Thanks for your advice.
> My strategy was based on "backup data once" approach but this would
> produce to many files I now realize.
>
> But I still want to avoid to backup the entire database (N=0) on
> regular basis.
> Whats your opinion about this approach?
> First backup N=0
> Every day N=1 for a month (replace file each time)
> Increment N next month.
>
> This would produce 12 files every year.
>
N=0 backs up everything.
N=1 backs up every page that's changed since last N=0.
N=2 backs up every page that's changed since last N=1.
...
N=k backs up every page that's changed since last N=k-1.
So, with your suggested scheme, first month would go like this:
Day 1: backup entire database (N=0).
Day 2: backup pages changed since day 1 (N=1).
Day 3: backup pages changed since day 1 (N=1). Will include all pages
copied day 2 + additional pages changed since day 2.
Day 4: backup pages changed since day 1 (N=1). Will include all pages
copied day 3 + additional pages changed since day 3.
...
Day 28/30/31: backup pages changed since day 1 (N=1). Will include all
pages copied the day before + additional pages changed the last day.
If Month 2 you increment N to 2, you will get this:
Day 1: backup pages changed since last day of month 1 (N=2).
Day 2: backup pages changed since last day of month 1 (N=2).
...
Day 28/30/31: backup pages changed since last day of month 1 (N=2).
At the end of the year, your actual "final" backup sequence will be:
N=0: Initial backup first day of the year.
N=1: Last backup of month 1.
N=2: Last backup of month 2.
...
N=12: Last backup of month 12.
It would probably make more sense to do it like this:
First day of year: N=0, initial complete backup.
First day of each month: N=1, will contain all pages changed since first
day of year.
First day of each week: N=2, will contain all pages changed since first
day of month.
Each day: N=3, will contain all pages changed since first day of week.
If two such days coincide, you still need to run both "colliding" levels
(lower N first, higher N directly afterwards), or the sequence will be
broken next day.
This way, you will have a daily backup that's complete, consisting of
four parts (N=0, 1, 2 and 3).
In general, Nbackup should be run with each value of N at a regular
interval, with tighter intervals for higher values of N. Incrementing N
over time as you suggested is not suitable.
Note that Nbackup has no way of detecting any database corruptions, so
if that happens it will go completely undetected. Might be a good idea
to combine it with some client local gbak or gfix- v or gfix -v -full,
as often as is viable.
I might also mention the possibility to simply lock the database with
Nbackup and copy the database file with rsync. This will probably have
similar or better performance over a slow connection. That's what I do
for out 200 Gbyte database, although that's on a single server with a
backup volume attached via high bandwidth network. :-)
Regards,
Kjell