Subject Re: [firebird-support] Linux Classic 2.x
Author Alexandre Benson Smith
Tom Miller wrote:
> Thanks, so each connection is a server and can use up to 2gb.

Yes

> and if 200
> people connected and each was using 128k, then some 20GB of memory would
> get used? Just to make sure I understand.

128KB of cache ??? If so assuming you use 8KB pages it would be 16 pages ?

CS uses by default (if not overrided by database or connection settings
and not changed on firebird.conf) 75 datapages for cache, if you use 8KB
pages it would be 600KB per connection. There are others structures that
the server instance would handle and that consumes memory.

By the way 128KB per connection * 200 connections would be 25MB not 20GB

> There doesn't seem to be any
> commented out default configuration params in the firebird.conf file.
> Is that on purpose as we shouldn't be messing with the parameters or an
> over site?
>

In a default Firebird.Conf file you will find something like this:
# ----------------------------
# Number of cached database pages
#
# This sets the number of pages from any one database that can be held
# in cache at once. If you increase this value, the engine will
# allocate more pages to the cache for every database. By default, the
# SuperServer allocates 2048 pages for each database and the classic
# allocates 75 pages per client connection per database.
#
# Type: integer
#
#DefaultDbCachePages = 2048

If you not coment it out and set a different number, CS will use 75
pages per connection as database cache

> I think the cached pages for CS defaults to 75 and I was going to bump
> it to 200 (if that makes sense).
>

I think you should expriment to see the best that fits your needs, I
have read reports about letting it at 75 per connection and let the OS
uses spare memory for file system level cache, that would be shared by
any CS instance.

> In addition, the gbak file I am restoring is from IB 7.51. Is there a
> FB (1.1 or 1.5) GBAK that is compatible with IB 7.51? If not I can
> always extract the DDL and datapump the data.
>

I don't know.

If I were you, I would extract the metadata, create a new database using
FB and pump the data over. I think it is the safest approach.

> Thanks
>
> Tom Miller
>
>

see you !

--
Alexandre Benson Smith
Development
THOR Software e Comercial Ltda
Santo Andre - Sao Paulo - Brazil
www.thorsoftware.com.br