Subject Re: [IBO] IB_SQL Script limit 64K
Author Lele Gaifax
>>>>> On Wed, 22 Jan 2003 12:42:29 -0000, "Marco Menardi <mmenaz@...>" <mmenaz@...> said:

>> Script1: declare tables and primary keys.
>> Script2: triggers
>> Script3: foreign keys, indexes
>> Script4: stored procedures. I often do two scripts for SPs -
>> one
"M> for the
>> lower-level procs, the next for the top-level ones.
>> Script5: DML script for populating control tables.
>> Script7: Drops, alters and recreates.

"M> THIS solution is very hard to mantein!

It's not! You surely do not want to depend on having a up-to-date gdb,
with a working server, around just to get to your beloved meta-data!

I keep ALL my meta-data in external, revision controlled, SQL
sources. Whenever I make a change, I create also a new script, under
patches/, that applied to existing DBs will bring them up-to-date.

Different from above, for each set of SPs I have two distinct sources,
one with empty "CREATE" (and soon "RECREATE" ;), the other with the
real code in ALTER stmts. This permits that simply reexecuting the
latter I'm sure all my code is up-to-date.

Then I have an home-grown app that basically keeps track of which
"patches" are already applied, and is able to load a new set of
patches and apply them, one after the other.

"M> And you will get crazy
"M> for dependencies when your table declaration or SP or Triggers
"M> or both exceed 64K limits.

Yup! I'm planning to rewrite my tool (in Python, this time !-) and
implement a concept of "require/provide", and thus able to
auto-compute the relationships between the single chunks, then to
apply them in the right order. But that's still vaporware :)

"M> If I have to add a default to a
"M> field, I simply export metadata, edit the script, then rebuild
"M> the database, and finally import the data (IBPump). That's
"M> very easy and error free.

Uhm, not at all for me. I would proceed by changing my
"ThisFeature/Tables.sql" adding the field in the right tables
"creation" stmt, then I'd write a new
"ThisFeature/patches/patchXX.sql" with "ALTER TABLE that ...", import
that patch in the tool I mentioned, from there export a compact and
portable container of all those little patches to be distribuited to
all customers. They have the same tool, reload the archive, click
"Apply", et voilà!

Well, it's not everything so smooth every time, but you get the idea.

>> If you want to end up with this running as if it were one
>> script,
"M> include
>> an INPUT statement at the end of each script (except the last,
>> of
"M> course).
>> This sequence will take good care of most dependency
>> situations but
"M> you can
>> vary your content and your includes to suit your specific
>> database.
>> Helen

"M> INPUT? I will check FB documentation, thanks :) Marco Menardi

Yes, that's a way I considered. Luckily I took the Python Everywhere
approach :): with the interpreter aboard, my patches can do that and
other ashtonishing things, being able to express a DB change as a
Python algorithm (say, "cycle all those tables that begins with
"BlaBla", and if they contain a field "NotUsed", then "ALTER TABLE xx
DROP it"!)

Use the power, Luke! 8-)

ciao, lele.
nickname: Lele Gaifax | Quando vivrò di quello che ho pensato ieri
real: Emanuele Gaifas | comincerò ad aver paura di chi mi copia.
email: lele@... | -- Fortunato Depero, 1929.