Subject RE: [ib-support] Re: Out of memory
Author Alan McDonald
if all you select in the first instance is 500 primary keys, then you use
these 5000 integers to select the variants whose foreign key is the primary
key you have selected. then this will certainly speed things up. I would
have to test it to see if committing each 50 variant updates/inserts would
be faster. It would be an easy thing to try both ways.
Alan

-----Original Message-----
From: Michael Vilhelmsen <Michael.Vilhelmsen@...>
[mailto:Michael.Vilhelmsen@...]
Sent: Monday, 10 February 2003 11:28 PM
To: ib-support@yahoogroups.com
Subject: [ib-support] Re: Out of memory


My problem lies in, that my customer has created somewhere around
5000 articles.
Each of those has on the average of 40 - 50 variants.

And its for each variant I has to do the update / insert.

I could select all articles first, and then for each article select
all the appr. variants.

Then after each 50 articles or so I could do a commit.

Would that increase speed significantly ?

Michael

--- In ib-support@yahoogroups.com, "Alan McDonald" <alan@m...> wrote:
> The unidirectional setting certainly reduced memory requirements.
That's
> probably what made the difference for you.
> Hopefully you have a hierarchy in your record schema, so that for
so many
> child records (228,000) you have a structure of parent records.
e.g. rooms,
> inside buildings, inside complexes, inside precincts.
> You could start by selecting distinct precincts, then looping thru
the
> precincts, select all the complexes (for that precinct), then
looping thru
> the complexes, select all the buildings (for the complex), then
looping thru
> the buildings, select all the rooms (for that building)... that way
a select
> will only have the rooms of one building at a time. You'd be very
surprised
> what a speed increase you will get by processing in this way. You
would be
> even better off doing this processing on the server under SP
control.
>
> Alan
>
> -----Original Message-----
> From: Michael Vilhelmsen <Michael.Vilhelmsen@M...>
> [mailto:Michael.Vilhelmsen@M...]
> Sent: Monday, 10 February 2003 10:46 PM
> To: ib-support@yahoogroups.com
> Subject: [ib-support] Re: Out of memory
>
>
> That means, that I should do my select in portions instead ?
>
> Michael
>
> --- In ib-support@yahoogroups.com, "Alan McDonald" <alan@m...>
wrote:
> > Sounds to me like it was telling you something very sensible.
> > Let's imagine for a moment that your database table grows over
time
> until it
> > has 50 million records. Does your current design in any way cater
> for this?
> > To select 50 million records (or even 228,000 records) in one hit
> is not a
> > good idea. There are many ways to design your processing such that
> only
> > small numbers of records are snatched off the server at any one
> time. I
> > would seriously think of these ways and ways in which you can
> normalise your
> > data such that there is little or no need to see all records in a
> table in
> > one select. Your performance will improve a hundred or more fold
> too. You
> > might be very surprised.
> >
> > Alan
> >
> > -----Original Message-----
> > From: Michael Vilhelmsen <Michael.Vilhelmsen@M...>
> > [mailto:Michael.Vilhelmsen@M...]
> > Sent: Monday, 10 February 2003 9:12 PM
> > To: ib-support@yahoogroups.com
> > Subject: [ib-support] Re: Out of memory
> >
> >
> > To answer my own question.
> >
> > I set UniDirectional to true, and the problem was gone.
> >
> > But I still would like to know why I got the error in the first
> place.
> > And if I could do something to avoid it !
> >
> > Michael
> >
> > --- In ib-support@yahoogroups.com, "Michael Vilhelmsen
> > <Michael.Vilhelmsen@M...>" <Michael.Vilhelmsen@M...> wrote:
> > > Hi
> > >
> > > I have an appl. that contects to an Firebird DB (1.0).
> > > In my appl. I have made a rutine, that select 228.000 records,
and
> > > parse through each of them.
> > > For each record I call a stored procedure that either update
> > another
> > > record or insert a record.
> > >
> > > That'll say it will update / insert up to 228.000 records.
> > >
> > > Does two query both uses the same transaction.
> > >
> > > Before starting I do a
> > >
> > > if not MyTrans.active then
> > > MyTrans.StartTransaction
> > >
> > > After the rutine has run through I do
> > >
> > > MyTrans.Commit
> > >
> > > On a smaller dataset it completes with no error.
> > > But on this bigger dataset (which I doesn't find that big) I get
> > this
> > > error:
> > >
> > > Project xxx raised exception class EOutOfMemory with
message "Out
> > of
> > > memory". Process stopped.
> > >
> > > I get it running both from inside D5 and running my appl. alone.
> > >
> > > I kept an eye on my Task manager and all the time I had at least
> 96
> > > Mb of physical memory, and my swapfile wasn't used that much.
> > >
> > > I have even tried running the program without any other programs
> > > running on my machine at the same time.
> > > Same result.
> > >
> > > Is this caursed by my program, because it doesn't allocate
enough
> > > memory for it self ?
> > > Can I change that ?
> > >
> > > I know in Turbo Pascal 7 I could set some compiler directives to
> > > increase the amount of memory available to my program.
> > >
> > >
> > > Regards
> > > Michael
> >
> >
> >
> > To unsubscribe from this group, send an email to:
> > ib-support-unsubscribe@egroups.com
> >
> >
> >
> > Your use of Yahoo! Groups is subject to
> http://docs.yahoo.com/info/terms/
>
>
>
> To unsubscribe from this group, send an email to:
> ib-support-unsubscribe@egroups.com
>
>
>
> Your use of Yahoo! Groups is subject to
http://docs.yahoo.com/info/terms/



To unsubscribe from this group, send an email to:
ib-support-unsubscribe@egroups.com



Your use of Yahoo! Groups is subject to http://docs.yahoo.com/info/terms/