Subject | Re: [IBO] DML updating using tib_cursor |
---|---|
Author | Steve Fields |
Post date | 2002-09-27T17:54:29Z |
Ok, here is what I was doing and here is the result:
In an app I would have a master detail record with a grid
below it. On the base record I would change a field that
was reflected in the detail records, vendor id for example
changed. The vendor id was not a indexed field for other
reasons. When it changes, however, I would need to change
it in each of the related details. If I left the detail
dataset linked (not using disablecontrols) I would get
a lot of scrolling in the grid of items as I stepped through
thej details, changing the field. To get around this
I would use a tib_cursor to make the change in the
background. Doing this, however, (again) would not reflect
in the grid. To try to make the changes visible I tried
to refresh the master record: no change. Tried to refresh
the detail tib_query, still no change. If there is a simple
fix to this I am all for it, I just havent found it yet
and I have tried quite a few work-arounds on this....
(BTW I am using DML cacheing with a primary key in both
detail and master using defind mastersource and masterlink)
Thanks,
Steve Fields
Helen Borrie wrote:
In an app I would have a master detail record with a grid
below it. On the base record I would change a field that
was reflected in the detail records, vendor id for example
changed. The vendor id was not a indexed field for other
reasons. When it changes, however, I would need to change
it in each of the related details. If I left the detail
dataset linked (not using disablecontrols) I would get
a lot of scrolling in the grid of items as I stepped through
thej details, changing the field. To get around this
I would use a tib_cursor to make the change in the
background. Doing this, however, (again) would not reflect
in the grid. To try to make the changes visible I tried
to refresh the master record: no change. Tried to refresh
the detail tib_query, still no change. If there is a simple
fix to this I am all for it, I just havent found it yet
and I have tried quite a few work-arounds on this....
(BTW I am using DML cacheing with a primary key in both
detail and master using defind mastersource and masterlink)
Thanks,
Steve Fields
Helen Borrie wrote:
> Erm...no.
> In documentation we loosely use the term "user" when we mean "connection"
> and, sometimes, "transaction" (or both). A user is only meaningful in
> terms of a connection - so to do any work, your phantom user would (a) have
> to be connected and b) have some way to know that an application wanted it
> to update something. In a sense, DMLCaching gives you what you describe,
> without a "different" user being involved, since it involves database events.
>
> >I was under the impression that to commit it
> > > would close the datasets relating to the database and
> > > would therefore _lose_ the record I was letting the user
> > > view.
>
> No. You have RefreshAction and CommitAction to control the position of the
> cursor after closing and reopening datasets, and after committing work,
> respectively.
>
> > > This is for a basic invoice/lineitem type of system,
> > > with twists relevant to our situation. I would make the
> > > changes in the background to the line items and the base
> > > totals, etc with a tib_cursor or a tib_dsql, (whatever it
> > > takes) but needed the values to update immediately on posting
> > > the lineitem changes. I am using all of the DML settings as
> > > true for each dataset involved (TIB_Queries and a few
> > > tib_cursors).
>
> I've kinda lost the original description but I get the impression you are
> making a self-perpetuating problem for yourself, by denying the dataset its
> native behaviour of updating itself.
>
> Basically, inserting, deleting and editing the current row are "things that
> a dataset knows how to do" in our OO model. IBO hides the whole process of
> the dataset constructing the SQL required for it to achieve these DML
> operations.
>
> With a plain-jane dataset, there is virtually nothing for you as the
> programmer to do except to ensure that the dataset knows how to locate the
> underlying row in the dataset (by supplying KeyLinks) and selecting the
> BufferSynchroFlags to set up how the dataset should respond to
> changes. With a more complex dataset, such as one involving joins, you
> help the dataset to achieve the DML operations by supplying custom SQL in
> the InsertSQL, EditSQL and DeleteSQL properties.
>
> Whether your dataset uses the default SQL statements or custom ones you
> have supplied in the xxxxSQL properties, refreshing the dataset after a
> commit is sufficient to give your application the updated view of the
> database state. An appropriate combination of BufferSynchroFlags,
> RefreshAction and CommitAction will allow you to set up the exact behaviour
> you want.
>
> What DML caching does is to add another level of synchronisation (or three
> more levels, to be more exact). It makes your datasets aware of changes
> committed by other transactions immediately those commits occur, rather
> than waiting until next time the dataset's work is committed. If you don't
> code in what you want the dataset to do in response to a DMLCache message
> (using the OnDMLCacheAnnounceItem and OnDMLCacheReceiveItem events) the
> behaviour with DMLCaching will be no different to the standard behaviour of
> IBO datasets...
>
> Helen
>
>