Subject Re: [IB-Architect] Fw: Mischievous SYSDBA
Author Charlie Caro
Ann Harrison wrote:
>
> At 12:22 PM 5/25/00 -0400, Jim Starkey wrote:
>
> >Given the code, it would be next to trivial to pass in an encryption
> >key in the database parameter block used in the block read/write
> >routines to encrypt the database. Perhaps a VAR might want to do
> >that, even given the cpu hit that it's going to cost him. If the
> >VAR makes the change to a private code base and distributes only
> >the binaries, he's got a fighting chance that obscurity will prevail
> >and his database will be secure from prying eyes (and third party
> >tools, of course).
>
> That would violate the IPL. Anyone who changes the code must
> make the changes available for a reasonable copying cost.
>
> Ann
>

An InterBase VAR/OEM should pay the annual $15 fee to obtain a digital signature
from their favorite certificate authority. The VAR embeds that digital ID into
their application and builds an InterBase kit from open source with that same
digital ID. Or for a service charge, InterBase Inc. would build, test and
certify it for you. This might be desirable if there was a clause in InterBase
support contracts that limited support services for rebuilt servers. (The
foregoing is a hypothetical scenario and in no way is meant to reflect the
business policies of InterBase Inc.)

Keeping these digital certificates in a database or file at the end user's site
would defeat the whole purpose of the exercise. It needs to be embedded in the
binary executables. Ownership and usage of databases is still under SQL control.
Ownership and usage of software executables controlled by digital certificates.

The VAR applications and InterBase server would always go thru a mutual
authentication session (yet to be designed but still open source) so that the
VAR app knows it is talking to the VAR's InterBase server and vice versa. Only
then does the VAR app pass the database encryption string on the database attach
call. A cracker engaged in corporate espionage would be prevented from spoofing
either side of the client/server interaction (unless he could obtain the VAR's
digital ID). If the cracker substituted both sides of the client/server, he
wouldn't have access to the database encryption key since the VAR compiled it
into the authentic application.

This technique might apply to other issues such as VARs who want to enforce
application licensing by piggybacking InterBase's prior licensing scheme. If the
customer replaced the VAR's InterBase server with one off the net, the mutual
authentication session would fail and prevent the licensing from being
circumvented.

This is not an open source software issue. It's about trust and the ability to
authenticate both parties to a transaction. Last month's email viruses and worms
were propogated by a proprietary, closed source mail client. But it had less to
do with default settings for executing .exe, .vbs, .bat mail extensions. If the
mail client program was registered via a digital certificate to its user and
bound to their email address, it wouldn't have been possible to forward forged,
digitally unsigned mail to other parties without their knowledge.

In any walk of life, an individual can do a great deal of damage if they don't
have to prove their identity, or worse, if they forge their identity to that of
an innocent party. This problem isn't novel or unique to software, open or
closed.

Regards,
Charlie