Novell is now a part of Micro Focus

Global Software Registry: Ensuring the Validity of Downloaded Code

Articles and Tips: tip

Rich Lee
Senior Research Engineer

Roger Schell
Security Development Manager

01 Jun 1998


To be commercially attractive, electronic commerce (e-commerce) requires inexpensive and commercially available code-segments for applications. In fact, the success of electronically-mediated communications in e-commerce depends on the integrity and trust of transportable code segments, such as Dynamic Link Libraries (DLLs). Yet transportable code is not limited to software DLLs or code segments. Software programs are also transportable or electronically distributable to end-user consumers.

There are code integrity and trust problems when you do not know if electronically delivered software has been altered before it arrives for processing. While this is true for most software, it is especially evident in the browsers used by Internet applications where the features and functions produced by code segments are right in line with what consumers want.

The problem extends deeply into today's consumer-oriented environment, where traditional commercial packaging (diskettes or CDs) for code distribution is rapidly becoming inefficient. In more and more cases, commercial application updates and specific version releases are being distributed electronically. Yet there is no way to verify the validity of code segments that are distributed electronically over the Internet as there is through off-the-shelf distribution. Often consumers simply assume the validity of the code as well as the reliability of its source.

Today, Internet software users are making technology and liability assumptions regarding the origin of the software they are using. These are integrity assumptions where they may take the liability and encompass even those people who use direct dial-up services like electronic checking. While there is no need for undue concern, people should recognize that these assumptions extend to software packages they buy off-the-shelf.

The often unrecognized assumptions come in two forms. One assumption is that some type of "real" integrity check is being performed. For example, if you buy NetWare, you can be relatively certain that the installation NCF files do meet an integrity check. In fact, the installation program validates the installation NCFs against a known checksum. However, assumptions as to the validity of checksums, or their effectiveness versus other technologies like public-key signatures (which are added to identify the origin of the software), would be incorrect. The second assumption--that such checks or signatures by themselves are reliable--is a significant barrier to performing safe e-commerce.

The Problem with Distributing Code

The real security nightmare for administrators and consumers comes when executable code moves to a user's machine. Most major companies need automatic, scaleable and secure code distribution techniques to ensure code integrity. However, where real money is involved, as in interbusiness transactions, current safeguards provide no protection in this "open" connectivity environment. It is unfortunate that today's code integrity checks are insufficient. The vulnerability of unauthenticated code at runtime increases with the exposure of electronic commerce transactions in an "open" connectivity. (See point #1 of the BFG 15 points dealing with authentication credentials found in the January 1998 issue of Novell AppNotes.)

Taking up the challenge, one might wonder what technologies are available for companies to obtain and certify code segments. There are two underlying requirements.

  1. Suitable Distribution- known ability to compute the authenticated source of the software

  2. Trusted Path- a persistent means to ensure code validity when the computers uses it

To meet these requirements, you need reliable processing platforms with secure end-points. Yet, this requirement mandates "reliable" platforms which must exist if commercial transactions are to be tamper-proof. The "reliable" requirement implies "trusted" technology. This is point # 3 of the BFG 15 point security requirements (see the March 98 AppNotes). Several Chief Information Officers (CIOs) whose companies participate in the Black Forest Group also recognize this as a fundamental information processing requirement.

Suitable Code Distribution

There must be a confirmation method surrounding code and application distribution over open networks. When software is distributed electronically in an open environment, you must establish and maintain a reliable connection with the manufacturer, as the real source of the software. As business goes, accountability is a must. It should be implied back to the software supplier, and must translate into a real assurance for the customer. This is a serious issue, as there is no room for error in electronic commerce transactions over open networks.

Accountability proceeds from the customer's liability for using the commercial software and applies to the supplier to assure that precautions are taken in creating and distributing the software code. This computer calculated operation--determining the authenticity of the code—must take place with a known (computable) conclusion, so that an applet or a DLL can be refused for loading because sensitive operations are running. However, these same applets or DLLs must be allowed to run during nonsensitive operations.

The above is a concept of dual assurance. It might be termed "balanced confidence," with the user seeking validation not only for the source (origin) of the code, but also that the code is reliable for commercial use. Thus, customers need a thorough, safe, and reliable (that is, "trusted") software distribution mechanism.

Here, the word "trust" denotes international standards for security evaluation, such as the Trusted Network Interpretation (TNI) of the National Computer Security Center (NCSC). There is a problem in ensuring safe and reliable distribution of transportable software (such as application DLLs) to each end-user platform. The "safe" factor must encompass an acceptable level of integrity, and the "reliable" factor must encompass some mechanism for rejecting unworthy code segments.

One problem with today's client operating systems is that if any transportable code segment can be successfully used, it will be, even in sensitive operations. For e-commerce, it is important not to use some code segments during certain operations. Applets used in a browser or DLLs in an application are good examples. Even the applications themselves need to be scrutinized. With adequate code management and platform support, companies need to determine the validity of code segments. Beyond such code management you move into corporate policy.

Validating Transportable Software Distribution

For customers to obtain a workable commercial distribution method, users and distributors must first recognize the need for a trusted software distribution model. This model must provide a unique association between the customer's software (or hardware) and the supplier, and must balance the supplier's confidence that the software will be used for its intended purpose along with the customer's assurance that the software delivered through an unforgiving electronically-mediated environment will be in tact and unaltered. Otherwise consumers are forced to rely only on a single attribute of cryptographic implementation, which may not be scalable, automatic, or secure enough for their needs.

The trusted distribution environment needs to contain levels or grades of assurance that a computer can discern for an application's use. For example, the application's or applet's code can hold the process for transacting critical information, which may increase or decrease in use depending on the value of some specific data or process. The key factors to make this successful are in communications and in the quality of assurance.

Communications Factor. Successful communications begins with providing a reliable transmission for the transportable code segments and software. Given the large number of commercial and end-users, it is impractical for each potential customer to use direct communication with the supplier of the software for validation. Still, it must be possible for end-users to know they have received a trusted distribution of the software without having to go back to the source. This scheme requires a bilateral communication between software suppliers and their customers.

Yet for software validation, the supplier must be known to the buyer regardless of the software distribution model (CDs, diskettes, or the Internet). If electronic commerce is to be successful, strong validation must exist at each end regardless of how the software is received, and the end-points must validate the reliability of the computing environment.

Quality of Assurance. One of the criteria for customer confidence is for assurance quality to surround the distribution of commercial software. In other words, the computer must be able to discern some mathematical order to the quality of software and code segments that is used in the commercial application.

In a model which encompasses widespread use and scaleability, customers expect to find various uses for software. Low-value applications written for the low-end market may be picked up by high-end (valuable) applications. If someone wants to use low-end software for a high-end application, the validation process should be able to tell if it can be used. If the high-end application code is suitable for low-end purposes, its integration should be transparent.

Grades of quality should become a mechanism for validating distributed software. There are, however, two key factors needed to provide a global standard in recognizing qualified code:

  1. Establish a common metric for quality. That is, the common metric needs an agreed-upon definition of software quality that produces a calculable result every time.

  2. A process must exist by which a computer can arrive at the calculable result from only the available information, without performing a continuous communication with the software supplier.

With these two key elements, quality and method, it is possible to assert a level of accountability back to the origin of the software. In short, accountability levels provide knowledge of the software's source. Beyond that, if a manufacturer makes an explicit claim that a particular software meets some standard, then the manufacturer is responsible for the claim.

Registry Service and the Directory

A registry mechanism provides a straightforward way to supply this needed information, becoming a facility for storing information about the software. The registry should be an audited facility which employs a publicly known and validated code-of-practice with strict operating procedures. The registry provides customers with a common process (a certificate validation mechanism) of known value.

The principal technology underlying this type of registry is the use of public key technology with associated security and Directory technologies, such as X.500 or Novell Directory Services (NDS). These become the two technologies that provide a scalable and automatic solution. The Public Key Infrastructure (PKI) not only allows the creation of certificates, it also provides for integrity and authentication.

The Directory becomes key to eliminating the pair-wise and group-wise communications (many-to-one) for validating software. While these practices are proposed, they show no scalability in providing a solution. However, with a Local Directory Access Protocol (LDAP), it becomes a relatively straightforward exercise to go to the Directory and find out the representation of a grade. Since a Directory can be replicated or copied, one can reliably confirm the software validity without direct communication. Having the public key of the registry available to the end-points through a Directory information makes the software validation process scalable. Since Directory entries can be signed, their validity can be tested as well as their attributes.

Summary

It is important to recognize that end-users cannot wait on a 3- to 5-year timetable for standards organizations to agree upon procedures, and the market will not tolerate a protracted time once the criteria for providing the solutions are understood. As an alternative course, it might be better to use the standards we know, like X.509 but use them in a trusted fashion.

Today's current solutions in Electronic Commerce software distribution do not scale, nor will they encompass the business-to-business electronic transaction paradigm, so desperately needed. This is why the current credit card transaction is so popular. But these use a predominantly proprietary transaction scheme and cannot be used for inter-business transactions. So they cannot be used for real mass electronic commerce, and cannot expand into of the paradigm to "interbusiness" or "transborder" electronic transaction models, as this would allow all sorts of tampering without the ability to capture culprits. Still, at risk are the end-users, the applications, transportable DLLs, as well as updates to the operating systems--generally the entire computer system.

Today there are many bogus solutions, some with a very appealing look. For instance, some may believe that complete security for enterprise systems already exists in the form of Unix or OSF Class B3 clients. It is time to dispel those myths: There are no commercial OSF Class B3 clients, nor can Unix be used economically in the commercial environment with any degree of trust or scalability.

If it were not for the "real" money involved in e-commerce, one might think it possible to distribute code segments without strong validation and without dire consequences. Yet, the ability to introduce malicious logic (viruses and Trojan Horses) into the processing environment continues to grow as a real threat. In the future, these threats can grow with even greater opportunities for electronic fraud unless actions are taken to protect customers and suppliers with a trusted means of communication and validation.

At Novell, we have put together an architectural infrastructure and a working model which is highly flexible. It will support a registry. Novell has built registry services with cross-platform capabilities, and the types of cryptographic software within our system have internal controls against criminal tampering. Our intent is to serve our customers by handling the issues of key quality along with the types of software used.

* Originally published in Novell AppNotes


Disclaimer

The origin of this information may be internal or external to Novell. While Novell makes all reasonable efforts to verify this information, Novell does not make explicit or implied claims to its validity.

© Copyright Micro Focus or one of its affiliates