TESTIMONY OF DR. MATTHEW BLAZE
BEFORE THE SENATE COMMITTEE ON COMMERCE, SCIENCE, AND TRANSPORTATION,
SUBCOMMITTEE ON SCIENCE, TECHNOLOGY, AND SPACE
JUNE 26, 1996
Thank you for the opportunity to speak with you about the technical impact
of encryption policy. It is a privilege to be here, and I hope my perspective
will be useful to you.
Let me begin by describing my own background and biases. I am a Principal
Research Scientist in the area of computer security and cryptology at AT&T
Research in Murray Hill, New Jersey. I also hold a number of ancillary
appointments related to computer security; among others, I teach an occasional
graduate course in the subject at Columbia University, and I serve as co-chair
of the Federal Networking Council Advisory Committee subcommittee on security
and privacy (which advises Federal agencies on computer networking issues).
However, the views I am presenting here today are my own, and should not
be taken to represent those of any organization with which I happen to be
affiliated.
I am a computer scientist by training; my Ph.D. is from the Princeton University
Computer Science department, and my primary research areas are cryptology,
computer security, and large-scale distributed systems. Much of my research
focuses on the management of encryption keys in networked computing systems
and understanding the risks of using cryptographic techniques to accomplish
security objectives. Recent government initiatives in encryption, such as
the "Clipper Chip," have naturally been of great interest to me,
in no small part because of the policy impact they have on the field in
which I work, but also because they present a number of very interesting
technical and scientific challenges in their own right.
My testimony today focuses on three areas. First, I will discuss the role
and risks of cryptographic techniques for securing the current and future
electronic world. Next, I will examine in more detail the security implications
of the limitations imposed on US-based cryptographic systems through the
government's export policies. Finally, I will discuss the technical aspects
of the Administration's current approach to cryptography policy, which promotes
"key escrow" systems.
I. THE INCREASING IMPORTANCE OF ENCRYPTION
The importance of cryptographic techniques for securing modern computer
and communications systems is widely recognized today. Evidence of the scope
of this recognition can be found in the increasing number of hardware, software,
and system vendors that offer encryption in their products, the increasing
demand for high-quality encryption by users in a widening array of applications,
and the growing, thriving community of cryptologic researchers of which
I am a part. It is vital that those who formulate our nation's policies
and official attitude toward encryption understand the nature of the underlying
technology and the reasons for its growing importance to our society.
The basic function of cryptography is to separate the security of a message's
content from the security of the medium over which it is carried. For example,
we might encrypt a cellular telephone conversation to guard against eavesdroppers
(allowing the call to be transmitted safely over easily-intercepted radio
frequencies), or we might use encryption to verify that documents, such
as contracts, have not been tampered with (removing the need to safeguard
a copy of the original). The idea that this might be possible is not a
new one; history suggests that the desire to protect information is almost
as old as the written word itself. Perhaps as a consequence of the invention
of the digital computer, our understanding of the theory and practice of
cryptography has accelerated in recent years, with a number of new techniques
developed and many new applications emerging. Among the most important of
the recent techniques is "public key cryptography." It allows
secure messages to be exchanged without the need for specific advance arrangements
between parties. A related notion is the "digital signature,"
which allows messages to be "signed" in a way that verifiably
associates the signer of a message with its content.
Modern cryptographic techniques are based on the application of simple,
if repetitive, mathematical functions, and as such lend themselves nicely
to implementation by computer programs. Any information that can be represented
digitally can be protected by encryption, including computer files, electronic
mail messages, and even audio and video signals such as telephone calls,
radio, and television. Encryption can be performed by means of software
on general-purpose computers, through special-purpose hardware, or by special
programming of microprocessor-based electronic products such as the next
generation of cellular telephones. The basic cost of encryption in terms
of computational power required is quite low, and the marginal cost of including
encryption in a software-based computer program or a programmable electronic
product is essentially zero.
Why, then, has encryption recently enjoyed so much attention? The reasons
can be found from two perspectives: the technology of modern communication
systems, and the new purposes for which we are relying on digital information.
First, the technology and economics of modern communications and computing
systems strongly favors media that have little inherent security. For example,
wireless telephones have great advantages in convenience and functionality
compared with their familiar wired counterparts and are comprising an increasing
proportion of the
telephone network. This also makes eavesdropping much easier for curious
neighbors, burglars identifying potential targets, and industrial spies
seeking to misappropriate trade secrets. Similarly, decentralized computer
networks such as the Internet have lower barriers to entry, are much less
expensive, are more robust and can be used to accomplish a far greater variety
of tasks than the proprietary networks of the past, but, again, at the expense
of intrinsic security. The Internet makes it virtually impossible to restrict,
or even predict, the path that a particular message will traverse, and there
is no way to be certain where a message really originated or whether its
content has been altered along the way. It is possible, even common, for
electronic mail messages to route through the computers of competitors.
This is not a result of sloppy design or poor planning on the part of the
Internet's architects; on the contrary, these properties are a direct consequence
of the technological advances that make the Internet efficient and useful
in the first place.
Second, electronic communication is becoming increasingly critical to the
smooth functioning of our society and our economy and even to protect the
safety of human life. Communication networks and computer media are rapidly
replacing less efficient, traditional modes of interaction whose security
properties are far better understood. As
teleconferencing replaces face-to-face meetings, electronic mail replaces
letters, electronic payment systems replace cash transactions, and on-line
information services replace written reference materials, we gain a great
deal in efficiency, but our assumptions about the reliability of very ordinary
transactions are often dangerously out-of-date.
Put another way, the trend in communication and computing networks has been
away from closed systems in favor of more open ones and the trend in our
society is to rely on these new systems for increasingly serious purposes.
There is every reason to believe that these trends will continue, and even
accelerate, for the foreseeable future. Cryptography plays an important
and clear role in helping to provide security assurances that at least mirror
what we have come to expect from the older, more familiar communications
methods of the not-so-distant past.
II. KEY LENGTH AND SECURITY
The "strength" of an encryption system depends on a number of
variables, including the mathematical properties of the underlying encryption
function, the quality of the implementation, and the number of different
"keys" from which the user is able to choose. It is very important
that a cryptosystem and its implementation be of high
quality, since an error or bug in either can expose the data it protects
to unexpected vulnerabilities. Although the mathematics of cryptography
is not completely understood and cipher design is an exceptionally difficult
discipline (there is as yet no general "theory" for designing
cipher functions), there are a number of common
cipher systems that have been extensively studied and that are widely trusted
as building blocks for secure systems. The implementation of practical
systems out of these building blocks, too, is a subtle and difficult art,
but commercial experience in this area is beginning to lead to good practices
for adding high-quality encryption systems to
software and hardware. Users and developers of secure systems can protect
against weaknesses in these areas by choosing only cipher functions that
have been carefully studied and by ensuring that their implementation follows
good engineering practices.
The most easily quantified variable that contributes to the strength of
an encryption system is the size of the pool of potential values from which
the cryptographic keys are chosen. Modern ciphers depend on the secrecy
of the users' keys, and a system is considered well-designed only if the
easiest "attack" involves trying every possible key, one after
the other, until the correct one is found. The system is secure only if
the number of keys is large enough to make such an attack infeasible. Keys
are usually specified as a string of "bits," and adding one bit
to the key length doubles the number of possible keys. An important question,
then, is the minimum key length sufficient to resist a key search attack
in practice.
Last November, I participated in a study, organized by the Business Software
Alliance, aimed at examining the computer technology that might be used
by an "attacker" in order to determine the minimum length keys
that should be used in commercial applications. We followed an unusually
conservative methodology in that we assumed that the attacker would have
only available standard "off-the-shelf" technology and is constrained
to purchase in single-unit quantities with no economies of scale. That
is, our methodology would tend to produce a recommendation for shorter keys
than would an analysis using the more conventional approach of giving the
potential attacker every benefit of the doubt in terms of technological
advantages he might enjoy. Nonetheless, we concluded that the key lengths
recommended in existing U.S. government standards (e.g., the Data Encryption
Standard, with a 56-bit key) for domestic use are far too short and will
soon render data protected under them vulnerable to attack with only modest
resources. We concluded that keys today should be a bare minimum of 75
bits long, and that systems being fielded today to secure data over the
next twenty years must employ keys of at least 90 bits. I have included
a copy of our report as an appendix to my testimony.
Attempting to design systems "at the margins" by using the minimum
key length needed is a dubious enterprise at best. Because even a slight
miscalculation as to the technology and resources available to the potential
attacker can make the difference between a secure system and an insecure
one, prudent designers specify keys that are longer than the minimum they
estimate is needed to resist attack, to provide a margin for error.
Current U.S. policy encourages the designers of encryption systems to take
exactly the opposite approach. Encryption systems designed for export from
the United States at present generally must use keys no more than 40 bits
long. Such systems provide essentially no cryptographic security, except
against the most casual "hacker." Examples of 40 bit systems being
"broken" through the use of spare computer time on university
computer networks are commonplace. Unfortunately, it is not only users outside
the U.S. who must make do with the inferior security provided by such short
keys. Because of the difficulty of maintaining multiple versions of software,
one for domestic sale and one for export, and the need for common interoperability
standards, many US-based products are available only with export-length
keys.
There is no technical, performance, or economic benefit to employing keys
shorter than needed. Unlike, for example, the locks used to protect our
homes, very secure cryptographic systems with long keys are no more expensive
to produce or any harder to design or use than weaker systems with shorter
keys. The only reason vendors design systems with short keys is to comply
with export requirements.
The key length figures and analysis in this section are based on so-called
"secret key" cryptosystems. For technical reasons, current public
key cryptosystems employ much longer keys than secret key systems to achieve
equivalent security (public keys are measured in hundreds or thousands of
bits). However, virtually all systems that use public key cryptography
also rely on secret key cryptography, and so the overall strength of any
system is limited by the weakest encryption function and key length in it.
III. THE RISKS OF KEY ESCROW
A number of recent Administration initiatives have proposed that future
cryptosystems include special "key escrow" provisions to facilitate
access to encrypted data by law enforcement and intelligence agencies.
In a such systems, copies of keys are automatically deposited, in advance,
with third parties who can use them to arrange for law enforcement access
if required in the future. Several key escrow systems have been proposed
by the Administration, differing in the details of how keys are escrowed,
and who the third party key holders are. In the first proposal, called
the "Clipper chip," the system is embedded in a special tamper-resistant
hardware-based cryptosystem and copies of keys are held by federal agencies.
In the more recent "public key infrastructure" proposal, keys
are escrowed at the time a new public key is generated and are held by the
organization (public or private) responsible for certification of the public
key.
Although the various key escrow proposals differ in the details of how they
accomplish their objective, there are a number of very serious fundamental
problems and risks associated with all of them.
There are some appropriate commercial applications of key escrow techniques.
A properly designed cryptosystem makes it essentially impossible to recover
encrypted data without the correct key. This can be a double-edge sword;
the cost of keeping unauthorized parties out is that if keys are lost or
unavailable at the time they are needed, the owner of encrypted data will
be unable to make use of his own information. This problem, of balancing
secrecy with assurances of continued availability, remains an area of active
research, and commercial solutions are starting to emerge. The Administration's
initiatives do not address this problem especially well, however.
The first problem with key escrow is the great increase in engineering complexity
that such systems entail. The design and implementation of even the simplest
encryption systems is an extraordinarily difficult and delicate process.
Very small changes can introduce fatal security flaws that often can be
exploited by an attacker. Ordinary (non-escrowed) encryption systems have
conceptually rather simple requirements (for example, the secure transmission
of data between two parties) and yet, because there is no general theory
for designing them, we still often discover exploitable flaws in fielded
systems. Key escrow renders even the specification of the problem itself
far more complex, making it virtually impossible to assure that such systems
work as they are intended to. It is possible, even likely, that lurking
in any key escrow system are one or more design weaknesses that allow recovery
of data by unauthorized parties. The commercial and academic world simply
does not have the tools to analyze or design the complex systems that arise
from escrow.
Key escrow is so difficult that even systems designed by the classified
world can have subtle problems that are only discovered later. In 1994
I discovered a new type of "protocol failure" in the Escrowed
Encryption Standard, the system on which the Clipper chip is based. The
failure allows, contrary to the design objectives of the system, a rogue
user to circumvent the escrow system in a way that makes the data unrecoverable
by the government. Other weaknesses have been discovered since then that
make it possible, for example, to create incriminating messages that appear
to have originated from a particular user.
It should be noted that these weaknesses have been discovered in spite of
the fact that most of the details of the standard are classified and were
not included in the analysis that led to the discovery of the flaws. But
these problems did not come about because of incompetence on the part of
the system's designers. Indeed, the U.S. National Security Agency is likely
the most advanced cryptographic enterprise in the world, and is justifiably
entrusted with developing the cryptographic systems that safeguard the government's
most important military and state secrets. The reason the Escrowed Encryption
Standard has flaws that are still being discovered is that key escrow is
an extremely difficult technical problem, with requirements unlike anything
previously encountered.
A second problem with key escrow arises from the difficulty of operating
a key escrow center in a secure manner. According to the Administration
(for example, see the May 20, 1996 White House draft report "Enabling
Privacy, Commerce, Security and Public Safety in the Global Information
Infrastructure"), key escrow centers must be prepared to respond to
law enforcement requests for escrowed data 24 hours a day, completing transactions
within two hours of receiving each request. There are thousands of law
enforcement agencies in the United States authorized to perform electronic
surveillance, and the escrow center must be prepared to identify and respond
to any of them within this time frame. If the escrow center is also a commercial
operation providing data recovery services, it may also have tens of thousands
of additional private sector customers that it must be prepared to serve
and respond to. There are few, if any, secure systems that operate effectively
on such a scale and under such tightly-constrained response time. The argument,
advanced by the Administration, that escrow centers can use the same procedures
that protect classified data is a curious one, since classified information
is by its nature available to a far smaller and more carefully-controlled
potential audience than are escrowed keys. It is simply inevitable that
escrow centers that meet the government's requirements will make mistakes
in giving out the wrong keys from time to time or will be vulnerable to
fraudulent key requests. Key escrow, by its nature, makes encrypted data
less secure because the escrow center introduces a new target for attack.
A third problem with the Administration's key escrow proposals is that they
fail to distinguish between cryptographic keys for which recovery might
be required and those for which recoverability is never needed. There are
many different kinds of encryption keys, but for the purposes of discussing
key escrow it is sufficient to divide keys into three categories. The first
includes keys used to encrypt stored information, which must be available
throughout the lifetime of the data. The owner of the data has an obvious
interest in ensuring the continued availability of such keys, and might
choose to rely on a commercial service to store "backup" copies
of them. A second category of key includes those used to encrypt real-time
communications such as telephone calls. Here, the key has no value to its
owner once the transaction for which it was used has completed. If a key
is lost or destroyed in the middle of a conversation, a new one can be established
in its place without permanent loss of information. For these keys, the
owner has no use for recoverability; it is of value only to law enforcement
and others who wish to obtain access to a conversation without the knowledge
or cooperation of the parties. Finally, there are the keys used not for
secrecy but for signature and authentication, to insure that messages indeed
originated from a particular party. There is never a need for anyone, law
enforcement or the key owner, to recover such keys, since their purpose
is not to obscure content but rather to establish authorship. If the owner
looses a signature key, a new one can be generated easily at any time.
Unfortunately, however, the current Administration proposal exposes all
three types of keys equally to the risks introduced by the escrow system,
even though recoverability is not required for all of them. Partly this
is because there is no intrinsic difference in the structure of the different
types of keys; they are usually indistinguishable from one another outside
of the application in which they are used.
Finally, there is the problem that criminals can circumvent almost any escrow
system to avoid exposure to law enforcement monitoring. All key escrow
systems are vulnerable to so-called "superencryption," in which
a user first encrypts data with an unescrowed key prior to processing it
with the escrowed system. Most escrow systems are also vulnerable to still
other techniques that make it especially easy to render escrowed keys useless
to law enforcement. The ease of avoiding law enforcement when convenient
raises an obvious question as to whether the reduced security and high cost
of setting up an escrow system will yield any appreciable public safety
benefit in practice.
IV. CONCLUSIONS AND RECOMMENDATIONS
The wide availability of encryption is vitally important to the future growth
of our global information infrastructure. In many cases, encryption offers
the only viable option for securing the rapidly increasing range of human,
economic and social activities taking place over emerging communication
networks. It is no exaggeration to say that the availability of encryption
in the commercial marketplace is and will continue to be necessary to protect
national security. Unfortunately, current policy, through export controls
and ambiguous standards, discourages, rather than promotes, the use of encryption.
Current encryption policy is enormously frustrating to almost everyone working
in the field. Export controls make it difficult to deploy effective cryptography
even domestically, and we can do little more than watch as our foreign colleagues
and competitors, not constrained by these rules, are matching our expertise
and obtaining an ever-increasing share of the market. A large part of the
problem is that the current regulations were written as if to cover hardware
but are applied to software, including software in the public domain or
aimed at the mass market. The PRO-CODE bill goes a long way toward moving
the regulations in line with the realities of the technology.
Back to the Pro-CODE Page
CDT Home Page
Posted on June 28, 1996 || For more information, contact webmaster@cdt.org