International Association for Cryptologic Research

International Association
for Cryptologic Research

CryptoDB

Nancy Lynch

Publications

Year
Venue
Title
2008
EPRINT
Modeling Computational Security in Long-Lived Systems, Version 2
For many cryptographic protocols, security relies on the assumption that adversarial entities have limited computational power. This type of security degrades progressively over the lifetime of a protocol. However, some cryptographic services, such as timestamping services or digital archives, are long-lived in nature; they are expected to be secure and operational for a very long time (i.e. super-polynomial). In such cases, security cannot be guaranteed in the traditional sense: a computationally secure protocol may become insecure if the attacker has a super-polynomial number of interactions with the protocol. This paper proposes a new paradigm for the analysis of long-lived security protocols. We allow entities to be active for a potentially unbounded amount of real time, provided they perform only a polynomial amount of work per unit of real time. Moreover, the space used by these entities is allocated dynamically and must be polynomially bounded. We propose a new notion of long-term implementation, which is an adaptation of computational indistinguishability to the long-lived setting. We show that long-term implementation is preserved under polynomial parallel composition and exponential sequential composition. We illustrate the use of this new paradigm by analyzing some security properties of the long-lived timestamping protocol of Haber and Kamat.
2007
EPRINT
On the Role of Scheduling in Simulation-Based Security
In a series of papers, K\"usters et al. investigated the relationships between various notions of simulation-based security. Two main factors, the placement of a ``master process'' and the existence of ``forwarder processes'', were found to affect the relationship between different definitions. In this extended abstract, we add a new dimension to the analysis of simulation-based security, namely, the scheduling of concurrent processes. We show that, when we move from sequential scheduling (as used in previous studies) to task-based nondeterministic scheduling, the same syntactic definition of security gives rise to incomparable semantic notions of security. Under task-based scheduling, the hierarchy based on placement of ``master process'' is no longer relevant, because no such designation is necessary to obtain meaningful runs of a system. On the other hand, the existence of ``forwarder processes'' remains an important factor.
2007
EPRINT
How to Model Bounded Computation in Long-Lived Systems
In most interesting cases, the security of cryptographic protocols relies on the assumption that adversarial entities have limited computational power, and it is generally accepted that security degrades progressively over time. However, some cryptographic services (e.g., time-stamping services or digital archives) are long-lived in nature; that is, their lifetime need not be bounded by a polynomial. In such cases, it is impossible to guarantee security in the traditional sense: even information theoretically secure protocols can fail if the attacker is given sufficient run time. This work proposes a new paradigm for long-lived computation, where computational restrictions are stated in terms of space and processing rates. In this setting, entities may live for an unbounded amount of real time, subject to the condition that only a polynomial amount of work can be done per unit real time. Moreover, the space used by these entities is allocated dynamically and must be polynomially bounded. We propose a key notion of approximate implementation, which is an adaptation of computational indistinguishability to the long-lived setting. We show that approximate implementation is preserved under polynomial parallel composition, and under exponential sequential composition. This provides core foundations for an exciting new area, namely, the analysis of long-lived cryptographic systems.
2005
EPRINT
Using Probabilistic I/O Automata to Analyze an Oblivious Transfer Protocol
The Probabilistic I/O Automata framework of Lynch, Segala and Vaandrager provides tools for precisely specifying protocols and reasoning about their correctness using multiple levels of abstraction, based on implementation relationships between these levels. We enhance this framework to allow analyzing protocols that use cryptographic primitives. This requires resolving and reconciling issues such as nondeterministic behavior and scheduling, randomness, resource-bounded computation, and computational hardness assumptions. The enhanced framework allows for more rigorous and systematic analysis of cryptographic protocols. To demonstrate the use of this framework, we present an example analysis that we have done for an Oblivious Transfer protocol.
1981
CRYPTO