International Association for Cryptologic Research

International Association
for Cryptologic Research

IACR News

Updates on the COVID-19 situation are on the Announcement channel.

Here you can see all recent updates to the IACR webpage. These updates are also available:

RSS symbol icon
via RSS feed
Twitter bird icon
via Twitter
Weibo icon
via Weibo
Facebook icon
via Facebook

16 February 2017

Kexin Qiao, Ling Song, Meicheng Liu, Jian Guo
ePrint Report ePrint Report
In this paper, we focus on collision attacks against \Keccak hash function family and some of its variants. Following the framework developed by Dinur \etal at FSE~2012 where 4-round collisions were found by combining 3-round differential trails and 1-round connectors, we extend the connectors one round further hence achieve collision attacks for up to 5 rounds. The extension is possible thanks to the large degree of freedom of the wide internal state. By linearization of all S-boxes of the first round, the problem of finding solutions of 2-round connectors are converted to that of solving a system of linear equations. However, due to the quick freedom reduction from the linearization, the system has solution only when the 3-round differential trails satisfy some additional conditions. We develop a dedicated differential trail search strategy and find such special differentials indeed exist. As a result, the first practical collision attack against 5-round \shakea and two 5-round instances of the \Keccak collision challenges are found with real examples. We also give the first results against 5-round \Keccakn-224 and 6-round \Keccak collision challenges. It is remarked that the work here is still far from threatening the security of the full 24-round \Keccak family.
Expand
Prabhanjan Ananth, Aayush Jain, Amit Sahai
ePrint Report ePrint Report
Indistinguishability Obfuscation (iO) has enabled an incredible number of new and exciting applications. However, our understanding of how to actually build secure iO remains in its infancy. While many candidate constructions have been published, some have been broken, and it is unclear which of the remaining candidates are secure.

This work deals with the following basic question: \emph{Can we hedge our bets when it comes to iO candidates?} In other words, if we have a collection of iO candidates, and we only know that at least one of them is secure, can we still make use of these candidates?

This topic was recently studied by Ananth, Jain, Naor, Sahai, and Yogev [CRYPTO 2016], who showed how to construct a robust iO combiner: Specifically, they showed that given the situation above, we can construct a single iO scheme that is secure as long as (1) at least one candidate iO scheme is a subexponentially secure iO, and (2) either the subexponential DDH or LWE assumptions hold.

In this work, we make three contributions: \begin{itemize} \item (\textbf{Better robust iO combiners.}) First, we work to improve the assumptions needed to obtain the same result as Ananth et al.: namely we show how to replace the DDH/LWE assumption with the assumption that subexponentially secure one-way functions exist. \item (\textbf{Transforming Combiners from iO to FE and NIKE.}) Second, we consider a broader question: what if we start with several iO candidates where only one works, but we don't care about achieving iO itself, rather we want to achieve concrete applications of iO? In this case, we are able to work with the \emph{minimal} assumption of just polynomially secure one-way functions, and where the working iO candidate only achieves polynomial security. We call such combiners {\em transforming combiners}. More generally, a transforming combiner from primitive A to primitive B is one that takes as input many candidates of primitive A, out of which we are guaranteed that at least one is secure and outputs a secure candidate of primitive B. We can correspondingly define robust transforming combiners. We present transforming combiners from indistinguishability obfuscation to \emph{functional encryption} and \emph{non-interactive multiparty key exchance (NIKE)}. \item (\textbf{Correctness Amplification for iO from polynomial security and one-way functions.}) Finally, along the way, we obtain a result of independent interest: Recently, Bitansky and Vaikuntanathan [TCC 2016] showed how to amplify the correctness of an iO scheme, but they needed subexponential security for the iO scheme and also require subexponentially secure DDH or LWE. We show how to achieve the same correctness amplification result, but requiring only polynomial security from the iO scheme, and assuming only polynomially secure one-way functions. \end{itemize}
Expand
Seny Kamara, Tarik Moataz
ePrint Report ePrint Report
Recent work on searchable symmetric encryption (SSE) has focused on increasing its expressiveness. A notable example is the OXT construction (Cash et al., CRYPTO '13 ) which is the first SSE scheme to support conjunctive keyword queries with sub-linear search complexity. While OXT efficiently supports disjunctive and boolean queries that can be expressed in searchable normal form, it can only handle arbitrary disjunctive and boolean queries in linear time. This motivates the problem of designing expressive SSE schemes with worst-case sub-linear search; that is, schemes that remain highly efficient for any keyword query.

In this work, we address this problem and propose non-interactive highly efficient SSE schemes that handle arbitrary disjunctive and boolean queries with worst-case sub-linear search and optimal communication complexity. Our main construction, called IEX, makes black-box use of an underlying single keyword SSE scheme which we can instantiate in various ways. Our first instantiation, IEX-2Lev, makes use of the recent 2Lev construction (Cash et al., NDSS '14 ) and is optimized for search at the expense of storage overhead. Our second instantiation, IEX-ZMF, relies on a new single keyword SSE scheme we introduce called ZMF and is optimized for storage overhead at the expense of efficiency (while still achieving asymptotically sub-linear search). Our ZMF construction is the first adaptively-secure highly compact SSE scheme and may be of independent interest. At a very high level, it can be viewed as an encrypted version of a new Bloom filter variant we refer to as a Matryoshka filter. In addition, we show how to extend IEX to be dynamic and forward-secure.

To evaluate the practicality of our schemes, we designed and implemented a new encrypted search framework called Clusion. Our experimental results demonstrate the practicality of IEX and of its instantiations with respect to either search (for IEX-2Lev) and storage overhead (for IEX-ZMF).
Expand
Payman Mohassel, Mike Rosulek
ePrint Report ePrint Report
In cut-and-choose protocols for two-party secure computation (2PC) the main overhead is the number of garbled circuits that must be sent. Recent work (Lindell, Riva; Huang et al., Crypto 2014) has shown that in a batched setting, when the parties plan to evaluate the same function $N$ times, the number of garbled circuits per execution can be reduced by a $O(\log N)$ factor compared to the single-execution setting. This improvement is significant in practice: an order of magnitude for $N$ as low as one thousand. % Besides the number of garbled circuits, communication round trips are another significant performance bottleneck. Afshar et al. (Eurocrypt 2014) proposed an efficient cut-and-choose 2PC that is round-optimal (one message from each party), but in the single-execution setting.

In this work we present new malicious-secure 2PC protocols that are round-optimal and also take advantage of batching to reduce cost. Our contributions include: \begin{itemize} \item A 2-message protocol for batch secure computation ($N$ instances of the same function). The number of garbled circuits is reduced by a $O(\log N)$ factor over the single-execution case. However, other aspects of the protocol that depend on the input/output size of the function do not benefit from the same $O(\log N)$-factor savings. \item A 2-message protocol for batch secure computation, in the random oracle model. All aspects of this protocol benefit from the $O(\log N)$-factor improvement, except for small terms that do not depend on the function being evaluated. \item A protocol in the offline/online setting. After an offline preprocessing phase that depends only on the function $f$ and $N$, the parties can securely evaluate $f$, $N$ times (not necessarily all at once). Our protocol's online phase is only 2 messages, and the total online communication is only $\ell + O(\kappa)$ bits, where $\ell$ is the input length of $f$ and $\kappa$ is a computational security parameter. This is only $O(\kappa)$ bits more than the information-theoretic lower bound for malicious 2PC.
Expand
Sanjam Garg, Susumu Kiyoshima, Omkant Pandey
ePrint Report ePrint Report
The round complexity of secure computation has been a fundamental problem in cryptography. Katz and Ostrovsky proved that 5 rounds are both necessary and sufficient for secure computation in the stand alone setting, thus resolving the exact round complexity of standalone secure computation.

In contrast, round complexity of secure computation in the concurrent setting, where several protocols may run simultaneously, is poorly understood. Since standard polynomial time simulation is impossible in the concurrent setting, alternative security notions have been proposed, e.g., super-polynomial simulation (SPS). While SPS security can be achieved in constant rounds, the actual constant (> 20) is far from optimal.

In this work, we take the first steps towards studying the exact round complexity of concurrent secure computation. We focus on the two party case and present a new secure computation protocol that achieves SPS security under concurrent self-composition. Our protocol has 5 rounds assuming quasi-polynomially-hard injective one-way functions (or 7 rounds assuming standard polynomially-hard collision-resistant hash functions). We also require other standard assumptions, specifically trapdoor OWPs and lossy TDFs. This matches the rounds for standalone secure computation.

More specifically, our security proof presents a polynomial time reduction from SPS security to 3-round public-coin non-malleable commitments with appropriate extractability properties. Such commitments are known based on quasi-polynomially-hard injective OWFs. (The reduction also works with a special 6-round non-malleable commitment to yield the 7-round result under CRHFs.)
Expand
Rishab Goyal, Venkata Koppula, Brent Waters
ePrint Report ePrint Report
A public key encryption scheme is said to be n-circular secure if no PPT adversary can distinguish between encryptions of an n length key cycle and n encryptions of zero.

One interesting question is whether circular security comes for free from IND-CPA security. Recent works have addressed this question, showing that for all integers n, there exists an IND-CPA scheme that is not n-circular secure. However, this leaves open the possibility that for every IND-CPA cryptosystem, there exists a cycle length l, dependent on the cryptosystem (and the security parameter) such that the scheme is l-circular secure. If this is true, then this would directly lead to many applications, in particular, it would give us a fully homomorphic encryption scheme via Gentry’s bootstrapping.

In this work, we show that is not true. Assuming indistinguishability obfuscation and leveled homomorphic encryption, we construct an IND-CPA scheme such that for all cycle lengths l, the scheme is not l-circular secure.
Expand
Vadim Lyubashevsky, Gregory Neven
ePrint Report ePrint Report
Verifiable encryption allows one to prove properties about encrypted data and is an important building block in the design of cryptographic protocols, e.g., group signatures, key escrow, fair exchange protocols, etc. Existing lattice-based verifiable encryption schemes, and even just proofs of knowledge of the encrypted data, require parallel composition of proofs to reduce the soundness error, resulting in proof sizes that are only truly practical when amortized over a large number of ciphertexts.

In this paper, we present a new construction of a verifiable encryption scheme, based on the hardness of the Ring-LWE problem in the random-oracle model, for short solutions to linear equations over polynomial rings. Our scheme is "one-shot", in the sense that a single instance of the proof already has negligible soundness error, yielding compact proofs even for individual ciphertexts. Whereas verifiable encryption usually guarantees that decryption can recover a witness for the original language, we relax this requirement to decrypt a witness of a related but extended language. This relaxation is sufficient for many applications and we illustrate this with example usages of our scheme in key escrow and verifiably encrypted signatures.

One of the interesting aspects of our construction is that the decryption algorithm is probabilistic and uses the proof as input (rather than using only the ciphertext). The decryption time for honestly-generated ciphertexts only depends on the security parameter, while the expected running time for decrypting an adversarially-generated ciphertext is directly related to the number of random-oracle queries of the adversary who created it. This property suffices in most practical scenarios, especially in situations where the ciphertext proof is part of an interactive protocol, where the decryptor is substantially more powerful than the adversary, or where adversaries can be otherwise discouraged to submit malformed ciphertexts.
Expand
David Kohel
ePrint Report ePrint Report
We introduce the twisted $\mu_4$-normal form for elliptic curves, deriving in particular addition algorithms with complexity $9M + 2S$ and doubling algorithms with complexity $2M + 5S + 2m$ over a binary field. Every ordinary elliptic curve over a finite field of characteristic 2 is isomorphic to one in this family. This improvement to the addition algorithm is comparable to the $7M + 2S$ achieved for the $\mu_4$-normal form, and replaces the previously best known complexity of $13M + 3S$ on L\'opez-Dahab models applicable to these twisted curves. The derived doubling algorithm is essentially optimal, without any assumption of special cases. We show moreover that the Montgomery scalar multiplication with point recovery carries over to the twisted models, giving symmetric scalar multiplication adapted to protect against side channel attacks, with a cost of $4M + 4S + 1m_t + 2m_c$ per bit. In characteristic different from 2, we establish a linear isomorphism with the twisted Edwards model. This work, in complement to the introduction of $\mu_4$-normal form, fills the lacuna in the body of work on efficient arithmetic on elliptic curves over binary fields, explained by this common framework for elliptic curves if $\mu_4$-normal form in any characteristic. The improvements are analogous to those which the Edwards and twisted Edwards models achieved for elliptic curves over finite fields of odd characteristic and extend $\mu_4$-normal form to cover the binary NIST curves.
Expand
Rishab Goyal, Venkata Koppula, Brent Waters
ePrint Report ePrint Report
In this work we separate private-key semantic security from circular security using the Learning with Error assumption. Prior works used the less standard assumptions of multilinear maps or indistinguishability obfuscation. To achieve our results we develop new techniques for obliviously evaluating branching programs.
Expand
Christopher Portmann
ePrint Report ePrint Report
We show that the family of quantum authentication protocols introduced in [Barnum et al., FOCS 2002] can be used to construct a secure quantum channel and additionally recycle all of the secret key if the message is successfully authenticated, and recycle part of the key if the message has been tampered with. We give a full security proof that constructs the secure channel given only insecure noisy channels and a shared secret key. We also prove that the number of recycled key bits is optimal for this family of protocols, i.e., there exists an adversarial strategy to obtain all non-recycled bits. Previous works recycled less key and only gave partial security proofs, since they did not consider all possible distinguishers (environments) that may be used to distinguish the real setting from the ideal secure quantum channel.
Expand

14 February 2017

Lorenzo Grassi, Christian Rechberger, and Sondre Rønjom
ePrint Report ePrint Report
AES is probably the most widely studied and used block cipher. Also versions with a reduced number of rounds are used as a building block in many cryptographic schemes, e.g. several candidates of the CAESAR competition are based on it.

So far, non-random properties which are independent of the secret key are known for up to 4 rounds of AES. These include differential, impossible differential, and integral properties.

In this paper we describe a \emph{new structural property for up to 5 rounds of AES}, differential in nature, which is independent of the secret key, of the details of the MixColumns matrix (with the exception that the branch number must be maximal) and of the SubBytes operation. It is very simple: By appropriate choices of difference for a number of input pairs it is possible to make sure that the number of times that the difference of the resulting output pairs lie in a particular subspace is \emph{always} a multiple of 8.

We not only observe this property experimentally (using a small-scale version of AES), we also give a detailed proof as to why it has to exist. As a first application of this property, we describe a way to distinguish the 5-round AES permutation (or its inverse) from a random permutation with only $2^{32}$ chosen texts that has a computational cost of $2^{35.6}$ look-ups into memory of size $2^{36}$ bytes which has a success probability greater than 99%.
Expand
Zhaohui Cheng
ePrint Report ePrint Report
SM9 is a Chinese official cryptography standard which defines a set of identity-based cryptographic schemes from pairings. This report describes the technical specification of SM9 as a reference for those practitioners who have difficult to access the Chinese version of the standard.
Expand
Vincent Grosso, François-Xavier Standaert
ePrint Report ePrint Report
Evaluating the security level of a leaking implementation against side-channel attacks is a challenging task. This is especially true when countermeasures such as masking are implemented since in this case: (i) the amount of measurements to perform a key recovery may become prohibitive for certification laboratories, and (ii) applying optimal (multivariate) attacks may be computationally intensive and technically challenging. In this paper, we show that by taking advantage of the tightness of masking security proofs, we can significantly simplify this evaluation task in a very general manner. More precisely, we show that the evaluation of a masked implementation can essentially be reduced to the one of an unprotected implementation. In addition, we show that despite optimal attacks against masking schemes are computationally intensive for large number of shares, heuristic (soft analytical side-channel) attacks can approach optimality very efficiently. As part of this second contribution, we also improve over the recent multivariate (aka horizontal) side-channel attacks proposed at CHES 2016 by Battistello et al.
Expand
Sietse Ringers, Eric Verheul, Jaap-Henk Hoepman
ePrint Report ePrint Report
An attribute-based credential scheme allows a user, given a set of attributes, to prove ownership of these attributes to a verifier, voluntarily disclosing some of them while keeping the others secret. A number of such schemes exist, of which some additionally provide unlinkability: that is, when the same attributes were disclosed in two transactions, it is not possible to tell if one and the same or two different credentials were involved. Recently full-fledged implementations of such schemes on smart cards have emerged; however, these need to compromise the security level to achieve reasonable transaction speeds. In this paper we present a new unlinkable attribute-based credential scheme with a full security proof, using a known hardness assumption in the standard model. Defined on elliptic curves, the scheme involves bilinear pairings but only on the verifier's side, making it very efficient both in terms of speed and size on the user's side.
Expand
Itay Berman, Ron D. Rothblum, Vinod Vaikuntanathan
ePrint Report ePrint Report
Interactive proofs of proximity (Ergun, Kumar and Rubinfeld, Information & Computation, 2004 and Rothblum, Vadhan and Wigderson, STOC 2013), or IPPs, are interactive proofs in which the verifier runs in time sub-linear in the input's length. Since the verifier cannot even read the entire input, following the property testing literature, the requirement is that she accepts inputs that are in the language and rejects ones that are far from the language. However, these proofs could (and in many cases, do) betray considerable global information about the input to the verifier.

In this work, we initiate the study of zero-knowledge proofs of proximity (ZKPP). A ZKPP convinces a sub-linear time verifier while ensuring that she learns nothing more than a few locations of the input (and the fact that the input is ``close'' to the language).

Our main focus is the setting of statistical zero-knowledge where we show that the following hold unconditionally (where $N$ denotes the input size):

* Statistical ZKPPs can be sub-exponentially more efficient than property testers (or even non-interactive IPPs): We show a natural property which has a statistical ZKPP with a polylog(N) time verifier, but requires $\Omega(\sqrt{N})$ queries (and hence also runtime) for every property tester.

* Statistical ZKPPs can be sub-exponentially less efficient than IPPs: We show a property which has an IPP with a polylog(N) time verifier, but cannot have a statistical ZKPP with even an $N^{o(1)}$ time verifier.

* Statistical ZKPPs for some graph-based properties such as promise versions of expansion and bipartiteness.

Lastly, we also consider the computational setting where we show that: 1. Assuming the existence of one-way functions, every language computable either in (logspace uniform) NC or in SC, has a computational ZKPP with a (roughly) $\sqrt{N}$ time verifier.

2. Assuming the existence of collision-resistant hash functions, every language in NP has a statistical zero-knowledge argument of proximity with a polylog(N) verifier.
Expand
Pei Luo, Konstantinos Athanasiou, Yunsi Fei, Thomas Wahl
ePrint Report ePrint Report
This paper presents an efficient algebraic fault analysis on all four modes of SHA-3 under relaxed fault models. This is the first work to apply algebraic techniques on fault analysis of SHA-3. Results show that algebraic fault analysis on SHA-3 is very efficient and effective due to the clear algebraic properties of Keccak operations. Comparing with previous work on differential fault analysis of SHA-3, algebraic fault analysis can identify the injected faults with much higher rates, and recover an entire internal state of the penultimate round with much fewer fault injections.
Expand
Xavier Bultel, Pascal Lafourcade
ePrint Report ePrint Report
Zero-knowledge proxy re-identification (ZK-PRI) has been introduced by Blaze et al. in 1998 together with two other well known primitives of recryptography, namely proxy re-encryption (PRE) and proxy re-signature (PRS). A ZK-PRI allows a proxy to transform an identification protocol for Alice into an identification protocol for Bob using a re-proof key. PRE and PRS have been largely studied in the last decade, but surprisingly, no results about ZK-PRI have been published since the pioneer paper of Blaze et al.. We first show the insecurity of this scheme: just by observing the communications Alice can deduce Bob’s secret key. Then we give (i) definitions of the different families of ZK-PRI(bidirectional/unidirectional and interactive/non-interactive)(ii) a formal security model for these primitives and (iii) a concrete construction for each family. Moreover, we show that ZK-PRI can be used to manage the acces policy to several services that require a public key authentication.
Expand
Jonathan Burns, Daniel Moore, Katrina Ray, Ryan Speers, Brian Vohaska
ePrint Report ePrint Report
We introduce a secure elliptic curve oblivious pseudorandom function (EC-OPRF) which operates by hashing strings onto an elliptic curve to provide a simple and efficient mechanism for computing an oblivious pseudorandom function (OPRF). The EC-OPRF protocol enables a semi-trusted server to receive a set of cryptographically masked elliptic curve points from a client, secure those points with a private key, and return the resulting set to the client for unmasking. We also introduce extensions and generalizations to this scheme, including a novel mechanism that provides forward secrecy, and discuss the security and computation complexity for each variant. Benchmark tests for the implementations of the EC-OPRF protocol and one of its variants are provided, along with test vectors for the original protocol.
Expand
Patrick McCorry, Siamak F. Shahandashti, Feng Hao
ePrint Report ePrint Report
We present the first implementation of a decentralised and self-tallying internet voting protocol with maximum voter privacy using the Blockchain. The Open Vote Network is suitable for boardroom elec- tions and is written as a smart contract for Ethereum. Unlike previously proposed Blockchain e-voting protocols, this is the first implementation that does not rely on any trusted authority to compute the tally or to protect the voter’s privacy. Instead, the Open Vote Network is a self- tallying protocol, and each voter is in control of the privacy of their own vote such that it can only be breached by a full collusion involving all other voters. The execution of the protocol is enforced using the consensus mechanism that also secures the Ethereum blockchain. We tested the implementation on Ethereum’s official test network to demonstrate its feasibility. Also, we provide a financial and computational breakdown of its execution cost.
Expand
Yevgeniy Dodis, Dario Fiore
ePrint Report ePrint Report
Key Exchange (KE), which enables two parties (e.g., a client and a server) to securely establish a common private key while communicating over an insecure channel, is one of the most fundamental cryptographic primitives. In this work, we address the setting of {\em unilaterally-authenticated key exchange} (UAKE), where an unauthenticated (unkeyed) client establishes a key with an authenticated (keyed) server. This setting is highly motivated by many practical uses of KE on the Internet, but received relatively little attention so far.

Unlike the prior work, defining UAKE by downgrading a relatively complex definition of {\em mutually authenticated} key exchange (MAKE), our definition follows the opposite approach of upgrading existing definitions of public key encryption (PKE) and signatures towards UAKE. As a result, our new definition is short and easy to understand. Nevertheless, we show that it is {\em equivalent} to the UAKE definition of Bellare-Rogaway (when downgraded from MAKE), and thus captures a very strong and widely adopted security notion, while looking very similar to the simple ``one-oracle'' definition of traditional PKE/signature schemes. As a benefit of our intuitive framework, we show two {\em exactly-as-you-expect} (i.e., having no caveats so abundant in the KE literature!) UAKE protocols from (possibly interactive) signature and encryption. By plugging various one- or two-round signature and encryption schemes, we derive provably-secure variants of various well-known UAKE protocols (such as a unilateral variant of SKEME with and without perfect forward secrecy, and Shoup's A-DHKE-1), as well as new protocols, such as the first $2$-round UAKE protocol which is both (passively) forward deniable and forward-secure.

To further clarify the intuitive connections between PKE/Signatures and UAKE, we define and construct stronger forms of (necessarily interactive) PKE/Signature schemes, called {\em confirmed encryption} and {\em confidential authentication}, which, respectively, allow the sender to obtain confirmation that the (keyed) receiver output the correct message, or to hide the content of the message being authenticated from anybody but the participating (unkeyed) receiver. Using confirmed PKE/confidential authentication, we obtain two concise UAKE protocols of the form: ``send confirmed encryption/confidential authentication of a random key $K$.''
Expand
◄ Previous Next ►