International Association for Cryptologic Research

International Association
for Cryptologic Research

IACR News

Updates on the COVID-19 situation are on the Announcement channel.

Here you can see all recent updates to the IACR webpage. These updates are also available:

RSS symbol icon
via RSS feed
Twitter bird icon
via Twitter
Weibo icon
via Weibo
Facebook icon
via Facebook

22 May 2017

Atul Luykx, Bart Mennink, Kenneth G. Paterson
ePrint Report ePrint Report
The multi-key, or multi-user, setting challenges cryptographic algorithms to maintain high levels of security when used with many different keys, by many different users. Its significance lies in the fact that in the real world, cryptography is rarely used with a single key in isolation. A folklore result, proved by Bellare, Boldyreva, and Micali for public-key encryption in EUROCRYPT 2000, states that the success probability in attacking any one of many independently keyed algorithms can be bounded by the success probability of attacking a single instance of the algorithm, multiplied by the number of keys present. Although sufficient for settings in which not many keys are used, once cryptographic algorithms are used on an internet-wide scale, as is the case with TLS, the effect of multiplying by the number of keys can drastically erode security claims. We establish a sufficient condition on cryptographic schemes and security games under which multi-key degradation is avoided. As illustrative examples, we discuss how AES and GCM behave in the multi-key setting, and prove that GCM, as a mode, does not have multi-key degradation. Our analysis allows limits on the amount of data that can be processed per key by GCM to be significantly increased. This leads directly to improved security for GCM as deployed in TLS on the Internet today.
Expand
Zhe Liu, Patrick Longa, Geovandro Pereira, Oscar Reparaz, Hwajeong Seo
ePrint Report ePrint Report
This work deals with the energy-efficient, high-speed and high-security implementation of elliptic curve scalar multiplication, elliptic curve Diffie-Hellman (ECDH) key exchange and elliptic curve digital signatures on embedded devices using FourQ and incorporating strong countermeasures to thwart a wide variety of side-channel attacks. First, we set new speed records for constant-time curve-based scalar multiplication, DH key exchange and digital signatures at the 128-bit security level with implementations targeting 8, 16 and 32-bit microcontrollers. For example, our software computes a static ECDH shared secret in 7.0 million cycles (or 0.9 seconds @8MHz) on a low-power 8-bit AVR microcontroller which, compared to the fastest Curve25519 and genus-2 Kummer implementations on the same platform, offers 2x and 1.4x speedups, respectively. Similarly, it computes the same operation in 559 thousand cycles on a 32-bit ARM Cortex-M4 microcontroller, achieving a factor-2.5 speedup when compared to the fastest Curve25519 implementation targeting the same platform. A similar speed performance is observed in the case of digital signatures. Second, we engineer a set of side-channel countermeasures taking advantage of FourQ's rich arithmetic and propose a secure implementation that offers protection against a wide range of sophisticated side-channel attacks, including differential power analysis (DPA). Despite the use of strong countermeasures, the experimental results show that our FourQ software is still efficient enough to outperform implementations of Curve25519 that only protect against timing attacks. Finally, we perform a differential power analysis evaluation of our software running on an ARM Cortex-M4, and report that no leakage was detected with up to 10 million traces. These results demonstrate the potential of deploying FourQ on low-power applications such as protocols for the Internet of Things.
Expand
Saikrishna Badrinarayanan, Sanjam Garg, Yuval Ishai, Amit Sahai, Akshay Wadia
ePrint Report ePrint Report
We study the feasibility of two-message protocols for secure two-party computation in the plain model, for functionalities that deliver output to one party, with security against malicious parties. Since known impossibility results rule out polynomial-time simulation in this setting, we consider the common relaxation of allowing super-polynomial simulation. We first address the case of zero-knowledge functionalities. We present a new construction of two-message zero-knowledge protocols with super-polynomial simulation from any (sub- exponentially hard) game-based two-message oblivious transfer protocol, which we call Weak OT. As a corollary, we get the first two-message WI arguments for NP from (sub-exponential) DDH. Prior to our work, such protocols could only be constructed from assumptions that are known to imply non-interactive zero-knowledge protocols (NIZK), which do not include DDH. We then extend the above result to the case of general single-output functionalities, showing how to construct two-message secure computation protocols with quasi-polynomial simulation from Weak OT. This implies protocols based on sub-exponential variants of several standard assumptions, including Decisional Diffie Hellman (DDH), Quadratic Residuosity Assumption, and Nth Residuosity Assumption. Prior works on two-message protocols either relied on some trusted setup (such as a common reference string) or were restricted to special functionalities such as blind signatures. As a corollary, we get three-message protocols for two-output functionalities, which include coin-tossing as an interesting special case. For both types of functionalities, the number of messages (two or three) is optimal. Finally, motivated by the above, we further study the Weak OT primitive. On the positive side, we show that Weak OT can be based on any semi-honest 2-message OT with a short second message. This simplifies a previous constructions of Weak OT from the Nth Residuosity Assumption. We also present a construction of Weak OT from Witness Encryption (WE) and injective one-way functions, implying the first construction of two-message WI arguments from WE. On the negative side, we show that previous constructions of Weak OT do not satisfy simulation-based security even if the simulator can be computationally unbounded.
Expand
Kaisa Nyberg
ePrint Report ePrint Report
Linear cryptanalysis makes use of statistical models that consider linear approximations over block cipher and random permutation as binary random variables. In this note we show that linear and statistical independence are equivalent properties for linear approximations of the random permutations and the block ciphers with independent pre- and post-whitening keys.
Expand
Nilanjan Datta, Atul Luykx, Bart Mennink, Mridul Nandi
ePrint Report ePrint Report
The authenticated encryption scheme COLM is a third-round candidate in the CAESAR competition. Much like its antecedents COPA, ELmE, and ELmD, COLM consists of two parallelizable encryption layers connected by a linear mixing function. While COPA uses plain XOR mixing, ELmE, ELmD, and COLM use a more involved invertible mixing function. In this work, we investigate the integrity of the COLM structure when unverified plaintext is released, and demonstrate that its security highly depends on the choice of mixing function. Our results are threefold. First, we discuss the practical nonce-respecting forgery by Andreeva et al. (ASIACRYPT 2014) against COPA's XOR mixing. Then we present a nonce-misusing forgery against arbitrary mixing functions with practical time complexity. Finally, by using significantly larger queries, we can extend the previous forgery to be nonce-respecting.
Expand
Ilaria Chillotti, Nicolas Gama, Mariya Georgieva, Malika Izabachène
ePrint Report ePrint Report
In this paper, we present several methods to improve the evaluation of homomorphic functions, both for fully and for leveled homomorphic encryption. We propose two packing methods, in order to decrease the expansion factor and optimize the evaluation of look-up tables and random functions in TRGSW-based homomorphic schemes. We also extend the automata logic, introduced in [19, 12], to the efficient leveled evaluation of weighted automata, and present a new homomorphic counter called TBSR, that supports all the elementary operations that occur in a multiplication. These improvements speed-up the evaluation of most arithmetic functions in a packed leveled mode, with a noise overhead that remains additive. We finally present a new circuit bootstrapping that converts TLWE into low-noise TRGSW ciphertexts in just 137ms, which makes the leveled mode of TFHE composable, and which is fast enough to speed-up arithmetic functions, compared to the gate-by-gate bootstrapping given in [12]. Finally, we propose concrete parameter sets and timing comparison for all our constructions.
Expand
Christian Badertscher, Christian Matt, Ueli Maurer
ePrint Report ePrint Report
Access control encryption (ACE) was proposed by Damgård et al. to enable the control of information flow between several parties according to a given policy specifying which parties are, or are not, allowed to communicate. By involving a special party, called the sanitizer, policy-compliant communication is enabled while policy-violating communication is prevented, even if sender and receiver are dishonest. To allow outsourcing of the sanitizer, the secrecy of the message contents and the anonymity of the involved communication partners is guaranteed.

This paper shows that in order to be resilient against realistic attacks, the security definition of ACE must be considerably strengthened in several ways. A new, substantially stronger security definition is proposed, and an ACE scheme is constructed which provably satisfies the strong definition under standard assumptions.

Three aspects in which the security of ACE is strengthened are as follows. First, CCA security (rather than only CPA security) is guaranteed, which is important since senders can be dishonest in the considered setting. Second, the revealing of an (unsanitized) ciphertext (e.g., by a faulty sanitizer) cannot be exploited to communicate more in a policy-violating manner than the information contained in the ciphertext. We illustrate that this is not only a definitional subtlety by showing how in known ACE schemes, a single leaked unsanitized ciphertext allows for an arbitrary amount of policy-violating communication. Third, it is enforced that parties specified to receive a message according to the policy cannot be excluded from receiving it, even by a dishonest sender.
Expand
Douglas R. Stinson
ePrint Report ePrint Report
In 1996, Jackson and Martin proved that a strong ideal ramp scheme is equivalent to an orthogonal array. However, there was no good characterization of ideal ramp schemes that are not strong. Here we show the equivalence of ideal ramp schemes to a new variant of orthogonal arrays that we term augmented orthogonal arrays}. We give some constructions for these new kinds of arrays, and, as a consequence, we also provide parameter situations where ideal ramp schemes exist but strong ideal ramp schemes do not exist.
Expand
Gregor Leander, Alexander May
ePrint Report ePrint Report
Using whitening keys is a well understood mean of increasing the key-length of any given cipher. Especially as it is known ever since Grover's seminal work that the effective key-length is reduced by a factor of two when considering quantum adversaries, it seems tempting to use this simple and elegant way of extending the key-length of a given cipher to increase the resistance against quantum adversaries. However, as we show in this work, using whitening keys does not increase the security in the quantum-CPA setting significantly. For this we present a quantum algorithm that breaks the construction with whitening keys in essentially the same time complexity as Grover's original algorithm breaks the underlying block cipher. Technically this result is based on the combination of the quantum algorithms of Grover and Simon for the first time in the cryptographic setting, which might well have other applications.
Expand
Masahiro Yagisawa
ePrint Report ePrint Report
Previously I proposed fully homomorphic public-key encryption (FHPKE) based on discrete logarithm problem which is vulnerable to quantum computer attacks. In this paper I propose FHPKE based on multivariate discrete logarithm assumption and multivariate computational Diffie–Hellman assumption. This encryption scheme is thought to withstand to quantum computer attacks. Though I can construct this scheme over many non-commutative rings, I will adopt the FHPKE scheme based on the octonion ring as the typical example for showing how this scheme is constructed. The multivariate discrete logarithm problem (MDLP) is defined such that given f(x), g(x), h(x) and a prime q, final goal is to find integers m and n where h(x)= g^(-n)(f^m(g^(n)(x))) mod q over octonion ring.
Expand
Akihiro Nishimura, Takuya Nishida, Yu-ichi Hayashi, Takaaki Mizuki, Hideaki Sone
ePrint Report ePrint Report
Card-based cryptographic protocols can perform secure computation of Boolean functions. Cheung et al. presented an elegant protocol that securely produces a hidden AND value using five cards; however, it fails with a probability of 1/2. The protocol uses an unconventional shuffle operation called unequal division shuffle; after a sequence of five cards is divided into a two-card portion and a three-card portion, these two portions are randomly switched. In this paper, we first show that the protocol proposed by Cheung et al. securely produces not only a hidden AND value but also a hidden OR value (with a probability of 1/2). We then modify their protocol such that, even when it fails, we can still evaluate the AND value. Furthermore, we present two five-card copy protocols using unequal division shuffle. Because the most efficient copy protocol currently known requires six cards, our new protocols improve upon the existing results. We also design a general copy protocol that produces multiple copies using unequal division shuffle.
Expand
Markku-Juhani O. Saarinen
ePrint Report ePrint Report
We describe a new reconciliation method for Ring-LWE that has a significantly smaller failure rate than previous proposals while reducing ciphertext size and the amount of randomness required. It is based on a simple, deterministic variant of Peikert's reconciliation that works with our new ``safe bits'' selection and constant-time error correction techniques. The new method does not need randomized smoothing to achieve non-biased secrets. When used with the very efficient ``New Hope'' Ring-LWE parametrization we achieve a decryption failure rate well below $2^{-128}$ (compared to $2^{-60}$ of the original), making the scheme suitable for public key encryption in addition to key exchange protocols; the reconciliation approach saves about $40 \%$ in ciphertext size when compared to the common LP11 Ring-LWE encryption scheme. We perform a combinatorial failure analysis using full probability convolutions, leading to a precise understanding of decryption failure conditions on bit level. Even with additional implementation security and safety measures the new scheme is still essentially as fast as the New Hope but has slightly shorter messages. The new techniques have been instantiated and implemented as a Key Encapsulation Mechanism (KEM) and public key encryption scheme designed to meet the requirements of NIST's Post-Quantum Cryptography effort at very high security level.
Expand
Alexander Koch, Stefan Walzer
ePrint Report ePrint Report
Starting with the so-called “Five-Card Trick” of den Boer (EUROCRYPT 1989) for computing the AND of two players' bits, a version of secure multiparty computation in which a deck of playing cards replaces the computer received increasing attention. Since then, there has been the quest of minimizing the number of cards in these protocols for various security notions, resulting in a wide range of protocols (Mizuki and Sone, FAW 2009; Mizuki, Kumamoto, and Sone, ASIACRYPT 2012; Koch, Walzer, and Härtel, ASIACRYPT 2015).

Recently, Mizuki and Shizuya (Int. J. Inf. Secur., 2014) defined a model meant to encompass all card-based protocols. This permits rigorous proofs of lower bounds and impossibility results. However, up to now, no general implementation of the shuffling operations in the model have been proposed and active security, e.g. ensuring that the players cannot deviate in the shuffling steps, has remained largely un-addressed.

In this paper we give a more strict characterization of card-based protocols, taking meticulous care that each involved operation can be implemented under a small set of plausible assumptions even if one of the players is dishonest. The resulting model is still powerful enough to admit an actively secure implementation of the model of Mizuki and Shizuya, restricted to shuffles with a uniform distribution on a permutation group. A linear number of helping cards is needed in the reduction.
Expand
Urbi Chatterjee, Vidya Govindan, Rajat Sadhukhan, Debdeep Mukhopadhyay, Rajat Subhra Chakraborty, Debashis Mahata, Mukesh Prabhu
ePrint Report ePrint Report
Physically Unclonable Functions (PUFs) promise to be a critical hardware primitive to provide unique identities to billions of connected devices in Internet of Things (IoTs). In traditional authentication protocols a user presents a set of credentials with an accompanying proof such as password or digital certificate. However, IoTs need more evolved methods as these classical techniques suffer from the pressing problems of password dependency and inability to bind access requests to the “things” from which they originate. Additionally, the protocols need to be lightweight and heterogeneous. Although PUFs seem promising to develop such mechanism, its unclonability property puts forward an open problem of how to develop such mechanism without needing to store the challenge-response pair (CRP) explicitly at the verifier end. In this paper, we develop an authentication and key exchange protocol called PUF+IBE by combining the ideas of Identity based Encryption (IBE) using Cryptographic Pairing with PUFs. We show that this combination can help to do away with the requirement of explicitly storing the secret CRPs. The proposed protocol is based on two security assumptions: (a) physical and mathematical unclonability of the constituent PUFs, and, (b) computational intractability of the Elliptic Curve Discrete Logarithm Problem (ECDLP). The security of the protocol is proved in a formal way under the Session Key Security and the Universal Composability Framework. A prototype of the protocol has been implemented to realize a secured video surveillance camera using a combination of an Intel Edison board, with a Digilent Nexys-4 FPGA board comprising of an Artix-7 FPGA, together serving as the IoT node. We show, though the stand-alone video camera can be subjected to man-in-the-middle attack via IP-spoofing using standard network penetration tools, the camera augmented with the proposed protocol resists such attacks and it suits aptly in an IoT infrastructure making the protocol deployable for the industry.
Expand
\'Eric Brier, R\'emi G\'eraud, David Naccache
ePrint Report ePrint Report
The Naccache–Stern public-key cryptosystem (NS) relies on the conjectured hardness of the modular multiplicative knapsack problem: Given $p,\{v_i\},\prod v_i^{m_i} \bmod p$, find the $\{m_i\}$.

Given this scheme's algebraic structure it is interesting to systematically explore its variants and generalizations. In particular it might be useful to enhance NS with features such as semantic security, re-randomizability or an extension to higher-residues.

This paper addresses these questions and proposes several such variants.
Expand
Shiyi Zhang, Yongjuan Wang, Yang Gao, Tao Wang
ePrint Report ePrint Report
The 4x4 MDS matrix over F2 is widely used in the design of block cipher's linear diffusion layers. However, considering the cost of a lightweight cipher's implementation, the sum of XOR operations of a MDS matrix usually plays the role of measure. During the research on the construction of the lightweight 4x4 MDS matrices, this paper presents the concept of formalized MDS matrix: some of the entries that make up the matrix are known, and their positions are determined, and the criterions of the MDS matrix is satisfied. In this paper, using the period and minimal polynomial theory of entries over finite fields, a new construction method of formalized MDS matrices is proposed. A large number of MDS matrices can be obtained efficiently by this method, and their number distribution has significant structural features. However, the algebraic structure of the lightest MDS matrices is also obvious. This paper firstly investigates the construction of 4x4 lightweight MDS matrices, analyzes the distribution characteristics of the them, and the feasibility of the construction method. Then, for the lightest MDS matrices obtained from the method above, the algebraic relations in themselves and between each other are studied, and the important application of the alternating group A4 and it's subgroup, the Klein four-group is found.
Expand

20 May 2017

Alessandro Budroni, Federico Pintore
ePrint Report ePrint Report
When a pairing $e: \mathbb{G}_1 \times \mathbb{G}_2 \rightarrow \mathbb{G}_{T}$, on an elliptic curve $E$ defined over $\mathbb{F}_q$, is exploited for an identity-based protocol, there is often the need to hash binary strings into $\mathbb{G}_1$ and $\mathbb{G}_2$. Traditionally, if $E$ admits a twist $\tilde{E}$ of order $d$, then $\mathbb{G}_1=E(\mathbb{F}_q) \cap E[r]$, where $r$ is a prime integer, and $\mathbb{G}_2=\tilde{E}(\mathbb{F}_{q^{k/d}}) \cap \tilde{E}[r]$, where $k$ is the embedding degree of $E$ w.r.t. $r$. The standard approach for hashing into $\mathbb{G}_2$ is to map to a general point $P \in \tilde{E}(\mathbb{F}_{q^{k/d}})$ and then multiply it by the cofactor $c=\#\tilde{E}(\mathbb{F}_{q^{k/d}})/r$. Usually, the multiplication by $c$ is computationally expensive. In order to speed up such a computation, two different methods (by Scott et al. and by Fuentes et al.) have been proposed. In this paper we consider these two methods for BLS pairing-friendly curves having $k \in \{12,24,30,42,48\}$, providing efficiency comparisons. When $k=30,42,48$, the Fuentes et al. method requires an expensive one-off pre-computation which was infeasible for the computational power at our disposal. In these cases, we theoretically obtain hashing maps that follow Fuentes et al. idea.
Expand

19 May 2017

Singapore University of Technology and Design, established in collaboration with MIT, USA
Job Posting Job Posting
We are looking for a Post-Doc, to join our group.

Candidates for Post-Doc should have completed (or close to completing) a PhD in computer science, mathematics, or a related discipline with solid publication record. He/she should have solid experience in any of the following areas:

1. Public Key/Private key Cryptography and Provable Security.

2. Information and Network Security.

3.Privacy and Authentication

Successful candidates are expected to contribute to one of the following topics:

- Security in Decentralized application

- IoT Security

- network security

The candidate must have strong publication list (At least one top journal paper with high impact factor in IEEE/ACM Transactions)

The post has a flexible starting date. The initial appointment will be for one year, with a strong possibility for further appointment.

Review of applications will start immediately until the positions are filled.

Contact: How to apply:

Interested candidates kindly send their CV to Dr. Jemin Lee (email: jmnlee (at) ieee.org) and cc Dr. Prosanta Gope (email: gope_prosanta (at) sutd.edu.sg). Initial screening of applications

will begin immediately and the position will remain open until

filled. Only shortlist will be notified.

Closing date for applications: 31 July 2017

Expand

18 May 2017

Oslo, Norway, 14 September - 15 September 2017
Event Calendar Event Calendar
Event date: 14 September to 15 September 2017
Submission deadline: 17 June 2017
Notification: 27 July 2017
Expand

15 May 2017

Rongmao Chen, Yi Mu, Guomin Yang, Willy Susilo, Fuchun Guo
ePrint Report ePrint Report
Leakage attacks, including various kinds of side-channel attacks, allow an attacker to learn partial information about the internal secrets such as the secret key and the randomness of a cryptographic system. Designing a strong, meaningful, yet achievable security notion to capture practical leakage attacks is one of the primary goals of leakage-resilient cryptography.

In this work, we revisit the modelling and design of authenticated key exchange (AKE) protocols with leakage resilience. We show that the prior works on this topic are inadequate in capturing realistic leakage attacks. To close this research gap, we propose a new security notion named leakage-resilient eCK model w.r.t. auxiliary inputs (AI-LR-eCK) for AKE protocols, which addresses the limitations of the previous models. Our model allows computationally hard-to-invert leakage of both the long-term secret key and the randomness, and also addresses a limitation in the previous models where the adversary is disallowed to make leakage queries during the challenge session. As another major contribution of this work, we present a generic framework for the construction of AKE protocols that are secure under the proposed AI-LR-eCK model. An instantiation based on the Decision Diffie-Hellman (DDH) assumption in the standard model is also given to demonstrate the feasibility of our proposed framework.
Expand
◄ Previous Next ►