International Association for Cryptologic Research

International Association
for Cryptologic Research

IACR News

Updates on the COVID-19 situation are on the Announcement channel.

Here you can see all recent updates to the IACR webpage. These updates are also available:

RSS symbol icon
via RSS feed
Twitter bird icon
via Twitter
Weibo icon
via Weibo
Facebook icon
via Facebook

03 November 2023

Surya Mathialagan
ePrint Report ePrint Report
When outsourcing a database to an untrusted remote server, one might want to verify the integrity of contents while accessing it. To solve this, Blum et al. [FOCS `91] propose the notion of memory checking. Memory checking allows a user to run a RAM program on a remote server, with the ability to verify integrity of the storage with small local storage. In this work, we define and initiate the formal study of memory checking for Parallel RAMs (PRAMs). The parallel RAM model is very expressive and captures many modern architectures such as multi-core architectures and cloud clusters. When multiple clients run a PRAM algorithm on a shared remote server, it is possible that there are concurrency issues that cause inconsistencies. Therefore, integrity verification is even more desirable property in this setting. Assuming only the existence of one-way functions, we construct an online memory checker (one that reports faults as soon as they occur) for PRAMs with $O(\log N)$ simulation overhead in both work and depth. In addition, we construct an offline memory checker (one that reports faults only after a long sequence of operations) with amortized $O(1)$ simulation overhead in both work and depth. Our constructions match the best known simulation overhead of the memory checkers in the standard single-user RAM setting. As an application of our parallel memory checking constructions, we additionally construct the first maliciously secure oblivious parallel RAM (OPRAM) with polylogarithmic overhead.
Expand
Behzad Abdolmaleki, Céline Chevalier, Ehsan Ebrahimi, Giulio Malavolta, Quoc-Huy Vu
ePrint Report ePrint Report
Non-interactive zero-knowledge (NIZK) proof systems are a cornerstone of modern cryptography, but their security has received little attention in the quantum settings. Motivated by improving our understanding of this fundamental primitive against quantum adversaries, we propose a new definition of security against quantum adversary. Specifically, we define the notion of quantum simulation soundness (SS-NIZK), that allows the adversary to access the simulator in superposition. We show a separation between post-quantum and quantum security of SS-NIZK, and prove that both Sahai’s construction for SS-NIZK (in the CRS model) and the Fiat-Shamir transformation (in the QROM) can be made quantumly-simulation-sound. As an immediate application of our new notion, we prove the security of the Naor-Yung paradigm in the quantum settings, with respect to a strong quantum IND-CCA security notion. This provides the quantum analogue of the classical dual key approach to prove the security of encryption schemes. Along the way, we introduce a new notion of quantum-query advantage functions, which may be used as a general framework to show classical/quantum separation for other cryptographic primitives, and it may be of independent interest.
Expand
Hosein Hadipour, Simon Gerhalter, Sadegh Sadeghi, Maria Eichlseder
ePrint Report ePrint Report
Integral, zero-correlation (ZC), and impossible-differential (ID) attacks are three of the most important attacks on block ciphers. However, manually finding these attacks can be a daunting task, which is why automated methods are becoming increasingly important. Most of the automatic tools regarding integral, ZC and ID attacks have been focused only on finding distinguishers rather than complete attacks. At EUROCRYPT 2023, Hadipour et al. proposed a generic and efficient constraint programming (CP) model based on satisfiability for finding ID, ZC, and integral distinguishers. This new model can be extended to a unified CP model for finding full key recovery attacks. However, their method has some limitations, including the fact that the location of contradiction should be determined in advance, and the model is a cell-wise model unsuitable for weakly aligned ciphers, e.g., Ascon and PRESENT. In addition, they left developing a CP model for the partial-sum technique in key recovery as a future work.

In this paper, we improve the method by Hadipour et al. in several ways. First, we remove the limitation of determining the contradiction location in advance. Second, we show how to extend the distinguisher model to a bit-wise model, considering the internal structure of S-boxes and keeping the model based on satisfiability. Third, we introduce a CP model for the partial-sum technique for the first time. To show the usefulness and versatility of our approach, we applied it to different designs, from strongly aligned designs such as ForkSKINNY and QARMAv2 to weakly aligned designs such as Ascon and PRESENT, obtaining significantly improved results. To mention a few of our results, we improve the integral distinguisher of QARMAv2-128 (resp. QARMAv2-64) by 7 (resp. 5) rounds, and the integral distinguisher of ForkSKINNY by 1 round, only thanks to our cell-wise distinguisher modelings. By using our new bit-wise modeling, our tool can find a group of $2^{155}$ 5-round ID and ZC distinguishers for Ascon in only one run, taking a few minutes on a regular laptop. Thanks to the new CP model for the partial-sum technique, we could improve the integral attacks on all variants of SKINNY. Particularly, we improved the best attack on SKINNY-$n$-$n$ in the single-key setting by 1 round. We also improved the ID attacks on ForkSKINNY and analyzed this cipher in the limited reduced-round setting for the first time. Our methods are generic and applicable to other block ciphers.
Expand
Radhika Garg, Kang Yang, Jonathan Katz, Xiao Wang
ePrint Report ePrint Report
Protocols for secure multi-party computation (MPC) supporting mixed-mode computation have found a lot of applications in recent years due to their flexibility in representing the function to be evaluated. However, existing mixed-mode MPC protocols are only practical for a small number of parties: they are either tailored to the case of two/three parties, or scale poorly for a large number of parties. In this paper, we design and implement a new system for highly efficient and scalable mixed-mode MPC tolerating an arbitrary number of semi-honest corruptions. Our protocols allow secret data to be represented in Encrypted, Boolean, Arithmetic, or Yao form, and support efficient conversions between these representations. 1. We design a multi-party table-lookup protocol, where both the index and the table can be kept private. The protocol is scalable even with hundreds of parties. 2. Using the above protocol, we design efficient conversions between additive arithmetic secret sharings and Boolean secret sharings for a large number of parties. For 32 parties, our conversion protocols require 1184× to 8141× less communication compared to the state- of-the-art protocols MOTION and MP-SPDZ; this leads to up to 1275× improvement in running time under 1 Gbps network. The improvements are even larger with more parties. 3. We also use new protocols to design an efficient multi-party distributed garbling protocol. The protocol could achieve asymptotically constant communication per party.

Our implementation will be made public.
Expand
Osman Biçer, Christian Tschudin
ePrint Report ePrint Report
In this paper, we introduce Oblivious Homomorphic Encryption (OHE) which provably separates the computation spaces of multiple clients of a fully homomorphic encryption (FHE) service while keeping the evaluator blind about whom a result belongs. We justify the importance of this strict isolation property of OHE by showing an attack on a recently proposed key-private cryptocurrency scheme. Our two OHE constructions are based on a puncturing function where the evaluator can effectively mask ciphertexts from rogue and potentially colluding clients. We show that this can be implemented via a FHE scheme plus an anonymous commitment scheme. OHE can be used to provide provable anonymity to cloud applications, to single server implementations of anonymous messaging as well as to account-based cryptocurrencies.
Expand
Xiaolu Hou, Jakub Breier, Mladen Kovačević
ePrint Report ePrint Report
The idea of balancing the side-channel leakage in software was proposed more than a decade ago. Just like with other hiding-based countermeasures, the goal is not to hide the leakage completely but to significantly increase the effort required for the attack. Previous approaches focused on two directions: either balancing the Hamming weight of the processed data or deriving the code by using stochastic leakage profiling.

In this brief, we build upon these results by proposing a novel approach that combines the two directions. We provide the theory behind our encoding scheme backed by experimental results on a 32-bit ARM Cortex-M4 microcontroller. Our results show that such a combination gives better side-channel resistance properties than each of the two methods separately.
Expand
Zhuolong Zhang, Shiyao Chen, Wei Wang, Meiqin Wang
ePrint Report ePrint Report
This paper presents full round distinguishing and key recovery attacks on lightweight block cipher SAND-2 with 64-bit block size and 128-bit key size, which appears to be a mixture of the AND-Rotation-XOR (AND-RX) based ciphers SAND and ANT. However, the security arguments against linear and some other attacks are not fully provided. In this paper, we find that the combination of a SAND-like nibble-based round function and ANT-like bit-based permutations will cause dependencies and lead to iterative linear and differential trails with high probabilities. By exploiting these, full round distinguishing attacks on SAND-2 work with $2^{46}$ queries for linear and $2^{58.60}$ queries for differential in the single-key setting. Then, full round key recovery attacks are also mounted, which work with the time complexity $2^{48.23}$ for linear and $2^{64.10}$ for differential. It should be noted that the dependency observed in this paper only works for SAND-2 and will not threaten SAND and ANT. From the point of designers, our attacks show the risk of mixing the parts of different designs, even though each of them is well-studied to be secure.
Expand
Zhengjun Cao
ePrint Report ePrint Report
We show that the Yang et al.'s key agreement scheme [Future Gener. Comput. Syst., 145, 415-428 (2023)] is flawed. (1) There are some inconsistent computations, which should be corrected. (2) The planned route of a target vehicle is almost exposed. The scheme neglects the basic requirement for bit-wise XOR, and tries to encrypt the route by the operator. The negligence results in some trivial equalities. (3) The scheme is insecure against impersonation attack launched by the next roadside unit.
Expand
Andrei Lapets
ePrint Report ePrint Report
Many secure computation schemes and protocols (such as numerous variants of secure multi-party computation and homomorphic encryption) have favorable performance characteristics when they are used to evaluate addition and scalar multiplication operations on private values that can be represented as ring elements. A purely algebraic argument (with no references to any specific protocol or scheme) can be used to show that the ability to perform these operations is sufficient to implement any univariate map that operates on private values when that map's domain is finite. Such implementations of univariate maps can be composed in sequence any number of times. Other forms of composition for such implementations can be realized by using multiplication operations involving ring elements, but it is possible that these can be substituted with scalar multiplication operations within certain secure computation workflows.
Expand
Tian Qiu, Qiang Tang
ePrint Report ePrint Report
Motivated by applications in anonymous reputation systems and blockchain governance, we initiate the study of predicate aggregate signatures (PAS), which is a new primitive that enables users to sign multiple messages, and these individual signatures can be aggregated by a combiner, preserving the anonymity of the signers. The resulting PAS discloses only a brief description of signers for each message and provides assurance that both the signers and their description satisfy the specified public predicate. We formally define PAS and give a construction framework to yield a logarithmic size signature, and further reduce the verification time also to logarithmic. We also give several instantiations for several concrete predicates that may be of independent interest. To showcase its power, we also demonstrate its applications to multiple settings including multi-signatures, aggregate signatures, threshold sig- natures, (threshold) ring signatures, attribute-based signatures, etc, and advance the state of the art in all of them.
Expand
George Teseleanu, Paul Cotan
ePrint Report ePrint Report
In the design of an identity-based encryption (IBE) scheme, the primary security assumptions center around quadratic residues, bilinear mappings, and lattices. Among these approaches, one of the most intriguing is introduced by Clifford Cocks and is based on quadratic residues. However, this scheme has a significant drawback: a large ciphertext to plaintext ratio. A different approach is taken by Zhao et al., who design an IBE still based on quadratic residues, but with an encryption process reminiscent of the Goldwasser-Micali cryptosystem. In the following pages, we will introduce an elementary method to accelerate Cocks' encryption process and adapt a space-efficient encryption technique for both Cocks' and Zhao et al.'s cryptosystems.
Expand
Xu An Wang, Lunhai Pan, Hao Liu, Xiaoyuan Yang
ePrint Report ePrint Report
In Crypto 94, Chor, Fiat, and Naor first introduced the traitor tracing (TT) systems, which aim at helping content distributors identify pirates. Since its introduction, many traitor tracing schemes have been proposed. However, we observe until now almost all the traitor tracing systems using probabilistic public key (and secret key) encryption as the the content distribution algorithm, they do not consider this basic fact: the malicious encrypter can plant some trapdoor in the randomness of the ciphertexts and later he can use this trapdoor or the delegation of the trapdoor to construct decoding pirates, He can sell them to the black market and get his own benefits. At first sight, this new attack model is too strong to capture the real attack scenarios. But we think it is valuable at least for the following two reasons: (1) Note in many modern content distribution systems, there are at least existing three different roles: { the content provider, the content distributer and the content consumer. In this framework, the encrypter is not necessarily the content provider (or content owner). It can be a malicious employee in the content provider corporation, it can also be the malicious content distributer or its malicious employee}. In all these cases, the encrypter has its own benefits and has the potential intention to plant some trapdoor in the randomness for generating ciphertexts. (2) Also note in the related work, there is a conclusion that traitor tracing and differential privacy can have directly influence on each other, while differential privacy (DP) is at the heart of constructing modern privacy preserving systems. But if we consider this new insider attacker (the encrypter), at least some part arguments on the relationship between traitor tracing and differential privacy need more consideration. Therefore in this paper we carefully describe this new insider attacker and investigate thoroughly on its effect. Our main research results are the following: (1) We show that many existing public key traitor tracing systems with probabilistic encryption algorithm are failing to work correctly when facing this malicious encrypter.They are including the BSW, BW, GKSW, LCZ and BZ traitor tracing systems. Furthermore, we conclude that most of the existing traitor tracing systems using probabilistic encryption algorithm can not resist this attack. (2) When considering the insider attacker (the encrypter), if the traitor tracing schemes using probabilistic encryption algorithms, the conclusion on tight relationship between traitor tracing and differential privacy may need more consideration. (3) By employing the technique of hash function, we show how to design TT+ system which can resist this type of attack based on the existing traitor tracing system. Compared with the old traitor tracing system, our new proposal does not add much overhead and thus is practical too.
Expand
George Teseleanu
ePrint Report ePrint Report
In this paper, we analyze the Espresso cipher from a related key chosen IV perspective. More precisely, we explain how one can obtain Key-IV pairs such that Espresso's keystreams either have certain identical bits or are shifted versions of each other. For the first case, we show how to obtain such pairs after $2^{32}$ iterations, while for the second case, we present an algorithm that produces such pairs in $2^{28}$ iterations. Moreover, we show that by making a minor change in the padding used during the initialization phase, it can lead to a more secure version of the cipher. Specifically, changing the padding increases the complexity of our second attack from $2^{28}$ to $2^{34}$. Finally, we show how related IVs can accelerate brute force attacks, resulting in a faster key recovery. Although our work does not have any immediate implications for breaking the Espresso cipher, these observations are relevant in the related-key chosen IV scenario.
Expand
Shuqing Zhang
ePrint Report ePrint Report
We present a new method for doing multi-party private set intersection against a malicious adversary, which reduces the total communication cost to $ O(nl\kappa) $. Additionally, our method can also be used to build a multi-party Circuit-PSI without payload. Our protocol is based on Vector-OLE(VOLE) and oblivious key-value store(OKVS). To meet the requirements of the protocol, we first promote the definition of VOLE to a multi-party version. After that, we use the new primitive to construct our protocol and prove that it can tolerate all-but-two malicious corruptions.

Our protocol follows the idea of [RS21], where each party encodes the respective set as a vector, uses VOLE to encrypt the vector, and finally construct an OPRF to get the result. When it comes to multi-party situation, we have to encrypt several vectors at one time. As a result, the VOLE used in [RS21] and follow-up papers is not enough, that brings our idea of an multi-party VOLE.
Expand
Libo Wang, Ling Song, Baofeng Wu, Mostafizar Rahman, Takanori Isobe
ePrint Report ePrint Report
In this paper, inspired by the work of Beyne and Rijmen at CRYPTO 2022, we explore the accurate probability of $d$-differential in the fixed-key model. The theoretical foundations of our method are based on a special matrix $-$ quasi-$d$-differential transition matrix, which is a natural extension of the quasidifferential transition matrix. The role of quasi-$d$-differential transition matrices in polytopic cryptananlysis is analogous to that of correlation matrices in linear cryptanalysis. Therefore, the fixed-key probability of a $d$-differential can be exactly expressed as the sum of the correlations of its quasi-$d$-differential trails.

Then we revisit the boomerang attack from a perspective of 3-differential. Different from previous works, the probability of a boomerang distinguisher can be exactly expressed as the sum of the correlations of its quasi-$3$-differential trails without any assumptions in our work.

In order to illustrate our theory, we apply it to the lightweight block cipher GIFT. It is interesting to find the probability of every optimal 3-differential characteristic of an existing 2-round boomerang is zero, which can be seen as an evidence that the security of block ciphers adopting half-round key XOR might be overestimated previously to some extent in differential-like attacks.
Expand
Thomas Pornin
ePrint Report ePrint Report
GLS254 is an elliptic curve defined over a finite field of characteristic 2; it contains a 253-bit prime order subgroup, and supports an endomorphism that can be efficiently computed and helps speed up some typical operations such as multiplication of a curve element by a scalar. That curve offers on x86 and ARMv8 platforms the best known performance for elliptic curves at the 128-bit security level.

In this paper we present a number of new results related to GLS254:

- We describe new efficient and complete point doubling formulas (2M+4S) applicable to all ordinary binary curves.

- We apply the previously described (x,s) coordinates to GLS254, enhanced with the new doubling formulas. We obtain formulas that are not only fast, but also complete, and thus allow generic constant-time usage in arbitrary cryptographic protocols.

- Our strictly constant-time implementation multiplies a point by a scalar in 31615 cycles on an x86 Coffee Lake, and 77435 cycles on an ARM Cortex-A55, improving previous records by 13% and 11.7% on these two platforms, respectively.

- We take advantage of the completeness of the formulas to define some extra operations, such as canonical encoding with (x, s) compression, constant-time hash-to-curve, and signatures. Our Schnorr signatures have size only 48 bytes, and offer good performance: signature generation in 18374 cycles, and verification in 27376 cycles, on x86; this is about four times faster than the best reported Ed25519 implementations on the same platform.

- The very fast implementations leverage the carryless multiplication opcodes offered by the target platforms. We also investigate performance on CPUs that do not offer such an operation, namely a 64-bit RISC-V CPU (SiFive-U74 core) and a 32-bit ARM Cortex-M4 microcontroller. While the achieved performance is substantially poorer, it is not catastrophic; on both platforms, GLS254 signatures are only about 2x to 2.5x slower than Ed25519.
Expand
Shuhei Nakamura
ePrint Report ePrint Report
The Crossbred algorithm is one of the algorithms for solving a system of polynomial equations, proposed by Joux and Vitse in 2017. It has been implemented in Fukuoka MQ challenge, which is related to the security of multivariate crytography, and holds several records. A framework for estimating the complexity has already been provided by Chen et al. in 2017. However, it is generally unknown which parameters are actually available. This paper investigates how to select available parameters for the Crossbred algorithm. As a result, we provide formulae that give an available parameter set and estimate the complexity of the Crossbred algorithm.
Expand
André Chailloux, Jean-Pierre Tillich
ePrint Report ePrint Report
One of the founding results of lattice based cryptography is a quantum reduction from the Short Integer Solution problem to the Learning with Errors problem introduced by Regev. It has recently been pointed out by Chen, Liu and Zhandry that this reduction can be made more powerful by replacing the learning with errors problem with a quantum equivalent, where the errors are given in quantum superposition. In the context of codes, this can be adapted to a reduction from finding short codewords to a quantum decoding problem for random linear codes.

We therefore consider in this paper the quantum decoding problem, where we are given a superposition of noisy versions of a codeword and we want to recover the corresponding codeword. When we measure the superposition, we get back the usual classical decoding problem for which the best known algorithms are in the constant rate and error-rate regime exponential in the codelength. However, we will show here that when the noise rate is small enough, then the quantum decoding problem can be solved in quantum polynomial time. Moreover, we also show that the problem can in principle be solved quantumly (albeit not efficiently) for noise rates for which the associated classical decoding problem cannot be solved at all for information theoretic reasons.

We then revisit Regev's reduction in the context of codes. We show that using our algorithms for the quantum decoding problem in Regev's reduction matches the best known quantum algorithms for the short codeword problem. This shows in some sense the tightness of Regev's reduction when considering the quantum decoding problem and also paves the way for new quantum algorithms for the short codeword problem.
Expand
Janik Huth, Antoine Joux
ePrint Report ePrint Report
In this paper, we introduce the subfield bilinear collision problem and use it to construct an identification protocol and a signature scheme. This construction is based on the MPC-in-the-head paradigm and uses the Fiat-Shamir transformation to obtain a signature.
Expand
Nan Cheng, Melek Önen, Aikaterini Mitrokotsa, Oubaïda Chouchane, Massimiliano Todisco, Alberto Ibarrondo
ePrint Report ePrint Report
Computing $\Delta(\mathbfit{x},\mathbfit{y}) \geq \tau$, the distance between two vectors $\mathbfit{x}$ and $\mathbfit{y}$ chained with a comparison to a predefined public threshold $\tau$, is an essential functionality that is extensively used in privacy-sensitive applications such as biometric authentication and identification, machine learning algorithms ({\em e.g.,} linear regression, k-nearest neighbors etc.) or typo-tolerant password-based authentication. Cosine similarity is one of the most popular distance metrics employed in these settings. In this paper, we investigate the privacy-preserving computation of cosine similarity in a two-party distributed setting {\em i.e.,} where a client outsources the distance calculation to two servers, while revealing only the result of the comparison to the service provider. We propose two two-party computation (2PC) protocols of cosine similarity followed by comparison to a public threshold, one in the semi-honest and one in the malicious setting. Our protocols combine additive secret sharing with function secret sharing, saving one communication round by employing a new building block to compute the composition of a bit and a binary function $f$, thus requiring only two communication rounds under a strong threat model. We evaluate our protocols in the setting of biometric authentication using voice biometrics. Our results show that not only are the proposed protocols efficient, but they also maintain the same accuracy as the plain-text systems.
Expand
◄ Previous Next ►