International Association for Cryptologic Research

International Association
for Cryptologic Research

IACR News

If you have a news item you wish to distribute, they should be sent to the communications secretary. See also the events database for conference announcements.

Here you can see all recent updates to the IACR webpage. These updates are also available:

email icon
via email
RSS symbol icon
via RSS feed

11 July 2025

Manideep Thotakura
ePrint Report ePrint Report
Pairing functions uniquely encode pairs of natural numbers into single values, a fundamental operation in mathematics and computer science. This paper presents an alternative approach inspired by geometric visualization—viewing pairs as arrangements of square blocks with missing tiles. Our method achieves packing efficiency comparable to the classical Cantor pairing function and matches the time complexity of both Cantor and Szudzik functions. Encoding is performed in constant time using simple arithmetic operations, while decoding requires square root computations, resulting in efficient inversion. By combining algebraic rigor with intuitive geometric insight, this approach offers a practical and accessible alternative for applications involving data encoding, spatial structures, and combinatorial problems.
Expand
Steven Galbraith, Valerie Gilchrist, Damien Robert
ePrint Report ePrint Report
Given two elliptic curves over F_q, computing an isogeny mapping one to the other is conjectured to be classically and quantumly hard. This problem plays an important role in the security of elliptic curve cryptography. In 2024, Galbraith applied recently developed techniques for isogenies to improve the state-of-the-art for this problem.

In this work, we focus on computing ascending isogenies. We give a simplified framework for computing self-pairings, and show how they can be used to improve upon the approach from Galbraith to recover these ascending isogenies and eliminate a heuristic assumption from his work. We show that this new approach gives an improvement to the overall isogeny recovery when the curves have a small crater (super-polynomial in size). We also study how these self-pairings affect the security of the (PEARL)SCALLOP group action, gaining an improvement over the state-of-the-art for some very particular parameter choices. The current SCALLOP parameters remain unaffected.
Expand
Orr Dunkelman, Eran Lambooij, Gaëtan Leurent
ePrint Report ePrint Report
In this note we study the proposed cipher Synergy and describe a full round differential with probability $2^{-21.29}$. The claims have been experimentally verified.
Expand
Evangelos Karatsiolis, Franziskus Kiefer, Juliane Krämer, Mirjam Loiero, Christian Tobias, Maximiliane Weishäupl
ePrint Report ePrint Report
With the advancing standardization of post-quantum cryptographic schemes, the need for preparing the IT security infrastructure for integrating post-quantum schemes increases. The focus of this work is a specific part of the IT security infrastructure, namely public key infrastructures. For public certification authorities, it is crucial to guarantee the quality of public keys certified by them. To this end, linting is deployed, which describes the process of analyzing the content of a certificate with respect to predefined rules, the so-called lints. In this work, we initiate the study of lints for post-quantum cryptography. As a starting point, we choose lattice-based schemes and analyze the public keys of the NIST standards ML-KEM and ML-DSA. We base our analyses on the NIST FIPS standards and IETF documents. We formally describe the identified lints and classify them with respect to the property of the public key that the lint checks. We implement the lints for a common X.509 certificate linter and provide an open-source tool.
Expand
Sven Argo, Marloes Venema, Adrian Ackermann, Tim Güneysu
ePrint Report ePrint Report
Attribute-based encryption (ABE) is a versatile primitive that has been considered in many applications to enforce access control cryptographically. To actually benefit from ABE in practice, we require implementations of schemes that satisfy all the properties that are needed. Many theoretical advancements have been made to attain such properties, ultimately resulting in powerful abstractions such as pair encodings. To build an ABE scheme, we use a compiler (in the theoretical sense), which transforms a provably secure pair encoding scheme into a provably secure ABE scheme. Although several such compilers have been introduced, they all abstract away many details that are relevant for engineers, which can hinder the implementation of schemes in practice. To address this problem, we propose pracy, which is a tool that automatically implements an ABE scheme from an input pair encoding scheme. To achieve this, we first note that we need to overcome a general issue in any automation efforts – including automated optimization and security analysis – in the field of pairing-based cryptography. In particular, there exist no parsers that properly model the interaction between the predicate and the pair encodings. Therefore, we devise a new formal model and type system, which capture this interaction in a way that is compatible with automated implementation efforts. To illustrate the feasibility of our model and system, we construct pracy, which is a (practical) compiler in Python that can implement ABE schemes in multiple target programming languages such as Python and C/C++. With pracy, we not only make the implementation of ABE schemes from pair encodings more accessible to practitioners, we realize the potential that pair encodings have to simplify implementation efforts.
Expand
Thomas Pornin
ePrint Report ePrint Report
In this short note, we describe some further improvements to the key pair generation process for the Falcon and Hawk lattice-based signature schemes, and for the BAT key encapsulation scheme, in a fully constant-time way and without any use of floating-point operations. Our new code is slightly faster than our previous implementation, and, more importantly for small embedded systems, uses less RAM space.
Expand
Pantelimon Stanica, Ranit Dutta, Bimal Mandal
ePrint Report ePrint Report
This paper introduces {\em truncated inner $c$-differential cryptanalysis}, a novel technique that for the first time enables the practical application of $c$-differential uniformity to block ciphers. While Ellingsen et al. (IEEE Trans. Inf. Theory, 2020) established the notion of $c$-differential uniformity using $(F(x\oplus a), cF(x))$, a key challenge remained: multiplication by $c$ disrupts the structural properties essential for block cipher analysis, particularly key addition.

We resolve this challenge by developing an \emph{inner} $c$-differential approach where multiplication by $c$ affects the input: $(F(cx\oplus a), F(x))$. We prove that the inner $c$-differential uniformity of a function $F$ equals the outer $c$-differential uniformity of $F^{-1}$, establishing a fundamental duality. This modification preserves cipher structure while enabling practical cryptanalytic applications.

Our main contribution is a comprehensive multi-faceted statistical-computational framework, implementing truncated $c$-differential analysis against the full 9-round Kuznyechik cipher (the inner $c$-differentials are immune to the key whitening at the backend). Through extensive computational analysis involving millions of differential pairs, we demonstrate statistically significant non-randomness across all tested round counts. For the full 9-round cipher, we identify multiple configurations triggering critical security alerts, with bias ratios reaching $1.7\times$ and corrected p-values as low as $1.85 \times 10^{-3}$, suggesting insufficient security margin against this new attack vector. This represents the first practical distinguisher against the full 9-round Kuznyechik.
Expand
Peter Gutmann, Stephan Neuhaus
ePrint Report ePrint Report
This paper presents implementations that match and, where possible, exceed current quantum factorisation records using a VIC-20 8-bit home computer from 1981, an abacus, and a dog. We hope that this work will inspire future efforts to match any further quantum factorisation records, should they arise.
Expand

09 July 2025

Timo Glaser
ePrint Report ePrint Report
In 2000, Pliam showed that there does not exist an upper or lower bound in terms of Shannon entropy alone for the number of guesses required in order to guess some randomly sampled element $s$ with certainty $0
Expand
Han Chen, Tao Huang, Phuong Pham, Shuang Wu
ePrint Report ePrint Report
HiAE is a recently proposed high-throughput authenticated encryption algorithm that achieves exceptional performance on both x86 and ARM architectures. Following its publication, several cryptanalysis papers have claimed that HiAE’s 256-bit encryption security is broken under the nonce-respecting model. In this note, we clarify that the claimed attacks rely critically on submitting forged-tag decryption queries — a type of behavior explicitly excluded by HiAE’s original security model.

HiAE was designed under a standard nonce-based AEAD setting without decryption oracle access, offering 256-bit security against key and state recovery, and 128-bit security against forgery. This design approach follows the same principle as well-known schemes such as AEGIS and MORUS.

The conclusion that HiAE is broken is based on a misinterpretation of its security model, as the attacks rely on conditions that the design explicitly excludes.
Expand
Vivian Fang, Emma Dauterman, Akshay Ravoor, Akshit Dewan, Raluca Ada Popa
ePrint Report ePrint Report
Transparency logs are critical for a wide range of applications, from web certificates to end-to-end encrypted messaging. Today, many transparency log designs exist for various applications and workloads, and developers must fully understand the design space to find the best design for their needs. Worse, if a developer needs a transparency log for an application and workload without an existing transparency log, the developer (who might not be an expert) must design a new log. To address these challenges, we introduce the paradigm of a configurable transparency log, which takes as input a description of the application work- load and constraints of different entities and automatically outputs a transparency log uniquely suited to the application. We present the first configurable transparency log design, LegoLog, which we implement and empirically evaluate end- to-end for three specialized transparency logs. We also show that LegoLog can express six different applications, and we compare the asymptotic complexity of LegoLog and existing transparency logs tailored to individual applications. We find that configurability does not come at the cost of performance: LegoLog can capture a variety of applications while performing comparably to existing, special-purpose transparency logs.
Expand
Shihui Fu
ePrint Report ePrint Report
Argument systems are a fundamental ingredient in many cryptographic constructions. The best-performing argument systems to date largely rely on a trusted setup, which is undesirable in trust-minimized applications. While transparent argument systems avoid this trust assumption, they have historically been inefficient, typically exhibiting polylogarithmic proof sizes compared to their trusted counterparts. In 2023, Arun et al. (PKC 2023) constructed the first transparent constant-sized polynomial commitment scheme (PCS), leading to transparent constant-sized arguments. However, the evaluation proof still comprises 66 group elements in a group of unknown order (GUO), rendering it rather impractical. In this work, we address this challenge by presenting a set of novel batching and aggregation techniques tailored for proofs of knowledge of ranges in GUOs. These techniques may also be of independent interest and are readily applicable to enhance and shorten other existing schemes in GUOs. Consequently, by applying these techniques, we immediately achieve an improved PCS with an evaluation proof consisting of only 10 group elements---an impressive 85% reduction. To our knowledge, this represents the shortest PCS in the transparent setting. Thus compiling known information-theoretic proof systems using our improved PCS yields highly compact transparent argument systems when instantiated in a class group, which is more practical than prior constant-sized schemes.
Expand
Sébastien Canard, Liam Medley, Duy Nguyen, Duong Hieu Phan
ePrint Report ePrint Report
In electronic voting systems, guaranteeing voter anonymity is essential. One primary method to ensure this is the use of a mix-net, in which a set of mix-servers sequentially shuffle a set of encrypted votes, and generate proofs that a correct permutation has been applied. Whilst mix-nets offer advantages over alternative approaches, their traditional use during the tallying phase introduces a significant robustness bottleneck: the process is inherently sequential and critically depends on trusted authorities to perform shuffling and decryption. Any disruption can prevent the final result from being revealed.

In this work, we propose offline mixing OMIX, the first voting framework to support a mix-net-based system in which trustees never handle encrypted votes, while also ensuring that each voter's cost is independent of the total number of voters. In particular, the contributions of permutations by mix-servers and decryption shares by trustees are completed and publicly verified before any vote is cast. This eliminates the need for their participation during tallying and enables the first scalable, mix-net-based, and self-tallying voting protocol in the sense of Kiayias and Yung (PKC'02).

At the core of OMIX is a distributed key-generation mechanism: each voter locally generates a private voting key and registers a constant-size set of basis public keys. These are permuted and partially decrypted in an offline phase, resulting in a final public decryption key that reveals votes in shuffled order. Our construction leverages the homomorphic and structure-preserving properties of function-hiding inner-product functional encryption, combined with standard primitives, to achieve self-tallying, client scalability, ballot privacy and other voting properties. To support the new mixing structure introduced by OMIX, we also develop a compact and verifiable offline mix-net, based on an enhanced linearly homomorphic signature scheme. This latter primitive may be of independent interest.
Expand
Jaisal Ahmadullah
ePrint Report ePrint Report
Steganography is the practice of concealing messages or information within other non-secret text or media to avoid detection. A central challenge in steganography is balancing payload size with detectability and media constraints—larger payloads increase the risk of detection and require proportionally larger or higher-capacity carriers. In this paper, we introduce a novel approach that combines Huffman coding, suitable dictionary identification, and large language models (LLMs) rephrasing techniques to significantly reduce payload size. This enables more efficient use of limited-capacity carriers, such as images, while minimizing the visual or statistical footprint. Our method allows for the embedding of larger payloads into fixed-size media, addressing a key bottleneck in traditional steganographic systems. By optimizing payload compression prior to encoding, we improve both the stealth and scalability of steganographic communication.
Expand
Sven Argo, Marloes Venema, Doreen Riepel, Tim Güneysu, Diego F. Aranha
ePrint Report ePrint Report
Since attribute-based encryption (ABE) was proposed in 2005, it has established itself as a valuable tool in the enforcement of access control. For practice, it is important that ABE satisfies many desirable properties such as multi-authority and negations support. Nowadays, we can attain these properties simultaneously, but none of these schemes have been implemented. Furthermore, although simpler schemes have been optimized extensively on a structural level, there is still much room for improvement for these more advanced schemes. However, even if we had schemes with such structural improvements, we would not have a way to benchmark and compare them fairly to measure the effect of such improvements. The only framework that aims to achieve this goal, ABE Squared (TCHES '22), was designed with simpler schemes in mind.

In this work, we propose the ABE Cubed framework, which provides advanced benchmarking extensions for ABE Squared. To motivate our framework, we first apply structural improvements to the decentralized ciphertext-policy ABE scheme supporting negations presented by Riepel, Venema and Verma (ACM CCS '24), which results in five new schemes with the same properties. We use these schemes to uncover and bridge the gaps in the ABE Squared framework. In particular, we observe that advanced schemes depend on more "variables" that affect the schemes' efficiency in different dimensions. Whereas ABE Squared only considered one dimension (as was sufficient for the schemes considered there), we devise a benchmarking strategy that allows us to analyze the schemes in multiple dimensions. As a result, we obtain a more complete overview on the computational efficiency of the schemes, and ultimately, this allows us to make better-founded choices about which schemes provide the best efficiency trade-offs for practice.
Expand
Honglin Shao, Yuejun Liu, Mingyao Shao, Yongbin Zhou
ePrint Report ePrint Report
NTRU-based structured lattices underpin several standardized post-quantum cryptographic schemes, most notably the Falcon signature algorithms. While offering compactness and efficiency, the algebraic structure of NTRU lattices introduces new vulnerabilities under physical attacks, where partial secret key leakage may occur.

This work addresses the problem of full key recovery in NTRU-based schemes when adversaries obtain partial information through side-channel or fault attacks. Existing leakage-aware frameworks, including the DDGR estimator and the approach of May and Nowakowski, either lack scalability or are limited to structured, single-source leakage on one secret vector. These constraints make them ineffective against practical leakage patterns in NTRU settings.

We propose a unified and scalable framework for recovering NTRU secret keys under partial leakage. Our method supports diverse hint types, such as perfect hints, modular hints, and low-bit leakage, and enables joint integration of leakage across both secret polynomials \( f \) and \( g \). At its core, the framework uses a dimension-reduction strategy to eliminate known coefficients and reduce the problem to a lower-dimensional NTRU instance suitable for lattice reduction. Additionally, we introduce a transformation that converts hints on \( g \) into modular constraints on \( f \), allowing unified hint embedding.

We demonstrate practical attacks on Falcon using NIST reference implementations. Leaking 400 coefficients of $f$ in Falcon-512 reduces the required BKZ block size from over 350 to 38, enabling full key recovery within 6 hours. Compared to MN23, our method achieves significant speedups: $5.83\times$ for Falcon-512 with 400 leaked coefficients, and over $15\times$ for Falcon-1024 with 910 leaked coefficients. These results highlight the efficiency and scalability of our framework and the importance of leakage-resilient design for structured NTRU lattices.
Expand
Christopher Battarbee, Christoph Striecks, Ludovic Perret, Sebastian Ramacher, Kevin Verhaeghe
ePrint Report ePrint Report
Authenticated Key Exchange (AKE) between any two entities is one of the most important security protocols available for securing our digital networks and infrastructures. In PQCrypto 2023, Bruckner, Ramacher and Striecks proposed a novel hybrid AKE (HAKE) protocol dubbed Muckle+ that is particularly useful in large quantum-safe networks consisting of a large number of nodes. Their protocol is hybrid in the sense that it allows key material from conventional, post-quantum, and quantum cryptography primitives to be incorporated into a single end-to-end authenticated shared key.

To achieve the desired authentication properties, Muckle+ utilizes post-quantum digital signatures. However, available instantiations of such signatures schemes are not yet efficient enough compared to their post-quantum key-encapsulation mechanism (KEM) counterparts, particularly in large networks with potentially several connections in a short period of time.

To mitigate this gap, we propose Muckle# that pushes the efficiency boundaries of currently known HAKE constructions. Muckle# uses post-quantum key-encapsulating mechanisms for implicit authentication inspired by recent works done in the area of Transport Layer Security (TLS) protocols, particularly, in KEMTLS (CCS'20).

We port those ideas to the HAKE framework and develop novel proof techniques on the way. Due to our KEM-based approach, the resulting protocol has a slightly different message flow compared to prior work that we carefully align with the HAKE framework and which makes our changes to Muckle+ non-trivial. Lastly, we evaluate the approach by a prototypical implementation and a direct comparison with Muckle+ to highlight the efficiency gains.
Expand
Orr Dunkelman, Shibam Ghosh
ePrint Report ePrint Report
ARADI is a low-latency block cipher introduced by the U.S. National Security Agency (NSA), targeting secure and efficient memory encryption. However, unlike most academic cipher proposals, the design rationale behind ARADI has not been made public, leaving its security to be only assessed through independent analysis. In this work, we present improved key-recovery attacks on up to 12 out of 16 rounds of ARADI in the single-key setting — advancing the best known attacks by two rounds. Our techniques build upon the ZeroSum distinguisher framework and leverage the Fast Hadamard Transform (FHT). A central insight in our attacks is that the linear layer of ARADI exhibits weak diffusion. This structural property allows partial decryption with only a subset of the round keys, significantly reducing the key-guessing space.
Expand
Michelle Yeo, Haoqian Zhang
ePrint Report ePrint Report
Censorship resilience is a fundamental assumption underlying the security of blockchain protocols. Additionally, the analysis of blockchain security from an economic and game theoretic perspective has been growing in popularity in recent years. In this work, we present a surprising rational censorship attack on blockchain censorship resilience when we adopt the analysis of blockchain security from a game theoretic lens and assume all users are rational. In our attack, a colluding group with sufficient voting power censors the remainder nodes such that the group alone can gain all the rewards from maintaining the blockchain. We show that if nodes are rational, coordinating this attack just requires a public read and write blackboard and we formally model the attack using a game theoretic framework. Furthermore, we note that to ensure the success of the attack, nodes need to know the total true voting power held by the colluding group. We prove that the strategy to join the rational censorship attack and also for nodes to honestly declare their power is a subgame perfect equilibrium in the corresponding extensive form game induced by our attack. Finally, we discuss the implications of the attack on blockchain users and protocol designers as well as some potential countermeasures.
Expand

07 July 2025

Seoul, South Korea, 19 November - 21 November 2025
Event Calendar Event Calendar
Event date: 19 November to 21 November 2025
Submission deadline: 5 September 2025
Notification: 29 October 2025
Expand
◄ Previous Next ►