## IACR News

Updates on the COVID-19 situation are on the Announcement channel.

Here you can see all recent updates to the IACR webpage. These updates are also available:

#### 03 December 2021

###### Nilanjan Datta, Avijit Dutta, Kushankur Dutta
ePrint Report
In CRYPTO'16, Cogliati and Seurin proposed a block cipher based nonce based MAC, called {\em Encrypted Wegman-Carter with Davies-Meyer} (\textsf{EWCDM}), that gives $2n/3$ bit MAC security in the nonce respecting setting and $n/2$ bit security in the nonce misuse setting, where $n$ is the block size of the underlying block cipher. However, this construction requires two independent block cipher keys. In CRYPTO'18, Datta et al. came up with a single-keyed block cipher based nonce based MAC, called {\em Decrypted Wegman-Carter with Davies-Meyer} (\textsf{DWCDM}), that also provides $2n/3$ bit MAC security in the nonce respecting setting and $n/2$ bit security in the nonce misuse setting. However, the drawback of \textsf{DWCDM} is that it takes only $2n/3$ bit nonce. In fact, authors have shown that \textsf{DWCDM} cannot achieve beyond the birthday bound security with $n$ bit nonces. In this paper, we prove that \textsf{DWCDM} with $3n/4$ bit nonces provides MAC security up to $O(2^{3n/4})$ MAC queries against all nonce respecting adversaries. We also improve the MAC bound of \textsf{EWCDM} from $2n/3$ bit to $3n/4$ bit. The backbone of these two results is a refined treatment of extended mirror theory that systematically estimates the number of solutions to a system of bivariate affine equations and non-equations, which we apply on the security proofs of the constructions to achieve $3n/4$ bit security.
ePrint Report
###### Stefano Tessaro, Xihu Zhang
ePrint Report
A substantial effort has been devoted to proving optimal bounds for the security of key-alternating ciphers with independent sub-keys in the random permutation model (e.g., Chen and Steinberger, EUROCRYPT '14; Hoang and Tessaro, CRYPTO '16). While common in the study of multi-round constructions, the assumption that sub-keys are truly independent is not realistic, as these are generally highly correlated and generated from shorter keys.

In this paper, we show the existence of non-trivial distributions of limited independence for which a t-round key-alternating cipher achieves optimal security. Our work is a natural continuation of the work of Chen et al. (CRYPTO '14) which considered the case of t = 2 when all-subkeys are identical. Here, we show that key-alternating ciphers remain secure for a large class of (t-1)-wise and (t-2)-wise independent distribution of sub-keys.

Our proofs proceed by generalizations of the so-called Sum-Capture Theorem, which we prove using Fourier-analytic techniques.
###### Alexander Bienstock, Yevgeniy Dodis, Yi Tang
ePrint Report
Multicast Key Agreement (MKA) is a long-overlooked natural primitive of large practical interest. In traditional MKA, an omniscient group manager privately distributes secrets over an untrusted network to a dynamically-changing set of group members. The group members are thus able to derive shared group secrets across time, with the main security requirement being that only current group members can derive the current group secret. There indeed exist very efficient MKA schemes in the literature that utilize symmetric-key cryptography. However, they lack formal security analyses, efficiency analyses regarding dynamically changing groups, and more modern, robust security guarantees regarding user state leakages: forward secrecy (FS) and post-compromise security (PCS). The former ensures that group secrets prior to state leakage remain secure, while the latter ensures that after such leakages, users can quickly recover security of group secrets via normal protocol operations.

More modern Secure Group Messaging (SGM) protocols allow a group of users to asynchronously and securely communicate with each other, as well as add and remove each other from the group. SGM has received significant attention recently, including in an effort by the IETF Messaging Layer Security (MLS) working group to standardize an eponymous protocol. However, the group key agreement primitive at the core of SGM protocols, Continuous Group Key Agreement (CGKA), achieved by the TreeKEM protocol in MLS, suffers from bad worst-case efficiency and heavily relies on less efficient (than symmetric-key cryptography) public-key cryptography. We thus propose that in the special case of a group membership change policy which allows a single member to perform all group additions and removals, an upgraded version of classical Multicast Key Agreement (MKA) may serve as a more efficient substitute for CGKA in SGM.

We therefore present rigorous, stronger MKA security definitions that provide increasing levels of security in the case of both user and group manager state leakage, and that are suitable for modern applications, such as SGM. We then construct a formally secure MKA protocol with strong efficiency guarantees for dynamic groups. Finally, we run experiments which show that the left-balanced binary tree structure used in TreeKEM can be replaced with red-black trees in MKA for better efficiency.
###### Omid Bazangani, Alexandre Iooss, Ileana Buhan, Lejla Batina
ePrint Report
Side-channel leakage simulators allow testing the resilience of cryptographic implementations to power side-channel attacks without a dedicated setup. The main challenge in their large-scale deployment is the limited support for target devices, a direct consequence of the effort required for reverse engineering microarchitecture implementations. We introduce ABBY, the first solution for the automated creation of fine-grained leakage models. The main innovation of ABBY is the training framework, which can automatically characterize the microarchitecture of the target device and is portable to other platforms. Evaluation of ABBY on real-world crypto implementations exhibits comparable performance to other state-of-the-art leakage simulators.
ePrint Report
As a recent fault-injection attack, SIFA defeats most of the known countermeasures. Although error-correcting codes have been shown effective against SIFA, they mainly require a large redundancy to correct a few bits. In this work, we propose a hybrid construction with the ability to detect and correct injected faults at the same time. We provide a general implementation methodology which guarantees the correction of up to $t_c$-bit faults and the detection of at most $t_d$ faulty bits. Exhaustive evaluation of our constructions, by the open-source fault diagnostic tool VerFI, indicate the success of our designs in achieving the desired goals.

#### 01 December 2021

###### Tomer Ashur, Mohsin Khan, Kaisa Nyberg
ePrint Report
The goal of this paper is to investigate linear approximations of random functions and permutations. Our motivation is twofold. First, before the distinguishability of a practical cipher from an ideal one can be analysed, the cryptanalyst must have an accurate understanding of the statistical behaviour of the ideal cipher. Secondly, this issue has been neglected both in old and in more recent studies, particularly when multiple linear approximations are being used simultaneously. Traditional models have been based on the average behaviour and simplified using other assumptions such as independence of the linear approximations. Multidimensional cryptanalysis was introduced to avoid making artificial assumptions about statistical independence of linear approximations. On the other hand, it has the drawback of including many trivial approximations that do not contribute to the attack but just cause a waste of time and memory. We show for the first time in this paper that the trivial approximations reduce the degree of freedom of the related χ² distribution. Previously, the affine linear cryptanalysis was proposed to allow removing trivial approximations and, at the same time, admitting a solid statistical model. In this paper, we identify another type of multidimensional linear approximation, called Davies-Meyer approximation, which has similar advantages, and present full statistical models for both the affine and the Davies-Meyer type of multidimensional linear approximations. The new models given in this paper are realistic, accurate and easy to use. They are backed up by standard statistical tools such as Pearson’s χ² test and finite population correction and demonstrated to work accurately using practical examples.
###### Qiang Tang
ePrint Report
After its debut with Bitcoin in 2009, Blockchain has attracted enormous attention and been used in many different applications as a trusted black box. Many applications focus on exploiting the Blockchain-native features (e.g. trust from consensus, and smart contracts) while paying less attention to the application-specific requirements. In this paper, we initiate a systematic study on the applications in the education and training sector, where Blockchain is leveraged to combat diploma fraud. We present a general system structure for digitized diploma management systems and identify both functional and non-functional requirements. Our analysis show that all existing Blockchain-based systems fall short in meeting these requirements. Inspired by the analysis, we propose a Blockchain-facilitated solution by leveraging some basic cryptographic primitives and data structures. Following-up analysis show that our solution respects all the identified requirements very well.
###### Shweta Agrawal, Elena Kirshanova, Damien Stehle, Anshu Yadav
ePrint Report
Blind signatures have numerous applications in privacy-preserving technologies. While there exist many practical blind signatures from number-theoretic assumptions, the situation is far less satisfactory from post-quantum assumptions. In this work, we make advances towards making lattice-based blind signatures practical. We introduce two round-optimal constructions in the random oracle model, and provide guidance towards their concrete realization as well as efficiency estimates.

The first scheme relies on the homomorphic evaluation of a lattice-based signature scheme. This requires an ${\sf HE}$-compatible lattice-based signature. For this purpose, we show that the rejection step in Lyubashevsky's signature is unnecessary if the working modulus grows linearly in $\sqrt{Q}$, where $Q$ is an a priori bound on the number of signature queries. Compared to the state of art scheme from Hauck et al [CRYPTO'20], this blind signature compares very favorably in all aspects except for signer cost. Compared to a lattice-based instantiation of Fischlin's generic construction, it is much less demanding on the user and verifier sides.

The second scheme relies on the Gentry, Peikert and Vaikuntanathan signature [STOC'08] and non-interactive zero-knowledge proofs for linear relations with small unknowns, which are significantly more efficient than their general purpose counterparts. Its security stems from a new and arguably natural assumption which we introduce: ${\sf one}$-${\sf more}$-${\sf ISIS}$. This assumption can be seen as a lattice analogue of the one-more-RSA assumption by Bellare et al [JoC'03]. To gain confidence, we provide a detailed overview of diverse attack strategies. The resulting blind signature beats all the aforementioned from most angles and obtains practical overall performance.
###### Karim Eldefrawy, Tancrède Lepoint, Antonin Leroux
ePrint Report
Secure multiparty computation (MPC) has recently been increasingly adopted to secure cryptographic keys in enterprises, cloud infrastructure, and cryptocurrency and blockchain-related settings such as wallets and exchanges. Using MPC in blockchains and other distributed systems highlights the need to consider dynamic settings. In such dynamic settings, parties, and potentially even parameters of underlying secret sharing and corruption tolerance thresholds of sub-protocols, may change over the lifetime of the protocol. In particular, stronger threat models -- in which \emph{mobile} adversaries control a changing set of parties (up to $t$ out of $n$ involved parties at any instant), and may eventually corrupt \emph{all $n$ parties} over the course of a protocol's execution -- are becoming increasingly important for such real world deployments; secure protocols designed for such models are known as Proactive MPC (PMPC).

In this work, we construct the first efficient PMPC protocol for \emph{dynamic} groups (where the set of parties changes over time) secure against a \emph{dishonest majority} of parties. Our PMPC protocol only requires $O(n^2)$ (amortized) communication per secret, compared to existing PMPC protocols that require $O(n^4)$ and only consider static groups with dishonest majorities. At the core of our PMPC protocol is a new efficient technique to perform multiplication of secret shared data (shared using a bivariate scheme) with $O(n \sqrt{n})$ communication with security against a dishonest majority without requiring pre-computation. We also develop a new efficient bivariate batched proactive secret sharing (PSS) protocol for dishonest majorities, which may be of independent interest. This protocol enables multiple dealers to contribute different secrets that are efficiently shared together in one batch; previous batched PSS schemes required all secrets to come from a single dealer.

#### 30 November 2021

###### École polytechnique fédérale de Lausanne (EPFL)
Job Posting
EPFL invites applications for postdoctoral positions in the Theory Group. Applications will be reviewed by theory faculty (as listed on EPFL's theory website at https://theory.epfl.ch/). Faculty may reach out to applicants on a rolling basis, though we encourage inquiries to be made as soon as possible.

The application must include the following materials:
• Cover letter (identify one or more faculty of interest and provide availability).
• CV (including a ranked list of at least three writers for letters of reference).
• Research statement (covering research interests, past research focus, and future research directions).
Applications must be submitted through the application system in the embedded link, and recommendation letters can be submitted by your writers to tcs-postdoc@groupes.epfl.ch.

EPFL is located in Lausanne (Switzerland) and ranks among the world’s top scientific universities. In French-speaking Lausanne, English is generally well spoken and is the main language at EPFL. Postdoctoral positions come with a competitive salary for 1 year (83'600 CHF with yearly increments), renewable up to a maximum of 4 years.

Closing date for applications:

Contact: tcs-postdoc@groupes.epfl.ch

###### University of Stuttgart, Institute of Information Security
Job Posting
The Institute of Information Security at University of Stuttgart offers

fully-funded Postdoc and PhD positions in formal verification.

Successful candidates are expected to carry out research on tool-supported formal verification methods for security-critical systems and security protocols in our new REPROSEC initiative (https://reprosec.org/). See, e.g., our work at ACM CCS 2021 and EuroS&P 2021 on DY*.

The positions are available immediately with an internationally competitive salary (German public salary scale TV-L E13 or TV-L E14, depending on the candidate's qualification, ranging from about 4.000 Euro to 6.200 Euro monthly gross salary). The employment periods are between one and six years, following the German Wissenschaftszeitvertragsgesetz (WissZeitVg).

The Institute of Information Security offers a creative international environment for top-level international research in Germany's high-tech region.

You should have a Master's degree or a Ph.D. (or should be very close to completion thereof) in Computer Science, Mathematics, Cyber Security, or a related field. We value excellent analytical skills and

• solid knowledge of logic, proofs and/or formal verification techniques (Theorem Proving, Type Checking, etc.), and
• solid programming experience.
Knowledge in cryptography/security is not required, but a plus. Knowledge of German is not required.

The deadline for applications is

December 12th, 2021.

Late applications will be considered until the positions are filled.

See https://www.sec.uni-stuttgart.de/institute/job-openings/ for the official job announcement and details of how to apply.

Closing date for applications:

Contact: Prof. Dr. Ralf Küsters Institute of Information Security University of Stuttgart, Germany

#### 29 November 2021

###### Sebastian Paul, Patrik Scheible, Friedrich Wiemer
ePrint Report
The threat of a cryptographically relevant quantum computer contributes to an increasing interest in the field of post-quantum cryptography (PQC). Compared to existing research efforts regarding the integration of PQC into the Transport Layer Security (TLS) protocol, industrial communication protocols have so far been neglected. Since industrial cyber-physical systems (CPS) are typically deployed for decades, protection against such long-term threats is needed.

In this work, we propose two novel solutions for the integration of post-quantum (PQ) primitives (digital signatures and key establishment) into the industrial protocol Open Platform Communications Unified Architecture (OPC~UA): a hybrid solution combining conventional cryptography with PQC and a solution solely based on PQC. Both approaches provide mutual authentication between client and server and are realized with certificates fully compliant to the X.509 standard. We implement the two solutions and measure and evaluate their performance across three different security levels. All selected algorithms (Kyber, Dilithium, and Falcon) are candidates for standardization by the National Institute of Standards and Technology (NIST). We show that Falcon is a suitable option~-- especially~-- when using floating-point hardware provided by our ARM-based evaluation platform. Our proposed hybrid solution provides PQ security for early adopters but comes with additional performance and communication requirements. Our solution solely based on PQC shows superior performance across all evaluated security levels in terms of handshake duration compared to conventional OPC~UA but comes at the cost of increased handshake sizes.

In addition to our performance evaluation, we provide a proof of security in the symbolic model for our two PQC-based variants of OPC~UA. For this proof, we use the cryptographic protocol verifier ProVerif and formally verify confidentiality and authentication properties of our quantum-resistant variants.
###### Andrew Morgan, Rafael Pass
ePrint Report
We consider the feasibility of non-interactive secure two-party computation (NISC) in the plain model satisfying the notion of superpolynomial-time simulation (SPS). While stand-alone secure SPS-NISC protocols are known from standard assumptions (Badrinarayanan et al., Asiacrypt 2017), it has remained an open problem to construct a concurrently composable SPS-NISC. Prior to our work, the best protocols require 5 rounds (Garg et al., Eurocrypt 2017), or 3 simultaneous-message rounds (Badrinarayanan et al., TCC 2017).

In this work, we demonstrate the first concurrently composable SPS-NISC. Our construction assumes the existence of: - a non-interactive (weakly) CCA-secure commitment, - a stand-alone secure SPS-NISC with subexponential security, and satisfies the notion of "angel-based" UC security (i.e., UC with a superpolynomial-time helper) with perfect correctness.

We additionally demonstrate that both of the primitives we use (albeit only with polynomial security) are necessary for such concurrently composable SPS-NISC with perfect correctness. As such, our work identifies essentially necessary and sufficient primitives for concurrently composable SPS-NISC with perfect correctness in the plain model.
###### Orr Dunkelman, Nathan Keller, Eyal Ronen, Adi Shamir
ePrint Report
One of the most celebrated and useful cryptanalytic algorithms is Hellman's time/memory tradeoff (and its Rainbow Table variant), which can be used to invert random-looking functions on $N$ possible values with time and space complexities satisfying $TM^2=N^2$. In this paper we develop new upper bounds on their performance in the quantum setting. As a search problem, one can always apply to it the standard Grover's algorithm, but this algorithm does not benefit from the possible availability of a large memory in which one can store auxiliary advice obtained during a free preprocessing stage. In fact, at FOCS'20 it was rigorously shown that for memory size bounded by $M \leq O(\sqrt{N})$, even quantum advice cannot yield an attack which is better than Grover's algorithm.

Our main result complements this lower bound by showing that in the standard Quantum Accessible Classical Memory (QACM) model of computation, we can improve Hellman's tradeoff curve to $T^{4/3}M^2=N^2$. When we generalize the cryptanalytic problem to a time/memory/data tradeoff attack (in which one has to invert $f$ for at least one of $D$ given values), we get the generalized curve $T^{4/3}M^2D^2=N^2$. A typical point on this curve is $D=N^{0.2}$, $M=N^{0.6}$, and $T=N^{0.3}$, whose time is strictly lower than both Grover's algorithm (which requires $T=N^{0.4}$ in this generalized search variant) and the classical Hellman algorithm (which requires $T=N^{0.4}$ for these $D$ and $M$).
###### Shiyao Chen, Yanhong Fan, Ling Sun, Yong Fu, Haibo Zhou, Yongqing Li, Meiqin Wang, Weijia Wang, Chun Guo
ePrint Report
We revisit designing AND-RX block ciphers, that is, the designs assembled with the most fundamental binary operations---AND, Rotation and XOR operations and do not rely on existing units. Likely, the most popular representative is the NSA cipher \texttt{SIMON}, which remains one of the most efficient designs, but suffers from difficulty in security evaluation.

As our main contribution, we propose \texttt{SAND}, a new family of lightweight AND-RX block ciphers. To overcome the difficulty regarding security evaluation, \texttt{SAND} follows a novel design approach, the core idea of which is to restrain the AND-RX operations to be within nibbles. By this, \texttt{SAND} admits an equivalent representation based on a $4\times8$ \textit{synthetic S-box} ($SSb$). This enables the use of classical S-box-based security evaluation approaches. Consequently, for all versions of \texttt{SAND}, (a) we evaluated security bounds with respect to differential and linear attacks, and in both single-key and related-key scenarios; (b) we also evaluated security against impossible differential and zero-correlation linear attacks.

This better understanding of the security enables the use of a relatively simple key schedule, which makes the ASIC round-based hardware implementation of \texttt{SAND} to be one of the state-of-art Feistel lightweight ciphers. As to software performance, due to the natural bitslice structure, \texttt{SAND} reaches the same level of performance as \texttt{SIMON} and is among the most software-efficient block ciphers.
###### Kaiyi Zhang, Hongrui Cui, Yu Yu
ePrint Report
With the growing adoption of facial recognition worldwide as a popular authentication method, there is increasing concern about the invasion of personal privacy due to the lifetime irrevocability of facial features. In principle, {\it Fuzzy Extractors} enable biometric-based authentication while preserving the privacy of biometric templates. Nevertheless, to our best knowledge, most existing fuzzy extractors handle binary vectors with Hamming distance, and no explicit construction is known for facial recognition applications where $\ell_2$-distance of real vectors is considered. In this paper, we utilize the dense packing feature of certain lattices (e.g., $\rm E_8$ and Leech) to design a family of {\it lattice-based} fuzzy extractors that docks well with existing neural network-based biometric identification schemes. We instantiate and implement the generic construction and conduct experiments on publicly available datasets. Our result confirms the feasibility of facial template protection via fuzzy extractors.
###### Chitchanok Chuengsatiansup, Andrew Feutrill, Rui Qi Sim, Yuval Yarom
ePrint Report
The seminal work of Heninger and Shacham (Crypto 2009) demonstrated a method for reconstructing secret RSA keys from artial information of the key components. In this paper we further investigate this approach but apply it to a different context that appears in some side-channel attacks. We assume a fixed-window exponentiation algorithm that leaks the equivalence between digits, without leaking the value of the digits themselves.

We explain how to exploit the side-channel information with the Heninger-Shacham algorithm. To analyse the complexity of the approach, we model the attack as a Markov process and experimentally validate the accuracy of the model. Our model shows that the attack is feasible in the commonly used case where the window size is 5.
###### Marco Baldi, Alessandro Barenghi, Franco Chiaraluce, Gerardo Pelosi, Paolo Santini
ePrint Report
Quasi-Cyclic Moderate-Density Parity-Check (QC-MDPC) codes are receiving increasing attention for their advantages in the context of post-quantum asymmetric cryptography based on codes. However, a fundamentally open question concerns modeling the performance of their decoders in the region of a low decoding failure rate (DFR). We provide two approaches for bounding the performance of these decoders, and study their asymptotic behavior. We first consider the well-known Maximum Likelihood (ML) decoder, which achieves optimal performance and thus provides a lower bound on the performance of any sub-optimal decoder. We provide lower and upper bounds on the performance of ML decoding of QC-MDPC codes and show that the DFR of the ML decoder decays polynomially in the QC-MDPC code length when all other parameters are fixed. Secondly, we analyze some hard to decode error patterns for Bit-Flipping (BF) decoding algorithms, from which we derive some lower bounds on the DFR of BF decoders applied to QC-MDPC codes.
###### Raghvendra Rohit, Santanu Sarkar
ePrint Report
At ToSC 2021, Rohit \textit{et al.} presented the first distinguishing and key recovery attacks on 7 rounds Ascon without violating the designer's security claims of nonce-respecting setting and data limit of $2^{64}$ blocks per key. So far, these are the best attacks on 7 rounds Ascon. However, the distinguishers require (impractical) $2^{60}$ data while the data complexity of key recovery attacks exactly equals $2^{64}$. Whether there are any practical distinguishers and key recovery attacks (with data less than $2^{64}$) on 7 rounds Ascon is still an open problem.

In this work, we give positive answers to these questions by providing a comprehensive security analysis of Ascon in the weak key setting. Our first major result is the 7-round cube distinguishers with complexities $2^{46}$ and $2^{33}$ which work for $2^{82}$ and $2^{63}$ keys, respectively. Notably, we show that such weak keys exist for any choice (out of 64) of 46 and 33 specifically chosen nonce variables. In addition, we improve the data complexities of existing distinguishers for 5, 6 and 7 rounds by a factor of $2^{8}, 2^{16}$ and $2^{27}$, respectively. Our second contribution is a new theoretical framework for weak keys of Ascon which is solely based on the algebraic degree. Based on our construction, we identify $2^{127.99}$, $2^{127.97}$ and $2^{116.34}$ weak keys (out of $2^{128}$) for 5, 6 and 7 rounds, respectively. Next, we present two key recovery attacks on 7 rounds with different attack complexities. The best attack can recover the secret key with $2^{63}$ data, $2^{69}$ bits of memory and $2^{115.2}$ time. Our attacks are far from threatening the security of full 12 rounds Ascon, but we expect that they provide new insights into Ascon's security.