International Association for Cryptologic Research

International Association
for Cryptologic Research

CryptoDB

Paolo Santini

Publications

Year
Venue
Title
2024
PKC
Zero Knowledge Protocols and Signatures from the Restricted Syndrome Decoding Problem
The Restricted Syndrome Decoding Problem (R-SDP) corresponds to the Syndrome Decoding Problem (SDP) with the additional constraint that all entries of the solution error vector must live in a fixed subset of the finite field. In this paper, we study how this problem can be applied to the construction of signatures derived from Zero-Knowledge (ZK) protocols. First, we show that R-SDP appears to be well-suited for this type of application: ZK protocols relying on SDP can easily be modified to use R-SDP, resulting in significant reductions in the communication cost. We then introduce and analyze a variant of R-SDP, which we call R-SDP(G), with the property that solution vectors can be represented with a number of bits that is slightly larger than the security parameter (which clearly provides an ultimate lower bound). This enables the design of competitive ZK protocols. We show that existing ZK protocols can greatly benefit from the use of R-SDP, achieving signature sizes in the order of 7 kB, which are smaller than those of several other schemes submitted to the additional call of NIST.
2024
CRYPTO
Not Just Regular Decoding: Asymptotics and Improvements of Regular Syndrome Decoding Attacks
Andre Esser Paolo Santini
Cryptographic constructions often base security on structured problem variants to enhance efficiency or to enable advanced functionalities. This led to the introduction of the Regular Syndrome Decoding (RSD) problem, which guarantees that a solution to the Syndrome Decoding (SD) problem follows a particular block-wise structure. Despite recent attacks exploiting that structure by Briaud and Øygarden (Eurocrypt ’23) and Carozza, Couteau and Joux (CCJ, Eurocrypt ’23), many questions about the impact of the regular structure on the problem hardness remain open. In this work we initiate a systematic study of the hardness of the RSD problem starting from its asymptotics. We classify different parameter regimes revealing large regimes for which RSD instances are solvable in polynomial time and on the other hand regimes that lead to particularly hard instances. Against previous perceptions, we show that a classification solely based on the uniqueness of the solution is not sufficient for isolating the worst case parameters. Further, we provide an in-depth comparison between SD and RSD in terms of reducibility and computational complexity, identifying regimes in which RSD instances are actually *harder* to solve. We provide the first asymptotic analyses of the algorithms presented by CCJ, establishing their worst case decoding complexities as $2^{0.141n}$ and $2^{0.135n}$, respectively. We then introduce *regular-ISD* algorithms by showing how to tailor the whole machinery of advanced Information Set Decoding (ISD) techniques from attacking SD to the RSD setting. The fastest regular-ISD algorithm improves the worst case decoding complexity significantly to $2^{0.113n}$. Eventually, we show that also with respect to suggested parameters regular-ISD outperforms previous approaches in most cases, reducing security levels by up to 30 bits.
2023
ASIACRYPT
A New Formulation of the Linear Equivalence Problem and Shorter LESS Signatures
Edoardo Persichetti Paolo Santini
The Linear Equivalence Problem (LEP) asks to find a linear isometry between a given pair of linear codes; in the Hamming weight this is known as a monomial map. LEP has been used in cryptography to design the family of LESS signatures, which includes also some advanced schemes, such as ring and identity-based signatures. All of these schemes are obtained applying the Fiat-Shamir transformation to a Sigma protocol, in which the prover's responses contain a description of how the monomial map acts on all code coordinates; such a description constitutes the vast majority of the signature size. In this paper, we propose a new formulation of LEP, which we refer to as Information-Set (IS)-LEP. Exploiting IS-LEP, it is enough for the prover to provide the description of the monomial action only on an information set, instead of all the coordinates. Thanks to this new formulation, we are able to drastically reduce signature sizes for all LESS signature schemes, without any relevant computational overhead. We prove that IS-LEP and LEP are completely equivalent (indeed, the same problem), which means that improvement comes with no additional security assumption, either.
2020
CRYPTO
Cryptanalysis of LEDAcrypt 📺
We report on the concrete cryptanalysis of LEDAcrypt, a 2nd Round candidate in NIST's Post-Quantum Cryptography standardization process and one of 17 encryption schemes that remain as candidates for near-term standardization. LEDAcrypt consists of a public-key encryption scheme built from the McEliece paradigm and a key-encapsulation mechanism (KEM) built from the Niederreiter paradigm, both using a quasi-cyclic low-density parity-check (QC-LDPC) code. In this work, we identify a large class of extremely weak keys and provide an algorithm to recover them. For example, we demonstrate how to recover $1$ in $2^{47.79}$ of LEDAcrypt's keys using only $2^{18.72}$ guesses at the 256-bit security level. This is a major, practical break of LEDAcrypt. Further, we demonstrate a continuum of progressively less weak keys (from extremely weak keys up to all keys) that can be recovered in substantially less work than previously known. This demonstrates that the imperfection of LEDAcrypt is fundamental to the system's design.