International Association for Cryptologic Research

International Association
for Cryptologic Research

IACR News

Here you can see all recent updates to the IACR webpage. These updates are also available:

email icon
via email
RSS symbol icon
via RSS feed

08 June 2020

Sumanta Sarkar, Yu Sasaki, Siang Meng Sim
ePrint Report ePrint Report
Bit permutation based block ciphers, like PRESENT and GIFT, are well-known for their extreme lightweightness in hardware implementation. However, designing such ciphers comes with one major challenge - to ensure strong cryptographic properties simply depending on the combination of three components, namely S-box, a bit permutation and a key addition function. Having a wrong combination of components could lead to weaknesses. In this article, we studied the interaction between these components, improved the theoretical security bound of GIFT and highlighted the potential pitfalls associated with a bit permutation based primitive design. We also conducted analysis on TRIFLE, a first-round candidate for the NIST lightweight cryptography competition, where our findings influenced the elimination of TRIFLE from second-round of the NIST competition. In particular, we showed that internal state bits of TRIFLE can be partially decrypted for a few rounds even without any knowledge of the key.
Expand
Shashank Agrawal, Saikrishna Badrinarayanan, Payman Mohassel, Pratyay Mukherjee, Sikhar Patranabis
ePrint Report ePrint Report
In the past decades, user authentication has been dominated by server-side password-based solutions that rely on "what users know". This approach is susceptible to breaches and phishing attacks, and poses usability challenges. As a result, the industry is gradually moving to biometric-based client-side solutions that do not store any secret information on servers. This shift necessitates the safe storage of biometric templates and private keys, which are used to generate tokens, on user devices.

We propose a new generic framework called Biometric Enabled Threshold Authentication (BETA) to protect sensitive client-side information like biometric templates and cryptographic keys. Towards this, we formally introduce the notion of Fuzzy Threshold Tokenizer (FTS) where an initiator can use a "close" biometric measurement to generate an authentication token if at least $t$ (the threshold) devices participate. We require that the devices only talk to the initiator, and not to each other, to capture the way user devices are connected in the real world. We use the universal composability (UC) framework to model the security properties of FTS, including the unforgeability of tokens and the privacy of the biometric values (template and measurement), under a malicious adversary. We construct three protocols that meet our definition.

Our first two protocols are general feasibility results that work for any distance function, any threshold $t$ and tolerate the maximal (i.e. $t-1$) amount of corruption. They are based on any two round UC-secure multi-party computation protocol in the standard model (with a CRS) and threshold fully homomorphic encryption, respectively. We show how to effectively use these primitives to build protocols in a constrained communication model with just four rounds of communication.

For the third protocol, we consider inner-product based distance metrics (cosine similarity, Euclidean distance, etc.) specifically, motivated by the recent interest in its use for face recognition. We use Paillier encryption, efficient NIZKs for specific languages, and a simple garbled circuit to build an efficient protocol for the common case of $n=3$ devices with one compromised.
Expand
Lübeck, Duitsland, 18 November - 20 November 2020
Event Calendar Event Calendar
Event date: 18 November to 20 November 2020
Submission deadline: 3 July 2020
Notification: 4 September 2020
Expand
Technology Innovation Institute - Abu Dhabi, UAE
Job Posting Job Posting

Technology Innovation Institute (TII) is a publicly funded research institute, based in Abu Dhabi, United Arab Emirates. It is home to a diverse community of leading scientists, engineers, mathematicians, and researchers from across the globe, transforming problems and roadblocks into pioneering research and technology prototypes that help move society ahead. At TII, we help society to overcome its biggest hurdles through a rigorous approach to scientific discovery and inquiry, using state-of-the-art facilities and collaboration with leading international institutions

As a Symmetric Cryptography Researcher, you will:

  • Design and implement symmetric cryptographic algorithms on software.
  • Conduct research in the field of symmetric cryptography and lightweight cryptography
  • Perform security assessments of cryptographic primitives at theoretical and implementation level.
  • Work closely with the other R&D teams to build secure systems using state-of-the-art cryptographic algorithms and protocols.

    Minimum qualifications:

  • PhD degree in one of the following: Cryptography, Applied Cryptography, Information theory, Mathematics, Computer Science or any relevant Engineering degree.
  • Valuable publications record in the field of Symmetric Cryptography.
  • Knowledge of relevant theoretical and practical cryptanalysis techniques.
  • Extensive experience developing in C, C++, x86 or ARM assembly, Rust or Go.
  • Understanding of security attacks, including: timing, cache-based and microarchitectural attacks, and the corresponding countermeasures.

    Closing date for applications:

    Contact: Mehdi Messaoudi - Talent Acquisition Partner

  • Expand
    Technology Innovation Institute - Abu Dhabi, UAE
    Job Posting Job Posting
    As a Lead Cryptographic Protocols Researcher, you will:

  • Lead the scientific watch specialized in cryptographic protocols
  • Guide the team members in their research and development work
  • Analyze project requirements and provide technical and functional recommendations
  • Design and implement classical, quantum-resistant and hybrid cryptographic protocols and algorithms
  • Conduct research and development on state-of-the-art modern protocols
  • Perform security assessments of cryptographic protocols at the theoretical and implementation level

    Minimum Qualifications:

  • MSc or PhD degree in Cryptography, Applied Cryptography, Information Theory, Mathematics or Computer Science
  • 6+ years of work experience. Previous experience heading teams is a plus
  • Knowledge of widely-deployed cryptographic protocols and primitives
  • Experience in C desired, C++, Rust or Go relevant as well
  • Solid engineering practices and processes, such as development and testing methodology and documentation
  • Prior contributions to crypto protocols and open source software collaboration preferred
  • Quick learner, geared towards implementation
  • Eager to develop new skills and willing to take ownership of projects

    Closing date for applications:

    Contact: Mehdi Messaoudi - Talent Acquisition Partner

    More information: https://www.linkedin.com/company/tiiuae/about/

  • Expand
    ISAE SUPAERO, Toulouse, France
    Job Posting Job Posting

    In late 2016, NIST issued a call for proposals for the standardization of cryptographic systems resistant to a quantum computer for encryption, key exchange and signature primitives (hereafter NIST PQC). The standardization process is currently ending its second phase, with only 26 candidates remaining.

    Each submission comes with a software implementation, targeting standard security levels for widespread applications, such as e-commerce.

    Topic 1: General improvement of post-quantum primitives

    During this thesis, the successful candidate will study the NIST PQC submissions still running for standardization and will propose modifications that improve the submissions in general (e.g. tighter reductions, improved theoretical error rate analysis, etc.), or that provide specific advantages in constrained settings (e.g. soft/hard implementation simplicity, reduced bandwidth, reduced latency, etc.).

    Topic 2: Hardware implementations of cryptographic schemes based on error-correcting codes and/or lattices

    The software performance of NIST PQC submissions has been thoroughly studied. On the other hand, the hardware performance (e.g. energy cost or gate cost on broadly available FPGAs) of many submissions is still not very well understood. During this thesis, the successful candidate will study code-based and/or lattice-based NIST PQC submissions, and propose hardware implementations of both the original submisssions and variations designed by the candidate to improve hardware performance.

    Closing date for applications:

    Contact: Carlos Aguilar Melchor

    Expand
    Technology Innovation Institute - Abu Dhabi, UAE
    Job Posting Job Posting

    As a Post-Quantum Crypto Researcher, you will:

  • Design and implement quantum-safe PKE/KEM and digital signature schemes.
  • Conduct research and development in one of the following: lattice-based cryptography, code-based cryptography, hash-based cryptography, isogeny-based cryptography
  • Perform security assessments of crypto-primitives and cryptosystems at the theoretical and implementation level

    Minimum qualifications:

  • MSc or PhD degree in Cryptography, Applied Cryptography, Information Theory, Mathematics or Computer Science
  • Extensive experience developing in C, C++, Rust or Go
  • Deep understanding of cryptographic algorithms and protocols
  • Understanding of security attacks, including: side-channel analysis, fault injection, microarchitectural attacks, and the corresponding countermeasures
  • Familiarity with hardware languages is a plus

    Closing date for applications:

    Contact: Mehdi Messaoudi - Talent Acquisition Partner

    More information: https://www.linkedin.com/company/tiiuae/about/

  • Expand

    07 June 2020

    Alexander Munch-Hansen, Claudio Orlandi, Sophia Yakoubov
    ePrint Report ePrint Report
    A ring signature (introduced by Rivest et al., Asiacrypt 2001) allows a signer to sign a message without revealing their identity by anonymizing themselves within a group of users (chosen by the signer in an ad-hoc fashion at signing time). The signature proves that one member of the group is the signer, but does not reveal which one. We consider threshold ring signatures (introduced by Bresson et al., Crypto 2002), where any $t$ signers can sign a message together while anonymizing themselves within a larger (size-$n$) group. The signature proves that $t$ members of the group signed, without revealing anything else about their identities.

    Our contributions in this paper are two-fold. First, we strengthen existing definitions of threshold ring signatures in a natural way; we demand that a signer cannot be de-anonymized even by their fellow signers. This is crucial, since in applications where a signer's anonymity is important, we do not want that anonymity to be compromised by a single insider.

    Second, we give the first rigorous construction of a threshold ring signature with size independent of $n$, the number of users in the larger group. Instead, our signatures have size linear in $t$, the number of signers. This is also a very important contribution; signers should not have to choose between achieving their desired degree of anonymity (possibly very large $n$) and their need for communication efficiency.
    Expand
    T-H. Hubert Chan, Naomi Ephraim, Antonio Marcedone, Andrew Morgan, Rafael Pass, Elaine Shi
    ePrint Report ePrint Report
    Nakamoto's famous blockchain protocol enables achieving consensus in a so-called permissionless setting--anyone can join (or leave) the protocol execution, and the protocol instructions do not depend on the identities of the players. His ingenious protocol prevents ``sybil attacks'' (where an adversary spawns any number of new players) by relying on computational puzzles (a.k.a. ``moderately hard functions'') introduced by Dwork and Naor (Crypto'92). Recent work by Garay et al (EuroCrypt'15) and Pass et al. (EuroCrypt'17) demonstrate that this protocol provably achieves consistency and liveness assuming a) honest players control a majority of the computational power in the network, b) the puzzle-difficulty is appropriately set as a function of the maximum network message delay and the total computational power of the network, and c) the computational puzzle is modeled as a random oracle.

    These works, however, leave open the question of how to set the puzzle difficulty in a setting where the computational power in the network is changing. Nakamoto's protocol indeed also includes a description of a difficutly update procedure. A recent work by Garay et al. (Crypto'17) indeed shows a variant of this difficulty adjustment procedure can be used to get a sound protocol as long as the computational power does not change too fast --- however, under two restrictions: 1) their analysis assumes that the attacker cannot delays network messages, and 2) the changes in computational power in the network changes are statically set (i.e., cannot be adaptively selected by the adversary). In this work, we show the same result but without these two restrictions, demonstrating the soundness of a (slightly different) difficulty update procedure, assuming only that the computational power in the network does not change too fast (as a function of the maximum network message delays); as an additional contribution, our analysis yields a tight bound on the ``chain quality'' of the protocol.
    Expand
    Riad S. Wahby, Dan Boneh, Christopher Jeffrey, Joseph Poon
    ePrint Report ePrint Report
    A common approach to bootstrapping a new cryptocurrency is an airdrop, an arrangement in which existing users give away currency to entice new users to join. But current airdrops offer no recipient privacy: they leak which recipients have claimed the funds, and this information is easily linked to off-chain identities.

    In this work, we address this issue by defining a private airdrop and describing concrete schemes for widely-used user credentials, such as those based on ECDSA and RSA. Our private airdrop for RSA builds upon a new zero-knowledge argument of knowledge of the factorization of a committed secret integer, which may be of independent interest. We also design a private genesis airdrop that efficiently sends private airdrops to millions of users at once. Finally, we implement and evaluate. Our fastest implementation takes 40--180 ms to generate and 3.7--10 ms to verify an RSA private airdrop signature. Signatures are 1.8--3.3 kiB depending on the security parameter.
    Expand

    05 June 2020

    Matthias Fitzi, Peter Gazi, Aggelos Kiayias, Alexander Russell
    ePrint Report ePrint Report
    Blockchain protocols based on variations of the longest-chain rule--whether following the proof-of-work paradigm or one of its alternatives--suffer from a fundamental latency barrier. This arises from the need to collect a sufficient number of blocks on top of a transaction-bearing block to guarantee the transaction's stability while limiting the rate at which blocks can be created in order to prevent security-threatening forks.

    Our main result is a black-box security-amplifying combiner based on parallel composition of $m$ blockchains that achieves $\Theta(m)$-fold security amplification or, equivalently, $\Theta(m)$-fold reduction in latency. Our construction breaks the latency barrier to achieve, for the first time, a worst-case constant-time-settlement ledger based purely on Nakamoto longest-chain consensus: Transaction settlement can be accelerated to a constant multiple of block propagation time with negligible error.

    Operationally, our construction shows how to view any family of blockchains as a unified, virtual ledger without requiring any coordination among the chains or any new protocol metadata. Users of the system have the option to inject a transaction into a single constituent blockchain or--if they desire accelerated settlement--all of the constituent blockchains. Our presentation and proofs introduce a new formalism for reasoning about blockchains, the dynamic ledger, and articulate our constructions as transformations of dynamic ledgers that amplify security. We additionally illustrate the versatility of this formalism by presenting a class of robust-combiner constructions for blockchains that can protect against complete adversarial control of a minority of a family of blockchains.
    Expand
    Chiara Spadafora, Riccardo Longo, Massimiliano Sala
    ePrint Report ePrint Report
    Coercion resistance is one of the most important features of a secure voting procedure. Because of the properties such as transparency, decentralization, and non-repudiation, blockchain is a fundamental technology of great interest in its own right, and it also has large potential when integrated into many other areas. Here we propose a decentralized e-voting protocol that is coercion-resistant and vote-selling resistant, while being also completely transparent and not receipt-free. We prove the security of the protocol under the standard DDH assumption.
    Expand
    Wenbo MAO, Wenxiang WANG
    ePrint Report ePrint Report
    We present LotMint, a permissionless blockchain, with a purposely low set bar for Proof-of-Work (PoW) difficulty. Our objective is for personal computers, cloud virtual machines or containers, even mobile devices, and hopefully future IoT devices, to become the main, widely distributed, collectively much securer, fairer, more reliable and economically sustainable mining workforce for blockchains. An immediate question arises: how to prevent the permissionless network from being flooded of block dissemination traffic by a massive number of profit enthusiastic miners? We propose a novel notion of {\em Decentralized Clock/Time} (DC/DT) as global and logical clock/time which can be agreed upon as a consensus. Our construction of DC/DT practically uses distributed private clocks of the participation nodes. With DC/DT, a node upon creating or hearing a block can know how luckily short or unluckily long time for the block to have been mined and/or traveled through the network. They can ``time throttle'' a potentially large number of unluckily mined/travelled blocks. The luckier blocks passing through the time throttle are treated as time-tie forks with a volume being ``throttle diameter'' adjustably controlled not to congest the network. With the number of time-tie forks being manageable, it is then easy to break-tie elect a winner, or even to orderly queue a plural number of winners for a ``multi-core'' utilization of resource. We provide succinct and evident analyses of necessary properties for the LotMint blockchain including: decentralization, energy saving, safety, liveness, robustness, fairness, anti-denial-of-service, anti-sybil, anti-censorship, scaled-up transaction processing throughput and sped-up payment confirmation time.
    Expand
    Leonie Reichert, Samuel Brack, Björn Scheuermann
    ePrint Report ePrint Report
    To combat the ongoing Covid -19 pandemic, many new ways have been proposed on how to automate the process of finding infected people, also called contact tracing. A special focus was put on preserving the privacy of users. In this survey we define multiple classes of automated contact tracing techniques which most of the approaches fall into. We identify two major groups: systems that rely on a server for finding new infections and systems that distribute this process. They are systematically classified regarding security and privacy criteria.
    Expand
    Sebastien Carre, Sylvain Guilley, Olivier Rioul
    ePrint Report ePrint Report
    Persistent fault analysis (PFA) consists in guessing block cipher secret keys by biasing their substitution box. This paper improves the original attack of Zhang et al. on AES-128 presented at CHES 2018. By a thorough analysis, the exact probability distribution of the ciphertext (under a uniformly distributed plaintext) is derived, and the maximum likelihood key recovery estimator is computed exactly. Its expression is turned into an attack algorithm, which is shown to be twice more efficient in terms of number of required encryptions than the original attack of Zhang et al. This algorithm is also optimized from a computational complexity standpoint. In addition, our optimal attack is naturally amenable to key enumeration, which expedites full 16- bytes key extraction. Various tradeoffs between data and computational complexities are investigated.
    Expand
    Crypto Group at IST Austria
    ePrint Report ePrint Report
    Automated contract tracing aims at helping with the current COVID-19 pandemic by alerting users of encounters with infected people. There are currently many proposals for protocols (like the ``decentralized" DP-3T and PACT or the ``centralized" ROBERT and DESIRE) to be run on mobile phones, where the basic idea is to regularly broadcast (using low energy Bluetooth) some values, and at the same time store (a function of) incoming messages broadcasted by users in their proximity. Should a user get diagnosed, he can upload some data to a backend server, which then is used to alert users that were in proximity with the diagnosed user.

    There are many important aspects one wants those protocols to achieve, in particular simplicity/efficiency, privacy and robustness, the latter including some security against false positives, that is, users getting alerts despite not having been in proximity with a diagnosed user.

    In the existing proposals one can trigger false positives on a massive scale by an ``inverse-Sybil" attack, where a large number of devices (malicious users or hacked phones) pretend to be the same user, such that later, just a single person needs to be diagnosed (and allowed to upload) to trigger an alert for all users that were in proximity to any of this large group of devices.

    We propose the first protocols that do not succumb to such attacks assuming the devices involved in the attack do not constantly communicate, which we observe is a necessary assumption. Our first protocol requires devices to non-interactively exchange values (like e.g. in DESIRE), while the second requires that the devices have access to some location dependent coordinate (like coarse GPS coordinates or cell tower IDs).

    The high level idea of the protocols is to derive the values to be broadcasted by a hash chain, so that two (or more) devices who want to launch an inverse Sybil attack will not be able to connect their respective chains and thus only one of them will be able to upload. Apart from achieving strong privacy and good efficiency, a main challenge is to force the chains on different devices to divert, which we do by infusing unpredictable data (randomness from encounters in the first, location data in the second protocol). Our protocols also achieve security against replay, and the second even against relay attacks.
    Expand
    Avijit Dutta, Mridul Nandi, Abishanka Saha
    ePrint Report ePrint Report
    In \textsf{ICISC-05}, \textsf{ePrint-10} and \textsf{patarin-book}, Patarin proved that the number of solutions of $(P_1, \ldots, P_{2q})$ with distinct $P_1, P_2, \ldots$, $P_{2q}$ from $\{0,1\}^n$ satisfying $P_{2i - 1} \oplus P_{2i} = \lambda_i$ ($\neq 0$), $1 \leq i \leq q$ is at least \begin{center} $\frac{(2^n)_{2q}}{2^{nq}}$ for all $q \leq \frac{2^n}{134}$ \end{center} where $(2^n)_{2q} := 2^n(2^n-1) \cdots (2^n - 2q + 1)$. This result is known as \textit{Mirror Theory}. Mirror theory stands out to be a powerful tool to provide a high security guarantee for many block cipher (or even an ideal permutation) based designs. Unfortunately, the proof of mirror theory contains some unverifiable gaps and several mistakes. In this paper, we revisit the proof strategy of \textsf{ePrint-10} and \textit{provide a detailed proof of the mirror theory by correcting the mistakes and filling up gaps}. In particular, we prove the mirror theory for all $q \leq 2^n/33.1$ (a wider range than what was originally claimed by Patarin). As an application, we show that the maximum PRF-advantage of sum of domain separated random permutation is \textbf{exactly} $1- (1 - 2^{-n})^q$, $\forall q \leq 2^n/33.1$. Using similar proof strategy, we also prove the following mirror theory for sum of two independent permutations: the number of solutions of $(P_1, Q_1, \ldots, P_{q}, Q_q)$ with distinct $P_1, P_2, \ldots$, $P_{q}$ and distinct $Q_1, \ldots, Q_q$ satisfying $P_{i} \oplus Q_i = \lambda_i$ for any fixed $\lambda_i \in \{0,1\}^n$, $1 \leq i \leq q$ is at least \begin{center} $\frac{(2^n)_{q} \times (2^n)_q}{2^{nq}} \times (1 - \frac{1.2q^2}{2^{2n}}-\frac{108n^3}{2^{2n}})$,\ \ for all $q \leq \frac{2^n}{13}$. \end{center}
    Expand
    Behzad Abdolmaleki, Helger Lipmaa, Janno Siim, Michał Zając
    ePrint Report ePrint Report
    While NIZK arguments in the CRS model are widely studied, the question of what happens when the CRS was subverted has received little attention. In ASIACRYPT 2016, Bellare, Fuchsbauer, and Scafuro showed the first negative and positive results in the case of NIZK, proving also that it is impossible to achieve subversion soundness and (even non-subversion) zero-knowledge at the same time. On the positive side, they constructed an involved sound and subversion-zero-knowledge (Sub-ZK) non-succinct NIZK argument for NP. We consider the practically very relevant case of zk-SNARKs. We make Groth's zk-SNARK for \textsc{Circuit-SAT} from EUROCRYPT 2016 computationally knowledge-sound and perfectly composable Sub-ZK with minimal changes. We only require the CRS trapdoor to be extractable and the CRS to be publicly verifiable. To achieve the latter, we add some new elements to the CRS and construct an efficient CRS verification algorithm. We also provide a definitional framework for knowledge-sound and Sub-ZK SNARKs.
    Expand
    Sahiba Suryawanshi, Dhiman Saha, Satyam Sachan
    ePrint Report ePrint Report
    In ToSC 2017 Saha et al. demonstrated an interesting property of SHA3 based on higher-order vectorial derivatives which led to self-symmetry based distinguishers referred to as SymSum and bettered the complexity w.r.t the well-studied ZeroSum distinguisher by a factor of 4. This work attempts to take a fresh look at this distinguisher in the light of the linearization technique developed by Guo et al. in Asiacrypt 2016. It is observed that the efficiency of SymSum against ZeroSum drops from 4 to 2 for any number of rounds linearized. This is supported by theoretical proofs. SymSum augmented with linearization can penetrate up to two more rounds as against the classical version. In addition to that, one more round is extended by inversion technique on the final hash values. The combined approach leads to distinguishers up to 9 rounds of SHA3 variants with a complexity of only 264 which is better than the equivalent ZeroSum distinguisher by the factor of 2. To the best of our knowledge this is the best distinguisher available on this many rounds of SHA3.
    Expand
    Chao Sun, Mehdi Tibouchi, Masayuki Abe
    ePrint Report ePrint Report
    Binary error LWE is the particular case of the learning with errors (LWE) problem in which errors are chosen in $\{0,1\}$. It has various cryptographic applications, and in particular, has been used to construct efficient encryption schemes for use in constrained devices. Arora and Ge showed that the problem can be solved in polynomial time given a number of samples quadratic in the dimension $n$. On the other hand, the problem is known to be as hard as standard LWE given only slightly more than $n$ samples.

    In this paper, we first examine more generally how the hardness of the problem varies with the number of available samples. Under standard heuristics on the Arora--Ge polynomial system, we show that, for any $\epsilon >0$, binary error LWE can be solved in polynomial time $n^{O(1/\epsilon)}$ given $\epsilon\cdot n^{2}$ samples. Similarly, it can be solved in subexponential time $2^{\tilde O(n^{1-\alpha})}$ given $n^{1+\alpha}$ samples, for $0<\alpha<1$.

    As a second contribution, we also generalize the binary error LWE to problem the case of a non-uniform error probability, and analyze the hardness of the non-uniform binary error LWE with respect to the error rate and the number of available samples. We show that, for any error rate $0 < p < 1$, non-uniform binary error LWE is also as hard as worst-case lattice problems provided that the number of samples is suitably restricted. This is a generalization of Micciancio and Peikert's hardness proof for uniform binary error LWE. Furthermore, we also discuss attacks on the problem when the number of available samples is linear but significantly larger than $n$, and show that for sufficiently low error rates, subexponential or even polynomial time attacks are possible.
    Expand
    ◄ Previous Next ►