International Association for Cryptologic Research

International Association
for Cryptologic Research

CryptoDB

Takeshi Koshiba

Affiliation: Saitama University

Publications

Year
Venue
Title
2008
EPRINT
Reducing Complexity Assumptions for Oblivious Transfer
K.Y. Cheong Takeshi Koshiba
Reducing the minimum assumptions needed to construct various cryptographic primitives is an important and interesting task in theoretical cryptography. Oblivious Transfer, one of the most basic cryptographic building blocks, is also studied under this scenario. Reducing the minimum assumptions for Oblivious Transfer seems not an easy task, as there are a few impossibility results under black-box reductions. Until recently, it is widely believed that Oblivious Transfer can be constructed with trapdoor permutations but not trapdoor functions in general. In this paper, we enhance previous results and show one Oblivious Transfer protocol based on a collection of trapdoor functions with some extra properties. We also provide reasons for adding the extra properties and argue that the assumptions in the protocol are nearly minimum.
2007
EPRINT
Direct Reduction of String (1,2)-OT to Rabin's OT
Kaoru Kurosawa Takeshi Koshiba
It is known that string (1,2)-OT and Rabin's OT are equivalent. However, two steps are required to construct a string $(1,2)$-OT from Rabin's OT. The first step is a construction of a bit (1,2)-OT from Rabin's OT, and the second step is a construction of a string $(1,2)$-OT from the bit (1,2)-OT. No direct reduction is known. In this paper, we show a direct reduction of string (1,2)-OT to Rabin's OT by using a deterministic randomness extractor. Our reduction is much more efficient than the previous two-step reduction.
2007
EPRINT
How to Derive Lower Bound on Oblivious Transfer Reduction
Suppose that we are given an ideal oblivious transfer protocol (OT). We wish to construct a larger OT by using the above OT as a blackbox. Then how many instances of the given ideal OT should be invoked ? For this problem, some lower bounds were derived using entropy. In this paper, we show more tight lower bounds by using combinatorial techniques. Roughly speaking, our lower bounds are two times larger than the previous bounds.
2007
EPRINT
Low-Density Attack Revisited
The low-density attack proposed by Lagarias and Odlyzko is a powerful algorithm against the subset sum problem. The improvement algorithm due to Coster et al. would solve almost all the problems of density < 0.9408... in the asymptotical sense. On the other hand, the subset sum problem itself is known as an NP-hard problem, and a lot of efforts have been paid to establish public-key cryptosystems based on the problem. In these cryptosystems, densities of the subset sum problems should be higher than 0.9408... in order to avoid the low-density attack. For example, the Chor-Rivest cryptosystem adopted subset sum problems with relatively high densities. In this paper, we further improve the low-density attack by incorporating an idea that integral lattice points can be covered with polynomially many spheres of shorter radius and of lower dimension. As a result, the success probability of our attack can be higher than that of Coster et al.'s attack for fixed dimensions. The density bound is also improved for fixed dimensions. Moreover, we numerically show that our improved low-density attack makes the success probability higher in case of low Hamming weight solution, such as the Chor-Rivest cryptosystem, if we assume SVP oracle calls.
2006
EPRINT
Computational Indistinguishability between Quantum States and Its Cryptographic Application
We introduce a computational problem of distinguishing between two specific quantum states as a new cryptographic problem to design a quantum cryptographic scheme that is ``secure'' against any polynomial-time quantum adversary. Our problem QSCDff is to distinguish between two types of random coset states with a hidden permutation over the symmetric group of finite degree. This naturally generalizes the commonly-used distinction problem between two probability distributions in computational cryptography. As our major contribution, we show three cryptographic properties: (i) QSCDff has the trapdoor property; (ii) the average-case hardness of QSCDff coincides with its worst-case hardness; and (iii) QSCDff is computationally at least as hard in the worst case as the graph automorphism problem. These cryptographic properties enable us to construct a quantum public-key cryptosystem, which is likely to withstand any chosen plaintext attack of a polynomial-time quantum adversary. We further discuss a generalization of QSCDff, called QSCDcyc, and introduce a multi-bit encryption scheme relying on the cryptographic properties of QSCDcyc.
2005
EUROCRYPT
2004
PKC
2002
FSE
2002
PKC
2002
EPRINT
Theoretical Analysis of ``Correlations in RC6''
In this paper, we give the theoretical analysis of Chi-square attack proposed by Knudsen and Meier on the RC6 block cipher. To this end, we propose the novel method of security evaluation against Chi-square attack precisely including key dependency by introducing a technique ``Transition Matrix Computing.'' On the other hand, the way of security evaluation against Chi-square attack has not been known except the computer experiment. We should note that it is the first results the way of security evaluation against Chi-square attack is shown theoretically. Using this method, we can obtain the ``weakest keys'' against the attack.
2001
PKC