Adapting Density Attacks to Low-Weight Knapsacks [chapter]

Phong Q. Nguyễn, Jacques Stern
2005 Lecture Notes in Computer Science  
Cryptosystems based on the knapsack problem were among the first public-key systems to be invented. Their high encryption/ decryption rate attracted considerable interest until it was noticed that the underlying knapsacks often had a low density, which made them vulnerable to lattice attacks, both in theory and practice. To prevent low-density attacks, several designers found a subtle way to increase the density beyond the critical density by decreasing the weight of the knapsack, and possibly
more » ... llowing non-binary coefficients. This approach is actually a bit misleading: we show that low-weight knapsacks do not prevent efficient reductions to lattice problems like the shortest vector problem, they even make reductions more likely. To measure the resistance of low-weight knapsacks, we introduce the novel notion of pseudodensity, and we apply the new notion to the Okamoto-Tanaka-Uchiyama (OTU) cryptosystem from Crypto '00. We do not claim to break OTU and we actually believe that this system may be secure with an appropriate choice of the parameters. However, our research indicates that, in its current form, OTU cannot be supported by an argument based on density. Our results also explain why Schnorr and Hörner were able to solve at Eurocrypt '95 certain high-density knapsacks related to the Chor-Rivest cryptosystem, using lattice reduction. n i=1 m i a i , where each m i ∈ {0, 1}, recover the m i 's. On the one hand, it is well-known that this problem is NP-hard, and accordingly it is considered to be hard in the worst case. On the other hand, some knapsacks are very easy to solve, such as when the a i 's are the successive powers of two, in which case the problem is to find the binary decomposition of s. This inspired many public-key cryptosystems in the eighties, following the seminal work of Merkle and Hellman [10]:
doi:10.1007/11593447_3 fatcat:us7noxwnofg6lpq3qgwjxmzk6m