The Kernel Matrix Diffie-Hellman Assumption
Rights accessOpen Access
We put forward a new family of computational assumptions, the Kernel Matrix Diffie-Hellman Assumption. Given some matrix A sampled from some distribution D, the kernel assumption says that it is hard to find “in the exponent” a nonzero vector in the kernel of A>. This family is a natural computational analogue of the Matrix Decisional Diffie-Hellman Assumption (MDDH), proposed by Escala et al. As such it allows to extend the advantages of their algebraic framework to computational assumptions. The k-Decisional Linear Assumption is an example of a family of decisional assumptions of strictly increasing hardness when k grows. We show that for any such family of MDDH assumptions, the corresponding Kernel assumptions are also strictly increasingly weaker. This requires ruling out the existence of some black-box reductions between flexible problems (i.e., computational problems with a non unique solution).
The final publication is available at link.springer.com
CitationMorillo, M., Rafols, C., Villar, J. The Kernel Matrix Diffie-Hellman Assumption. "Lecture notes in computer science", Desembre 2016, vol. 10031, p. 729-758.