The Kernel Matrix Diffie-Hellman Assumption

View/Open
Cita com:
hdl:2117/102936
Document typeArticle
Defense date2016-12
Rights accessOpen Access
Abstract
We put forward a new family of computational assumptions, the Kernel Matrix Diffie-Hellman Assumption. Given some matrix A sampled from some distribution D, the kernel assumption says that it is hard to find “in the exponent” a nonzero vector in the kernel of A>. This family is a natural computational analogue of the Matrix Decisional Diffie-Hellman Assumption (MDDH), proposed by Escala et al. As such it allows to extend the advantages of their algebraic framework to computational assumptions. The k-Decisional Linear Assumption is an example of a family of decisional assumptions of strictly increasing hardness when k grows. We show that for any such family of MDDH assumptions, the corresponding Kernel assumptions are also strictly increasingly weaker. This requires ruling out the existence of some black-box reductions between flexible problems (i.e., computational problems with a non unique solution).
Description
The final publication is available at link.springer.com
CitationMorillo, M., Rafols, C., Villar, J. The Kernel Matrix Diffie-Hellman Assumption. "Lecture notes in computer science", Desembre 2016, vol. 10031, p. 729-758.
ISSN0302-9743
Publisher versionhttp://link.springer.com/chapter/10.1007/978-3-662-53887-6_27
Files | Description | Size | Format | View |
---|---|---|---|---|
main-mcdh_asiacrypt16.pdf | 541,9Kb | View/Open |
All rights reserved. This work is protected by the corresponding intellectual and industrial
property rights. Without prejudice to any existing legal exemptions, reproduction, distribution, public
communication or transformation of this work are prohibited without permission of the copyright holder