A lower bound for learning distributions generated by probabilistic automata

View/Open
Cita com:
hdl:2117/10556
Document typeConference report
Defense date2010
PublisherSpringer
Rights accessOpen Access
Except where otherwise noted, content on this work
is licensed under a Creative Commons license
:
Attribution-NonCommercial-NoDerivs 3.0 Spain
Abstract
Known algorithms for learning PDFA can only be shown to run in time polynomial in the so-called distinguishability μ of the target machine, besides the number of states and the usual accuracy and confidence parameters. We show that the dependence on μ is necessary for every algorithm whose structure resembles existing ones. As a technical tool, a new variant of Statistical Queries termed L ∞-queries is defined. We show how these queries can be simulated from samples and observe that known PAC algorithms for learning PDFA can be rewritten to access its target using L∞-queries and standard Statistical Queries. Finally, we show a lower bound: every algorithm to learn PDFA using queries with a resonable tolerance needs a number of queries larger than (1=μ )c for every c < 1.
CitationB. Balle; Castro, J.; Gavaldà, R. A lower bound for learning distributions generated by probabilistic automata. A: International Conference on Algorithmic Learning Theory. "21st International Conference on Algorithmic Learning Theory". Canberra: Springer, 2010, p. 179-193.
ISBN978-3-642-16107-0
Files | Description | Size | Format | View |
---|---|---|---|---|
alt2010final.pdf | 181,6Kb | View/Open |