Some structural complexity aspects of neural computation
Visualitza/Obre
Estadístiques de LA Referencia / Recolecta
Inclou dades d'ús des de 2022
Cita com:
hdl:2117/370366
Tipus de documentReport de recerca
Data publicació1992
Condicions d'accésAccés obert
Llevat que s'hi indiqui el contrari, els
continguts d'aquesta obra estan subjectes a la llicència de Creative Commons
:
Reconeixement-NoComercial-SenseObraDerivada 4.0 Internacional
Abstract
Recent work by Siegelmann and Sontag has demonstrated that polynomial time on linear saturated recurrent neural networks equas polynomial time on standard computational models: Turing machines if the weights of the net are rationals, and nonuniform circuits if the weights are reals. Here we develop further connections between the languages recognized by such neural nets and other complexity classes. We present connections to space-bounded classes, simulation of parallel computational models such as Vector Machines, and a discussion of the characterizations of various nonuniform classes in terms of Kolmogorov complexity.
CitacióBalcazar, J.L. [et al.]. Some structural complexity aspects of neural computation. 1992.
Forma partLSI-92-32-R
Fitxers | Descripció | Mida | Format | Visualitza |
---|---|---|---|---|
1400013446.pdf | 1,352Mb | Visualitza/Obre |