Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 10 von 1119
The Annals of statistics, 2021-08, Vol.49 (4), p.2231
2021

Details

Autor(en) / Beteiligte
Titel
On the rate of convergence of fully connected deep neural network regression estimates
Ist Teil von
  • The Annals of statistics, 2021-08, Vol.49 (4), p.2231
Ort / Verlag
Hayward: Institute of Mathematical Statistics
Erscheinungsjahr
2021
Link zum Volltext
Quelle
Project Euclid Complete
Beschreibungen/Notizen
  • Recent results in nonparametric regression show that deep learning, that is, neural network estimates with many hidden layers, are able to circumvent the so-called curse of dimensionality in case that suitable restrictions on the structure of the regression function hold. One key feature of the neural networks used in these results is that their network architecture has a further constraint, namely the network sparsity. In this paper, we show that we can get similar results also for least squares estimates based on simple fully connected neural networks with ReLU activation functions. Here, either the number of neurons per hidden layer is fixed and the number of hidden layers tends to infinity suitably fast for sample size tending to infinity, or the number of hidden layers is bounded by some logarithmic factor in the sample size and the number of neurons per hidden layer tends to infinity suitably fast for sample size tending to infinity. The proof is based on new approximation results concerning deep neural networks.
Sprache
Englisch
Identifikatoren
ISSN: 0090-5364
eISSN: 2168-8966
DOI: 10.1214/20-AOS2034
Titel-ID: cdi_proquest_journals_2578871387

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX