Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 8 von 2048

Details

Autor(en) / Beteiligte
Titel
Recurrent Residual Networks Contain Stronger Lottery Tickets
Ist Teil von
  • IEEE access, 2023-01, Vol.11, p.1-1
Ort / Verlag
Piscataway: IEEE
Erscheinungsjahr
2023
Quelle
EZB Electronic Journals Library
Beschreibungen/Notizen
  • Accurate neural networks can be found just by pruning a randomly initialized overparameterized model, leaving out the need for any weight optimization. The resulting subnetworks are small, sparse, and ternary, making excellent candidates for efficient hardware implementation. However, finding optimal connectivity patterns is an open challenge. Based on the evidence that residual networks may be approximating unrolled shallow recurrent neural networks, we conjecture that they contain better candidate subnetworks at inference time when explicitly transformed into recurrent architectures. This hypothesis is put to the test on image classification tasks, where we find subnetworks within the recurrent models that are more accurate and parameter-efficient than both the ones found within feedforward models and than the full models with learned weights. Furthermore, random recurrent subnetworks are tiny: under a simple compression scheme, ResNet-50 is compressed without a drastic loss in performance to 48.55× less memory size, fitting in under 2 megabytes. Code available at: https://github.com/Lopez-Angel/hidden-fold-networks.
Sprache
Englisch
Identifikatoren
ISSN: 2169-3536
eISSN: 2169-3536
DOI: 10.1109/ACCESS.2023.3245808
Titel-ID: cdi_ieee_primary_10045676

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX