Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 12 von 26

Details

Autor(en) / Beteiligte
Titel
Excitation Dropout: Encouraging Plasticity in Deep Neural Networks
Ist Teil von
  • International journal of computer vision, 2021-04, Vol.129 (4), p.1139-1152
Ort / Verlag
New York: Springer US
Erscheinungsjahr
2021
Link zum Volltext
Quelle
SpringerLink (Online service)
Beschreibungen/Notizen
  • We propose a guided dropout regularizer for deep networks based on the evidence of a network prediction defined as the firing of neurons in specific paths. In this work, we utilize the evidence at each neuron to determine the probability of dropout, rather than dropping out neurons uniformly at random as in standard dropout. In essence, we dropout with higher probability those neurons which contribute more to decision making at training time. This approach penalizes high saliency neurons that are most relevant for model prediction, i.e. those having stronger evidence. By dropping such high-saliency neurons, the network is forced to learn alternative paths in order to maintain loss minimization, resulting in a plasticity-like behavior, a characteristic of human brains too. We demonstrate better generalization ability, an increased utilization of network neurons, and a higher resilience to network compression using several metrics over four image/video recognition benchmarks.

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX