Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 9 von 7061
IEEE transaction on neural networks and learning systems, 2023-07, Vol.PP, p.1-13
2023
Volltextzugriff (PDF)

Details

Autor(en) / Beteiligte
Titel
Learn to Adapt for Self-Supervised Monocular Depth Estimation
Ist Teil von
  • IEEE transaction on neural networks and learning systems, 2023-07, Vol.PP, p.1-13
Ort / Verlag
United States: IEEE
Erscheinungsjahr
2023
Quelle
IEEE Electronic Library (IEL)
Beschreibungen/Notizen
  • Monocular depth estimation is one of the fundamental tasks in environmental perception and has achieved tremendous progress by virtue of deep learning. However, the performance of trained models tends to degrade or deteriorate when employed on other new datasets due to the gap between different datasets. Though some methods utilize domain adaptation technologies to jointly train different domains and narrow the gap between them, the trained models cannot generalize to new domains that are not involved in training. To boost the transferability of self-supervised monocular depth estimation models and mitigate the issue of meta-overfitting, we train the model in the pipeline of meta-learning and propose an adversarial depth estimation task. We adopt model-agnostic meta-learning (MAML) to obtain universal initial parameters for further adaptation and train the network in an adversarial manner to extract domain-invariant representations for easing meta-overfitting. In addition, we propose a constraint to impose upon cross-task depth consistency to compel the depth estimation to be identical in different adversarial tasks, which improves the performance of our method and smoothens the training process. Experiments on four new datasets demonstrate that our method adapts quite fast to new domains. Our method trained after 0.5 epoch achieves comparable results with the state-of-the-art methods trained at least 20 epochs.
Sprache
Englisch
Identifikatoren
ISSN: 2162-237X
eISSN: 2162-2388
DOI: 10.1109/TNNLS.2023.3289051
Titel-ID: cdi_pubmed_primary_37410648

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX