Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...

Details

Autor(en) / Beteiligte
Titel
Revisiting Weakly Supervised Pre-Training of Visual Perception Models
Ist Teil von
  • 2022 IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022, p.794-804
Ort / Verlag
IEEE
Erscheinungsjahr
2022
Link zum Volltext
Quelle
IEEE Electronic Library (IEL)
Beschreibungen/Notizen
  • Model pre-training is a cornerstone of modern visual recognition systems. Although fully supervised pre-training on datasets like ImageNet is still the de-facto standard, recent studies suggest that large-scale weakly supervised pretraining can outperform fully supervised approaches. This paper revisits weakly-supervised pre-training of models using hashtag supervision with modern versions of residual networks and the largest-ever dataset of images and corresponding hashtags. We study the performance of the resulting models in various transfer-learning settings including zero-shot transfer. We also compare our models with those obtained via large-scale self-supervised learning. We find our weakly-supervised models to be very competitive across all settings, and find they substantially outperform their self-supervised counterparts. We also include an investigation into whether our models learned potentially troubling associations or stereotypes. Overall, our results provide a compelling argument for the use of weakly supervised learning in the development of visual recognition systems. Our models, Supervised Weakly through hashtAGs (SWAG), are available publicly.

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX