Sie befinden Sich nicht im Netzwerk der Universität Paderborn. Der Zugriff auf elektronische Ressourcen ist gegebenenfalls nur via VPN oder Shibboleth (DFN-AAI) möglich. mehr Informationen...
Ergebnis 6 von 10

Details

Autor(en) / Beteiligte
Titel
RGB2Hands: real-time tracking of 3D hand interactions from monocular RGB video
Ist Teil von
  • ACM transactions on graphics, 2020-11, Vol.39 (6), p.1-16, Article 218
Ort / Verlag
New York, NY, USA: ACM
Erscheinungsjahr
2020
Link zum Volltext
Quelle
ACM Digital Library
Beschreibungen/Notizen
  • Tracking and reconstructing the 3D pose and geometry of two hands in interaction is a challenging problem that has a high relevance for several human-computer interaction applications, including AR/VR, robotics, or sign language recognition. Existing works are either limited to simpler tracking settings (e.g., considering only a single hand or two spatially separated hands), or rely on less ubiquitous sensors, such as depth cameras. In contrast, in this work we present the first real-time method for motion capture of skeletal pose and 3D surface geometry of hands from a single RGB camera that explicitly considers close interactions. In order to address the inherent depth ambiguities in RGB data, we propose a novel multi-task CNN that regresses multiple complementary pieces of information, including segmentation, dense matchings to a 3D hand model, and 2D keypoint positions, together with newly proposed intra-hand relative depth and inter-hand distance maps. These predictions are subsequently used in a generative model fitting framework in order to estimate pose and shape parameters of a 3D hand model for both hands. We experimentally verify the individual components of our RGB two-hand tracking and 3D reconstruction pipeline through an extensive ablation study. Moreover, we demonstrate that our approach offers previously unseen two-hand tracking performance from RGB, and quantitatively and qualitatively outperforms existing RGB-based methods that were not explicitly designed for two-hand interactions. Moreover, our method even performs on-par with depth-based real-time methods.
Sprache
Englisch
Identifikatoren
ISSN: 0730-0301
eISSN: 1557-7368
DOI: 10.1145/3414685.3417852
Titel-ID: cdi_crossref_primary_10_1145_3414685_3417852

Weiterführende Literatur

Empfehlungen zum selben Thema automatisch vorgeschlagen von bX