Whose Hands Are These? Hand Detection and Hand-Body Association in the Wild
Supreeth Narasimhaswamy1, Thanh Nguyen2, Mingzhen Huang3, Minh Hoai1,2
1Stony Brook University, 2VinAI, 3University of Buffalo
CVPR 2022
Qualitative Results
Figure 1: We develop a method to detect hands and their corresponding body locations. Hands and bodies belonging to the same person have bounding boxes in the same color and identification numbers.
Abstract
We study a new problem of detecting hands and finding the location of the corresponding person for each detected hand. This task is helpful for many downstream tasks such as hand tracking and hand contact estimation. Associating hands with people is challenging in unconstrained conditions since multiple people can be present in the scene with varying overlaps and occlusions.
We propose a novel end-to-end trainable convolutional network that can jointly detect hands and the body location for the corresponding person. Our method first detects a set of hands and bodies and uses a novel Hand-Body Association Network to predict association scores between them. We use these association scores to find the body location for each detected hand. We also introduce a new challenging dataset called BodyHands containing unconstrained images with hand and their corresponding body locations annotations. We conduct extensive experiments on BodyHands and another public dataset to show the effectiveness of our method. Finally, we demonstrate the benefits of hand-body association in two critical applications: hand tracking and hand contact estimation. Our experiments show that hand tracking and hand contact estimation methods can be improved significantly by reasoning about the hand-body association.
BodyHands Dataset
As a part of this project, we release the BodyHands dataset. BodyHands is a large-scale dataset based on ContactHands, and has images with annotations for hand and body locations and their correspondences. The dataset can be dowloaded at BodyHands (4.6GB)
We also release YoutubeHands-20, a dataset used in our hand tracking experiments. YoutubeHands-20 contains 20 videos with annotations for hand locations and trajectories. The dataset can be dowloaded at YoutubeHands-20 (700MB). YoutubeHands-20 has now been expanded to a larger dataset called YoutubeHands containing 200 videos.
Figure 2: Representative images from the BodyHands dataset. Hands and bodies belonging to the same person have bounding boxes in the same color and identification numbers.
Code
Please see the code for the proposed method's implementation.
Paper
Whose Hands Are These? Hand Detection and Hand-Body Association in the Wild. Supreeth Narasimhaswamy, Thanh Nguyen, Mingzhen Huang, and Minh Hoai, IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR), 2022 .
If you find this work useful in your research please cite our work:
@inproceedings{bodyhands_2022,
title={Whose Hands Are These? Hand Detection and Hand-Body Association in the Wild},
author={Supreeth Narasimhaswamy and Thanh Nguyen and Mingzhen Huang and Minh Hoai},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2022},
}
@inproceedings{handler_2022,
title={Forward Propagation, Backward Regression and Pose Association for Hand Tracking in the Wild},
author={Mingzhen Huang and Supreeth Narasimhaswamy and Saif Vazir and Haibin Ling and Minh Hoai},
booktitle={Proceedings of the IEEE/CVF Conference on Computer Vision and Pattern Recognition (CVPR)},
year={2022},
}
@inproceedings{contacthands_2020,
title={Detecting Hands and Recognizing Physical Contact in the Wild},
author={Supreeth Narasimhaswamy and Trung Nguyen and Minh Hoai},
booktitle={Advances in Neural Information Processing Systems (NeurIPS)},
year={2020},
}
Copyright and Disclaimer
Copyright:
Notwithstanding the publish and/or disclosure of this document, the data, and any material in pertaining to this document (collectively the “Works”), all copyright and all rights therein are maintained by and remained proprietary property of the authors and/or the copyright holders.
Disclaimer:
The Works are provided “AS IS” without warranties or conditions of any kind either expressed or implied. To the fullest extent possible under applicable law, the authors and/or the copyright holders disclaim any and all warranties and conditions, expressed or implied, including but not limited to, implied warranties or conditions of merchantability and fitness for a particular purpose, non-infringement or other violation of rights or breach of contract. The authors and/or the copyright holders of the Works do not warrant nor make any representations of any kind in relation to, or in connection with the use, accuracy, timelines, completeness, efficacy, fitness, applicability, performance, security, availability or reliability of the Works, or the results from the use of the Works. In no event and under no circumstance shall the authors and/or the copyright holders of the Works be liable for any claim, damages or other liability, whether in an action of contract, tort or otherwise, arising out of or in connection with the Works, or the use of the Works.
|