Enhanced Recognition for Finger Gesture-Based Control in Humanoid Robots Using Inertial Sensors
Research Article  ·  Published: 30 October 2024
Issue cover
ICCK Transactions on Sensing, Communication, and Control
Volume 1, Issue 2, 2024: 89-100
Research Article Free to Read

Enhanced Recognition for Finger Gesture-Based Control in Humanoid Robots Using Inertial Sensors

1 School of Computer and Information Engineering, Beijing Technology and Business University, Beijing 100048, China
2 Department of Information Engineering, University of Padua, Italy
Corresponding Author: Shenglun Yi, [email protected]
Volume 1, Issue 2

Abstract

Humanoid robots play a significant role in numerous fields, where efficient and intuitive control inputs are essential, particularly in applications requiring remote operation. In this paper, we investigate the potential advantages of inertial sensors as a key component for generating command signals in humanoid robot control systems. The objective is to accurately detect user motion through inertial sensing, thereby enabling precise control commands. Finger gestures are first captured as signals from the inertial sensor, and movement commands are extracted through filtering and recognition processes. These commands are then translated into corresponding robot actions based on the sensor’s attitude angles. Experimental results demonstrate the accuracy and effectiveness of this method in recognizing finger movements and converting them into reliable robot operations. The use of inertial sensors for gesture recognition simplifies the transmission of control inputs, promoting more user-friendly and efficient interfaces for humanoid robot operation. This approach not only improves control precision but also enhances the practicality of deploying humanoid robots in real-world environments.

Graphical Abstract

Enhanced Recognition for Finger Gesture-Based Control in Humanoid Robots Using Inertial Sensors

Keywords

inertial sensor finger gesture NAO humanoid robot quaternions motion capture

Data Availability Statement

Data will be made available on request.

Funding

This work was supported without any funding.

Conflicts of Interest

The authors declare no conflicts of interest.

Ethical Approval and Consent to Participate

Not applicable.

References

  1. Katona, J. (2021). A review of human–computer interaction and virtual reality research fields in cognitive InfoCommunications. Applied Sciences, 11(6), 2646.
    [CrossRef] [Google Scholar]
  2. Bhame, V., Sreemathy, R., & Dhumal, H. (2014, September). Vision based hand gesture recognition using eccentric approach for human computer interaction. In 2014 International Conference on Advances in Computing, Communications and Informatics (ICACCI) (pp. 949-953). IEEE.
    [CrossRef] [Google Scholar]
  3. Chakravarthi, S. S., Rao, B., Challa, N. P., Ranjana, R., & Rai, A. (2023). Gesture Recognition for Enhancing Human Computer Interaction. Journal of Scientific \textnormal{& Industrial Research, 82(04), 438-443.
    [CrossRef] [Google Scholar]
  4. Molchanov, P., Gupta, S., Kim, K., & Kautz, J. (2015). Hand gesture recognition with 3D convolutional neural networks. In Proceedings of the IEEE conference on computer vision and pattern recognition workshops (pp. 1-7).
    [Google Scholar]
  5. Devineau, G., Moutarde, F., Xi, W., & Yang, J. (2018, May). Deep learning for hand gesture recognition on skeletal data. In 2018 13th IEEE International Conference on Automatic Face \textnormal{& Gesture Recognition (FG 2018) (pp. 106-113). IEEE.
    [CrossRef] [Google Scholar]
  6. Tran, D. S., Ho, N. H., Yang, H. J., Baek, E. T., Kim, S. H., & Lee, G. (2020). Real-time hand gesture spotting and recognition using RGB-D camera and 3D convolutional neural network. Applied Sciences, 10(2), 722.
    [CrossRef] [Google Scholar]
  7. Jaramillo-Yánez, A., Benalcázar, M. E., & Mena-Maldonado, E. (2020). Real-time hand gesture recognition using surface electromyography and machine learning: A systematic literature review. Sensors, 20(9), 2467.
    [CrossRef] [Google Scholar]
  8. Pan, M., Tang, Y., & Li, H. (2023). State-of-the-art in data gloves: A review of hardware, algorithms, and applications. IEEE Transactions on Instrumentation and Measurement, 72, 1-15.
    [CrossRef] [Google Scholar]
  9. Kim, B. K., Jang, M., Kim, J. S., Kang, K., Kim, D. E., & Kim, J. (2022). Investigation of FBG linear/angular acceleration sensor for novel type inertial measurement. IEEE Transactions on Industrial Electronics, 70(6), 6377-6385.
    [CrossRef] [Google Scholar]
  10. Sonchan, P., Ratchatanantakit, N., O-larnnithipong, N., Adjouadi, M., & Barreto, A. (2023, July). A Self-contained Approach to MEMS MARG Orientation Estimation for Hand Gesture Tracking in Magnetically Distorted Environments. In International Conference on Human-Computer Interaction (pp. 585-602). Cham: Springer Nature Switzerland.
    [Google Scholar]
  11. Wang, Y., & Zhao, Y. (2023). Handwriting recognition under natural writing habits based on a low-cost inertial sensor. IEEE Sensors Journal.
    [CrossRef] [Google Scholar]
  12. Nguyen, V., Rupavatharam, S., Liu, L., Howard, R., & Gruteser, M. (2019, November). HandSense: capacitive coupling-based dynamic, micro finger gesture recognition. In Proceedings of the 17th Conference on Embedded Networked Sensor Systems (pp. 285-297).
    [CrossRef] [Google Scholar]
  13. Gromov, B., Abbate, G., Gambardella, L. M., & Giusti, A. (2019, May). Proximity human-robot interaction using pointing gestures and a wrist-mounted IMU. In 2019 International Conference on Robotics and Automation (ICRA) (pp. 8084-8091). IEEE.
    [CrossRef] [Google Scholar]
  14. Ling, Y., Chen, X., Ruan, Y., Zhang, X., & Chen, X. (2021). Comparative study of gesture recognition based on accelerometer and photoplethysmography sensor for gesture interactions in wearable devices. IEEE Sensors Journal, 21(15), 17107-17117.
    [CrossRef] [Google Scholar]
  15. Picerno, P., Iosa, M., D’Souza, C., Benedetti, M. G., Paolucci, S., & Morone, G. (2021). Wearable inertial sensors for human movement analysis: a five-year update. Expert review of medical devices, 18(sup1), 79-94.
    [CrossRef] [Google Scholar]
  16. Hao, M., Chen, K., & Fu, C. (2019). Smoother-based 3-D foot trajectory estimation using inertial sensors. IEEE Transactions on Biomedical engineering, 66(12), 3534-3542.
    [CrossRef] [Google Scholar]
  17. Calado, A., Lin, B. S., Lee, I. J., & Saggio, G. (2023). Quasi-Static Measurement Performances of Flex Sensor Based and Inertial Measurement Unit Based Sensory Gloves. IEEE Sensors Journal.
    [CrossRef] [Google Scholar]
  18. Li, G., Wan, B., Su, K., Huo, J., Jiang, C., & Wang, F. (2023). sEMG and IMU Data-based Hand Gesture Recognition Method using Multi-stream CNN with a Fine-tuning Transfer Framework. IEEE Sensors Journal.
    [CrossRef] [Google Scholar]
  19. Dong, Y., Liu, J., & Yan, W. (2021). Dynamic hand gesture recognition based on signals from specialized data glove and deep learning algorithms. IEEE Transactions on Instrumentation and Measurement, 70, 1-14.
    [CrossRef] [Google Scholar]
  20. Lee, M., & Bae, J. (2020). Deep learning based real-time recognition of dynamic finger gestures using a data glove. IEEE Access, 8, 219923-219933.
    [CrossRef] [Google Scholar]
  21. Theodoridou, E., Cinque, L., Mignosi, F., Placidi, G., Polsinelli, M., Tavares, J. M. R., & Spezialetti, M. (2022). Hand tracking and gesture recognition by multiple contactless sensors: A survey. IEEE Transactions on Human-Machine Systems, 53(1), 35-43.
    [CrossRef] [Google Scholar]
  22. Jin, X. B., Sun, S., Wei, H., & Yang, F. B. (Eds.). (2018). Advances in multi-sensor information fusion: Theory and applications 2017. MDPI.
    [Google Scholar]
  23. Pramanik, R., Sikdar, R., & Sarkar, R. (2023). Transformer-based deep reverse attention network for multi-sensory human activity recognition. Engineering Applications of Artificial Intelligence, 122, 106150.
    [CrossRef] [Google Scholar]
  24. Ryumin, D., Ivanko, D., & Ryumina, E. (2023). Audio-visual speech and gesture recognition by sensors of mobile devices. Sensors, 23(4), 2284.
    [CrossRef] [Google Scholar]
  25. Qi, W., Ovur, S. E., Li, Z., Marzullo, A., & Song, R. (2021). Multi-sensor guided hand gesture recognition for a teleoperated robot using a recurrent neural network. IEEE Robotics and Automation Letters, 6(3), 6039-6045.
    [CrossRef] [Google Scholar]
  26. Bai, Y., Yan, B., Zhou, C., Su, T., & Jin, X. (2023). State of art on state estimation: Kalman filter driven by machine learning. Annual Reviews in Control, 56, 100909.
    [CrossRef] [Google Scholar]
  27. Jin, X. B., Robert Jeremiah, R. J., Su, T. L., Bai, Y. T., & Kong, J. L. (2021). The new trend of state estimation: From model-driven to hybrid-driven methods. Sensors, 21(6), 2085.
    [CrossRef] [Google Scholar]
  28. Khodabin, M., & Rostami, M. (2015). Mean square numerical solution of stochastic differential equations by fourth order Runge-Kutta method and its application in the electric circuits with noise. Advances in Difference Equations, 2015(1), 62.
    [Google Scholar]
  29. Bortolami, S. B., Pierobon, A., DiZio, P., & Lackner, J. R. (2006). Localization of the subjective vertical during roll, pitch, and recumbent yaw body tilt. Experimental brain research, 173, 364-373.
    [Google Scholar]
  30. Jin, X. B., Su, T. L., Kong, J. L., Bai, Y. T., Miao, B. B., & Dou, C. (2018). State-of-the-art mobile intelligence: Enabling robots to move like humans by estimating mobility with artificial intelligence. Applied Sciences, 8(3), 379.
    [CrossRef] [Google Scholar]
  31. Nagy, E., Karl, É., & Molnár, G. (2024). Exploring the Role of Human-Robot Interactions, within the Context of the Effectiveness of a NAO Robot. Acta Polytechnica Hungarica, 21(3).
    [Google Scholar]
  32. Mutawa, A. M., Al Mudhahkah, H. M., Al-Huwais, A., Al-Khaldi, N., Al-Otaibi, R., & Al-Ansari, A. (2023). Augmenting Mobile App with NAO Robot for Autism Education. Machines, 11(8), 833.
    [CrossRef] [Google Scholar]
  33. WANG, C., BAI, Y., CAI, L., HU, M., LIU, L., MA, Y., ... & ZHOU, Z. (2023). High precision electrostatic inertial sensor. Scientia Sinica Physica, Mechanica \textnormal{& Astronomica, 53(5), 250401.
    [Google Scholar]
  34. Sameni, R. (2017). Online filtering using piecewise smoothness priors: Application to normal and abnormal electrocardiogram denoising. Signal Processing, 133, 52-63.
    [CrossRef] [Google Scholar]

Cite This Article

APA Style
Xie, J., Na, X., & Yi, S. (2024). Enhanced Recognition for Finger Gesture-Based Control in Humanoid Robots Using Inertial Sensors. ICCK Transactions on Sensing, Communication, and Control, 1(2), 89–100. https://doi.org/10.62762/TSCC.2024.805710
Export Citation
RIS Format
Compatible with EndNote, Zotero, Mendeley, and other reference managers
TY  - JOUR
AU  - Xie, Jingyi
AU  - Xiang, Na
AU  - Yi, Shenglun
PY  - 2024
DA  - 2024/10/30
TI  - Enhanced Recognition for Finger Gesture-Based Control in Humanoid Robots Using Inertial Sensors
JO  - ICCK Transactions on Sensing, Communication, and Control
T2  - ICCK Transactions on Sensing, Communication, and Control
JF  - ICCK Transactions on Sensing, Communication, and Control
VL  - 1
IS  - 2
SP  - 89
EP  - 100
DO  - 10.62762/TSCC.2024.805710
UR  - https://www.icck.org/article/abs/TSCC.2024.805710
KW  - inertial sensor
KW  - finger gesture
KW  - NAO humanoid robot
KW  - quaternions
KW  - motion capture
AB  - Humanoid robots play a significant role in numerous fields, where efficient and intuitive control inputs are essential, particularly in applications requiring remote operation. In this paper, we investigate the potential advantages of inertial sensors as a key component for generating command signals in humanoid robot control systems. The objective is to accurately detect user motion through inertial sensing, thereby enabling precise control commands. Finger gestures are first captured as signals from the inertial sensor, and movement commands are extracted through filtering and recognition processes. These commands are then translated into corresponding robot actions based on the sensor’s attitude angles. Experimental results demonstrate the accuracy and effectiveness of this method in recognizing finger movements and converting them into reliable robot operations. The use of inertial sensors for gesture recognition simplifies the transmission of control inputs, promoting more user-friendly and efficient interfaces for humanoid robot operation. This approach not only improves control precision but also enhances the practicality of deploying humanoid robots in real-world environments.
SN  - 3068-9287
PB  - Institute of Central Computation and Knowledge
LA  - English
ER  - 
BibTeX Format
Compatible with LaTeX, BibTeX, and other reference managers
@article{Xie2024Enhanced,
  author = {Jingyi Xie and Na Xiang and Shenglun Yi},
  title = {Enhanced Recognition for Finger Gesture-Based Control in Humanoid Robots Using Inertial Sensors},
  journal = {ICCK Transactions on Sensing, Communication, and Control},
  year = {2024},
  volume = {1},
  number = {2},
  pages = {89-100},
  doi = {10.62762/TSCC.2024.805710},
  url = {https://www.icck.org/article/abs/TSCC.2024.805710},
  abstract = {Humanoid robots play a significant role in numerous fields, where efficient and intuitive control inputs are essential, particularly in applications requiring remote operation. In this paper, we investigate the potential advantages of inertial sensors as a key component for generating command signals in humanoid robot control systems. The objective is to accurately detect user motion through inertial sensing, thereby enabling precise control commands. Finger gestures are first captured as signals from the inertial sensor, and movement commands are extracted through filtering and recognition processes. These commands are then translated into corresponding robot actions based on the sensor’s attitude angles. Experimental results demonstrate the accuracy and effectiveness of this method in recognizing finger movements and converting them into reliable robot operations. The use of inertial sensors for gesture recognition simplifies the transmission of control inputs, promoting more user-friendly and efficient interfaces for humanoid robot operation. This approach not only improves control precision but also enhances the practicality of deploying humanoid robots in real-world environments.},
  keywords = {inertial sensor, finger gesture, NAO humanoid robot, quaternions, motion capture},
  issn = {3068-9287},
  publisher = {Institute of Central Computation and Knowledge}
}

Article Metrics

Citations
Google Scholar
2
Crossref
1
Scopus
1
Web of Science
1
Views
2832
PDF Downloads
382

Publisher's Note

ICCK stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and Permissions

Institute of Central Computation and Knowledge (ICCK) or its licensor holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
ICCK Transactions on Sensing, Communication, and Control
ICCK Transactions on Sensing, Communication, and Control
ISSN: 3068-9287 (Online) | ISSN: 3068-9279 (Print)
Portico
Preserved at
Portico