-
CiteScore
-
Impact Factor
Volume 1, Issue 1, ICCK Transactions on Intelligent Systematics
Volume 1, Issue 1, 2024
Submit Manuscript Edit a Special Issue
Academic Editor
Weimin Zhang
Weimin Zhang
Beijing Institute of Technology, China
Article QR Code
Article QR Code
Scan the QR code for reading
Popular articles
ICCK Transactions on Intelligent Systematics, Volume 1, Issue 1, 2024: 3-9

Free to Read | Research Article | 15 May 2024
Visual Feature Extraction and Tracking Method Based on Corner Flow Detection
1 National Engineering Laboratory for Agri-product Quality Traceability, BTBU, Beijing, China
* Corresponding Author: Huijun Ma, [email protected]
Received: 09 January 2024, Accepted: 10 May 2024, Published: 15 May 2024  
Cited by: 3  (Source: Web of Science) , 5  (Source: Google Scholar)
Abstract
Front-end feature tracking based on vision is the process in which a robot captures images of its surrounding environment using a camera while in motion. Each frame of the image is then analyzed to extract feature points, which are subsequently matched between pairwise frames to estimate the robot’s pose changes by solving for the variations in these points. While feature matching methods that rely on descriptor-based approaches perform well in cases of significant lighting and texture variations, the addition of descriptors increases computational cost and introduces instability. Therefore, in this paper, a novel approach is proposed that combines sparse optical flow tracking with Shi-Tomasi corner detection, replacing the use of descriptors. This new method offers improved stability in situations of challenging lighting and texture variations while maintaining lower computational cost. Experimental results, validated using the OpenCV library on the Ubuntu operating system, demonstrate the algorithm's effectiveness and efficiency.

Graphical Abstract
Visual Feature Extraction and Tracking Method Based on Corner Flow Detection

Keywords
computer vision
feature tracking
optical flow method
visual features
visual tracking

Data Availability Statement
Data will be made available on request.

Funding
This work was supported in part by the National Natural Science Foundation of China under Grant 62173007, Grant 62006008, and Grant 62203020; in part by the Project of Humanities and Social Sciences (Ministry of Education in China, MOC) under Grant 22YJCZH006.

Conflicts of Interest
The authors declare no conflicts of interest.

Ethical Approval and Consent to Participate
Not applicable.

References
  1. Ke, X., Yu, Y., Li, K., Wang, T., Zhong, B., Wang, Z., ... & Wang, C. (2023). Review on robot-assisted polishing: Status and future trends. Robotics and Computer-integrated manufacturing, 80, 102482.
    [CrossRef]   [Google Scholar]
  2. GUO, H., LOU, J., YANG, Z., & XU, Y. (2024). Research on Dispersion Strategy for Multiple Unmanned Ground Vehicles Based on Auction Multi-agent Deep Deterministic Policy Gradient. Journal of Electronics & Information Technology, 46(1), 287-298.
    [CrossRef]   [Google Scholar]
  3. Jiao, L., Wang, D., Bai, Y., Chen, P., & Liu, F. (2021). Deep learning in visual tracking: A review. IEEE transactions on neural networks and learning systems, 34(9), 5497-5516.
    [CrossRef]   [Google Scholar]
  4. Huanyu, L. I., Duyan, B. I., Yuan, Y. A. N. G., Yu-fei, Z., Bing, Q., & Li-chao, Z. (2015). Research on visual tracking algorithm based on deep feature expression and learning. Journal of Electronics & Information Technology, 37(9), 2033-2039. http://dx.doi.org/10.11999/JEIT150031
    [Google Scholar]
  5. Haifeng, L., Zunhe, H., & Xinwei, C. (2017). PLP-SLAM: A visual SLAM method based on point line and surface feature fusion. Robot, 39(02), 214-220.
    [Google Scholar]
  6. Stephens, M., & Harris, C. (1989). 3D wire-frame integration from image sequences. Image and Vision Computing, 7(1), 24-30.
    [CrossRef]   [Google Scholar]
  7. SU, Z. Q., HE, Q., & XIE, Z. (2018). Molten Steel Level Measurement Based on Optical Flow Analysis. Journal of Northeastern University (Natural Science), 39(2), 158.
    [CrossRef]   [Google Scholar]
  8. Nie, G. Y., Bodda, S. S., Sandhu, H. K., Han, K., & Gupta, A. (2022). Computer-vision-based vibration tracking using a digital camera: A sparse-optical-flow-based target tracking method. Sensors, 22(18), 6869.
    [CrossRef]   [Google Scholar]
  9. Hou, A. L., Guo, J. L., Wang, C. J., Wu, L., & Li, F. (2013, July). Abnormal behavior recognition based on trajectory feature and regional optical flow. In 2013 Seventh International Conference on Image and Graphics (pp. 643-649). IEEE.
    [CrossRef]   [Google Scholar]
  10. Hua, Y., Lin, J., & Lin, C. (2010, July). An improved SIFT feature matching algorithm. In 2010 8th World Congress on Intelligent Control and Automation (pp. 6109-6113). IEEE.
    [CrossRef]   [Google Scholar]
  11. Kumar, D., Pandey, R. C., & Mishra, A. K. (2024). A review of image features extraction techniques and their applications in image forensic. Multimedia Tools and Applications, 83(40), 87801-87902.
    [CrossRef]   [Google Scholar]
  12. Wang, J., Huang, W., & Biljecki, F. (2024). Learning visual features from figure-ground maps for urban morphology discovery. Computers, Environment and Urban Systems, 109, 102076.
    [CrossRef]   [Google Scholar]
  13. Belmonte, L. M., Morales, R., & Fernández-Caballero, A. (2019). Computer vision in autonomous unmanned aerial vehicles—a systematic mapping study. Applied Sciences, 9(15), 3196.
    [CrossRef]   [Google Scholar]
  14. Cazzato, D., Cimarelli, C., Sanchez-Lopez, J. L., Voos, H., & Leo, M. (2020). A survey of computer vision methods for 2d object detection from unmanned aerial vehicles. Journal of Imaging, 6(8), 78.
    [CrossRef]   [Google Scholar]

Cite This Article
APA Style
Li, J., Wang, B., Ma, H., Gao, L., & Fu, H. (2024). Visual Feature Extraction and Tracking Method Based on Corner Flow Detection. ICCK Transactions on Intelligent Systematics, 1(1), 3–9. https://doi.org/10.62762/TIS.2024.136895

Article Metrics
Citations:

Crossref

9

Scopus

4

Web of Science

3
Article Access Statistics:
Views: 3229
PDF Downloads: 948

Publisher's Note
ICCK stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and Permissions
Institute of Central Computation and Knowledge (ICCK) or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.
ICCK Transactions on Intelligent Systematics

ICCK Transactions on Intelligent Systematics

ISSN: 3068-5079 (Online) | ISSN: 3069-003X (Print)

Email: [email protected]

Portico

Portico

All published articles are preserved here permanently:
https://www.portico.org/publishers/icck/