تاثیر اطلاعات بافتی و استخراج نقاط کلیدی در ردیابی بصری شی مبتنی بر تبدیل شباهت

نوع مقاله : مقاله پژوهشی

نویسندگان

1 دانشجوی دکتری/ دانشکده مهندسی کامپیوتر دانشگاه یزد- بخش هوش مصنوعی

2 استادیار، دانشکده مهندسی کامپیوتر دانشگاه یزد - بخش هوش مصنوعی

چکیده

در سال‌های اخیر، ردیابی شی در محیط‌های مختلف با اشیا متنوع، اهمیت قابل توجهی یافته است. یک ویژگی بسیار مهم، ردیابی سریع، بدون نیاز به سخت‌افزار خاص و پیش‌آموزش است. ردیاب‌های مبتنی بر فیلتر همبستگی متمایزکننده، نتایج مثبتی را از نظر سرعت و دقت ارائه داده‌اند. اگرچه در بسیاری از این ردیاب‌ها، موقعیت شی در هر فریم بر اساس انتقال و مقیاس‌های هرمی تخمین زده می‌شود، در الگوریتم تبدیل شباهت، انتقال، مقیاس و چرخش برای یافتن موقعیت شی برآورد می‌شوند. در این الگوریتم، ویژگی هیستوگرام گرادیا‌ن‌های جهت‌دار استخراج شده است. در مقاله‌ی پیش‌رو، دو رویکرد متفاوت جهت استخراج ویژگی در این الگوریتم اتخاذ شده است. روش اول از تصاویر مقیاس‌بندی شده با استفاده از محدود‌سازی حداقل واریانس استفاده می‌کند. سپس با روش تغییر مقیاس در ماتریس‌های هم‌رخداد، ویژگی‌ها به سطح دیگری نگاشت می‌شوند. رویکرد دوم، ترکیبی از ویژگی‌های الگوی دودویی محلی رنگی متضاد و ویژگی مقاوم تسریع یافته ارائه می‌دهد. مجموعه‌ داده‌ی مورد ارزیابی OTB-2015 شامل 100 دنباله‌ی ویدیویی است. هر دو رویکرد، نتایج کلی مقاله‌ی پایه را تا حدود 3 درصد بهبود داده‌اند. روش اول در چالش رزولوشن پایین تا 7 درصد و روش دوم در چالش چرخش تا 4 درصد، نتایج را افزایش داده‌اند.

کلیدواژه‌ها


عنوان مقاله [English]

The Effects of Textural Information and Key Points Extraction on Visual Object Tracking, Based on Similarity Transformation

نویسندگان [English]

  • Solmaz Abbasi 1
  • Mehdi Rezaeian 2
1 Ph.D Candidate, Computer Engineering Department, Yazd University, Yazd. ّIran
2 Associate Professor, Computer Engineering Department, Yazd University, Yazd. Iran.
چکیده [English]

Visual object tracking in arbitrary environments with arbitrary objects has gained considerable importance in recent years. A very significant feature, which makes a tracker useful, is real time tracking without needing GPU and pre-train algorithms. In the recent decade, the trackers, which function on the basis of discriminative correlation filters, have promised positive results in terms of both speed and accuracy. Although, in most of such methods, the estimation of the position of the object in each frame is computed based on transformation and pyramid scales, in Large Displacement Estimation of Similarity transformation algorithm, translation, scale and rotation are estimated in each frame. In this paper, the Histogram of Oriented Gradient is considered as feature extraction. Here, we adopt two different approaches. The first approach uses scaled images as a feature matrix by applying minimum variance quantization. The second approach, uses a combination of opposite color local binary patterns and Speeded-Up Robust Features. By using these two methods, we are able to extract helpful and fast features, and therefore improve the results of tracking against challenging attributes. The OTB-2015 dataset is utilized for evaluating tracker. The results show precision of trackers improve 3%. Additionally, the first tracker increase the result about 7% against low resolution and the second one can be helpful about 4% in facing rotation challenge.    

کلیدواژه‌ها [English]

  • Visual Object Tracking
  • Similarity Transformation
  • Minimum Variance Quantization
  • Opposite Color Local Binary Patterns
  • Speeded-Up Robust Features
  • Yuan, X. Li, Z. He, Q. Liu, S. L.-K.-B. Systems, and undefined 2020, “Visual object tracking with adaptive structural convolutional network,” Elsevier, Accessed: Jul. 19, 2021. [Online]. Available:

https://www.sciencedirect.com/science/article/pii/S0950705120300472.

  • Li, W. Hu, C. Shen, Z. Zhang, A. Dick, and A. Van Den Hengel, “A survey of appearance models in visual object tracking,” ACM Trans. Intell. Syst. Technol., vol. 4, no. 4, pp. 1–48, Sep. 2013, doi: 10.1145/2508037.2508039.
  • Fiaz, A. Mahmood, S. Javed, and S. K. Jung, “Handcrafted and deep trackers: Recent visual object tracking approaches and trends,” ACM Comput. Surv., vol. 52, no. 2, 2019, doi: 10.1145/3309665.
  • Wang and H. Ling, “Gracker: A Graph-Based Planar Object Tracker,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 40, no. 6, pp. 1494–1501, Jun. 2018, doi: 10.1109/TPAMI.2017.2716350.
  • Zhang et al., “Structural Sparse Tracking,” Proc. IEEE Comput. Soc. Conf. Comput. Vis. Pattern Recognit., vol. 07-12-June, pp. 150–158, 2015, doi: 10.1109/CVPR.2015.7298610.
  • Du, H. Qi, W. Li, L. Wen, Q. Huang, and S. Lyu, “Online Deformable Object Tracking Based on Structure-Aware Hyper-Graph,” IEEE Trans. Image Process., vol. 25, no. 8, pp. 3572–3584, 2016, doi: 10.1109/TIP.2016.2570556.
  • Zhang, S. Liu, N. Ahuja, M. H. Yang, and B. Ghanem, “Robust Visual Tracking Via Consistent Low-Rank Sparse Learning,” Int. J. Comput. Vis., vol. 111, no. 2, pp. 171–190, 2015, doi: 10.1007/s11263-014-0738-0.
  • S. Bolme, J. R. Beveridge, B. A. Draper, and Y. M. Lui, “Visual object tracking using adaptive correlation filters,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2010, pp. 2544–2550, doi: 10.1109/CVPR.2010.5539960.
  • Danelljan, G. Hager, F. S. Khan, and M. Felsberg, “Discriminative Scale Space Tracking,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 39, no. 8, pp. 1561–1575, Aug. 2017, doi: 10.1109/TPAMI.2016.2609928.
  • Li, J. Zhu, W. Song, Z. Wang, H. Liu, and S. C. H. Hoi, “Robust Estimation of Similarity Transformation for Visual Object Tracking with Correlation Filters,” Aaai, 2019, [Online]. Available: http://arxiv.org/abs/1712.05231.
  • Surasak, I. Takahiro, C. H. Cheng, C. E. Wang, and P. Y. Sheng, “Histogram of oriented gradients for human detection in video,” Proc. 2018 5th Int. Conf. Bus. Ind. Res. Smart Technol. Next Gener. Information, Eng. Bus. Soc. Sci. ICBIR 2018, pp. 172–176, Jun. 2018, doi: 10.1109/ICBIR.2018.8391187.
  • W. Thomas, “EFFICIENT INVERSE COLOR MAP COMPUTATION,” Graph. Gems II, pp. 116–125, Jan. 1991, doi: 10.1016/B978-0-08-050754-5.50034-7.
  • Milosevic, D. Jankovic, and A. Peulic, “Thermography based breast cancer detection using texture features and minimum variance quantization,” EXCLI J., vol. 13, pp. 1204–1215, 2014, doi: 10.17877/DE290R-7338.
  • Varish and A. K. Pal, “A novel image retrieval scheme using gray level co-occurrence matrix descriptors of discrete cosine transform based residual image,” Appl. Intell., vol. 48, no. 9, pp. 2930–2953, Jan. 2018, doi: 10.1007/s10489-017-1125-7.
  • Mäenpää, M. Pietikäinen, and J. Viertola, “Separating color and pattern information for color texture discrimination,” Proc. - Int. Conf. Pattern Recognit., vol. 16, no. 1, pp. 668–671, 2002, doi: 10.1109/icpr.2002.1044840.
  • Koh, E. Ng, S. Bhandary, … A. L.-A., and undefined 2018, “Automated detection of retinal health using PHOG and SURF features extracted from fundus images,” Springer, vol. 48, no. 5, pp. 1379–1393, May 2018, doi: 10.1007/s10489-017-1048-3.
  • You, H. Zhu, M. Li, and Y. Li, “A Review of Visual Trackers and Analysis of its Application to Mobile Robot,” 2019, [Online]. Available: http://arxiv.org/abs/1910.09761.
  • Danelljan, F. S. Khan, M. Felsberg, and J. Van De Weijer, “Adaptive color attributes for real-time visual tracking,” in Proceedings of the IEEE Computer Society Conference on Computer Vision and Pattern Recognition, 2014, pp. 1090–1097, doi: 10.1109/CVPR.2014.143.
  • Li, Q. Liu, Z. He, H. Wang, C. Zhang, and W. S. Chen, “A multi-view model for visual tracking via correlation filters,” Knowledge-Based Syst., vol. 113, pp. 88–99, 2016, doi: 10.1016/j.knosys.2016.09.014.
  • F. Henriques, R. Caseiro, P. Martins, and J. Batista, “High-speed tracking with kernelized correlation filters,” IEEE Trans. Pattern Anal. Mach. Intell., vol. 37, no. 3, pp. 583–596, 2015, doi: 10.1109/TPAMI.2014.2345390.
  • Gao, F. Chen, J. G. Yu, R. Huang, and N. Sang, “Robust Visual Tracking Using Exemplar-Based Detectors,” IEEE Trans. Circuits Syst. Video Technol., vol. 27, no. 2, pp. 300–312, 2017, doi: 10.1109/TCSVT.2015.2513700.
  • Bertinetto, J. Valmadre, J. F. Henriques, A. Vedaldi, and P. H. S. Torr, “Fully-convolutional siamese networks for object tracking,” in Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics), 2016, vol. 9914 LNCS, pp. 850–865, doi: 10.1007/978-3-319-48881-3_56.
  • Choi, H. J. Chang, S. Yun, T. Fischer, Y. Demiris, and J. Y. Choi, “Attentional correlation filter network for adaptive visual tracking,” in Proceedings - 30th IEEE Conference on Computer Vision and Pattern Recognition, CVPR 2017, 2017, vol. 2017-Janua, pp. 4828–4837, doi: 10.1109/CVPR.2017.513.
  • Ma, J. Bin Huang, X. Yang, and M. H. Yang, “Adaptive Correlation Filters with Long-Term and Short-Term Memory for Object Tracking,” Int. J. Comput. Vis., vol. 126, no. 8, pp. 771–796, 2018, doi: 10.1007/s11263-018-1076-4.
  • Wang, J. Gao, J. Xing, M. Zhang, and W. Hu, “DCFNET: Discriminant correlation filters network for visual tracking,” arXiv. 2017, Accessed: Jun. 22, 2020. [Online]. Available: https://github.com/foolwood/DCFNet.
  • “Opposite Color Local Binary Patterns (OC-LBP) - File Exchange - MATLAB Central.” https://de.mathworks.com/matlabcentral/fileexchange/44284-opposite-color-local-binary-patterns-oc-lbp (accessed Aug. 20, 2021).
  • Sotoodeh, M. R. Moosavi, and R. Boostani, “A structural based feature extraction for detecting the relation of hidden substructures in coral reef images,” Multimed. Tools Appl., vol. 78, no. 24, pp. 34513–34539, 2019, doi: 10.1007/s11042-019-08050-w.
  • Sotoodeh, M. R. Moosavi, and R. Boostani, “A novel adaptive LBP-based descriptor for color image retrieval,” Expert Syst. Appl., vol. 127, pp. 342–352, Aug. 2019, doi: 10.1016/j.eswa.2019.03.020.
  • K. Jena, S. Chakraverty, and M. Malikan, “Implementation of Haar wavelet, higher order Haar wavelet, and differential quadrature methods on buckling response of strain gradient nonlocal beam embedded in an elastic medium,” Eng. Comput., vol. 37, no. 2, pp. 1251–1264, 2021, doi: 10.1007/s00366-019-00883-1.