Disparity Map Computation from Stereo Images Using Hill-Climbing Segmentation

  • Tin Tin San Image Processing Lab, University of Computer Studies, Mandalay, Myanmar
  • Kay Thi Win Faculty of Information Science, University of Computer Studies, Mandalay, Myanmar

Abstract

Stereo matching is one of the most active research areas in computer vision for decades. The task of stereo matching is to find the point correspondence between two images of the same scene taken from different viewpoints. This paper presents a segment-based stereo matching algorithm. Firstly, the reference image is segmented using hill-climbing algorithm and local stereo matching is performed Scale Invariant Feature Transform (SIFT) feature points with Sum of Absolute Differences (SAD) block matching. Secondly, a set of reliable pixels is constructed by comparing the matching cost and the mutual cross-checking consistent between the left and right initial disparity maps, which can lead to an actual disparity plane. Thirdly, a set of all possible disparity planes are extracted and then plane fitting and neighboring segment merging are performed. Finally, the disparity planes are set in each region using graph cuts to obtain final disparity map. The evaluation of proposed algorithm on the Middlebury data set result shows that the proposed algorithm is competitive with state-of-the-art stereo matching algorithms.

Downloads

Download data is not yet available.

References

[1] T.T. San and N. War, "Feature based disparity estimation using hill-climbing algorithm." In Software Engineering Research, Management and Applications (SERA). IEEE 15th International Conference on, pp. 129-133. IEEE, 2017.
[2] D. Scharstein and R. Szeliski, “A taxonomy and evaluation of dense two-frame stereo correspondence algorithms” International Journal of Computer Vision (IJCV), 47:7–42, 2001.
[3] G. Wang and H. Ju, “A disparity map extraction algorithm for lunar rover BH2” IEEE International Conference on Intelligent Computing and Intelligent Systems (ICIS), vol. 4, pp. 385 – 389, Shanghai, 2009.
[4] M.M. Chowdhury and M.A.A. Bhuiyan, “Fast window based stereo matching for 3D scene reconstruction.” Int. Arab J. Inf. Technol., 10(3), pp.209-214, 2013.
[5] B.B. Alagoz, “Obtaining depth maps from color images by region based stereo matching algorithms”, arXiv preprint arXiv: 0812.1340, 2008.
[6] S. Mukherjee and R.M.R. Guddeti, “A hybrid algorithm for disparity calculation from sparse disparity estimates based on stereo vision.” In Signal Processing and Communications (SPCOM). International Conference on, pp. 1-6. IEEE, 2014.
[7] H. Hirschmuller, “Accurate and efficient stereo processing by semi-global matching and mutual information.” In Computer Vision and Pattern Recognition (CVPR). IEEE Computer Society Conference on Vol. 2, pp. 807-814. IEEE, 2005.
[8] H. Hirschmuller, “Stereo vision in structured environments by consistent semi-global matching.” In Computer Vision and Pattern Recognition (CVPR). IEEE Computer Society Conference on Vol. 2, pp. 2386-2393. IEEE, 2006.
[9] S. Lee, J.H. Lee, J. Lim and I.H. Suh, “Robust stereo matching using adaptive random walk with restart algorithm.” Image and Vision Computing, 37, pp.1-11, 2015.
[10] B. Ham, D. Min, C. Oh, M.N. Do and K. Sohn, “Probability-based rendering for view synthesis.” IEEE Transactions on Image Processing, 23(2), pp.870-884, 2014.
[11] C. Rhemann, A. Hosni, M. Bleyer, C. Rother and M. Gelautz, “Fast cost-volume filtering for visual correspondence and beyond.” In Computer Vision and Pattern Recognition (CVPR), IEEE Conference on, pp. 3017-3024. IEEE, 2011.
[12] V. Muninder, U. Soumik and G. Krishna, “Robust segment-based stereo using cost aggregation.” In Proceedings of the British Machine Vision Conference. pp. 1-11, 2014.
[13] S. Ploumpis, A. Amanatiadis and A. Gasteratos, “A stereo matching approach based on particle filters and scattered control landmarks.” Image and Vision Computing, 38, pp.13-23, 2015.
[14] X. Sun, X. Mei, S. Jiao, M. Zhou and H. Wang, “Stereo matching with reliable disparity propagation.” In 3D Imaging, Modeling, Processing, Visualization and Transmission (3DIMPVT). International Conference on pp. 132-139. IEEE, 2011.
[15] O. Woodford, P. Torr, I. Reid and A. Fitzgibbon, “Global stereo reconstruction under second-order smoothness priors.” IEEE transactions on pattern analysis and machine intelligence, 31(12), pp.2115-2128, 2009.
[16] C. Olsson, J. Ulén and Y. Boykov, “In defense of 3d-label stereo.” In Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition. pp. 1730-1737, IEEE, 2013.
[17] L. Hong and G. Chen. “Segment-based stereo matching using graph cuts.” Computer Vision and Pattern Recognition (CVPR), Proceedings of the 2004 IEEE Computer Society Conference on. Vol. 1, pp. 74–81. IEEE, 2004.
[18] T. Ohashi, Z. Aghbari, and A. Makinouchi, “Hill-climbing algorithm for efficient color-based image segmentation”, IASTED International Conference on Signal Processing, Pattern Recognition and Applications, 2003.
[19] D. Lowe, “Distinctive image features from scale-invariant keypoints.” International journal of Computer Vision 60(2): pp.91–110, 2004.
[20] P. Kamencay, M. Breznan, R. Jarina, P. Lukac and M. Zachariasova, “Sparse disparity map computation from stereo-view images using segment based algorithm.” Radioelektronika, 2012 22nd International Conference. IEEE, 2012.
[21] H. Tao, H. Sawhney, and R. Kumar, “A global matching framework for stereo computation.” Computer Vision, ICCV 2001. Proceedings. Eighth IEEE International Conference on. Vol. 1, pp.532-539. IEEE, 2001.
[22] D. Scharstein and R. Szelinski, “Middlebury stereo vision website.” http://vision.middlebury.edu/stereo/.
Published
2019-01-04
How to Cite
SAN, Tin Tin; WIN, Kay Thi. Disparity Map Computation from Stereo Images Using Hill-Climbing Segmentation. International Journal of Research and Engineering, [S.l.], v. 6, n. 1, p. 547-555, jan. 2019. ISSN 2348-7860. Available at: <https://digital.ijre.org/index.php/int_j_res_eng/article/view/365>. Date accessed: 24 aug. 2019. doi: https://doi.org/10.21276/ijre.2019.6.1.1.