Skip to main content
Log in

On the Use of Optical Flow for Scene Change Detection and Description

  • Published:
Journal of Intelligent & Robotic Systems Aims and scope Submit manuscript

Abstract

We propose the use of optical flow information as a method for detecting and describing changes in the environment, from the perspective of a mobile camera. We analyze the characteristics of the optical flow signal and demonstrate how robust flow vectors can be generated and used for the detection of depth discontinuities and appearance changes at key locations. To successfully achieve this task, a full discussion on camera positioning, distortion compensation, noise filtering, and parameter estimation is presented. We then extract statistical attributes from the flow signal to describe the location of the scene changes. We also employ clustering and dominant shape of vectors to increase the descriptiveness. Once a database of nodes (where a node is a detected scene change) and their corresponding flow features is created, matching can be performed whenever nodes are encountered, such that topological localization can be achieved. We retrieve the most likely node according to the Mahalanobis and Chi-square distances between the current frame and the database. The results illustrate the applicability of the technique for detecting and describing scene changes in diverse lighting conditions, considering indoor and outdoor environments and different robot platforms.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  1. Angeli, A., Filliat, D., Doncieux, S., Meyer, J.: Fast and incremental method for loop-closure detection using bags of visual words. IEEE Trans. Robot. 24(5), 1027–1037 (2008)

    Article  Google Scholar 

  2. Barron, J.L., Fleet, D.J., Beauchemin, S.S.: Performance of optical flow techniques. Int. J. Comput. Vis. 12, 43–77 (1994)

    Article  Google Scholar 

  3. Bauer, A., Klasing, K., Lidoris, G., Mühlbauer, Q., Rohrmüller, F., Sosnowski, S., Xu, T., Kühnlenz, K., Wollherr, D., Buss, M.: The autonomous city explorer: towards natural human-robot interaction in urban environments. Int. J. Soc. Robot. 1(2), 127–140 (2009)

    Article  Google Scholar 

  4. Bay, H., Tuytelaars, T., Van Gool, L.: SURF: speeded up robust features. In: Conference on Computer Vision, pp. 404–417 (2006)

  5. Beeson, P., Modayil, J., Kuipers, B.: Factoring the mapping problem: Mobile robot map-building in the hybrid spatial semantic hierarchy. Int. J. Robot. Res. 29(4), 428–459 (2010)

    Article  Google Scholar 

  6. Beeson, P., Jong, N.K., Kuipers, B.: Towards autonomous topological place detection using the extended Voronoi graph. In: International Conference on Robotics and Automation, pp. 4373–4379 (2005)

  7. Belongie, S., Malik, J., Puzicha, J.: Shape matching and object recognition using shape contexts. IEEE Trans. Pattern Anal. Mach. Intell. 24(4), 509–522 (2002)

    Article  Google Scholar 

  8. Biernacki, C., Celeux, G., Govaert, G.: Assessing a mixture model for clustering with the integrated completed likelihood. IEEE Trans. Pattern Anal. Mach. Intell. 22(7), 719–725 (2000)

    Article  Google Scholar 

  9. Blaer, P., Allen, P.: Topological mobile robot localization using fast vision techniques. In: IEEE International Conference on Robotics and Automation, vol. 1, pp. 1031–1036 (2002)

  10. Bouguet, J., et al.: Pyramidal Implementation of the Lucas–Kanade Feature Tracker Description of the Algorithm, vol. 3. Intel Corporation, Microprocessor Research Labs, OpenCV Documents (1999)

  11. Bradley, D., Patel, R., Vandapel, N., Thayer, S.: Realtime image-based topological localization in large outdoor environments. In: IEEE/RSJ Internation Conference on Intelligent Robots and Systems, pp. 3670–3677. IEEE (2005)

  12. Canny, J.: A computational approach to edge detection. IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986)

    Article  Google Scholar 

  13. Chernoff, H., Lehmann, E.: The Use of Maximum Likelihood Estimates in χ2 Tests for Goodness of Fit. The Annals of Mathematical Statistics, pp. 579–586 (1954)

  14. Choset, H., Burdick, J.: Sensor-based exploration: the hierarchical generalized Voronoi graph. Int. J. Robot. Res. 19(2), 96 (2000)

    Article  Google Scholar 

  15. Choset, H., Nagatani, K.: Topological simultaneous localization and mapping (SLAM): toward exact localization without explicit localization. IEEE Trans. Robot. Autom. 17(2), 125–137 (2001)

    Article  Google Scholar 

  16. Cotsaces, C., Nikolaidis, N., Pitas, I.: Video shot detection and condensed representation. A review. IEEE Signal Process. Mag. 23(2), 28–37 (2006)

    Article  Google Scholar 

  17. Cummins, M., Newman, P.: FAB-MAP: probabilistic localization and mapping in the space of appearance. Int. J. Robot. Res. 27(6), 647–665 (2008)

    Article  Google Scholar 

  18. Davison, A.J., Reid, I.D., Molton, N.D., Stasse, O.: MonoSLAM: real-time single camera SLAM. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1052–1067 (2007)

    Article  Google Scholar 

  19. Dittmar, L., Sturzl, W., Baird, E., Boeddeker, N., Egelhaaf, M.: Goal seeking in honeybees: matching of optic flow snapshots? J. Exp. Biol. 213(17), 2913–2923 (2010)

    Article  Google Scholar 

  20. Duenne, M., Mayer, H., Nourani-Vatani, N.: Floor Cleaning Apparatus and Method of Control therefore. European Patent Office, no. EP1557730 (2005)

  21. Duff, E., Roberts, J., Corke, P.: Automation of an underground mining vehicle using reactive navigation and opportunistic localization. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 4, pp. 3775–3780 (2003)

  22. Fawcett, T.: ROC graphs: notes and practical considerations for researchers. Mach. Learn. 31(HPL-2003-4), 1–38 (2004)

    MathSciNet  Google Scholar 

  23. Galvin, B., McCane, B., Novins, K., Mason, D., Mills, S.: Recovering motion fields: an evaluation of eight optical flow algorithms. In: British Machine Vision Conference, vol. 98, pp. 195–204 (1998)

  24. Giachetti, A., Campani, M., Torre, V.: The use of optical flow for road navigation. IEEE Trans. Robot. Autom. 14(1), 34–48 (1998)

    Article  Google Scholar 

  25. Goedemé, T., Nuttin, M., Tuytelaars, T., Van Gool, L.: Omnidirectional vision based topological navigation. Int. J. Comput. Vis. 74, 219–236 (2007)

    Article  Google Scholar 

  26. Harris, C., Stephens, M.: A combined corner and edge detector. In: Alvey Vision Conference, vol. 15, p. 50. Manchester, UK (1988)

  27. Horn, B.K.P.: Robot Vision, Ser. MIT Electrical Engineering and Computer Science Series. The MIT Press (1986)

    Google Scholar 

  28. Hrabar, S., Sukhatme, G.: Optimum camera angle for optic flow-based centering response. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3922–3927 (2006)

  29. Juan, L., Gwun, O.: A comparison of sift, pca-sift and surf. Int. J. Image Process. 3(4), 143–152 (2009)

    Google Scholar 

  30. Kortenkamp, D., Baker, L., Weymouth, T.: Using gateways to build a route map. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 3, pp. 2209–2214 (1992)

  31. Kuipers, B., Byun, Y.: A robot exploration and mapping strategy based on a semantic hierarchy of spatial representations. In: Toward Learning Robots, pp. 47–63. MIT Press, Cambridge, MA (1993)

    Google Scholar 

  32. Leutenegger, S., Chli, M., Siegwart, R.Y.: Brisk: Binary robust invariant scalable keypoints. In: IEEE International Conference on Computer Vision (ICCV), pp. 2548–2555 (2011)

  33. Lowe, D.: Object recognition from local scale-invariant features. In: International Conference on Computer Vision, vol. 2, pp. 1150–1157. Kerkyra, Greece (1999)

  34. Lucas, B., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: International Joint Conference on Artificial Intelligence, vol. 3, pp. 674–679 (1981)

  35. Marinakis, D., Dudek, G.: Pure topological mapping in mobile robotics. IEEE Trans. Robot. 26(6), 1051–1064 (2010)

    Article  Google Scholar 

  36. McHugh, S.: Noise reduction by image averaging. http://www.cambridgeincolour.com/tutorials/image-averaging-noise.htm. Accessed 15 May 2013

  37. Munkres, J.: Algorithms for the assignment and transportation problems. J. Soc. Ind. Appl. Math. 5(1), 32–38 (1957)

    Article  MATH  MathSciNet  Google Scholar 

  38. Nourani-Vatani, N., Borges, P., Roberts, J., Srinivasan, M.: Topological localization using optical flow descriptors. In: IEEE International Conference on Computer Vision Workshops (ICCV Workshops), pp. 1030–1037 (2011)

  39. Nourani-Vatani, N., Borges, P., Roberts, J.: A study of feature extraction algorithms for optical flow tracking. In: Australasian Conference on Robotics and Automation (2012)

  40. Nourani-Vatani, N.: On the use of optical flow for scene change detection and description in outdoor lightingvariant environments. Ph.D. dissertation, The University of Queensland, St Lucia, Queensland, Australia (2011)

  41. Nourani-Vatani, N., Borges, P.V.K.: Correlation-based visual odometry for car-like vehicles. J. Field Robot. 28(5), 742–768 (2011)

    Article  MATH  Google Scholar 

  42. Nourani-Vatani, N., Pradalier, C.: Scene change detection for vision-based topological mapping and localization. In: IEEE/RSJ International Conference on Robots and Systems, pp. 3792–3797 (2010)

  43. Nuske, S., Roberts, J., Wyeth, G.: Robust visual localisation for industrial vehicles in dynamic and non-uniform outdoor lighting. J. Field Robot. 26(9), 728–756 (2009)

    Article  Google Scholar 

  44. Proakis, J.G., Manolakis, D.G.: Digital Signal Processing. Prentice-Hall (1996)

  45. Radhakrishnan, D., Nourbakhsh, I., Topological robot localization by training a vision-based transition detector. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 1, pp. 468–473 (1999)

  46. Ranganathan, A., Dellaert, F.: Bayesian surprise and landmark detection. In: IEEE international conference on Robotics and Automation, pp. 1240–1246 (2009)

  47. Ranganathan, A., Menegatti, E., Dellaert, F.: Bayesian inference in the space of topological maps. IEEE Trans. Robot. 22(1), 92–107 (2006)

    Article  Google Scholar 

  48. Siagian, C., Itti, L.: Biologically-inspired robotics vision Monte Carlo localization in the outdoor environment. In: IEEE/RSJ Internation Conference on Intelligent Robots and Systems, pp. 1723–1730 (2007)

  49. Srinivasan, M.V., Zhang, S., Lehrer, M., Collet, T.S.: Honeybee navigation en route to the goal: visual flight control and odometry. J. Exp. Biol. 199, 237–244 (1996)

    Google Scholar 

  50. Srinivasan, M.V., Zhang, S., Altwein, M., Tautz, J.: Honeybee navigation: nature and calibration of the “odometer”. Science 287, 851–853 (2000)

    Article  Google Scholar 

  51. Ulrich, I., Nourbakhsh, I.: Appearance-based place recognition for topological localization. In: IEEE International Conference on Robotics and Automation, vol. 2, pp. 1023–1029 (2000)

  52. Valgren, C., Lilienthal, A.J.: Sift, surf & seasons: appearance-based long-term localization in outdoor environments. Robot. Auton. Syst. 58(2), 149–156 (2010)

    Article  Google Scholar 

  53. Van De Weijer, J., Schmid, C.: Coloring local feature extraction. In: European Conference on Computer Vision, pp. 334–348 (2006)

  54. Weibel, Y.: Optic-flow-based tracking of moving objects from a moving viewpoint. Master’s thesis, Ecole Polytechnique Federale de Lausanne (2008)

  55. Werner, F., Maire, F., Sitte, J.: Topological SLAM using fast vision techniques. In: Proceedings of the FIRA RoboWorld Congress 2009 on Advances in Robotics, p. 196. Springer (2009)

  56. Wu, C.: SiftGPU: a GPU implementation of scale invariant feature transform (SIFT). http://cs.unc.edu/ccwu/siftgpu/ (2007). Accessed 15 May 2013

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Navid Nourani-Vatani.

Additional information

This work is funded jointly by the Australian government through the SIEF Postdoctoral Fellowship and the Endeavour IPRS scheme, the University of Queensland, the CSIRO ICT Centre, the ARC Centre of Excellence in Vision Science (grant CE0561903), and a Queensland Smart State Premier’s Fellowship.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Nourani-Vatani, N., Borges, P.V.K., Roberts, J.M. et al. On the Use of Optical Flow for Scene Change Detection and Description. J Intell Robot Syst 74, 817–846 (2014). https://doi.org/10.1007/s10846-013-9840-8

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s10846-013-9840-8

Keywords

Navigation