Abstract
We propose the use of optical flow information as a method for detecting and describing changes in the environment, from the perspective of a mobile camera. We analyze the characteristics of the optical flow signal and demonstrate how robust flow vectors can be generated and used for the detection of depth discontinuities and appearance changes at key locations. To successfully achieve this task, a full discussion on camera positioning, distortion compensation, noise filtering, and parameter estimation is presented. We then extract statistical attributes from the flow signal to describe the location of the scene changes. We also employ clustering and dominant shape of vectors to increase the descriptiveness. Once a database of nodes (where a node is a detected scene change) and their corresponding flow features is created, matching can be performed whenever nodes are encountered, such that topological localization can be achieved. We retrieve the most likely node according to the Mahalanobis and Chi-square distances between the current frame and the database. The results illustrate the applicability of the technique for detecting and describing scene changes in diverse lighting conditions, considering indoor and outdoor environments and different robot platforms.
Similar content being viewed by others
References
Angeli, A., Filliat, D., Doncieux, S., Meyer, J.: Fast and incremental method for loop-closure detection using bags of visual words. IEEE Trans. Robot. 24(5), 1027–1037 (2008)
Barron, J.L., Fleet, D.J., Beauchemin, S.S.: Performance of optical flow techniques. Int. J. Comput. Vis. 12, 43–77 (1994)
Bauer, A., Klasing, K., Lidoris, G., Mühlbauer, Q., Rohrmüller, F., Sosnowski, S., Xu, T., Kühnlenz, K., Wollherr, D., Buss, M.: The autonomous city explorer: towards natural human-robot interaction in urban environments. Int. J. Soc. Robot. 1(2), 127–140 (2009)
Bay, H., Tuytelaars, T., Van Gool, L.: SURF: speeded up robust features. In: Conference on Computer Vision, pp. 404–417 (2006)
Beeson, P., Modayil, J., Kuipers, B.: Factoring the mapping problem: Mobile robot map-building in the hybrid spatial semantic hierarchy. Int. J. Robot. Res. 29(4), 428–459 (2010)
Beeson, P., Jong, N.K., Kuipers, B.: Towards autonomous topological place detection using the extended Voronoi graph. In: International Conference on Robotics and Automation, pp. 4373–4379 (2005)
Belongie, S., Malik, J., Puzicha, J.: Shape matching and object recognition using shape contexts. IEEE Trans. Pattern Anal. Mach. Intell. 24(4), 509–522 (2002)
Biernacki, C., Celeux, G., Govaert, G.: Assessing a mixture model for clustering with the integrated completed likelihood. IEEE Trans. Pattern Anal. Mach. Intell. 22(7), 719–725 (2000)
Blaer, P., Allen, P.: Topological mobile robot localization using fast vision techniques. In: IEEE International Conference on Robotics and Automation, vol. 1, pp. 1031–1036 (2002)
Bouguet, J., et al.: Pyramidal Implementation of the Lucas–Kanade Feature Tracker Description of the Algorithm, vol. 3. Intel Corporation, Microprocessor Research Labs, OpenCV Documents (1999)
Bradley, D., Patel, R., Vandapel, N., Thayer, S.: Realtime image-based topological localization in large outdoor environments. In: IEEE/RSJ Internation Conference on Intelligent Robots and Systems, pp. 3670–3677. IEEE (2005)
Canny, J.: A computational approach to edge detection. IEEE Trans. Pattern Anal. Mach. Intell. 8(6), 679–698 (1986)
Chernoff, H., Lehmann, E.: The Use of Maximum Likelihood Estimates in χ2 Tests for Goodness of Fit. The Annals of Mathematical Statistics, pp. 579–586 (1954)
Choset, H., Burdick, J.: Sensor-based exploration: the hierarchical generalized Voronoi graph. Int. J. Robot. Res. 19(2), 96 (2000)
Choset, H., Nagatani, K.: Topological simultaneous localization and mapping (SLAM): toward exact localization without explicit localization. IEEE Trans. Robot. Autom. 17(2), 125–137 (2001)
Cotsaces, C., Nikolaidis, N., Pitas, I.: Video shot detection and condensed representation. A review. IEEE Signal Process. Mag. 23(2), 28–37 (2006)
Cummins, M., Newman, P.: FAB-MAP: probabilistic localization and mapping in the space of appearance. Int. J. Robot. Res. 27(6), 647–665 (2008)
Davison, A.J., Reid, I.D., Molton, N.D., Stasse, O.: MonoSLAM: real-time single camera SLAM. IEEE Trans. Pattern Anal. Mach. Intell. 29(6), 1052–1067 (2007)
Dittmar, L., Sturzl, W., Baird, E., Boeddeker, N., Egelhaaf, M.: Goal seeking in honeybees: matching of optic flow snapshots? J. Exp. Biol. 213(17), 2913–2923 (2010)
Duenne, M., Mayer, H., Nourani-Vatani, N.: Floor Cleaning Apparatus and Method of Control therefore. European Patent Office, no. EP1557730 (2005)
Duff, E., Roberts, J., Corke, P.: Automation of an underground mining vehicle using reactive navigation and opportunistic localization. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 4, pp. 3775–3780 (2003)
Fawcett, T.: ROC graphs: notes and practical considerations for researchers. Mach. Learn. 31(HPL-2003-4), 1–38 (2004)
Galvin, B., McCane, B., Novins, K., Mason, D., Mills, S.: Recovering motion fields: an evaluation of eight optical flow algorithms. In: British Machine Vision Conference, vol. 98, pp. 195–204 (1998)
Giachetti, A., Campani, M., Torre, V.: The use of optical flow for road navigation. IEEE Trans. Robot. Autom. 14(1), 34–48 (1998)
Goedemé, T., Nuttin, M., Tuytelaars, T., Van Gool, L.: Omnidirectional vision based topological navigation. Int. J. Comput. Vis. 74, 219–236 (2007)
Harris, C., Stephens, M.: A combined corner and edge detector. In: Alvey Vision Conference, vol. 15, p. 50. Manchester, UK (1988)
Horn, B.K.P.: Robot Vision, Ser. MIT Electrical Engineering and Computer Science Series. The MIT Press (1986)
Hrabar, S., Sukhatme, G.: Optimum camera angle for optic flow-based centering response. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, pp. 3922–3927 (2006)
Juan, L., Gwun, O.: A comparison of sift, pca-sift and surf. Int. J. Image Process. 3(4), 143–152 (2009)
Kortenkamp, D., Baker, L., Weymouth, T.: Using gateways to build a route map. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 3, pp. 2209–2214 (1992)
Kuipers, B., Byun, Y.: A robot exploration and mapping strategy based on a semantic hierarchy of spatial representations. In: Toward Learning Robots, pp. 47–63. MIT Press, Cambridge, MA (1993)
Leutenegger, S., Chli, M., Siegwart, R.Y.: Brisk: Binary robust invariant scalable keypoints. In: IEEE International Conference on Computer Vision (ICCV), pp. 2548–2555 (2011)
Lowe, D.: Object recognition from local scale-invariant features. In: International Conference on Computer Vision, vol. 2, pp. 1150–1157. Kerkyra, Greece (1999)
Lucas, B., Kanade, T.: An iterative image registration technique with an application to stereo vision. In: International Joint Conference on Artificial Intelligence, vol. 3, pp. 674–679 (1981)
Marinakis, D., Dudek, G.: Pure topological mapping in mobile robotics. IEEE Trans. Robot. 26(6), 1051–1064 (2010)
McHugh, S.: Noise reduction by image averaging. http://www.cambridgeincolour.com/tutorials/image-averaging-noise.htm. Accessed 15 May 2013
Munkres, J.: Algorithms for the assignment and transportation problems. J. Soc. Ind. Appl. Math. 5(1), 32–38 (1957)
Nourani-Vatani, N., Borges, P., Roberts, J., Srinivasan, M.: Topological localization using optical flow descriptors. In: IEEE International Conference on Computer Vision Workshops (ICCV Workshops), pp. 1030–1037 (2011)
Nourani-Vatani, N., Borges, P., Roberts, J.: A study of feature extraction algorithms for optical flow tracking. In: Australasian Conference on Robotics and Automation (2012)
Nourani-Vatani, N.: On the use of optical flow for scene change detection and description in outdoor lightingvariant environments. Ph.D. dissertation, The University of Queensland, St Lucia, Queensland, Australia (2011)
Nourani-Vatani, N., Borges, P.V.K.: Correlation-based visual odometry for car-like vehicles. J. Field Robot. 28(5), 742–768 (2011)
Nourani-Vatani, N., Pradalier, C.: Scene change detection for vision-based topological mapping and localization. In: IEEE/RSJ International Conference on Robots and Systems, pp. 3792–3797 (2010)
Nuske, S., Roberts, J., Wyeth, G.: Robust visual localisation for industrial vehicles in dynamic and non-uniform outdoor lighting. J. Field Robot. 26(9), 728–756 (2009)
Proakis, J.G., Manolakis, D.G.: Digital Signal Processing. Prentice-Hall (1996)
Radhakrishnan, D., Nourbakhsh, I., Topological robot localization by training a vision-based transition detector. In: IEEE/RSJ International Conference on Intelligent Robots and Systems, vol. 1, pp. 468–473 (1999)
Ranganathan, A., Dellaert, F.: Bayesian surprise and landmark detection. In: IEEE international conference on Robotics and Automation, pp. 1240–1246 (2009)
Ranganathan, A., Menegatti, E., Dellaert, F.: Bayesian inference in the space of topological maps. IEEE Trans. Robot. 22(1), 92–107 (2006)
Siagian, C., Itti, L.: Biologically-inspired robotics vision Monte Carlo localization in the outdoor environment. In: IEEE/RSJ Internation Conference on Intelligent Robots and Systems, pp. 1723–1730 (2007)
Srinivasan, M.V., Zhang, S., Lehrer, M., Collet, T.S.: Honeybee navigation en route to the goal: visual flight control and odometry. J. Exp. Biol. 199, 237–244 (1996)
Srinivasan, M.V., Zhang, S., Altwein, M., Tautz, J.: Honeybee navigation: nature and calibration of the “odometer”. Science 287, 851–853 (2000)
Ulrich, I., Nourbakhsh, I.: Appearance-based place recognition for topological localization. In: IEEE International Conference on Robotics and Automation, vol. 2, pp. 1023–1029 (2000)
Valgren, C., Lilienthal, A.J.: Sift, surf & seasons: appearance-based long-term localization in outdoor environments. Robot. Auton. Syst. 58(2), 149–156 (2010)
Van De Weijer, J., Schmid, C.: Coloring local feature extraction. In: European Conference on Computer Vision, pp. 334–348 (2006)
Weibel, Y.: Optic-flow-based tracking of moving objects from a moving viewpoint. Master’s thesis, Ecole Polytechnique Federale de Lausanne (2008)
Werner, F., Maire, F., Sitte, J.: Topological SLAM using fast vision techniques. In: Proceedings of the FIRA RoboWorld Congress 2009 on Advances in Robotics, p. 196. Springer (2009)
Wu, C.: SiftGPU: a GPU implementation of scale invariant feature transform (SIFT). http://cs.unc.edu/ccwu/siftgpu/ (2007). Accessed 15 May 2013
Author information
Authors and Affiliations
Corresponding author
Additional information
This work is funded jointly by the Australian government through the SIEF Postdoctoral Fellowship and the Endeavour IPRS scheme, the University of Queensland, the CSIRO ICT Centre, the ARC Centre of Excellence in Vision Science (grant CE0561903), and a Queensland Smart State Premier’s Fellowship.
Rights and permissions
About this article
Cite this article
Nourani-Vatani, N., Borges, P.V.K., Roberts, J.M. et al. On the Use of Optical Flow for Scene Change Detection and Description. J Intell Robot Syst 74, 817–846 (2014). https://doi.org/10.1007/s10846-013-9840-8
Received:
Accepted:
Published:
Issue Date:
DOI: https://doi.org/10.1007/s10846-013-9840-8