Skip to main content
Log in

Constructing task visibility intervals for video surveillance

  • Regular paper
  • Published:
Multimedia Systems Aims and scope Submit manuscript

Abstract

Vision systems are increasingly being deployed to perform complex surveillance tasks. While improved algorithms are being developed to perform these tasks, it is also important that data suitable for these algorithms be acquired – a non-trivial task in a dynamic and crowded scene viewed by multiple PTZ cameras. In this paper, we describe a real-time multi-camera system that collects images and videos of moving objects in such scenes, subject to task constraints. The system constructs “task visibility intervals” that contain information about what can be sensed in future time intervals. Constructing these intervals requires prediction of future object motion and consideration of several factors such as object occlusion and camera control parameters. Such intervals can also be combined to form multi-task intervals, during which a single camera can collect videos suitable for multiple tasks simultaneously. Experimental results are provided to illustrate the system capabilities in constructing such task visibility intervals, followed by scheduling them using a greedy algorithm.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Institutional subscriptions

Similar content being viewed by others

References

  1. Abrams S., Allen P.K., Tarabanis K. (1999) Computing camera viewpoints in an active robot work cell. Int. J. Rob. Res. 18(2): 267–285

    Article  Google Scholar 

  2. Abrams, S., Allen, P.K., Tarabanis, K.A.: Dynamic sensor planning. In: ICRA, vol. 2, pp. 605–610 (1993)

  3. de Berg M., van Kreveld M., Overmars M., Schwarzkopf O. (1997) Computational Geometry. Springer, Berlin Heidelberg New York

    MATH  Google Scholar 

  4. Costello, C., Diehl, C., Banerjee, A., Fisher, H.: Scheduling an active camera to observe people. In: 2nd ACM International Workshop on Video Surveillance and Sensor Networks, New York (2004)

  5. Cowan C.K., Kovesi P.D. (1988) Automatic sensor placement from vision task requirement. IEEE Trans. Pattern Anal. Mach. Intell. 10(3): 407–416

    Article  Google Scholar 

  6. Erdem, U.M., Sclaroff, S.: Look there! predicting where to look for motion in an active camera network. In: International Conference on Advanced Video and Signal based Surveillance (2005)

  7. Isard M., Blake A. (1998) Condensation – conditional density propagation for visual tracking. Int. J. Comput. Vis. 29(1): 5–28

    Article  Google Scholar 

  8. Julier, S.J., Uhlmann, J.K.: A new extension of the Kalman filter to nonlinear systems. In: 11th International Symposium on Aerospace/Defence Sensing, Simulation and Controls (1997)

  9. Kalman R.E. (1960) A new approach to linear filtering and prediction problems. Trans. ASME J. Basic Eng. 82(Series D): 35–45

    Google Scholar 

  10. Kutulakos, K., Dyer, C.R.: Global surface reconstruction by purposive control of observer motion. In: IEEE Conference on Computer Vision and Pattern Recognition, Seattle, Washington, USA (1994)

  11. Kutulakos, K., Dyer, C.R.: Occluding contour detection using affine invariants and purposive viewpoint control. In: IEEE Conference on Computer Vision and Pattern Recognition, Seattle, Washington, USA (1994)

  12. Kutulakos K., Dyer C.R. (1994) Recovering shape by purposive viewpoint adjustment. Int. J. Comput. Vis. 12(2): 113–136

    Article  Google Scholar 

  13. Lim, S.N., Davis, L.S., Wan, Y.J.: Visibility planning: predicting continuous period of unobstructed views. In: Technical Report, CS-TR 4577, Department of Computer Science, University of Maryland, College Park (2004)

  14. Mittal, A., Davis, L.S.: Visibility analysis and sensor planning in dynamic environments. In: European Conference on Computer Vision (2004)

  15. Miura, J., Shirai, Y.: Parallel scheduling of planning and action for realizing an efficient and reactive robotic system. In: 7th International Conference on Control, Automation, Robotics and Vision, Singapore (2002)

  16. Qureshi, F.Z., Terzopoulos, D.: Surveillance camera scheduling: a virtual vision approach. In: 3rd ACM International Workshop on Video Surveillance and Sensor Networks, Singapore (2005)

  17. Stamos, I., Allen, P.: Interactive sensor planning. In: Computer Vision and Pattern Recognition Conference, pp. 489–494 (1998)

  18. Tarabanis K., Allen P., Tsai R. (1995) A survey of sensor planning in computer vision. IEEE Trans. Rob. Autom. 11(1): 86–104

    Article  Google Scholar 

  19. Tarabanis K., Tsai R., Allen P. (1995) The mvp sensor planning system for robotic vision tasks. IEEE Trans. Rob. Autom. 11(1): 72–85

    Article  Google Scholar 

  20. Grimson, W.E.L., Stauffer, C.: Adaptive background mixture models for real-time tracking. In: IEEE Conference on Computer Vision and Pattern Recognition (1999)

Download references

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Ser-Nam Lim.

Rights and permissions

Reprints and permissions

About this article

Cite this article

Lim, SN., Davis, L.S. & Mittal, A. Constructing task visibility intervals for video surveillance. Multimedia Systems 12, 211–226 (2006). https://doi.org/10.1007/s00530-006-0062-9

Download citation

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s00530-006-0062-9

Keywords

Navigation