Automated vision tracking of project related entities

https://doi.org/10.1016/j.aei.2011.01.003Get rights and content

Abstract

Tracking of project related entities such as construction equipment, materials, and personnel is used to calculate productivity, detect travel path conflicts, enhance the safety on the site, and monitor the project. Radio frequency tracking technologies (Wi-Fi, RFID, UWB) and GPS are commonly used for this purpose. However, on large-scale sites, deploying, maintaining and removing such systems can be costly and time-consuming. In addition, privacy issues with personnel tracking often limits the usability of these technologies on construction sites. This paper presents a vision based tracking framework that holds promise to address these limitations. The framework uses videos from a set of two or more static cameras placed on construction sites. In each camera view, the framework identifies and tracks construction entities providing 2D image coordinates across frames. Combining the 2D coordinates based on the installed camera system (the distance between the cameras and the view angles of them), 3D coordinates are calculated at each frame. The results of each step are presented to illustrate the feasibility of the framework.

Introduction

The National Academy of Engineering recently listed “Restoring and Improving Urban Infrastructure” as one of the Grand Challenges of Engineering in the 21st century [1]. One of the greatest challenges noted by the report is the need for more automation in construction, through advances in computer science and robotics. For instance, many researchers have endeavored to automate acquiring real time site information which provides an additional layer of control over the project. Automated tracking of project related entities in construction sites is one of the topics in this area. RFID, GPS, UWB, 3D range camera, etc. have been introduced and experimented for application in automated tracking. However, the drawbacks of each technology such as short ranges or the need for installation of sensors impose limits on its application to construction sites where a large number of entities exist.

This paper presents a framework that promises to determine the spatial location of project related entities in a large-scale congested construction site, such as construction equipment, personnel, and materials of standard sizes and shapes, across time, without installation of any sensors. Under this framework, video streams are initially collected from a set of fixed video cameras that are placed at a project site. The views of the cameras should have large common areas so that the entities appear in both views. In each view, project related entities are automatically identified when they appear. The identified entities are tracked in the subsequent frames. This process provides 2D pixel coordinates across frames for each view. In order to obtain 3D coordinates of the entities with two 2D pixel coordinates at each frame, the geometric relation between the camera views should be discovered [2]. Once the distance between camera positions and the angle between the camera views are revealed, the 3D coordinates are calculated by triangulating two cameras and the entity.

This paper focuses on presenting the big picture of the framework as a whole which can be divided into several independent research work steps. Summarized results from each one of the steps that make it work are presented in this paper. More details about the performance of each individual step have been [3], [4] or will be presented separately, given the size limitations of journal publications.

Section snippets

Background

Tracking of project related entities, such as materials, equipment and personnel, has been a significant topic of research for the last decade. Trackers, and especially automated trackers, are useful in progress monitoring and inventory control applications for construction sites, materials management, collision/accident prevention tools and security applications. Tracking records can also be used in activity sequence analysis for optimal path determination and processes redesign.

Different

Proposed solution

The overall objective of this paper is to show the feasibility of a novel automated vision-based tracking framework that aims to report the 4D coordinate (spatial coordinates and time) of distinctly shaped, project related entities, such as construction equipment, personnel, and materials of standard sizes and shapes. The following steps summarize the mechanics of the proposed vision-based tracking methodology (Fig. 2):

  • a.

    Video-streams are collected from two or more cameras at the site that have

Experiments and results

In this section, the experiments for the steps explained in previous sections were performed to validate the effectiveness of algorithms employed in each step and the applicability of the suggested framework. Underlining the purpose of this paper, it should be noticed that this paper proposes a novel framework of tracking for construction applications and this section will exhibit examples of each step’s result to show the flow of the whole process. Camera calibration is the first step of the

Conclusions and future work

Tracking of project related entities on construction sites can be applied to various construction tasks. The available tracking technologies – RFID, GPS, laser scanners, and LADAR – have been applied to tracking prefabricated materials, equipment, inventory, and personnel. However, the technologies have drawbacks in being applied in open construction sites, which can be overcome by the alternative method, vision-based tracking. Vision-based tracking does not require tagging sensors on the

Acknowledgements

This material is based upon work supported by the National Science Foundation under Grants No. 0933931 and 0904109. Any opinions, findings, and conclusions or recommendations expressed in this material are those of the authors and do not necessarily reflect the views of the National Science Foundation.

References (53)

  • D. Forsyth et al.

    Computer Vision – A modern approach

    (2002)
  • M.-W. Park, A. Makhmalbaf, I. Brilakis, Comparative study of vision tracking methods for tracking of construction site...
  • M.-W. Park et al.

    Correlating Multiple 2D Vision Trackers for 3D Object Tracking on Construction Sites, PM-05 – Advancing Project Management for the 21st Century

    (2010)
  • J. Song, C.H. Caldas, E. Ergen, C. Haas, B. Akinci, Field trials of RFID technology for tracking pre-fabricated pipe...
  • C. Williams, Y.K. Cho, J.-H. Youn, Wireless sensor-driven intelligent navigation method for mobile robot applications...
  • J. Teizer, D. Lao, M. Sofer, Rapid automated monitoring of construction site activities using ultra-wideband, in: Proc....
  • J. Teizer et al.

    Real-time three-dimensional occupancy grid modeling for the detection and tracking of construction resources

    Journal of Construction Engineering and Management

    (2007)
  • C.H. Caldas, D.G. Torrent, C.T. Haas, Integration of automated data collection technologies for real-time field...
  • E. Ergen, B. Akinci, R. Sacks, Formalization and automation of effective tracking and locating of precast components in...
  • R.J. Fontana, E. Richley, J. Barney, Commercialization of an ultra wideband precision asset location system, in: Proc....
  • R.J. Fontana

    Recent system applications of short-pulse ultra-wideband (UWB) technology

    IEEE Transactions on Microwave Theory and Techniques

    (2004)
  • I. Brilakis, F. Cordova, P. Clark, Automated 3D vision tracking for project control support, in: Proc. the Joint...
  • S. Gächter

    Results on range image segmentation for service robots

    (2005)
  • Wikipedia, Time-of-flight camera, <http://en.wikipedia.org/wiki/Time-of-flight_camera>, 2010 (accessed...
  • D.G. Lowe

    Distinctive image features from scale-invariant keypoints

    International Journal of Computer Vision

    (2004)
  • J. Bauer, N. Sünderhauf, P. Protzel, Comparing several implementations of two recently published feature detectors, in:...
  • Cited by (192)

    View all citing articles on Scopus
    View full text