skip to main content
research-article

Capturing Subjective First-Person View Shots with Drones for Automated Cinematography

Published: 10 August 2020 Publication History

Abstract

We propose an approach to capture subjective first-person view (FPV) videos by drones for automated cinematography. FPV shots are intentionally not smooth to increase the level of immersion for the audience, and are usually captured by a walking camera operator holding traditional camera equipment. Our goal is to automatically control a drone in such a way that it imitates the motion dynamics of a walking camera operator, and, in turn, capture FPV videos. For this, given a user-defined camera path, orientation, and velocity, we first present a method to automatically generate the operator’s motion pattern and the associated motion of the camera, considering the damping mechanism of the camera equipment. Second, we propose a general computational approach that generates the drone commands to imitate the desired motion pattern. We express this task as a constrained optimization problem, where we aim to fulfill high-level user-defined goals, while imitating the dynamics of the walking camera operator and taking the drone’s physical constraints into account. Our approach is fully automatic, runs in real time, and is interactive, which provides artistic freedom in designing shots. It does not require a motion capture system, and works both indoors and outdoors. The validity of our approach has been confirmed via quantitative and qualitative evaluations.

References

[1]
3D Robotics. 2015. 3DR Solo. Retrieved September 13, 2016 from http://3drobotics.com/solo.
[2]
APM. 2016. APM Autopilot Suite. Retrieved September 13, 2019 from http://ardupilot.com.
[3]
Joshua Bailey, Tiffany Mata, and John A. Mercer. 2017. Is the relationship between stride length, frequency, and velocity influenced by running on a treadmill or overground? International Journal of Exercise Science 10, 7 (2017), 1067.
[4]
Justin Carpentier, Mehdi Benallegue, and Jean-Paul Laumond. 2017. On the centre of mass motion in human walking. International Journal of Automation and Computing 14, 5 (2017), 542--551.
[5]
Marc Christie, Patrick Olivier, and Jean-Marie Normand. 2008. Camera control in computer graphics. Computer Graphics Forum 27, 8 (2008), 2197--2218.
[6]
DJI. 2016. PC Ground Station. Retrieved September 13, 2019 from http://www.dji.com/pc-ground-station.
[7]
Alexander Domahidi and Juan Jerez. 2017. FORCES Pro: Code Generation for Embedded Optimization. Retrieved September 4, 2019 from https://www.embotech.com/FORCES-Pro.
[8]
Steven M. Drucker and David Zeltzer. 1994. Intelligent camera control in a virtual environment. In Proceedings of Graphics Interface (GI). 190--199.
[9]
Salman Faraji and Auke J. Ijspeert. 2017. 3LP: A linear 3D-walking model including torso and swing dynamics. The International Journal of Robotics Research 36, 4 (2017), 436--455.
[10]
Quentin Galvane, Julien Fleureau, François-Louis Tariolle, and Philippe Guillotel. 2016. Automated cinematography with unmanned aerial vehicles. In Proceedings of the Eurographics Workshop on Intelligent Cinematography and Editing (WICED). 23--30.
[11]
Quentin Galvane, Christophe Lino, Marc Christie, Julien Fleureau, Fabien Servant, Fran Tariolle, Philippe Guillotel, et al. 2018. Directing cinematographic drones. ACM Transactions on Graphics (TOG) 37, 3 (2018), 34.
[12]
Christoph Gebhardt, Benjamin Hepp, Tobias Nägeli, Stefan Stevšić, and Otmar Hilliges. 2016. Airways: Optimization-based planning of quadrotor trajectories according to high-level user goals. In CHI. 2508--2519.
[13]
Christoph Gebhardt and Otmar Hilliges. 2018. WYFIWYG: Investigating effective user support in aerial videography. arXiv preprint arXiv:1801.05972 (2018).
[14]
Christoph Gebhardt, Stefan Stevšić, and Otmar Hilliges. 2018. Optimizing for aesthetically pleasing quadrotor camera motion. ACM Transactions on Graphics (TOG) 37, 4 (2018), 90.
[15]
Robert D. Gregg, Adam K. Tilton, Salvatore Candido, Timothy Bretl, and Mark W. Spong. 2012. Control and planning of 3-D dynamic walking with asymptotically stable gait primitives. IEEE Transactions on Robotics 28, 6 (2012), 1415--1423.
[16]
S. Javad Hasaneini, C. J. B. Macnab, John E. A. Bertram, and Henry Leung. 2013. The dynamic optimization approach to locomotion dynamics: Human-like gaits from a minimally-constrained biped model. Advanced Robotics 27, 11 (2013), 845--859.
[17]
Jessica Hodgins. 2015. CMU Graphics Lab Motion Capture Database.
[18]
Jerry Holway and Laurie Hayball. 2013. The Steadicam® Operator’s Handbook. CRC Press.
[19]
Niels Joubert, L. E. Jane, Dan B. Goldman, Floraine Berthouzoz, Mike Roberts, James A. Landay, and Pat Hanrahan. 2016. Towards a drone cinematographer: Guiding quadrotor cameras using visual composition principles. arXiv preprint arXiv:1610.01691 (2016).
[20]
Niels Joubert, Mike Roberts, Anh Truong, Floraine Berthouzoz, and Pat Hanrahan. 2015. An interactive tool for designing quadrotor camera shots. ACM Transactions on Graphics (TOG) 34, 6 (2015), 238.
[21]
Steven Douglas Katz. 1991. Film Directing Shot by Shot: Visualizing from Concept to Screen. Gulf Professional Publishing.
[22]
Anatole Lécuyer, Jean-Marie Burkhardt, Jean-Marie Henaff, and Stéphane Donikian. 2006. Camera motions improve the sensation of walking in virtual environments. In Proceedings of the IEEE Virtual Reality Conference (VR). 11--18.
[23]
Tsai-Yen Li and Chung-Chiang Cheng. 2008. Real-time camera planning for navigation in virtual environments. In International Symposium on Smart Graphics (SG). 118--129.
[24]
Christophe Lino and Marc Christie. 2012. Efficient composition for virtual camera control. In ACM SIGGRAPH/Eurographics Symposium on Computer Animation (SCA). 65--70.
[25]
Christophe Lino and Marc Christie. 2015. Intuitive and efficient camera control with the toric space. ACM Transactions on Graphics (TOG) 34, 4 (2015), 82.
[26]
Christophe Lino, Marc Christie, Roberto Ranon, and William Bares. 2011. The Director’s Lens: An intelligent assistant for virtual cinematography. In ACM International Conference on Multimedia. 323--332.
[27]
Ian R. Manchester and Jack Umenberger. 2014. Real-time planning with primitives for dynamic walking over uneven terrain. In ICRA. 4639--4646.
[28]
Tobias Nägeli, Javier Alonso-Mora, Alexander Domahidi, Daniela Rus, and Otmar Hilliges. 2017a. Real-time motion planning for aerial videography with dynamic obstacle avoidance and viewpoint optimization. IEEE Robotics and Automation Letters 2, 3 (2017), 1696--1703.
[29]
Tobias Nägeli, Lukas Meier, Alexander Domahidi, Javier Alonso-Mora, and Otmar Hilliges. 2017b. Real-time planning for automated multi-view drone cinematography. ACM Transactions on Graphics (TOG) 36, 4 (2017), 132.
[30]
Morgan Quigley, Ken Conley, Brian P. Gerkey, Josh Faust, Tully Foote, Jeremy Leibs, Rob Wheeler, and Andrew Y. Ng. 2009. ROS: An open-source robot operating system. In IEEE ICRA Workshop on Open Source Software.
[31]
Mike Roberts and Pat Hanrahan. 2016. Generating dynamically feasible trajectories for quadrotor cameras. ACM Transactions on Graphics (TOG) 35, 4 (2016), 61.
[32]
Michael J. Seitz and Gerta Köster. 2012. Natural discretization of pedestrian movement in continuous space. Physical Review E 86, 4 (2012), 046108.
[33]
Maziar A. Sharbafi and Andre Seyfarth. 2015. FMCH: A new model for human-like postural control in walking. In IROS. 5742--5747.
[34]
Jianbo Shi and Carlo Tomasi. 1993. Good Features to Track. Technical Report. Cornell University.
[35]
VC Technology. 2016. Litchi Tool. Retrieved September 13, 2019 from https://flylitchi.com/.
[36]
Carlo Tomasi and Takeo Kanade. 1991. Tracking of Point Features. Technical Report. Tech. Rep. CMU-CS-91-132, Carnegie Mellon University.
[37]
Yujiang Xiang, Jasbir S. Arora, and Karim Abdel-Malek. 2010. Physics-based modeling and simulation of human walking: A review of optimization-based and other approaches. Structural and Multidisciplinary Optimization 42, 1 (2010), 1--23.
[38]
Ke Xie, Hao Yang, Shengqiu Huang, Dani Lischinski, Marc Christie, Kai Xu, Minglun Gong, Daniel Cohen-Or, and Hui Huang. 2018. Creating and chaining camera moves for quadrotor videography. ACM Transactions on Graphics (TOG) 37, 4 (2018), 88.
[39]
I-Cheng Yeh, Chao-Hung Lin, Hung-Jen Chien, and Tong-Yee Lee. 2011. Efficient camera path planning algorithm for human motion overview. Computer Animation and Virtual Worlds 22, 2–3 (2011), 239--250.
[40]
Wiebren Zijlstra and A. L. Hof. 1997. Displacement of the pelvis during human walking: Experimental data and model predictions. Gait 8 Posture 6, 3 (1997), 249--262.

Cited By

View all
  • (2025)Intelligent Cinematography: a review of AI research for cinematographic productionArtificial Intelligence Review10.1007/s10462-024-11089-358:4Online publication date: 25-Jan-2025
  • (2024)The Evolution of Intelligent Transportation Systems: Analyzing the Differences and Similarities between IoV and IoFVDrones10.3390/drones80200348:2(34)Online publication date: 24-Jan-2024
  • (2024)CineMPC: A Fully Autonomous Drone Cinematography System Incorporating Zoom, Focus, Pose, and Scene CompositionIEEE Transactions on Robotics10.1109/TRO.2024.335355040(1740-1757)Online publication date: 1-Jan-2024
  • Show More Cited By

Index Terms

  1. Capturing Subjective First-Person View Shots with Drones for Automated Cinematography

      Recommendations

      Comments

      Information & Contributors

      Information

      Published In

      cover image ACM Transactions on Graphics
      ACM Transactions on Graphics  Volume 39, Issue 5
      October 2020
      184 pages
      ISSN:0730-0301
      EISSN:1557-7368
      DOI:10.1145/3403637
      Issue’s Table of Contents
      Permission to make digital or hard copies of all or part of this work for personal or classroom use is granted without fee provided that copies are not made or distributed for profit or commercial advantage and that copies bear this notice and the full citation on the first page. Copyrights for components of this work owned by others than ACM must be honored. Abstracting with credit is permitted. To copy otherwise, or republish, to post on servers or to redistribute to lists, requires prior specific permission and/or a fee. Request permissions from [email protected]

      Publisher

      Association for Computing Machinery

      New York, NY, United States

      Publication History

      Published: 10 August 2020
      Online AM: 16 May 2020
      Accepted: 01 May 2020
      Revised: 01 March 2020
      Received: 01 October 2019
      Published in TOG Volume 39, Issue 5

      Permissions

      Request permissions for this article.

      Check for updates

      Author Tags

      1. Cinematography
      2. aerial videography
      3. filmmaking
      4. human motion model
      5. quadrotor camera

      Qualifiers

      • Research-article
      • Research
      • Refereed

      Funding Sources

      • Institute of Information 8 Communications Technology Planning 8 Evaluation (IITP)
      • National Research Foundation of Korea

      Contributors

      Other Metrics

      Bibliometrics & Citations

      Bibliometrics

      Article Metrics

      • Downloads (Last 12 months)72
      • Downloads (Last 6 weeks)10
      Reflects downloads up to 16 Feb 2025

      Other Metrics

      Citations

      Cited By

      View all
      • (2025)Intelligent Cinematography: a review of AI research for cinematographic productionArtificial Intelligence Review10.1007/s10462-024-11089-358:4Online publication date: 25-Jan-2025
      • (2024)The Evolution of Intelligent Transportation Systems: Analyzing the Differences and Similarities between IoV and IoFVDrones10.3390/drones80200348:2(34)Online publication date: 24-Jan-2024
      • (2024)CineMPC: A Fully Autonomous Drone Cinematography System Incorporating Zoom, Focus, Pose, and Scene CompositionIEEE Transactions on Robotics10.1109/TRO.2024.335355040(1740-1757)Online publication date: 1-Jan-2024
      • (2024)Human Orientation Estimation Under Partial Observation2024 IEEE/RSJ International Conference on Intelligent Robots and Systems (IROS)10.1109/IROS58592.2024.10802390(11544-11551)Online publication date: 14-Oct-2024
      • (2024)Exploring the Science and Art of UAV Light Painting: From Equations and Pixels to Long-Exposure Photography2024 International Conference on Unmanned Aircraft Systems (ICUAS)10.1109/ICUAS60882.2024.10556858(755-762)Online publication date: 4-Jun-2024
      • (2023)Head-mounted display system based on Metaverse VR TechnologyThird International Conference on Computer Graphics, Image, and Virtualization (ICCGIV 2023)10.1117/12.3008184(50)Online publication date: 14-Nov-2023
      • (2023)A Drone Video Clip Dataset and its Applications in Automated CinematographyComputer Graphics Forum10.1111/cgf.1466841:7(189-203)Online publication date: 20-Mar-2023
      • (2023)Training in “First Person View” Systems for Racing Drones2023 5th International Conference on Artificial Intelligence and Computer Applications (ICAICA)10.1109/ICAICA58456.2023.10405437(256-261)Online publication date: 28-Nov-2023
      • (2022)LookOut! Interactive Camera Gimbal Controller for Filming Long TakesACM Transactions on Graphics10.1145/350669341:3(1-16)Online publication date: 7-Mar-2022
      • (2022)Visibility-Aware Navigation With Batch Projection Augmented Cross-Entropy Method Over a Learned Occlusion CostIEEE Robotics and Automation Letters10.1109/LRA.2022.31900877:4(9366-9373)Online publication date: Oct-2022
      • Show More Cited By

      View Options

      Login options

      Full Access

      View options

      PDF

      View or Download as a PDF file.

      PDF

      eReader

      View online with eReader.

      eReader

      HTML Format

      View this article in HTML Format.

      HTML Format

      Figures

      Tables

      Media

      Share

      Share

      Share this Publication link

      Share on social media