Robotic Cinematography: The Construction and Testing of an Autonomous Robotic Videographer and Race Simulator

Date

2021-05

Journal Title

Journal ISSN

Volume Title

Publisher

Abstract

In this body of work, we lay out in detail how to construct a wheeled robot capable of interacting with an uncertain environment in order to film events that can be pieced together to form a structured story narrative. The scope of our project focuses on capturing short distance racing events in an obstacle-free environment. Our robot, equipped with a custom-built navigation system and camera, is tasked with filming predictable events, such as the race start and end; and unpredictable events, such as a runner overtaking another or runners colliding. In this work, we focus not just on the events captured, which we categorise as substance but also the video shot types we use to capture the event, the style. We start by building a 2D simulator of a race on a track that models runners and the robot. We use this simulator to explore two planning algorithms. Each algorithm predicts likely upcoming events that will advance the story narrative. The algorithms also compute the appropriate style to use when capturing the predicted events. We analyse both algorithms by running a thousand simulations to compare their efficacy and time efficiency. We then detail the steps needed to build a system to implement the algorithms in the real world and in real-time. Lastly, we showcase the results from real-life implementations of the more time-efficient algorithm.

Description

Keywords

Robotics, Planning, Field robotics, Cinematography, Tracking, sports videography, human tracking, robotic videographer

Citation

Portions of this document appear in: D. Chaudhuri, R. Ike, H. Rahmani, A. T. Becker, D. A. Shell, and J. M. O’Kane, “Conditioning style on substance: Plans for narrative observation,” in International Conference on Robotics and Automation (ICRA), 2021.