Becker, Aaron T.Taylor, ConlanIke, Rhema2021-02-112021-02-112020-09-29https://hdl.handle.net/10657/7494This research poster showcases the progress made in developing and testing sensor software and hardware for use in a robot capable of producing a cohesive story from visual and other sensory input. An algorithm capable of instructing a robot to record and thread events into a structured narrative is a novel concept that may provide revolutionary insights into automation. To provide the algorithm with useful information, working sensors must collect data relevant to the desired narrative. The robot operates using the body of an RC car and a Raspberry Pi computer for a brain. Hardware units compatible with the robot’s computer were tested to develop software intended to realize the robot’s operation. A prototype emergency brake program was developed using an ultrasonic sonar distance sensor and micro servo motor. Code for the robot’s inertial measurement unit and GPS operations were also improved. Extensive field testing with the robot’s GPS indicated promising accuracy and precision from the unit. Future tests are in development to build on these accomplishmentsen-USThe author of this work is the copyright owner. UH Libraries and the Texas Digital Library have their permission to store and provide access to this work. Further transmission, reproduction, or presentation of this work is prohibited except with permission of the author(s).Sensor Implementation in Autonomous Narrative-Capturing RobotPoster