Publication: Yerf-Dog: An Autonomous Buggy
Files
Date
Authors
Journal Title
Journal ISSN
Volume Title
Publisher
Abstract
This thesis presents the design and implementation of a fully autonomous, electric go-kart, developed from a refurbished Yerf-Dog frame. The primary objective was to demonstrate vision-based autonomous navigation using low-cost hardware and open-source software. Major subsystems include a 72V electric drivetrain, a custom gear-reduction assembly, a steer-by-wire mechanism actuated via a high-torque motor, and a perception pipeline driven by real-time computer vision. A laptop running Python processes front-facing camera input using YOLOv8 for object detection and SegFormer for semantic segmentation. These outputs are encoded and transmitted to a Teensy 4.1 microcontroller, which actuates steering and throttle commands. The final system reliably performed lane following and object recognition (specifically for pedestrians and stop signs), validated through over 20 hours of autonomous testing on campus roads. Peak velocity reached 14 mph with excess torque available. The system operates for approximately 5 hours on a single charge and was built under a $2100 budget, with a total expenditure of $2040.17. Limitations in model inference speed and decision granularity were encountered, suggesting opportunities for optimization in both perception latency and control smoothing. This work serves as a proof of concept for low-cost, modular autonomous vehicles and highlights the practical integration of mechanical, electrical, and software subsystems under real-world constraints. Future development may focus on improving perception capabilities and reducing latencies of all types for higher-speed operation.