| Abstract: | Reliable and accurate state estimation for autonomous tractor-trailer systems (e.g., class 8 vehicles) is required for effective control and motion planning schemes. A key challenge for class 8 autonomy is addressing the trailer states. Trailer units can be interchanged among tractors, and their parameters often highly vary between operations. As such, robust trailer state estimation algorithms must not rely on sensors on the trailer or on a vehicle model. This work proposes equipping only the tractor with mirror-mounted monocular cameras and leveraging deep learning techniques to extract the trailer articulation angle from raw RGB images. A simulation study is performed comparing state-of-the-art convolutional neural networks (CNNs), and accuracy and efficiency trade-offs are investigated. The proposed method successfully estimates articulation angle with the most accurate CNNs achieving under 1 o of RMSE across the testing dataset while operating well over 35 FPS. Index Terms—deep learning, machine vision, state estimation |
| Published in: |
2025 IEEE/ION Position, Location and Navigation Symposium (PLANS) April 28 - 1, 2025 Salt Lake Marriott Downtown at City Creek Salt Lake City, UT |
| Pages: | 654 - 663 |
| Cite this article: | Thawainin, Tahn, Flegel, Tyler, Bevly, David, "Surveying CNNs for Estimating Trailer Articulation Angle Using Monocular Cameras," 2025 IEEE/ION Position, Location and Navigation Symposium (PLANS), Salt Lake City, UT, April 2025, pp. 654-663. |
| Full Paper: |
ION Members/Non-Members: 1 Download Credit
Sign In |