Smart Features for Dynamic Vision Sensors

Zachary P. Friedel and Robert C. Leishman

Abstract: Abstract—This paper presents a semi-supervised procedure for training a fully convolutional neural network for interest point detection and description. Contrary to previously trained networks utilized for the same purpose, our network was tailored to work with event-based imagery from a downward facing camera aboard a fixed-wing unmanned aerial vehicle. Event-based cameras are a novel type of visual sensor that operate under a unique paradigm, providing asynchronous data on the log-level changes in light intensity for individual pixels. This hardware-level approach to change detection allows these cameras to achieve ultra-wide dynamic range and high temporal resolution. The final system produces state-of-the-art repeatability and homography estimation results on an aerial event-based image dataset when compared to traditional interest point detector and descriptor algorithms.
Published in: 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS)
April 20 - 23, 2020
Hilton Portland Downtown
Portland, Oregon
Pages: 973 - 978
Cite this article: Friedel, Zachary P., Leishman, Robert C., "Smart Features for Dynamic Vision Sensors," 2020 IEEE/ION Position, Location and Navigation Symposium (PLANS), Portland, Oregon, April 2020, pp. 973-978.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In