Imaging Sensors for Optical Wireless Location Technology

A. Arafa, X. Jin, D. Guerrero, R. Klukas, J. F. Holzman

Abstract: Objective The objective of this work is to characterize the performance of a new imaging sensor for optical wireless location (OWL) technology. The OWL system consists of overhead Light-Emitting-Diodes (LEDs) in a fixed optical beacon grid. A mobile imaging sensor measures Angle-of-Arrival (AOA) bearings to the observable optical beacons and estimates its position via triangulation. The imaging sensor consists of an integrated microlens and imaging array. The performance specifications of the integrated microlens and imaging array are tuned for the especially wide Field-of-View (FOV) characteristics needed for indoor positioning applications. The architecture is capable of operation with an FOV that exceeds 90 degrees, and this is done with device dimensions on the order of 100 micrometer. This enhanced imaging capability greatly improves overall positioning accuracy. The indoor positioning performance of this OWL imaging sensor is characterized in this work. Motivation There is growing interest in OWL technology. This is due to the emergence of optical wireless communication links [1], which offer especially high data rates, immunity to electromagnetic interference, increased security, and smaller transmitter/receiver packages. OWL systems can be divided into time-based, amplitude-based, and vector-based categories. Time-based optical positioning techniques use synchronized signals from three or more optical transmitters to measure the Time-of-Arrival (TOA) or Time Difference of Arrival (TDOA) and compute a position estimate using trilateration [2]. The main drawback of this technique is that it requires clock synchronization which adds significant hardware costs to the system. Amplitude-based optical positioning techniques measure the Received Signal Strength (RSS) of signals transmitted by optical beacons and use a weighted average of the amplitudes to compute a position estimate [3]. Although this technique is simple, since it requires no clock synchronization, it suffers from the fundamental requirement for the optical transmitter grid to have balanced power levels. The vector-based optical positioning technique operates through the detection of AOAs. It is a new concept for OWL systems but has great potential. Vector bearings can give the direction to observable optical beacons and position estimates can be generated by triangulating with two or more of these bearings. Such a technique has been successfully employed with photoreceivers having multiple photodiodes with differing orientations. Centimeter-level accuracies have been noted in [4], [5]. The main drawback of such photodiode systems is their inability to differentiate the multiple wavelengths of a broadband LED source (for isolating separate channels into wavelength channels). An imaging sensor [6-9] can offer this multi-wavelength capability for vector-based positioning by imaging the optical beacon grid at separate wavelengths. Centimeter-level positioning accuracies for such an imaging technique have been reported [6-9], although the performance of the imaging sensors is limited by a narrow FOV of 45 degrees and device dimensions on millimeter and centimeter scales [6-9]. In this work, a new OWL imaging sensor is proposed and demonstrated. This new imaging sensor has an especially wide FOV, being greater than 90 degrees, which allows more LED optical beacons to be imaged. This provides greater redundancy for the AOA triangulation process and ultimately offers enhanced positioning performance. The imaging sensor consists of an integrated microlens overtop of an imaging CMOS array. A new fabrication technique is used to create the wide FOV microlenses with dimensions on the order of 100 micrometer. This technology allows for pedestrian and robot navigation with centimeter level accuracy. Methodology and Results Positioning using an imaging sensor involves the use of a lens that is placed above an imaging array (e.g., CCD or CMOS). When the imaging sensor (receiver) points toward the LED optical beacon grid, the position of the imaging sensor can be estimated by noting the focal beam spot for each observable LED optical beacon. The focal beam spots can then be used to determine the respective AOA bearings and ultimately be used to estimate the imaging sensor position. The collinearity principle is used to solve for the position of the imaging sensor. The collinearity principle is expressed as a set of mathematical equations that represent the relationship between an image point on the two-dimensional imaging array, and the LED optical beacon with respect to a three dimensional (3-D) reference coordinate system. The mathematical equations reveal that the image point, object point (LED optical beacon), and perspective centre (defined as the centre of the lens), are collinear. By knowing the orientation of the imaging sensor coordinate system and the coordinates of three image points and their respective LED optical beacon reference coordinates, one can compute the 3-D position of the image sensor. The accuracy of an OWL system is determined by the imaging sensor and its observations of the overhead LED optical beacon grid. Imaging with a wide FOV will result in an increased number of observed LED optical beacons and AOA bearings for the position estimation process. It is typically difficult to establish wide FOVs with compact imaging architectures due to the need for high microlens curvatures. The presented microlens fabrication technique is able to create these wide FOV microlenses and optimize them for OWL applications. A full performance characterization is carried out for this micron-scale imaging sensor, and the optimal architecture for optical positioning is designed, built, and tested. It is expected that these integrated OWL sensors will become important elements in future indoor positioning systems. References [1] K. Panta and J. Armstrong, “Indoor localization using white LEDs,” IET Trans. Electron. Lett., vol. 48, pp. 228–230, Feb. 2012. [2] S. Y. Jung, S. Hann, and C. S. Park, “TDOA-based optical wireless indoor localization using LED ceiling lamps,” IEEE Trans. Consum. Electron., vol. 57, no. 4, pp. 1592–1597, 2011. [3] A. Hiyama, J. Yamashita, H. Kuzuoka, K. Hirota, and M. Hirose, “Position tracking using infra-red signals for museum guiding system,” in Proc. 2nd Int. Conf. Ubiquitous Comput. Syst., 2004, pp. 49–61. [4] A. Arafa, R. Klukas, J. F. Holzman, and X. Jin, “Towards a practical indoor lighting positioning system,” in Proc. ION GNSS, Sept. 2012. [5] X. Liu, H. Makino, and Y. Maeda, “Basic study on indoor location estimation using visible light communication platform,” in Proc. 30th Annu. Int. Conf. of the IEEE Engineering in Medicine and Biology Society, 2008, pp. 2377–2380. [6] S. Horikawa, T. Komine, S. Haruyama, and M. Nakagawa, “Pervasive visible light positioning system using white LED lighting,” IEICE Tech. Report, vol. 103, no. 721, pp. 93–99, 2004. [7] M. Yoshino, S. Haruyama, and M. Nakagawa, “High-accuracy positioning system using visible LED lights and image sensor,” IEEE Radio and Wireless Symp., Orlando, FL, USA, pp. 439–442, Jan. 2008. [8] M. S. Rahman, Md. M. Haque, and K-D. Kim, “Indoor positioning by LED visible light communication and image sensors,” IJECE, vol. 1, no. 2, pp. 161–170, Dec. 2011. [9] B. Y. Kim, J-S. Cho, Y. Park, and K-D. Kim, “Implementation of indoor positioning using LED and dual pc cameras,” in Proc. Int. conf. on Ubiquitous and Future Networks (ICUFN), 2012, pp. 476–477.
Published in: Proceedings of the 26th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2013)
September 16 - 20, 2013
Nashville Convention Center, Nashville, Tennessee
Nashville, TN
Pages: 1020 - 1023
Cite this article: Arafa, A., Jin, X., Guerrero, D., Klukas, R., Holzman, J. F., "Imaging Sensors for Optical Wireless Location Technology," Proceedings of the 26th International Technical Meeting of the Satellite Division of The Institute of Navigation (ION GNSS+ 2013), Nashville, TN, September 2013, pp. 1020-1023.
Full Paper: ION Members/Non-Members: 1 Download Credit
Sign In