Next Article in Journal
Early Neighbor Rejection-Aided Tabu Search Detection for Large MIMO Systems
Next Article in Special Issue
Weakly Supervised Learning for Object Localization Based on an Attention Mechanism
Previous Article in Journal
Application of Conventional and Non-Conventional Extraction Methods to Obtain Functional Ingredients from Jackfruit (Artocarpus heterophyllus Lam.) Tissues and By-Products
Previous Article in Special Issue
BengaliNet: A Low-Cost Novel Convolutional Neural Network for Bengali Handwritten Characters Recognition
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multilateration Approach for Wide Range Visible Light Indoor Positioning System Using Mobile CMOS Image Sensor

by
Md Habibur Rahman
,
Mohammad Abrar Shakil Sejan
and
Wan-Young Chung
*
Department of Electronic Engineering, Pukyong National University, Busan 48513, Korea
*
Author to whom correspondence should be addressed.
Appl. Sci. 2021, 11(16), 7308; https://doi.org/10.3390/app11167308
Submission received: 1 July 2021 / Revised: 3 August 2021 / Accepted: 7 August 2021 / Published: 9 August 2021
(This article belongs to the Special Issue Intelligent Processing on Image and Optical Information, Volume III)

Abstract

:
Visible light positioning (VLP) is a cost-effective solution to the increasing demand for real-time indoor positioning. However, owing to high computational costs and complicated image processing procedures, most of the existing VLP systems fail to deliver real-time positioning ability and better accuracy for image sensor-based large-area indoor environments. In this study, an effective method is proposed to receive coordinate information from multiple light-emitting diode (LED) lights simultaneously. It provides better accuracy in large experimental areas with many LEDs by using a smartphone-embedded image sensor as a terminal device and the existing LED lighting infrastructure. A flicker-free frequency shift on–off keying line coding modulation scheme was designed for the positioning system to ensure a constant modulated frequency. We tested the performance of the decoding accuracy with respect to vertical and horizontal distance, which utilizes a rolling shutter mechanism of a complementary metal-oxide-semiconductor image sensor. The experimental results of the proposed positioning system can provide centimeter-level accuracy with low computational time, rendering it a promising solution for the future direction of large-area indoor positioning systems.

1. Introduction

Indoor positioning has become a popular research topic in the past few decades owing to the demand for location-based services. The positioning system has useful applications for tracking patients in medical centers, tracking assets, personal tracking, and positioning people inside airports. The most popular positioning technology is the global positioning system (GPS), which receives signals from several satellites [1]. However, GPS with line-of-sight (LOS) signal transmissions exhibit many complexities owing to the signal being blocked by buildings and walls of indoor positioning systems. Therefore, many alternative techniques exist for indoor positioning, such as Wi-Fi [2], radio-frequency identification [3], ultra-wideband [4], and ZigBee [5]. Among these techniques, radio-frequency (RF)-based positioning is low cost and offers better coverage [6]. However, because of multipath fading and signal interference, its positioning accuracy is still uncertain. In recent years, visible light communication (VLC)-based indoor positioning systems have garnered significant attention. Due to the advantages of high data rate, eco-friendliness, and high security, VLC is a reliable technology [7]. Visible light allows LOS communication between a transmitter and receiver [8]. Finally, VLC technology can minimize the hardware cost by using existing lighting infrastructure [9]. In recent years, several smartphone image sensor-based visible light positioning approaches have been proposed and implemented [10,11,12,13]. Only single LED-based and received-signal-strength (RSS)/angle-of-arrival (AoA)-based localization systems in which an image sensor is employed as a receiver device were demonstrated in [14]. In [15], a positioning system was implemented using a double-LED as transmitter and an image sensor as a receiver that was capable of providing several centimeters of positioning accuracy in three-dimensional (3-D) systems. The authors in [16] implemented 3-D VLC positioning based on the angle difference of arrival using an image sensor as a receiver. In [16], it provided 3.20~14.66 cm positioning accuracy and used small-size color LEDs, which may cause a practical challenge for implementation of a system using rolling shutter mode image sensor based-positioning. A VLC and mobile image sensor-based indoor positioning system using a single LED was implemented in [17]; from the system description, the experimental testbed and achieved lower positioning accuracy at 4.47 cm were not clear. In [18], the proposed indoor positioning using VLC and image sensor provided an accuracy of 6.96 cm.
Geometry-based positioning primarily uses the concept of signal trilateration or triangulation. Therefore, the target photosensor receives the signal from multiple transmitter luminaires and calculates the distance from each luminaire. Geometry-based positioning systems using RSS/trilateration and smartphone (light sensor and inertial measurement unit i.e., IMU) have been reported in [19]. A time-difference-of-arrival and multilateration-based visible light positioning system has been proposed [20,21]. An AOA/trilateration-based indoor positioning system using four LEDs and a camera image sensor was reported in [22]. It provides 15.6 cm positioning accuracy with a 1.8 m × 1.8 m × 3.5 m testbed, and the study implemented the positioning result on a simulation basis, where it required an auxiliary device, such as one extra camera. A commercial camera (Model: Canon EOS 7D Mark II) image sensor-based VLP system using the trilateration method was reported in [23]; it was 5 cm and 6.6 cm for the height of 120 and 180 cm.
In this study, we utilized a visible light infrastructure with many LEDs and a multilateration-based localization algorithm to locate an image sensor in a wide range of indoor environments with a high precision system. The proposed system does not require the support of any additional devices at the receiver terminal. Data transmission from the ceiling of the LED was performed by flicker-free FSOOK line coding modulation to minimize the bit error rate. The rolling shutter mechanism of the CMOS mobile image sensor was used to enhance the data rate of our proposed system. Several experiments were conducted in this study, and the results show that the proposed system can achieve centimeter-level positioning accuracy in large-area indoor environments.
In summary, the significant contributions of this study are as follows:
  • Practical System: Designed and implemented an efficient VLP system using existing LED light infrastructure. The LED drivers were modified slightly to transmit the location information to the image sensor.
  • Test Area Accuracy: Implemented a multilateration positioning approach for this system, which utilized VLC and mobile CMOS image sensors for providing location-based service in a large indoor environment with many LEDs. The system ensures good accuracy and high precision within an area measuring 2.5 m × 4.5 m.
  • Sensor: Smartphone-embedded low-cost CMOS image sensor with a rolling shutter mechanism was used to scan every image pixel for increasing the data rate. However, no additional device was required at the receiver, affording a simple system.
The remainder of this paper is organized as follows. Section 2 describes the system design. The proposed positioning method is provided in Section 3. The experimental environment and the outcome of this study are presented in Section 4. Finally, Section 5 presents the conclusions of the paper.

2. System Design

2.1. Transmitter

Figure 1 shows a schematic of the transmitter system, which consists primarily of three parts: (1) an LED bulb, (2) a metal-oxide-semiconductor field-effect transistor (MOSFET) chip, and (3) a microcontroller (MCU) chip. A circular-shaped white LED with diameter and power of 15 cm and 15 W, respectively, was used as a transmitter LED. To control the current of the LED bulb, we constructed a driver circuit, which comprised a high-speed switching device MOSFET with two parallel resistances, R1 and R2. The resistance R1 was connected to the data pin of the MCU, and R2 was connected to pin number three of the MOSFET. An ATmega328p MCU was used to encode the data to the LED lighting. Table 1 show the list of configurations of the transmitter parameters for transmitting data. Phase-shift keying (PSK) is mainly used to apply LAN, Bluetooth, and RFID communication. The problem with PSK is that the receiver cannot know the exact phase of the transmitted signal to determine whether it is in a mark or space condition. In contrast, the case of frequency shift-keying (FSK) uses a pair of discrete frequencies to transmit binary (0 s and 1 s), and it is used for caller ID and remote metering [24]. In addition, the frequency shift on–off keying (FSOOK) was used, as it offers advantages for the separation of dark and bright strips width from the LED light through the transmission with its constant frequency.
The camera field-of-view (FOV) is the main factor that increased the number of captured dark and bright strips from the LED. The captured dark and bright strips are inversely proportional to the distance at the different communication links. This means that the amount of covered dark and bright strips is smaller at a longer communication distance. However, with the distance variation, the loss of dark and bright strips increases for OOK modulation due to the separation problem of strip width to the captured FOV of the image sensor. No technique can solve this properly because out of the protocol control rule. This is a significant limitation of OOK modulation when establishing communication between LEDs and the camera image sensor. Hence, we proposed the FSOOK modulation technique to represent the modulated bit by multiple cycles, where the distance variation between the transmitter and receiver does not affect the decoding process. With the base of the frequency subcarrier, the binary data bit arrangement function appears at the group of dark and bright stripes at the receiver image and were determined in the FSOOK. If the width of the strips of the modulated frequency is constant, then the subcarrier frequency remains same.
The FSOOK modulation has multiple high frequency shifts for a high binary input and low frequency shifts for a low binary input of the LED light. However, the generated binary modulated bits of 1 s and 0 s of the FSOOK modulation are known as Mark and Space frequencies, respectively, as shown in Figure 2a. The transmitter LED flickers are controlled according to the modulation frequency during the transmission process. However, below the frequency of 200 Hz, the human eyes can observe the flicker [25]. Because the human eyes can observe it, the visual flicker is undesirable and modulation frequency should be higher than 200 Hz; thus, the frequency for modulation is generally between 200 Hz to 8 kHz. The FSOOK modulation scheme uses different flicker-free frequencies of the square wave to represent the different symbols of the binary data signal. The number of bits in one symbol depends on the number of frequency shifts. The receive image by the image sensor can be demodulated by those data signals if it can calculate the dark and bright strip widths of the received image. In our system, we used 8-FSOOK and received three-bits in every frequency. The packet format of modulation light is shown in Figure 2b. The mechanism of the proposed modulation scheme is that, by calculating the number of different frequencies had between two consecutive frequencies, we exported the data symbol with four total frequencies or executed an error correction if a mixed symbol frame occurred, as shown in Figure 3. To solve the error of two mixed frequencies, the state of the first bit in every three bits was replaced during the design of the modulation scheme. This design helped us to identify the frame with a mixed symbol and rectify the data signal. To maintain the synchronization control, a splitter frequency for supporting the system was inserted to identify the frame that would contribute to the beginning and ending frequency. A shutter speed threshold defined in the current study related to the FSOOK modulation technique was manually added to the splitter frequency above the camera image sensor. The splitter frequency was mostly 10 kHz, which maintained the average luminance equilibrium when the image was captured by the image sensor.

2.2. Receiver

The designed positioning receiver was based on the rolling shutter effect mechanism of the CMOS camera image sensor. The modulated signal from the ceiling LED light was captured by the rolling shutter camera sensor, which converted it from pixels into data. The decoding process at the receiver side was operated by converting the light signal, accessing the memory time, and performing the image processing mechanism. The working mechanism of the charged coupled devices (CCD) sensor exposes all the pixel values on the sensor at one time, and at the end of each exposure, all the data pixels are read out simultaneously. The frame rate of the CCD image sensor is relatively low. This mechanism is known as the global shutter mechanism of the CCD image sensor, as shown in Figure 4a.
Conversely, for the case of a CMOS image sensor, each row of the pixel values is exposed simultaneously at the exposure time, and the row of pixels is protected by read-out from the overlapping. This operation is known as the rolling shutter mechanism of a CMOS image sensor, as shown in Figure 4b. The mechanism of the CMOS sensor has been reported in [26,27]. The sensor also can record the flicker pattern within one image [28], which forms the signal layer. The signal layer is superposed with the image background and produces the alternately dark and bright strip bands. When exposed to LED light, it opens and closes, causing bright and dark strips to be captured by the rolling shutter mechanism of the CMOS image sensor.
To demodulate the data from the transmitter LED using the rolling shutter mechanism of the CMOS image sensor, the receiving camera demodulates the signal data based on the measured width of the dark and bright strips of the received image and converts it back to the original transmitted frequency. The estimation of width is handled by counting how many pixels exist between a couple of dark and bright strips, as shown in Figure 5. The frequency of a transmitted signal of a square wave is f, and the duration of one complete cycle is 1/f seconds. Therefore, to read-out a couple of bright and dark strips in the received image, the camera should be exposed at every 1/f second. Additionally, during this time, the camera spends time to read-out a row of pixels in a read-out duration Rs. Therefore, the width W′ of the consecutive dark and bright strips can be calculated as follows:
W = 1 2 f R s = 1 2 f R s
The calculation of the strip width based on the theoretical of (1) is a real number. In the experiment, the width of the strip number is an integer. Therefore, the receiver can only measure the number of rows possessed by strip W″ as an integer estimate of W′, which can demodulate the symbol by
W = 1 2 f R s
f = 1 2 W R s
As a result of this, the read-out duration of each camera image sensor is basically an unknown parameter. In our case, for the given frequency, different cameras have distinct read-out durations, and hence, we could observe different strip widths. The strip width difference is basically large in the low-frequency regions, and the strip-width difference is small in the high-frequency regions. As indicated in [29], a smartphone device cannot decode frequency-modulated signals without knowing the read-out duration, and channel estimation becomes necessary for realizing high-order frequency modulation.

2.3. Multilateration Principle

Multilateration is a positioning method used to determine the position of an object in a space by measuring the distance from three or more than three transmitters to a receiver as a terminal device. If three transmitters are used to determine the position of the receiver, then the system is known as a trilateration system. However, to increase the accuracy of the positioning system, the number of transmitters should be increased; additionally, the position should be stationary and known. The fundamental principle of the multilateration positioning system is to measure the distance between the receivers and simultaneously communicate with the transmitters to determine the position of the receiver. Figure 6 shows the system design for the basics of the multilateration principle, where T1, T2, T3, and T4 represent the transmitter LEDs, R represents the receiver, and L1, L2, L3, and L4 indicate the distance between transmitter and receiver. However, by utilizing this localization technique with VLC technology for large coverage areas in an indoor environment, an LED light can be used as a transmitter, and a smartphone camera image sensor can be used as a receiver device.

3. Positioning Method

3.1. CMOS Image Sensor and LED Light Distance

The general functions of the camera image sensor and the projection of a detected LED light infrastructure with a transmitter and receiver are illustrated in Figure 7, where multiple detected LED light signals pass through the camera lens, and they are incident to the image sensor (IS) surface. The detected image on the image surface is in the opposite position of the LED light signal. Consider f as the camera focal length, which is the intrinsic parameter of every CMOS image sensor; the distance between the target i-th LED light (xi, yj) to the camera lens is L, and the distance between the focal length to the projected image on the image sensor is l. Therefore, the fundamental equation of the lens can be written in the following form:
l L = f L f
The magnification parameter H of the camera lens is defined as the ratio of the projected image size to the area of the LED geometrical size. Therefore, we can express the magnification as:
H = k i j K i j = l L
where kij is the area of detected image on the image sensor, and Kij is the actual area of LED light (xi, yj). It is clear that f << L. Therefore, by combining Equations (4) and (5) we obtain:
k i j = H 2 K i j
Therefore, the number of pixels of the image sensor is the ratio of the projected image size on the image sensor to the unit pixel area of the image sensor. If the number of pixels is αp on the sensor, µ is the unit pixel area of the IS, and K is the area of detected LED ceiling light, then their relationship can be expressed as follows:
α p = f 2 K µ 2 L 2
Currently, LEDs of different shapes and configurations are available in the marketplace. LEDs are typically manufactured in rectangular, square, or circular shapes to satisfy customer demand. In our experiment, we used a circular-shaped LED. For the circular-shaped LEDs, if r is the radius of the circle, then the total area of the LED shape will be K = πr2, as shown in Figure 8a. The distance from the ceiling LEDs and camera image sensor within the camera FOV is unique for every LED light produced during the communication with multiple light signals. The focal length and unit of the pixel area for every smartphone camera are not variable. Meanwhile, if the real-physical area of the ceiling LEDs is known, we can calculate the distance measured by the corresponding pixel value in the image area of the image sensor from the equation.
We can rewrite the Equation (7) as follows:
L = v 1 α p
where v = f K /µ is a constant for each camera image sensor and LED light. As indicated in (8), the distance between the LEDs to the camera lens is inversely proportional to the square root of the camera image sensor area.

3.2. Calculation of Image Area of Image Sensor and Location Change with Camera FOV

The image area on the image sensor depends on the size of the LED luminaires, the distance between the LED light and the camera image sensor that we previously described in Section 3.1, the intensity of the LED light signal during transmission, and the number of LED lights in the camera FOV. The number of LED lights within the camera FOV changes with respect to the movement of the camera location in the indoor environment. Our proposed system enables estimation of the possible position by capturing image data from more than three LEDs within the camera FOV, which can change the location from that of single LEDs up to four LEDs for measuring the distance. Figure 8b shows the possible location estimation process done by changing the pixel value with respect to the camera FOV within the positioning environment, where the camera image sensor moves from location 1 to location 5. We assumed that the LED luminaires were distributed in a square on the ceiling; furthermore, the number of lights within the camera FOV depended on the transmission height, inter-LED distances, and FOV angle. The camera FOV was assigned with a circular shape, as shown in Figure 8b, where locations 1, 2, 3, and 5 of the camera image sensor were within two LED lights of the image on the images sensor. Meanwhile, locations 2 and 4 of the camera image sensor were within four LED lights of the image on the image sensor and so on. However, if the number of LED lights is increased within the camera FOV, then the positioning accuracy will be increased.
Basically, the LED light spreads from the ceiling to the ground in a full circular form. Consequently, at the specific distance from the LED light, the light intensities are equal inside each LED luminaire. Therefore, if the image sensor is located inside those locations and can detect the image area on the image sensor with a single LED luminaire, then an estimate of the degree to which an incorrect location causes more location errors can be determined. Meanwhile, attempting to eliminate the location error, another LED added as a reference transmitter with a single LED also could not provide the appropriate location information of the image sensor. Hence, a third LED luminaire is needed to deliver accurate positioning information of the image sensor. However, the location estimation problem and location error occurred because the three LED transmitters either did not lie lay in a straight line or lay in a straight line when the image sensor moved through the lighting environment. To compensate for those location errors in our proposed system, we added more than three LED transmitters on the ceiling. Additionally, we measured the coordinate distance from four LED luminaires simultaneously by using the image sensor and located the accurate position of the image sensor that enhanced the positioning accuracy.

3.3. Coordinate Distance Estimation

The proposed system measures the coordinate distance from the ceiling LED lights and smartphone image sensor. The LED lights were attached to the ceiling, and the distance between the ceiling LEDs and the ground was constant for every positioning point in the indoor environment. The smartphone image sensor was moved in under the LED lighting infrastructure. In our system, we demonstrated with at least three and more than three LED lights within the camera FOV where the LEDs were continuously transmitting their coordinate information using visible light technology to the smartphone image sensor. We assumed that the ceiling LED transmitters coordinates along the x, y, and z axis directions (xn, yn, zn) and the image sensor coordinate on the image plane (x′,y′,z′). Therefore, after receiving all of the coordinate information from the ceiling LED lights by the smartphone CMOS image sensor, the distance Ln from the n LEDs and image sensor is expressed as:
L n 2 = ( x x n ) 2 + ( y y n ) 2 + ( z z n ) 2
If the coordinate information is received from the Ln = 4 number of ceiling LED light transmitters, then we can evaluate Equation (9) as follows:
{ L 1 2 = ( x x 1 ) 2 + ( y y 1 ) 2 + ( z z 1 ) 2 L 2 2 = ( x x 2 ) 2 + ( y y 2 ) 2 + ( z z 2 ) 2 L 3 2 = ( x x 3 ) 2 + ( y y 3 ) 2 + ( z z 3 ) 2 L 4 2 = ( x x 4 ) 2 + ( y y 4 ) 2 + ( z z 4 ) 2
By simplifying Equation (10), we obtain a matrix form that can be generated in Equation (11), which shows the relationship of the calculated positions and the measured distances between the LED transmitter and images sensor receiver:
[ 1 2 x 1 2 y 1 2 z 1 1 2 x 2 2 y 2 2 z 2 1 2 x 3 2 y 3 2 z 3 1 2 x 4 2 y 4 2 z 4 ] [ x 2 + y 2 + z 2 x y z ] = [ L 1 2 x 1 2 y 1 2 z 1 2 L 2 2 x 2 2 y 2 2 z 2 2 L 3 2 x 3 2 y 3 2 z 3 2 L 4 2 x 4 2 y 4 2 z 4 2 ]
Hence, available from Equation (11), (x′, y′, z′) can be acquired from the following form:
M · x = S
where
M = 2 . ( x 1 x 2 y 1 y 2 z 1 z 2 x 1 x 3 y 1 y 3 z 1 z 3 x 1 x 4 y 1 y 4 z 1 z 4 ) , x = ( x y z ) , S = ( L 2 2 L 1 2 + x 1 2 x 2 2 + y 1 2 y 2 2 + z 1 2 z 2 2 L 3 2 L 1 2 + x 1 2 x 3 2 + y 1 2 y 3 2 + z 1 2 z 3 2 L 4 2 L 1 2 + x 1 2 x 4 2 + y 1 2 y 4 2 + z 1 2 z 4 2 )
The solution of x from Equation (12) obtained from the least-square estimation method is expressed as:
x = ( M T M ) 1 M T S
The equation indicates that all the elements of matrix Ñ = (MTM)−1MT are derived from the transmitter coordinates value only. Therefore, vector S comprises the measured distance between the unknown coordinate of the image sensor receiver and the coordinates of the LED transmitters.

4. Experimental Results

4.1. Experiment Setup

The developed positioning system was tested in an experimental work area with a floor space area of 2.5 m × 4.5 m, and the distance from LED luminaires and camera images sensor was at a height of 2 m from the ground to the ceiling. The experimental setup of the proposed system comprised six LEDs mounted on the ceiling. The LEDs were spaced 2 m from each other and 0.25 m from the edge of the experimental region. Each of the LED luminaires transmitted their individual coordinate information to the smartphone camera image sensor, and the image sensor continuously received this coordinate information from the ceiling LED luminaires. Figure 9 shows the experimental area, where Figure 9a,b represents the measurement area and the real-time experimental environment, respectively.

4.2. Experimental Result Evaluation

4.2.1. Light Intensity Measurement Performance

The incident light intensity primarily depends on the transmission distance between the LED luminaires on the ceiling and the smartphone image sensor. If the transmission distance from the LEDs to the image sensor increases, then the light intensity will decrease, i.e., the amount of light intensity is inversely proportional to the transmission distance. Furthermore, the data rate will increase if the intensity of light from the LEDs increases. We conducted an experiment in two different approaches for observation of the light intensity measurement performance inside the LED luminaires. In one approach, we used a luxmeter to measure the light intensity from each LED light with respect to the transmission distance, as shown in Figure 10a. In the second approach, we used our experimental smartphone image sensor to measure the light intensity. For this operation, we implemented a simple Android application from [30], which can measure the light intensity well. The implemented Android application flow chart is shown in Figure 10b. To analyze the performance of the image sensor for light intensity measurement capability, we compared the image sensor measurement data with traditional luxmeter measurement data, as shown in Figure 10c. We observed that the measurement data accuracy of luxmeter and smartphone image sensor were similar, and the light intensity decreased with increased transmission distance.

4.2.2. Image Sensor Communication Performance

Distance estimation performance evaluation depends on successful communication between a transmitters’ multiple LEDs on the ceiling and the receiver CMOS image sensor. The receiver included the camera rolling shutter mechanism and an Android application that could manually adjust the shutter speed of the camera before implementing the experiment. To observe our proposed system communication performance between LED transmitter and receiver image sensor, we conducted an experiment with the data rate of the proposed FSOOK modulation technique at different distances and configurations of the camera frame rate, as shown in Figure 11a. It is clear that the data rate increases with camera frame rate, i.e., the data rate is directly proportional to the camera frame rate. For the practical implementation of the image sensor, the communication with high frame rates creates difficulty during the decoding process on the receiver side. In our system, the experiment was performed using a current Android smartphone (Model: Samsung Galaxy S8) with a camera frame rate of 20 fps. Owing to the loss of cache memory of the smartphone, we could not configure the camera frame rate to more than 20 fps in our system. Table 2 shows the configuration of camera parameters used in the experiment to decode data successfully. In our experiment, we observed the performance of the image sensor during the image computation time in terms of the image format and image resolution setup, as shown in Figure 11b, where two types of image formats, JPEG and YUV, were tested. As shown in Figure 11b, the computational time of image processing with respect to the image resolution of the YUV format requires less time compared with that of the JPEG format. Moreover, in our system, we set the image resolution to 600 × 800, which consumed a total computational time of 27.19 ms. Meanwhile, the JPEG format consumed 40 ms, which was comparatively higher than that of the YUV image format.

4.2.3. Decoding Accuracy Rate Performance

The performance of the positioning information depends on the decoding success rate of the receiver. The decoding rate error of the receiver side may result in wrong positioning information, and hence it, requires re-positioning, which causes the system to exhibit high latency and unstable positioning information. The decoding success rate depends on the vertical distance and horizontal distance from the transmitter of the LEDs and the receiver of the image sensor. In the normal case, the distance from the ceiling to the user is nearly 1.5 m. The decoding rate in accordance with the vertical distance from LEDs to image sensor is shown in Figure 12a. The decoding performance of our system is as follows: it reached the highest 100% when the vertical distance of the transmitter and receiver was less than 1.5 m, and it remained at 80% until the distance was 1.78 m. To observe the decoding rate of the system with respect to the horizontal distance from the LED luminaries, we tested the success accuracy rate from the LED lights to the image sensor to the horizontal distance, as shown in Figure 12b, which was sufficient to cover the region of under the single LED luminaires. The decoding accuracy of the horizontal distance from directly below the LED was 100%, and it was reduced when the image sensor moved far away from the direct incident place of the LED to the edge of the LED luminaire. The decoding accuracy was calculated by counting how many successive receiving data frames were received on the receiver side by the camera image sensor to extract the ID messages, i.e., decoding accuracy = ((received frame)/(total frame)) × 100%.

4.2.4. Positioning Performance Analysis

To evaluate the performance of our proposed positioning method with respect to the positioning accuracy, we conducted an experiment in the evenly distributed test points and randomly distributed test points. For evenly distributed test points, 32 evenly distributed real test positions were created at the height of 2 m from the ceiling to ground. Each position was tested twice, and the experiment for the estimated position involved 64 positions. The even distributions of the positioning results are shown in Figure 13a. As shown in Figure 13a, the estimated positions matched well with the real positions, which proved that our proposed positioning system provided high positioning accuracy. To observe the performance of the random movement of the user, we also conducted the experiment with 11 randomly distributed test points at the same height. The random distributions of the positioning results are shown in Figure 13b. The results of the system show that the estimated positions matched well with the real positions for the random movement cases. The experimental result shows that the average positioning accuracy was 2.44 cm, and the maximum error was 4.60 cm for the evenly distributed test points. Meanwhile, for the randomly distributed test points, the average positioning accuracy was 2.41 cm with maximum error of 4.50 cm, which was almost the same as the evenly distributed test results. For a better visualization of the performance, Figure 14a,b shows the cumulative distribution function (CDF) of the positioning error for the evenly and randomly distributed test points, respectively. The cumulative distribution function is defined in terms of the probability of a random positioning error (ε), whose value is less than or equal to the positioning accuracy Pa, such that the CDF can be expressed as (Pa) = P (εPa). As shown by the histograms in Figure 14c,d, the average errors are 2.44 and 2.41 cm, where all the positioning errors are within 4.60 and 4.50 cm for the evenly and randomly distributed test points, respectively. Therefore, it is perceivable that the proposed positioning method can yield a highly accurate positioning performance. Finally, a comparative analysis of the proposed system with other practically implemented VLC- and image sensor-based indoor positioning systems is shown in Table 3. The major drawback of the other system is its small testbed with inadequate accuracy and high cost. By applying the proposed localization method with VLC combined with an image sensor, a high-accuracy low-cost indoor positioning system was obtained.

5. Conclusions

In this paper, a novel wide-range and precise solution for an indoor positioning system was proposed, which utilized LED lights as a transmitter and a mobile image sensor as a receiver to identify lighting information. Utilizing the rolling shutter mechanism of a smartphone-embedded CMOS image sensor enhanced the data rate, thereby improving the decoding accuracy for minimizing the positioning error. The existing LED lighting infrastructure was effectively used in the proposed positioning system for the transmission of coordinate information with a high data rate by the FSOOK modulation scheme. Therefore, it results in a system that is cost-effective and easy to implement. To cover a large positioning area, the multilateration positioning method was proposed, which can locate a static mobile camera image sensor with centimeter-level positioning accuracy by receiving multiple LED light signals simultaneously. The accuracy of the proposed system is sufficient for localization applications for future large-scale indoor environments. Additionally, the proposed system achieved a 100% decoding rate with respect to the vertical up to a distance of 1.5 m and it remained at 80% until the distance was 1.78 m. The decoding accuracy of the horizontal distance from directly below to the LED was 100%, and it was gradually reduced when the image sensor moved far away from the direct incident place of the LED to the edge of the LED light. The performance of the proposed system was compared with those of other emerging indoor positioning technologies.

Author Contributions

Conceptualization, W.-Y.C.; system methodology, M.H.R. and M.A.S.S.; software, M.H.R.; validation, W.-Y.C. and M.H.R.; formal analysis, M.H.R. and M.A.S.S.; investigation, W.-Y.C., M.H.R. and M.A.S.S.; data curation, M.H.R.; writing—original draft preparation, M.H.R.; writing—review and editing, W.-Y.C., M.H.R. and M.A.S.S.; supervision, W.-Y.C. All authors have read and agreed to the published version of the manuscript.

Funding

The study was supported by a National Research Foundation of Korea (NRF) grant funded by the Korea Government (No. 2020R1A4A1019463).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Ozsoy, K.; Bozkurt, A.; Tekin, I. Indoor positioning based on global positioning system signals. Microw. Opt. Technol. Lett. 2013, 55, 1091–1097. [Google Scholar] [CrossRef] [Green Version]
  2. He, S.; Chan, S.H.G. Wi-Fi fingerprint-based indoor positioning: Recent advance and comparisons. IEEE Commun. Surv. Tuts. 2015, 18, 466–490. [Google Scholar] [CrossRef]
  3. Errington, A.F.C.; Daku, B.L.F.; Prugger, A.F. Initial position estimation using RFID tags: A least-squares approach. IEEE Trans. Instrum. Meas. 2010, 59, 2863–2869. [Google Scholar] [CrossRef]
  4. Cazzorla, A.; De Angelis, G.; Moschitta, A.; Dionigi, M.; Alimenti, F.; Carbone, P. A 5.6-GHz UWB position measurement system. IEEE Trans. Instrum. Meas. 2013, 62, 675–683. [Google Scholar] [CrossRef]
  5. Konings, D.; Bubel, A.; Alam, F.; Noble, F. Entity tracking within a Zigbee based smart home. In Proceedings of the 23rd International Conference on Mechatronics and Machine Vision in Practice M2VIP, Nanjing, China, 28–30 November 2016; pp. 1–6. [Google Scholar]
  6. Cifter, B.S.; Kadri, A.; Guvenc, I. Fundamental bounds on RSS-based wireless localization in passive UHF RFID systems 2015. In Proceedings of the IEEE Wireless Communications and Networking Conference (WCNC), New Orleans, LA, USA, 9–12 March 2015; pp. 1356–1361. [Google Scholar]
  7. Pham, N.Q.; Rachim, V.P.; Chung, W.-Y. High-accuracy VLC-based indoor positioning system using multi-level modulation. Opt. Express 2019, 27, 7568–7584. [Google Scholar] [CrossRef]
  8. Sejan, M.A.S.; Chung, W.-Y. Indoor fine particulate matter monitoring in a large area using bidirectional multihop VLC. IEEE Internet Things J. 2021, 8, 7214–7228. [Google Scholar] [CrossRef]
  9. Gu, W.; Zhang, W.; Kavehrad, M.; Feng, I. Three-dimensional light positioning algorithm with filtering techniques for indoor environments. Opt. Eng. 2014, 53, 107107. [Google Scholar] [CrossRef]
  10. Rajagopal, N.; Lazik, P.; Rowe, A. Visible light landmarks for mobile device. In Proceedings of the 3th IEEE/ACM International Conference on Information Processing in Sensor Networks (IPSN 2014), Berlin, Germany, 15–17 April 2014; pp. 249–260. [Google Scholar]
  11. Kuo, Y.-S.; Oannuto, P.; Hsiao, K.-J.; Dutta, P. Lusapose: Indoor positioning with mobile phones and visible light. In Proceedings of the 20th Annual International Conference on Mobile Computing and Networking, Maui, HI, USA, 7–11 December 2014; pp. 447–458. [Google Scholar]
  12. Yang, Z.; Wang, Z.; Zhang, J.; Huang, C.; Zhang, Q. Wearables can afford: Light-weight indoor positioning with visible light. In Proceedings of the 3th International Conference on Mobile Systems, Applications, and Services, Florence, Italy, 18–22 May 2015; pp. 317–330. [Google Scholar]
  13. Huang, H.; Feng, L.; Ni, G.; Yang, A. Indoor imaging visible light positioning with sampled sparse light source and mobile device. Chin. Opt. Lett. 2016, 14, 090602. [Google Scholar] [CrossRef]
  14. Hou, Y.; Xiao, S.; Bi, M.; Xue, Y.; Pan, W.; Hu, W. Single LED beacon-based 3-D indoor positioning using off-the-shelf device. IEEE Photonics J. 2016, 8, 1–11. [Google Scholar] [CrossRef]
  15. Kim, J.-Y.; Yang, S.-H.; Son, Y.-H.; Han, S.-K. High-resolution indoor positioning using light emitting diode visible light and camera image sensor. IET Optoelectron. 2016, 10, 184–192. [Google Scholar] [CrossRef]
  16. Zhu, B.; Cheng, J.; Wang, Y.; Yan, J.; Wang, J. Three-dimensional VLC positioning based on angle difference of arrival with arbitrary tilting angle of receiver. IEEE J. Sel. Areas Commun. 2018, 36, 8–22. [Google Scholar] [CrossRef]
  17. Ji, Y.; Xiao, C.; Gao, J.; Ni, J.; Cheng, H.; Zhang, P.; Sun, G. A single LED lamp positioning system based on CMOS camera and visible light communication. Opt. Commun. 2019, 443, 48–54. [Google Scholar] [CrossRef]
  18. Nakazawa, Y.; Makino, H.; Nishimori, K.; Wakatsuki, D.; Komagata, H. LED-tracking and ID-estimation for indoor positioning using visible light communication presented. In Proceedings of the 2014 International Conference on Indoor Positioning and Indoor Navigation (IPIN), Busan, Korea, 27–30 October 2014; pp. 87–94. [Google Scholar]
  19. Li, L.; Hu, P.; Peng, C.; Shen, G.; Zhao, F. Epsilon: A visible light based positioning system presented. In Proceedings of the 11th USENIX Conference on Networked Systems Design and Implementation, Seattle, WA, USA, 2–4 April 2014; pp. 331–343. [Google Scholar]
  20. Do, T.-H.; Yoo, M. TDOA-based indoor positioning using visible light. Photonic Netw. Commun. 2014, 27, 80–88. [Google Scholar] [CrossRef]
  21. Taparugssanagorn, A.; Siwamogsatham, S.; Pomalaza-Raez, C. A hexagonal coverage LED-ID indoor positioning based on TDOA with extended Kalman filter. In Proceedings of the 2013 IEEE 37th Annual Computer Software and Applications Conference, Kyoto, Japan, 22–26 July 2013; pp. 742–747. [Google Scholar]
  22. Rahman, M.S.; Haque, M.M.; Kim, K.-D. Indoor positioning by LED visible light communication and image sensors. Int. J. Elect. Comput. Eng. 2011, 1, 161. [Google Scholar] [CrossRef]
  23. Lin, B.; Ghassemlooy, Z.; Lin, C.; Tang, X.; Li, Y.; Zhang, S. An indoor visible light positioning system based on optical camera communications. IEEE Photon. Technol. Lett. 2017, 29, 579–582. [Google Scholar] [CrossRef]
  24. Bharati, S.; Rahman, M.A.; Podder, P. Implementation of ASK, FSK and PSK with BER vs. SNR comparison over AWGN channel. arXiv 2020, arXiv:2002.03601. [Google Scholar]
  25. Berman, S.M.; Greenhouse, D.S.; Bailey, I.L.; Clear, R.D.; Raasch, T.W. Human electroretinogram responses to video displays, fluorescent lighting, and other high frequency sources. Optom. Vis. Sci. 1991, 68, 645–662. [Google Scholar] [CrossRef]
  26. Liu, Y. Decoding mobile-phone image sensor rolling shutter effect for visible light communications. Opt. Eng. 2016, 55, 01603. [Google Scholar] [CrossRef]
  27. Chow, C.-W.; Shiu, R.-J.; Liu, Y.-C.; Yeh, C.-H. Non-flickering 100 m RGB visible light communication transmission based on a CMOS image sensor. Opt. Express 2018, 26, 7079–7084. [Google Scholar] [CrossRef]
  28. Yang, Y.; Nie, J.; Luo, J. ReflexCode: Coding with Superposed Reflection Light for LED-Camera Communication. In Proceedings of the 23rd Annual International Conference on Mobile Computing and Networking, Snowbird, UT, USA, 16–20 October 2017; pp. 193–205. [Google Scholar]
  29. Lee, H.Y.; Lin, H.M.; Wei, Y.L.; Wu, H.I.; Tsai, H.M.; Lin, K.C.J. RollingLight: Enabling line-of-sight light-to-camera communications. In Proceedings of the International Conference on Mobile Systems, Florence, Italy, 16–17 May 2015; pp. 167–180. [Google Scholar]
  30. Gutierrez-Martinez, J.-M.; Castillo-Martinez, A.; Medina-Merodio, J.-A.; Aguado-Delgado, J.; Martinez-Herraiz, J.-J. Smartphone as a light measurement tool: Case of study. Appl. Sci. 2017, 7, 616. [Google Scholar] [CrossRef]
  31. Zhang, R.; Zhong, W.-D.; Qian, K.; Wu, D. Image sensor based visible light positioning system with improved positioning algorithm. IEEE Access 2017, 5, 6087–6094. [Google Scholar] [CrossRef]
  32. Lee, J.-W.; Kim, S.-J.; Han, S.-K. 3D Visible light positioning by bokeh based optical intensity measurement in smartphone camera. IEEE Access 2019, 7, 91399–91406. [Google Scholar] [CrossRef]
  33. Quan, J.; Bai, B.; Jin, S.; Zhang, Y. Indoor positioning modeling by visible light communication and imaging. Chin. Opt. Lett. 2014, 12, 052201. [Google Scholar] [CrossRef]
Figure 1. (a) Simple transmitter circuit including its parameter; (b) the electrical schematic of transmitter circuit with its connection design.
Figure 1. (a) Simple transmitter circuit including its parameter; (b) the electrical schematic of transmitter circuit with its connection design.
Applsci 11 07308 g001
Figure 2. (a) FSOOK modulation signal; (b) the packet format of modulation light.
Figure 2. (a) FSOOK modulation signal; (b) the packet format of modulation light.
Applsci 11 07308 g002
Figure 3. Rolling shutter captured image with mixed symbol data.
Figure 3. Rolling shutter captured image with mixed symbol data.
Applsci 11 07308 g003
Figure 4. (a) Global shutter effect of CCD sensor; (b) rolling shutter effect of CMOS sensor.
Figure 4. (a) Global shutter effect of CCD sensor; (b) rolling shutter effect of CMOS sensor.
Applsci 11 07308 g004
Figure 5. Estimation of strip widths during the decoding process.
Figure 5. Estimation of strip widths during the decoding process.
Applsci 11 07308 g005
Figure 6. The multilateration operation diagram, where T1, T2, T3, T4, and R are the LED transmitter–receiver, respectively, and L1, L2, L3, and L4 are the calculated distances between transmitters and receiver.
Figure 6. The multilateration operation diagram, where T1, T2, T3, T4, and R are the LED transmitter–receiver, respectively, and L1, L2, L3, and L4 are the calculated distances between transmitters and receiver.
Applsci 11 07308 g006
Figure 7. Positioning system infrastructure installed with ceiling LED lights as a transmitter and smartphone image sensor as a receiver.
Figure 7. Positioning system infrastructure installed with ceiling LED lights as a transmitter and smartphone image sensor as a receiver.
Applsci 11 07308 g007
Figure 8. (a) Determination of area of the LED light; (b) location estimate and change of LEDs within FOV of camera from one location to another.
Figure 8. (a) Determination of area of the LED light; (b) location estimate and change of LEDs within FOV of camera from one location to another.
Applsci 11 07308 g008
Figure 9. Experimental setup: (a) VLP installation area; (b) test field environment.
Figure 9. Experimental setup: (a) VLP installation area; (b) test field environment.
Applsci 11 07308 g009
Figure 10. (a) Light intensity vs. communication distance; (b) flow chart of Android application for luminance measurement; (c) luxmeter vs. image sensor performance.
Figure 10. (a) Light intensity vs. communication distance; (b) flow chart of Android application for luminance measurement; (c) luxmeter vs. image sensor performance.
Applsci 11 07308 g010
Figure 11. (a) Data rate vs. camera frame rate; (b) image processing performance vs. image sensor resolution.
Figure 11. (a) Data rate vs. camera frame rate; (b) image processing performance vs. image sensor resolution.
Applsci 11 07308 g011
Figure 12. (a) Decoding accuracy vs. vertical distance; (b) decoding accuracy vs. horizontal distance.
Figure 12. (a) Decoding accuracy vs. vertical distance; (b) decoding accuracy vs. horizontal distance.
Applsci 11 07308 g012
Figure 13. Positioning results for evenly and randomly distributed test points: (a) for evenly distributed test points, positioning result at height 2 m; (b) for randomly distributed test points, positioning result at height 2 m.
Figure 13. Positioning results for evenly and randomly distributed test points: (a) for evenly distributed test points, positioning result at height 2 m; (b) for randomly distributed test points, positioning result at height 2 m.
Applsci 11 07308 g013
Figure 14. (a) CDF vs. error for evenly distributed test points; (b) CDF vs. error for randomly distributed test points; (c) histogram of error for evenly distributed test points; (d) histogram of error for randomly distributed test points.
Figure 14. (a) CDF vs. error for evenly distributed test points; (b) CDF vs. error for randomly distributed test points; (c) histogram of error for evenly distributed test points; (d) histogram of error for randomly distributed test points.
Applsci 11 07308 g014
Table 1. Transmitter parameters configuration.
Table 1. Transmitter parameters configuration.
Parameters NameValue
LED ModelBSDW-010, Color Temp. 5300~6000 K
LED Size15 cm
LED Power15 W
Number of LEDs6
MCUAtmega328p
MOSFET chipP24N65E
Resistance R1 value10 kΩ
Resistance R2 value55 Ω
Frequency number8
Synchronization frequency10 kHz
Error correctionRun-length encoding
Table 2. Receiver parameters configuration.
Table 2. Receiver parameters configuration.
Parameters NameValue
Image SensorRolling shutter CMOS sensor
Shutter speed16 kHz
ISO100
Frame rate6
MCU20 fps
Image processing libraryOpenCV
Smartphone modelSamsung Galaxy S8
CameraFront Camera with 8 megapixels
Focal length24 mm
Aperture1.7
Camera APICamera 2 with API Level 23
Camera image resolution600 × 800 pixels
Table 3. Comparison of the proposed system with other VLC- and image sensor-based positioning systems.
Table 3. Comparison of the proposed system with other VLC- and image sensor-based positioning systems.
Ref.Positioning
System
Experiment/SimulationExperiment Testbed SizeAccuracyNumber of LEDs
[31]ImageExperiment1.4 × 1.4 × 1.6 mX:3 cm Y: 7 cm3 LEDs
[11]AoA 1 +
Triangulation
Experiment0.71× 0.74 × 2.26 m10 cm5 LEDs
[32]AoA +
Trilateration
Experiment1.0 × 1.0 × 2.4 m<10 cm1 LED
[22]RSS 1 + AoASimulation1.8 × 1.8 × 3.5 m15.6 cm4 LEDs
[33]AoA + ImageExperiment1.8 × 1.8 × 3.5 m<40 cm9 LEDs
ProposedImage +
Multilateration
Experiment2.5 × 4.5 × 2 m2.41 cm6 LEDs
1 Angle-of-arrival (AoA), received-signal-strength (RSS).
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Rahman, M.H.; Sejan, M.A.S.; Chung, W.-Y. Multilateration Approach for Wide Range Visible Light Indoor Positioning System Using Mobile CMOS Image Sensor. Appl. Sci. 2021, 11, 7308. https://doi.org/10.3390/app11167308

AMA Style

Rahman MH, Sejan MAS, Chung W-Y. Multilateration Approach for Wide Range Visible Light Indoor Positioning System Using Mobile CMOS Image Sensor. Applied Sciences. 2021; 11(16):7308. https://doi.org/10.3390/app11167308

Chicago/Turabian Style

Rahman, Md Habibur, Mohammad Abrar Shakil Sejan, and Wan-Young Chung. 2021. "Multilateration Approach for Wide Range Visible Light Indoor Positioning System Using Mobile CMOS Image Sensor" Applied Sciences 11, no. 16: 7308. https://doi.org/10.3390/app11167308

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop