Sensors 2010, 10(9), 8363-8374; doi:10.3390/s100908363

Article
Color Regeneration from Reflective Color Sensor Using an Artificial Intelligent Technique
Ömer Galip Saracoglu * and Hayriye Altural
Department of Electrical and Electronic Engineering, Erciyes University, 38039, Kayseri, Turkey; E-Mail: hayriye.altural@gmail.com
*
Author to whom correspondence should be addressed; E-Mail: saracog@erciyes.edu.tr; Tel.: +90-352-437-4901; Fax: +90-352-437-5784.
Received: 20 July 2010; in revised form: 10 August 2010 / Accepted: 20 August 2010 /
Published: 3 September 2010

Abstract

: A low-cost optical sensor based on reflective color sensing is presented. Artificial neural network models are used to improve the color regeneration from the sensor signals. Analog voltages of the sensor are successfully converted to RGB colors. The artificial intelligent models presented in this work enable color regeneration from analog outputs of the color sensor. Besides, inverse modeling supported by an intelligent technique enables the sensor probe for use of a colorimetric sensor that relates color changes to analog voltages.
Keywords:
color sensing; intelligent sensor; artificial neural network

1. Introduction

Color sensing is one of the important subjects of optical sensors. Color sensors have a variety of applications including detection of environmental, biological, and chemical parameters [17]. Color detection based chemical sensing is mostly implemented with particular emphasis on colorimetric sensors because many parameters, like pH [6], concentration [3], and chemical gases [4,8,9] can cause direct or indirect color changes in biological and chemical species.

Optical sensors usually have a non-linear relationship between the sensor’s response and the effect to be sensed or the measurand. Due to the fact that optical sensors have highly sensitive and non-linear nature, an unexpected change in the measurand may cause considerably changes and measurement errors in the sensor’s responses. Furthermore, it is expected that in modern sensor technology, a sensor has to adapt itself to the changing or unexpected conditions. In order to meet the expectations and to predict the sensor’s response more accurately, artificial intelligent techniques become a useful tool for the design of intelligent sensors [1012].

Artificial Neural Networks (ANNs) are inspired by the brain’s complex, nonlinear, and parallel computing ability and therefore they have some exceptional properties for data processing, such as adaptation, learning, and generalization [13]. Intelligent optical sensors incorporate the abilities of ANNs with optical sensors inherently having high accuracy, long term stability, and immunity to electromagnetic interference. ANN based intelligent techniques have more significance if the sensor is highly nonlinear and/or a precise mathematical relationship cannot be established between the sensor’s response and the measurand. For example, when optical color sensors are used for classification of emerged colors, ANNs exhibit good performance in versatility of the measurement system [14,15].

The study we present in this paper is a low-cost reflective color sensor whose detection principle is similar to that of designs in [1618] but overcomes the drawbacks reported therein with the aid of artificial intelligence. All parts of the sensor consist of cheap and easily available components. For example, the reflected signal from colored surface is produced by a RGB LED driven by a microcontroller and is detected by a photodiode. The analog signal from the photodiode is then amplified by an Op-Amp and applied to the microcontroller where it is converted to digital signals and processed. As it is well known, dark-colored surfaces degrade the performances of the reflective color sensors. In order to overcome this degradation and to improve sensor responses, we used ANN models utilizing multilayer perceptron (MLP) algorithms.

2. Brief Description of the Neural Networks

Neural Networks are perhaps the most popular intelligent technique in the design of intelligent sensors [19] and intelligent optical sensors [11,12]. This is especially because of their abilities in modeling of highly non-linear functions and generalizing of unseen data.

There are different types of neural networks, such as multilayer perceptron (MLP), radial basis function (RBF) and generalized regression neural networks (GRNN), for modeling of non-linear functions and data estimation/prediction problems [20]. The three network types have not only many similarities but also significant discrepancies. Figure 1 [13] shows general structure of the three types of neural networks consisting of an input layer, one or more hidden layers and an output layer. The processing units of an MLP apply a linear function to their inputs while they typically have a non-linear activation function whereas the RBF and GRNN include radial processing units. Besides, MLPs can exhibit better performances when they are used for data estimation/prediction problems (see Table 3 of Foody et al. [20]) while some other network types such as RBF, GRNN and probabilistic neural network (PNN) can successfully solve classification problems [21]. Since our problem is typically a data estimation/prediction problem, we preferred to use MLP neural network whose brief description is given below. Performance comparisons of the network types may be a subject of further studies.

An MLP neural network consists of neurons (also called as nodes) connected to each other with weights. Each neuron has individual weights that are multiplied by the inputs when they enter to the processing unit. Then the processing unit sums up the inputs multiplied by the weights and produces the output of the neuron after the summation is applied to an activation function. The activation function must be a continuously differentiable function (preferably a non-linear function so that the network has a non-linear behavior). The function mostly used in MLP neural networks has sigmoidal nonlinearity whose types are logistic and hyperbolic tangent functions [13].

The input variables are transferred via the neurons at the input of the network. This group of the neurons is called input layer while the neurons producing of the network responses at the output are called output layer. An MLP network also includes one or more hidden layers between the two layers. In most nonlinear problems, two or more hidden layers improve the generalization ability of the network [13].

In order to train an MLP neural network, a learning algorithm is used to adjust the weights of the connections between the neurons in the layers. The performance of a learning algorithm generally changes from one problem to another and consequently, a trial-and-error method is mostly considered to determine the more efficient algorithm for a given problem. The learning algorithms having satisfactory performances used in this work are Gradient descent with momentum and adaptive learning rate backpropagation (GDX), Bayesian regularization backpropagation (BR), Levenberg-Marquardt backpropagation (LM), Resilient backpropagation (RP), and Broyden Fletcher Goldfarb Shanno quasi-Newton backpropagation (BFG).

GDX has an algorithm that updates weights and bias values according to gradient descent momentum and an adaptive learning rate. LM algorithm is a network training function that updates weight and bias values according to Levenberg-Marquardt optimization and it is the fastest (at the expense of the more memory usage) backpropagation algorithm in MATLAB Neural Network Toolbox. In BR algorithm, the weight and bias values are updated according to LM optimization that minimizes a combination of squared errors and weights. RP algorithm updates weight and bias values according to the resilient backpropagation algorithm and it works by modifying each weight by a learning parameter in order to minimize the overall error. Weights and bias values are updated according to the BFGS quasi-Newton method where the new weight is computed as a function of the gradient and the current weight. More details about the ANNs and learning algorithms can be found in [13,22].

3. Reflective Color Sensing

A typical reflective color sensor consists of three parts: a target, a source illuminating the target and a detector capturing the reflected light from the target. The most important parts of the design are the source and the detector. Reflective color sensors use either a broadband white-light source and three photo-detectors mounted behind of individual color filters or three narrow-light source generating RGB colors and a photo-detector. The sensor described in this section is the second and its general structure is given in Figure 2.

In a reflective color sensor, three phenomena take place when the light beams impinge onto the target: reflection, absorption, and transmission. A portion of incident light is reflected while the remaining is transmitted after a partial absorption regarding to the properties of the target. Two types of the reflected signal, i.e., diffuse and specular reflections, contribute to detector’s response. The diffuse reflection carries the useful information about the surface color; however varying specular reflection and fluctuating transparency can degrade the performance of the sensor. Fortunately, the specular reflection and the transparency remain almost constant for a given target and some small fluctuations in these parameters can be tolerated by intelligent approaches.

The cycle starts checking out whether the device is connected with PC. Then, minimum and maximum analog voltages produced by the probe are determined to calibrate the probe according to the properties of the surface to be measured. Minimum and maximum voltages are obtained from black and white colored surfaces, respectively. After calibration, the automatic measurement intervals are selected for each LED to be driven and the photodiode to be read. Time intervals are selected as 100 ms in this work. Microcontroller then drives R-LED during a duty-cycle of 100 ms as first and reads the photodiode’s output at the same time (Figure 3). This step is repeated averagely by 10 times until readouts remain stable. After completion of the R-LED period the microcontroller sends the readouts to the PC and initiates the G-LED period, and so on. When the three periods complete, (i) a new period can be started on the same surface, (ii) a new measurement can be started on another surface, or (iii) the procedure can be ended. A typical readout in terms of analog voltages and corresponding RGB contents is given in Table 1.

The sensor uses RGB LEDs controlled by a microcontroller as the light source. A photodiode converts the optical signal reflected from the target into analog voltages. Then the microcontroller converts analog voltages into digital data and sends to computer after completion of a measurement cycle. This digital data is utilized to determine R, G, and B values, since the reflected intensity (i.e., diffuse reflection) is proportional to RGB content of the target. Figure 4 describes how the measurement cycle completes.

4. Results and Discussion

An MLP neural network having an input layer with three inputs, two hidden layers, and an output layer with three outputs is used. The activation function of the output layer is a linear one while that of the other layers is the hyperbolic tangent function. The network is trained by five different training algorithms where the hidden neuron numbers in each layer are adjusted for the best performances. The inputs of the network are analog voltages obtained from the photonic circuit and the outputs are RGB contents of the surface. In order to obtain optimal network structures and neuron numbers in the hidden layers we used a trial and error approach. Although we started to train the networks with a minimum number of hidden layers and neurons in it, the best performances were obtained by different structures. Network structures of the proposed models are summarized in Table 2.

Although RGB content of any surface can be one of the 16 million possible combinations, we used a dataset consisting of only 246 data for training of the networks given in Table 2. Test dataset that is randomly selected and completely different from training dataset consists of 33 data. In order to constitute the dataset, we prepared a color palette whose a small part is given in Figure 5 and printed out them on a glossy paper by using a color laser printer. Then, for each color cell in the palette we obtained corresponding analog voltages by measuring with the probe given in Figure 2.

Computational performances of the networks are summarized in Table 3. It is useful to note that in the table, absolute error between the real and the networks’ values should be considered instead of usual error notations such as Root Mean Squared (RMS) and mean squared errors (MSEs), because absolute deviation from the real RGB content is more instructive than the others for this problem. Nevertheless, the MSEs of the models are also given in the table calculated with un-normalized values of the RGB contents. As can be seen from Table 3, BR and RP algorithms exhibit the best results in terms of maximum absolute error and time consumption.

For a better comparison of the performances, the best results (BR and RP) and the worst result (GDX) of the proposed models are given in Table 4. Numerical meaning of RGB contents is that if the absolute error is more than 30, human eye can distinguish the difference. In this context, RP and GDX results have two and four values, respectively, whose absolute errors are more than 30 while all of absolute errors of BR results are less than 30. Visual results of the networks are given in Table 5 as another performance comparison.

As can be seen from Table 5, all proposed networks can successfully model any RGB content of a given surface regarding analog voltages. In other words, the artificial intelligent models presented in this manner enable color regeneration from analog outputs of the color sensor (or from any voltage information). Besides, with the aim of an inverse modeling supported by an intelligent technique, the sensor probe can be used for a colorimetric sensor that relates color changes to analog voltages.

5. Conclusions

A low-cost optical sensor based on reflective color sensing is presented. Moreover, some neural network models are used as artificial intelligent technique to improve the color regeneration from the sensor signals. That is to say, analog voltages of the sensor can be successfully converted to RGB colors. The artificial intelligent models presented in this work enable color regeneration from analog outputs of the color sensor. Besides, an inverse modeling supported by an intelligent technique enables the sensor probe for use of a colorimetric sensor that relates color changes to analog voltages.

This work was supported by Research Fund of the Erciyes University (Project Number FBD09-1079)

References

  1. Endo, T; Yanagida, Y; Hatsuzawa, T. Colorimetric detection of volatile organic compounds using a colloidal crystal-based chemical sensor for environmental applications. Sens. Actuat. B 2007, 125, 589–595, doi:10.1016/j.snb.2007.03.003.
  2. Naydenova, I; Jallapuram, R; Toal, V; Martin, S. A visual indication of environmental humidity using a colour changing hologram recorded in a self developing photopolymer. Appl. Phys. Lett 2008, 92, 031109, doi:10.1063/1.2837454.
  3. Wang, XD; Chen, HX; Zhou, TY; Lin, ZJ; Zeng, JB; Xie, ZX; Xi, C; Wong, KY; Chen, GN; Wang, XR. Optical colorimetric sensor strip for direct readout glucose measurement. Biosens. Bioelectron 2009, 24, 3702–3705, doi:10.1016/j.bios.2009.05.018.
  4. Ricketts, SR; Douglas, P. A simple colorimetric luminescent oxygen sensor using a green LED with Ptoctaethylporphyrin in ethyl cellulose as the oxygen-responsive element. Sens. Actuat. B 2008, 135, 46–51, doi:10.1016/j.snb.2008.07.017.
  5. Ercag, E; Üzer, A; Apak, A. Selective spectrophotometric determination of TNT using a dicyclohexylamine-based colorimetric sensor. Talanta 2009, 78, 772–780, doi:10.1016/j.talanta.2008.12.042.
  6. O’ Toole, M; Shepherd, R; Wallace, GG; Diamond, D. Inkjet printed LED based pH chemical sensor for gas sensing. Anal. Chim. Acta 2009, 652, 308–314, doi:10.1016/j.aca.2009.07.019.
  7. Rastegarzadeh, R; Pourreza, N; Saeedi, I. An optical redox chemical sensor for determination of iodide. Talanta 2008, 77, 1032–1036.
  8. Sen, A; Albarella, JD; Carey, JR; Kim, P; McNamara, WB, III. Low-cost colorimetric sensor for the quantitative detection of gaseous hydrogen sulfide. Sens. Actuat. B 2008, 134, 234–237, doi:10.1016/j.snb.2008.04.046.
  9. Courbat, J; Briand, D; Wöllenstein, J; De Rooij, NF. Colorimetric gas sensors based on optical waveguides made on plastic foil. Procedia Chem 2009, 1, 576–579, doi:10.1016/j.proche.2009.07.144.
  10. Borecki, M. Intelligent fiber optic sensor for estimating the concentration of a mixture-design and working principle. Sensors 2007, 7, 384–399, doi:10.3390/s7030384.
  11. Saracoglu, ÖG. An artificial neural network approach for the prediction of absorption measurements of an evanescent field fiber sensor. Sensors 2008, 8, 1585–1594, doi:10.3390/s8031585.
  12. Efendioglu, HS; Yildirim, T; Fidanboylu, K. Prediction of force measurements of a microbend sensor based on an artificial neural network. Sensors 2009, 9, 7167–7176, doi:10.3390/s90907167.
  13. Haykin, S. Neural Networks: A Comprehensive Foundation, 2nd ed ed.; Prentice-Hall: Englewood Cliffs, NJ, USA, 1999.
  14. O’ Farrell, M; Lewis, E; Flanagan, C; Lyons, WB; Jackman, N. Design of a system that uses optical-fiber sensors and neural networks to control a large-scale industrial oven by monitoring the food quality online. IEEE Sensor J 2005, 5, 1407–1420, doi:10.1109/JSEN.2005.858963.
  15. O’ Farrell, M; Lewis, E; Flanagan, C; Lyons, W; Jackman, N. Comparison of k-NN and neural network methods in the classification of spectral data from an optical fibre-based sensor system used for quality control in the food industry. Sens Actuat B 2005, 111–112, 354–362.
  16. Afromowitz, MA; Van Liew, GS; Heimbach, DM. Clinical evaluation of burn injuries using an optical reflectance technique. IEEE Trans Biomed Eng 1987. BME-34, 114–127.
  17. Laming, JE; Martino, A. PC color recognition using led and software techniques. IEEE Photonic. Technol. Lett 1993, 5, 583–586, doi:10.1109/68.215289.
  18. Yang, PK; Chen, JC; Chuang, YH. Improvement on reflective color measurement using a tri-color LED by multi-point calibration. Opt. Commun 2007, 272, 320–324, doi:10.1016/j.optcom.2006.11.051.
  19. Wefky, AM; Espinosa, F; Jiménez, JA; Santiso, E; Rodríguez, JM; Fernández, AJ. Alternative sensor system and MLP neural network for vehicle pedal activity estimation. Sensors 2010, 10, 3798–3814, doi:10.3390/s100403798.
  20. Foody, GM; Cutler, ME; McMorrow, J; Pelz, D; Tangki, H; Boyd, DS; Douglas, I. Mapping the biomass of bornean tropical rain forest from remotely sensed data. Global Ecol. Biogeogr 2001, 10, 379–387, doi:10.1046/j.1466-822X.2001.00248.x.
  21. Kiyan, T; Yildirim, T. Breast cancer diagnosis using statistical neural networks. Istanbul Univ.–J. Elect. Elect. Eng 2004, 4, 1149–1153.
  22. Neural Network Toolbox, Available online at http://www.mathworks.com/access/helpdesk/help/toolbox/nnet/ (accessed on June 25, 2010).
Sensors 10 08363f1 200
Figure 1. General structure of a multilayer perceptron (MLP).

Click here to enlarge figure

Figure 1. General structure of a multilayer perceptron (MLP).
Sensors 10 08363f1 1024
Sensors 10 08363f2 200
Figure 2. (a) Schematic illustration of RGB LED based color. (b) Photo of sensor tip. (c) Sensor probe connected with PC.

Click here to enlarge figure

Figure 2. (a) Schematic illustration of RGB LED based color. (b) Photo of sensor tip. (c) Sensor probe connected with PC.
Sensors 10 08363f2 1024
Sensors 10 08363f3 200
Figure 3. Switching of LEDs and the photodiode.

Click here to enlarge figure

Figure 3. Switching of LEDs and the photodiode.
Sensors 10 08363f3 1024
Sensors 10 08363f4 200
Figure 4. Flowchart of the measurement procedure.

Click here to enlarge figure

Figure 4. Flowchart of the measurement procedure.
Sensors 10 08363f4 1024
Sensors 10 08363f5 200
Figure 5. A part of color palette to constitute training/test dataset.

Click here to enlarge figure

Figure 5. A part of color palette to constitute training/test dataset.
Sensors 10 08363f5 1024
Table Table 1. Typical photodetector readout in terms of analog voltages depending on RGB contents of the surface.

Click here to display table

Table 1. Typical photodetector readout in terms of analog voltages depending on RGB contents of the surface.
Photodetector readouts (volts)RGB contents
Surface ColorVR, R-LEDVG, G-LEDVB, B-LEDRGB
Black0.1590.2530.163000
(Any)0.8001.251.7713590225
White3.673.663.66255255255
Table Table 2. Network structures of the proposed models

Click here to display table

Table 2. Network structures of the proposed models
Training algorithmNetwork type (neuron numbers in the layers)
Input1st hidden2nd hiddenOutput

Gradient descent with momentum and adaptive learning rate backpropagation (GDX)3893
Bayesian regularization backpropagation (BR)31053
Levenberg-Marquardt backpropagation (LM)3593
Resilient backpropagation (RP)3593
Broyden Fletcher Goldfarb Shanno quasi-Newton backpropagation (BFG)31293
Table Table 3. Network structures of the proposed models.

Click here to display table

Table 3. Network structures of the proposed models.
AlgorithmMaximum absolute errorUn-normalized MSEEpoch numberTime consumption (s)
GDX3826875,000416
BR3025941013
LM332702,20062
RP322324,00022
BFG342652,200110
Table Table 4. The best results (BR and RP) and the worst result (GDX) of the proposed networks.

Click here to display table

Table 4. The best results (BR and RP) and the worst result (GDX) of the proposed networks.
NoInputs (analog voltages)Outputs (RGB contents)
Real valuesBR resultsRP resultsGDX results
VRVGVBRGBRGBRGBRGB
10.881.590.4890135459114633102142199815030
22.141.620.4618013545190145241821431318415121
30.350.730.914545135423916548331644037172
41.000.570.6913545135144371301441614313813153
53.550.710.8722545135248171572401614524315165
61.300.930.3813590451479720135103181319519
71.913.640.731352254511024415104244209724911
80.541.831.4645135135181431372214413433140132
90.943.251.2745225135542251523022613766222145
100.580.640.9313545180126252001331720012318205
110.601.770.824513590291427831147814614087
120.481.962.5845135225201462512913524021143233
131.361.930.9013513590123136951371507513014593
143.661.890.9722513590251134842421419625513092
151.281.061.1113590135145931491479315713787139
163.661.001.0122590135253671392526814825360133
171.803.001.011351809015417011514920510413519695
180.882.031.609013513588147147100149129107148144
191.233.641.4890225135762531539224915491253151
200.801.251.7713590225134972271399023914789238
210.631.972.5990135225981422518913724293141252
222.361.731.19180135135182125131196140121177141127
231.612.961.55135180135141169164137198139131187150
241.171.982.07135135180137134186133143181146152174
251.941.621.64180135180179120180182131179167138162
263.651.881.85225135180251128178254132188254139190
272.162.931.40180180135170157154174197127150184139
281.453.012.36135180180143183204133200182140193200
291.533.652.17135225180123254197123249186118249184
303.093.662.06180225135199250152205241144200251145
311.651.752.16180135225183123227175132227180140213
322.951.742.11225135225226123242243129225232139229
331.673.653.56135225225162238254143247255154236253
Table Table 5. Visual comparisons of the real RGB contents and proposed neural network results.

Click here to display table

Table 5. Visual comparisons of the real RGB contents and proposed neural network results.
NoRealBRRPLMBFGGDX

1

2

3

4

5

6

7

8

9

10

11

12

13

14

15

16

17

18

19

20

21

22

23

24

25

26

27

28

29

30

31

32

33
Sensors EISSN 1424-8220 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert