An Identity Recognition Model Based on RF-RFE: Utilizing Eye-Movement Data
Abstract
:1. Introduction
2. Literature Review
2.1. Suspect Identification Research
2.2. Eye-Movement Technology in Identifying Criminal Suspects
2.3. Application of Machine Learning in Identity Recognition
3. Experiment
3.1. Experimental Design
3.1.1. Experiment Preparation
3.1.2. Experimental Procedure
3.2. Data Analysis
Analysis of the above Indicators
4. An RF-RFE Model Based on RFE Interpretable Machine Learning
4.1. Model Construction
Algorithm 1: Random Forest construction process |
,…, }; |
}. |
Process: Function TreeGenerate (D, A) |
1: Generate node; |
2: if The samples in D all belong to the same category C then |
3: Mark node as a class C leaf node; return |
4: end if |
5: if A = ∅ OR The samples in D take the same value on A then |
6: Mark the node as a leaf node, category is marked as the class with the largest number of samples in D; return |
7: end if |
from A; |
do |
; |
is empty then |
12: Mark the branch node as a leaf node and its class as the class with the most samples in D; return |
13: else |
}) as branch node |
15: end if |
16: end for |
Output: a decision tree with node as the root node |
Algorithm 2: RFE algorithm process |
1: for the results of each resampling do |
2: Divide the data into training set, test set by resampling; |
3: Train the model in the training set using feature variables; |
4: Evaluate models with test sets; |
5: Calculate and rank the importance of each feature variable; |
6: Remove the least important features; |
7: end |
8: Decide on the proper number of characteristic variables |
9: Estimation of the set of feature variables ultimately used to build the model |
4.2. Analysis of Results
5. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
Appendix A
Portrait | ||||||
---|---|---|---|---|---|---|
Category | AOI | Fixation Time | Fixation Count | Pupil Diameter | Saccade Frequency | Blink Frequency |
Innocent | Involved | 1201.64 ± 916.05 | 3.77 ± 2.68 | 3.98 ± 0.59 | 2.98 ± 0.77 | 0.59 ± 0.29 |
Uninvolved | 1128.47 ± 740.79 | 4.05 ± 2.71 | 3.91 ± 0.53 | 3.01 ± 0.34 | 0.59 ± 0.22 | |
Informed | Involved | 1328.44 ± 881.53 | 5.03 ± 3.27 | 4.31 ± 0.45 | 3.16 ± 0.79 | 0.43 ± 0.37 |
Uninvolved | 1075.04 ± 769.88 | 4.21 ± 2.86 | 4.06 ± 0.42 | 3.07 ± 0.68 | 0.47 ± 0.30 | |
Crime | Involved | 2117.03 ± 956.41 | 7.00 ± 2.88 | 4.73 ± 0.47 | 2.90 ± 0.94 | 0.36 ± 0.30 |
Uninvolved | 1231.25 ± 877.16 | 4.70 ± 3.16 | 4.28 ± 0.38 | 3.00 ± 0.69 | 0.35 ± 0.21 |
Object | ||||||
---|---|---|---|---|---|---|
Category | AOI | Fixation Time | Fixation Count | Pupil Diameter | Saccade Frequency | Blink Frequency |
Innocent | Involved | 1183.18 ± 648.78 | 4.15 ± 2.44 | 3.86 ± 0.50 | 4.42 ± 0.31 | 0.35 ± 0.34 |
Uninvolved | 824.53 ± 336.71 | 3.44 ± 1.24 | 3.90 ± 0.49 | 3.38 ± 0.32 | 0.45 ± 0.18 | |
Informed | Involved | 1586.08 ± 840.08 | 6.46 ± 3.22 | 4.36 ± 0.48 | 3.42 ± 0.92 | 0.25 ± 0.28 |
Uninvolved | 624.63 ± 280.03 | 2.79 ± 1.20 | 4.50 ± 2.13 | 3.32 ± 0.64 | 0.34 ± 0.25 | |
Crime | Involved | 1214.04 ± 679.00 | 5.39 ± 2.92 | 4.54 ± 0.42 | 3.31 ± 0.96 | 0.30 ± 0.27 |
Uninvolved | 646.20 ± 274.59 | 2.97 ± 1.15 | 4.42 ± 0.31 | 3.39 ± 0.70 | 0.27 ± 0.19 |
Scene | ||||||
---|---|---|---|---|---|---|
Category | AOI | Fixation Time | Fixation Count | Pupil Diameter | Saccade Frequency | Blink Frequency |
Innocent | Involved | 368.03 ± 363.73 | 1.54 ± 1.16 | 4.03 ± 0.60 | 3.52 ± 0.66 | 0.46 ± 0.24 |
Uninvolved | 426.12 ± 150.17 | 1.70 ± 0.59 | 3.90 ± 0.47 | 3.36 ± 0.50 | 0.49 ± 0.23 | |
Informed | Involved | 754.68 ± 328.78 | 2.92 ± 1.10 | 4.49 ± 0.55 | 3.46 ± 0.79 | 0.33 ± 0.28 |
Uninvolved | 256.37 ± 112.58 | 1.15 ± 0.49 | 4.21 ± 0.43 | 3.42 ± 0.77 | 0.33 ± 0.24 | |
Crime | Involved | 1363.66 ± 394.62 | 5.54 ± 1.64 | 4.86 ± 0.42 | 3.42 ± 0.60 | 0.30 ± 0.21 |
Uninvolved | 289.74 ± 141.96 | 1.30 ± 0.52 | 4.35 ± 0.25 | 3.39 ± 0.54 | 0.32 ± 0.23 |
(a) | ||||||||
---|---|---|---|---|---|---|---|---|
Number | ACC | Kappa | Number | ACC | Kappa | Number | ACC | Kappa |
1 | 63.0% | 0.11 | 11 | 73.9% | 0.33 | 21 | 76.1% | 0.37 |
2 | 52.2% | −0.18 | 12 | 73.9% | 0.33 | 22 | 80.4% | 0.45 |
3 | 73.9% | 0.36 | 13 | 84.8% | 0.60 | 23 | 71.7% | 0.00 |
4 | 73.9% | 0.33 | 14 | 69.6% | 0.28 | 24 | 73.9% | 0.25 |
5 | 71.7% | 0.21 | 15 | 71.7% | 0.25 | 25 | 80.4% | 0.42 |
6 | 67.4% | 0.18 | 16 | 73.9% | 0.25 | 26 | 87.0% | 0.63 |
7 | 71.7% | 0.25 | 17 | 78.3% | 0.38 | 27 | 80.4% | 0.45 |
8 | 73.9% | 0.29 | 18 | 76.1% | 0.25 | 28 | 78.3% | 0.38 |
9 | 71.7% | 0.25 | 19 | 82.6% | 0.47 | 29 | 84.8% | 0.60 |
10 | 69.6% | 0.08 | 20 | 84.8% | 0.60 | 30 | 80.4% | 0.48 |
(b) | ||||||||
Number | ACC | Kappa | Number | ACC | Kappa | Number | ACC | Kappa |
1 | 66.7% | 0.60 | 11 | 84.1% | 0.81 | 21 | 85.5% | 0.84 |
2 | 65.2% | 0.58 | 12 | 79.7% | 0.75 | 22 | 88.4% | 0.84 |
3 | 85.5% | 0.82 | 13 | 81.2% | 0.77 | 23 | 82.6% | 0.84 |
4 | 78.3% | 0.73 | 14 | 84.1% | 0.80 | 24 | 85.5% | 0.84 |
5 | 75.4% | 0.70 | 15 | 84.1% | 0.80 | 25 | 84.1% | 0.84 |
6 | 84.1% | 0.80 | 16 | 85.5% | 0.82 | 26 | 87.0% | 0.84 |
7 | 75.4% | 0.70 | 17 | 84.1% | 0.80 | 27 | 87.0% | 0.84 |
8 | 87.0% | 0.84 | 18 | 81.2% | 0.77 | 28 | 88.4% | 0.84 |
9 | 79.7% | 0.75 | 19 | 81.2% | 0.84 | 29 | 84.1% | 0.84 |
10 | 82.6% | 0.79 | 20 | 82.6% | 0.84 | 30 | 88.4% | 0.84 |
(c) | ||||||||
Number | ACC | Kappa | Number | ACC | Kappa | Number | ACC | Kappa |
1 | 47.2% | −0.13 | 11 | 75.0% | 0.45 | 21 | 86.1% | 0.67 |
2 | 58.3% | 0.11 | 12 | 83.3% | 0.63 | 22 | 88.9% | 0.75 |
3 | 66.7% | 0.25 | 13 | 63.9% | 0.00 | 23 | 88.9% | 0.76 |
4 | 75.0% | 0.45 | 14 | 77.8% | 0.50 | 24 | 88.9% | 0.76 |
5 | 72.2% | 0.40 | 15 | 69.4% | 0.30 | 25 | 86.1% | 0.68 |
6 | 72.2% | 0.40 | 16 | 91.7% | 0.82 | 26 | 86.1% | 0.69 |
7 | 85.6% | 0.56 | 17 | 91.7% | 0.82 | 27 | 88.9% | 0.75 |
8 | 86.1% | 0.69 | 18 | 88.9% | 0.75 | 28 | 86.1% | 0.69 |
9 | 72.2% | 0.35 | 19 | 88.9% | 0.75 | 29 | 91.7% | 0.81 |
10 | 69.4% | 0.33 | 20 | 91.7% | 0.82 | 30 | 86.1% | 0.69 |
(d) | ||||||||
Number | ACC | Kappa | Number | ACC | Kappa | Number | ACC | Kappa |
1 | 69.6% | 0.37 | 11 | 71.4% | 0.38 | 21 | 80.4% | 0.56 |
2 | 67.9% | 0.33 | 12 | 69.6% | 0.33 | 22 | 82.1% | 0.62 |
3 | 76.8% | 0.50 | 13 | 71.4% | 0.40 | 23 | 76.8% | 0.52 |
4 | 64.3% | 0.22 | 14 | 62.5% | 0.19 | 24 | 83.9% | 0.65 |
5 | 67.9% | 0.32 | 15 | 67.9% | 0.30 | 25 | 73.2% | 0.44 |
6 | 76.8% | 0.50 | 16 | 82.1% | 0.61 | 26 | 76.8% | 0.51 |
7 | 73.2% | 0.44 | 17 | 82.1% | 0.63 | 27 | 82.1% | 0.62 |
8 | 75.0% | 0.46 | 18 | 83.9% | 0.66 | 28 | 82.1% | 0.63 |
9 | 75.0% | 0.46 | 19 | 78.6% | 0.53 | 29 | 76.8% | 0.49 |
10 | 67.9% | 0.33 | 20 | 85.7% | 0.69 | 30 | 82.1% | 0.62 |
(e) | ||||||||
Number | ACC | Kappa | Number | ACC | Kappa | Number | ACC | Kappa |
1 | 60.9% | 0.07 | 11 | 71.0% | 0.29 | 21 | 85.5% | 0.65 |
2 | 69.6% | 0.29 | 12 | 75.4% | 0.39 | 22 | 84.1% | 0.61 |
3 | 71.0% | 0.32 | 13 | 73.9% | 0.34 | 23 | 81.2% | 0.53 |
4 | 76.8% | 0.44 | 14 | 66.7% | 0.19 | 24 | 82.6% | 0.57 |
5 | 85.5% | 0.69 | 15 | 69.6% | 0.26 | 25 | 81.2% | 0.53 |
6 | 85.5% | 0.34 | 16 | 84.1% | 0.62 | 26 | 85.5% | 0.64 |
7 | 69.6% | 0.24 | 17 | 84.1% | 0.61 | 27 | 82.6% | 0.58 |
8 | 75.4% | 0.41 | 18 | 85.5% | 0.65 | 28 | 82.6% | 0.57 |
9 | 65.2% | 0.20 | 19 | 87.0% | 0.68 | 29 | 81.2% | 0.55 |
10 | 73.9% | 0.36 | 20 | 84.1% | 0.62 | 30 | 84.1% | 0.64 |
References
- Shuangqi, L. Identification of Lies of Criminal Suspects in Interrogation. Police Sci. Res. 2021, 176, 41–53. (In Chinese) [Google Scholar]
- Trovillo, P.V. History of lie detection. Am. Inst. Crim. L. Criminol. 1938, 29, 848. [Google Scholar] [CrossRef]
- Gaggioli, A. Beyond the truth machine: Emerging technologies for lie detection. Cyberpsychol. Behav. Soc. Netw. 2018, 21, 144. [Google Scholar] [CrossRef]
- Ben-Shakhar, G.; Furedy, J.J. Theories and Applications in the Detection of Deception: A Psychophysiological and International Perspective; Springer Science & Business Media: New York, NY, USA, 1990. [Google Scholar]
- Elaad, E. The challenge of the concealed knowledge polygraph test. Expert Evid. 1998, 6, 161–187. [Google Scholar] [CrossRef]
- Mukhra, R.; Krishan, K.; Kanchan, T. Bare footprint metric analysis methods for comparison and identification in forensic examinations: A review of literature. J. Forensic Leg. Med. 2018, 58, 101–112. [Google Scholar] [CrossRef]
- SEAK-Expert Witness Directory. Available online: https://www.seakexperts.com/members/8122-michael-s-nirenberg (accessed on 4 February 2018).
- Nirenberg, M.S.; Krishan, K.; Kanchan, T. A metric study of insole foot impressions in footwear of identical twins. J. Forensic Leg. Med. 2017, 52, 116–121. [Google Scholar] [CrossRef]
- Krishan, K.; Kanchan, T. Identification: Prints-Footprints. In Encyclopedia of Forensic and Legal Medicine, 2nd ed.; Elsevier Inc.: Amsterdam, The Netherlands, 2015; pp. 81–91. [Google Scholar]
- Robbins, L.M. Estimating height and weight from size of footprints. J. Forensic Sci. 1986, 31, 143–152. [Google Scholar] [CrossRef]
- Bodziak, W.J. Footwear Impression Evidence: Detection, Recovery, and Examination; CRC Press: Boca Raton, FL, USA, 2017. [Google Scholar]
- Kennedy, R.B.; Chen, S.; Pressman, I.S.; Yamashita, A.B.; Pressman, A.E. A large-scale statistical analysis of barefoot impressions. J. Forensic Sci. 2005, 50, JFS2004277-10. [Google Scholar] [CrossRef]
- Jung, J.W.; Bien, Z.; Lee, S.W.; Sato, T. Dynamic-footprint based person identification using mat-type pressure sensor. In Proceedings of the 25th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (IEEE Cat. No. 03CH37439), Cancun, Mexico, 17–21 September 2003; Volume 3, pp. 2937–2940. [Google Scholar]
- Abdullah, N.A.; Saidi, M.J.; Rahman, N.H.A.; Wen, C.C.; Hamid, I.R.A. Face recognition for criminal identification: An implementation of principal component analysis for face recognition. In AIP Conference Proceedings; AIP Publishing LLC: Melville, NY, USA, 2017; Volume 1891, p. 020002. [Google Scholar]
- Kakkar, P.; Sharma, V. Criminal identification system using face detection and recognition. Int. J. Adv. Res. Comput. Commun. Eng. 2018, 7, 238–243. [Google Scholar]
- Sivapathasundharam, B.; Prakash, P.A.; Sivakumar, G. Lip prints (cheiloscopy). Indian J. Dent. Res. Off. Publ. Indian Soc. Dent. Res. 2001, 12, 234–237. [Google Scholar]
- Dwivedi, N.; Agarwal, A.; Kashyap, B.; Raj, V.; Chandra, S. Latent lip print development and its role in suspect identification. J. Forensic Dent. Sci. 2013, 5, 22. [Google Scholar] [CrossRef] [Green Version]
- Penn, D.J.; Oberzaucher, E.; Grammer, K.; Fischer, G.; Soini, H.A.; Wiesler, D.; Novotny, M.V.; Dixon, S.J.; Xu, Y.; Brereton, R.G. Individual and gender fingerprints in human body odour. J. R. Soc. Interface 2007, 4, 331–340. [Google Scholar] [CrossRef] [Green Version]
- Cuzuel, V.; Leconte, R.; Cognon, G.; Thiebaut, D.; Vial, J.; Sauleau, C.; Rivals, I. Human odor and forensics: Towards Bayesian suspect identification using GC× GC–MS characterization of hand odor. J. Chromatogr. B 2018, 1092, 379–385. [Google Scholar] [CrossRef]
- Papesh, M.H. Source Memory Revealed through Eye Movements and Pupil Dilation. Ph.D. Thesis, Arizona State University, Tempe, AZ, USA, 2012. [Google Scholar]
- Walczyk, J.J.; Griffith, D.A.; Yates, R.; Visconte, S.R.; Simoneaux, B.; Harris, L.L. Lie detection by inducing cognitive load: Eye movements and other cues to the false answers of “witnesses” to crimes. Crim. Justice Behav. 2012, 39, 887–909. [Google Scholar] [CrossRef] [Green Version]
- Dyer, R. Are You Lying to Me?: Using Nonverbal Cues to Detect Deception. Ph.D. Thesis, Haverford College, Haverford, PA, USA, 2007. [Google Scholar]
- Ryan, J.D.; Hannula, D.E.; Cohen, N.J. The obligatory effects of memory on eye movements. Memory 2007, 15, 508–525. [Google Scholar] [CrossRef]
- Vrij, A.; Oliveira, J.; Hammond, A.; Ehrlichman, H. Saccadic eye movement rate as a cue to deceit. J. Appl. Res. Mem. Cogn. 2015, 4, 15–19. [Google Scholar] [CrossRef] [Green Version]
- Wang, R.; Han, C.; Wu, Y.; Guo, T. Fingerprint classification based on depth neural network. arXiv 2014, arXiv:1409.5188. [Google Scholar]
- Li, R. Research on Feature Extraction and Recognition Algorithm of Facial Expression. Master’s Thesis, Chongqing University, Chongqing, China, 2010. (In Chinese). [Google Scholar]
- Li, W.; Wen, L.; Chen, Y. Application of improved GA-BP neural network model in property crime prediction. J. Wuhan Univ. (Inf. Sci. Ed.) 2017, 8, 1110–1116. (In Chinese) [Google Scholar]
- Gruber, A.; Ben-Gal, I. Using targeted Bayesian network learning for suspect identification in communication networks. Int. J. Inf. Secur. 2018, 17, 169–181. [Google Scholar] [CrossRef]
- Gao, Y.; Wang, X.; Chen, Q.; Guo, Y.; Yang, Q.; Yang, K.; Fang, T. Suspects prediction towards terrorist attacks based on machine learning. In Proceedings of the 2019 5th International Conference on Big Data and Information Analytics (BigDIA), Kunming, China, 8–10 July 2019; pp. 126–131. [Google Scholar]
- Zemblys, R.; Niehorster, D.C.; Komogortsev, O.; Holmqvist, K. Using machine learning to detect events in eye-tracking data. Behav. Res. Methods 2018, 50, 160–181. [Google Scholar] [CrossRef] [Green Version]
- Zhang, X.; Liu, L.; Lan, M.; Song, G.; Xiao, L.; Chen, J. Interpretable machine learning models for crime prediction. Comput. Environ. Urban Syst. 2022, 94, 101789. [Google Scholar] [CrossRef]
- Nicodemus, K.K. On the stability and ranking of predictors from random forest variable importance measures. Brief. Bioinform. 2011, 12, 369–373. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Townsend, J.T. Theoretical analysis of an alphabetic confusion matrix. Percept. Psychophys. 1971, 9, 40–50. [Google Scholar] [CrossRef]
- Leal, S.; Vrij, A. Blinking During and After Lying. J. Nonverbal Behav. 2008, 32, 187–194. [Google Scholar] [CrossRef]
- Just, M.A.; Carpenter, P.A. Reading and spatial cognition: Reflections from eye fixations. In Eye Movement Research: Physiological and Psychological Aspects; Luer, G., Lass, U., Shallo-Hoffmann, J., Eds.; Hogrefe: Gollingen, Germany, 1988; pp. 193–213. [Google Scholar]
Sequence | Variable | Sequence | Variable | Sequence | Variable |
---|---|---|---|---|---|
1 | 11 | 21 | |||
2 | 12 | 22 | |||
3 | 13 | 23 | |||
4 | 14 | 24 | |||
5 | 15 | 25 | |||
6 | 16 | 26 | |||
7 | 17 | 27 | |||
8 | 18 | 28 | |||
9 | 19 | 29 | |||
10 | 20 | 30 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2023 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Liu, X.; Ding, N.; Shi, J.; Sun, C. An Identity Recognition Model Based on RF-RFE: Utilizing Eye-Movement Data. Behav. Sci. 2023, 13, 620. https://doi.org/10.3390/bs13080620
Liu X, Ding N, Shi J, Sun C. An Identity Recognition Model Based on RF-RFE: Utilizing Eye-Movement Data. Behavioral Sciences. 2023; 13(8):620. https://doi.org/10.3390/bs13080620
Chicago/Turabian StyleLiu, Xinyan, Ning Ding, Jiguang Shi, and Chang Sun. 2023. "An Identity Recognition Model Based on RF-RFE: Utilizing Eye-Movement Data" Behavioral Sciences 13, no. 8: 620. https://doi.org/10.3390/bs13080620
APA StyleLiu, X., Ding, N., Shi, J., & Sun, C. (2023). An Identity Recognition Model Based on RF-RFE: Utilizing Eye-Movement Data. Behavioral Sciences, 13(8), 620. https://doi.org/10.3390/bs13080620