Detection of Moving Point Symbols on Cartographic Backgrounds
Abstract
:Introduction
Methods
Experimental design
Eye Movement Analysis
- “Success metric”: This metric is computed as subjects' percentage that detects the moving point symbol. The values of this metric lies in the range of [0,1] where 0 value indicates that all subjects do not detect moving point symbol while the value 1 indicates that 100% of subjects achieve the moving point detection.
- “Duration metric”: This metric expresses the average total durations that are linked with the detection of moving point symbol produced by the calculation of the corresponded average values produced by all subjects' recordings.
- “Duration percentage metric”: This metric expresses the average percentage of duration required for the detection of moving point symbol. The metric is produced by the calculation of the corresponding average values produced by all subjects' recordings. The values lie in the range of [0,1] where 0 value indicates that all fixations performed within the visual scene are not linked with the detection of moving point symbol while value 1 indicates that 100% of the performed fixations correspond to the detection of moving point symbol.
- “Number metric”: This metric expresses the average number of fixations related to moving point detection.
- “Number percentage metric”: This metric refers to the average percentage of fixations’ number that is linked with moving point detection to the total number of fixations in the visual scene. The values of this metric lies in the range of [0,1] where 0 value indicates that all fixations performed within the visual scene are not linked with the detection of moving point symbol while value 1 indicates that the 100% of the performed fixations correspond to the detection of moving point symbol.
- “Time to first fixation metric”: This metric indicates the average duration value, which is required for the first fixation in moving point symbol on the examined visual scene. More specifically, the metric expresses the duration between the appearance of the new frame until the first fixation (by the central vision) in the moving point symbol.
Results
Overall visual reaction
Duration threshold
Discussion
Conclusion
Appendix
Point Symbol | X (pixels) | Y (pixels) | Distance (pixels) | Duration A (msec) | Duration B (msec) | |||||
---|---|---|---|---|---|---|---|---|---|---|
T1 | 73 | 101 | - | 2000 | 700 | |||||
T2 | 1039 | 743 | 1160 | 800 | 450 | |||||
T3 | 270 | 873 | 780 | 400 | 300 | |||||
T4 | 1245 | 86 | 1260 | 200 | 250 | |||||
T5 | 932 | 78 | 300 | 2000 | 700 | |||||
T6 | 465 | 455 | 600 | 600 | 350 | |||||
T7 | 606 | 566 | 180 | 400 | 300 | |||||
T8 | 1198 | 941 | 700 | 800 | 450 | |||||
T9 | 1242 | 1008 | 80 | 600 | 350 | |||||
T10 | 49 | 38 | 1540 | 200 | 250 | |||||
T11 | 156 | 568 | 540 | 100 | 150 | |||||
T12 | 1200 | 220 | 1100 | 2000 | 700 | |||||
T13 | 861 | 250 | 340 | 3000 | 900 | |||||
T14 | 288 | 864 | 840 | 600 | 350 | |||||
T15 | 1221 | 47 | 1240 | 1000 | 500 | |||||
T16 | 1252 | 155 | 100 | 800 | 450 | |||||
T17 | 74 | 911 | 1400 | 100 | 150 | |||||
T18 | 634 | 915 | 560 | 1000 | 500 | |||||
T19 | 610 | 56 | 860 | 400 | 300 | |||||
T20 | 1029 | 898 | 940 | 2000 | 700 | |||||
T21 | 753 | 389 | 580 | 400 | 300 | |||||
T22 | 324 | 489 | 440 | 800 | 450 | |||||
T23 | 125 | 501 | 200 | 400 | 300 | |||||
T24 | 1223 | 18 | 1200 | 600 | 350 | |||||
T25 | 335 | 224 | 920 | 3000 | 900 | |||||
T26 | 531 | 395 | 260 | 800 | 450 | |||||
T27 | 212 | 950 | 640 | 3000 | 900 | |||||
T28 | 1137 | 318 | 1120 | 200 | 250 | |||||
T29 | 89 | 943 | 1220 | 100 | 150 | |||||
T30 | 159 | 432 | 500 | 600 | 350 | |||||
T31 | 810 | 741 | 720 | 2000 | 700 | |||||
T32 | 1224 | 836 | 420 | 100 | 150 | |||||
T33 | 79 | 179 | 1320 | 400 | 300 | |||||
T34 | 329 | 53 | 280 | 1000 | 500 | |||||
T35 | 1000 | 974 | 1140 | 1000 | 500 | |||||
T36 | 543 | 392 | 740 | 200 | 250 | |||||
T37 | 619 | 599 | 220 | 600 | 350 | |||||
T38 | 448 | 938 | 380 | 3000 | 900 | |||||
T39 | 1233 | 57 | 1180 | 800 | 450 | |||||
T40 | 122 | 972 | 1440 | 100 | 150 | |||||
T41 | 41 | 176 | 800 | 2000 | 700 | |||||
T42 | 1244 | 612 | 1280 | 200 | 250 | |||||
T43 | 77 | 39 | 1300 | 3000 | 900 | |||||
T44 | 490 | 580 | 680 | 200 | 250 | |||||
T45 | 1144 | 666 | 660 | 1000 | 500 | |||||
T46 | 1025 | 678 | 120 | 100 | 150 | |||||
T47 | 1004 | 644 | 40 | 3000 | 900 | |||||
T48 | 1198 | 785 | 240 | 1000 | 500 | |||||
T49 | 102 | 14 | 1340 | 2000 | 700 |
Point.Symbol | X(pixels) | Y(pixels) | Duration(msec) | Intensitychange | ||
---|---|---|---|---|---|---|
T1 | 73 | 101 | 475 | 0.00% | ||
T2 | 1039 | 743 | 750 | 35.00% | ||
T3 | 270 | 873 | 200 | 70.00% | ||
T4 | 1245 | 86 | 612.5 | 35.00% | ||
T5 | 932 | 78 | 337.5 | 70.00% | ||
T6 | 465 | 455 | 612.5 | 0.00% | ||
T7 | 606 | 566 | 475 | 0.00% | ||
T8 | 1198 | 941 | 750 | 70.00% | ||
T9 | 1242 | 1008 | 337.5 | 0.00% | ||
T10 | 49 | 38 | 200 | 35.00% | ||
T11 | 156 | 568 | 475 | 70.00% | ||
T12 | 1200 | 220 | 337.5 | 70.00% | ||
T13 | 861 | 250 | 475 | 35.00% | ||
T14 | 288 | 864 | 612.5 | 35.00% | ||
T15 | 1221 | 47 | 200 | 0.00% | ||
T16 | 1252 | 155 | 750 | 0.00% | ||
T17 | 74 | 911 | 475 | 70.00% | ||
T18 | 634 | 915 | 612.5 | 35.00% | ||
T19 | 610 | 56 | 200 | 0.00% | ||
T20 | 1029 | 898 | 337.5 | 35.00% | ||
T21 | 753 | 389 | 475 | 35.00% | ||
T22 | 324 | 489 | 200 | 70.00% | ||
T23 | 125 | 501 | 612.5 | 0.00% | ||
T24 | 1223 | 18 | 200 | 70.00% | ||
T25 | 335 | 224 | 475 | 0.00% | ||
T26 | 531 | 395 | 612.5 | 0.00% | ||
T27 | 212 | 950 | 200 | 35.00% | ||
T28 | 1137 | 318 | 612.5 | 70.00% | ||
T29 | 89 | 943 | 337.5 | 35.00% | ||
T30 | 159 | 432 | 612.5 | 70.00% | ||
T31 | 810 | 741 | 337.5 | 0.00% | ||
T32 | 1224 | 836 | 200 | 0.00% | ||
T33 | 79 | 179 | 475 | 70.00% | ||
T34 | 329 | 53 | 750 | 35.00% | ||
T35 | 1000 | 974 | 337.5 | 0.00% | ||
T36 | 543 | 392 | 750 | 35.00% | ||
T37 | 619 | 599 | 612.5 | 70.00% | ||
T38 | 448 | 938 | 337.5 | 70.00% | ||
T39 | 1233 | 57 | 475 | 35.00% | ||
T40 | 122 | 972 | 750 | 0.00% | ||
T41 | 41 | 176 | 750 | 70.00% | ||
T42 | 1244 | 612 | 337.5 | 35.00% | ||
T43 | 77 | 39 | 750 | 70.00% | ||
T44 | 490 | 580 | 200 | 35.00% | ||
T45* | 578 | 599 | 750 | 0.00% | ||
T46* | 740 | 422 | 475 | 0.00% |
References
- Alaçam, Ö., and M. Dalci. 2009. A Usability Study of WebMaps with Eye Tracking Tool: The Effect of Iconic Representation of Information. In Human-Computer Interaction. Edited by J.A. Jacko. LNCS 5610. Springer-Verlang Berlin Heidelber: pp. 12–21. [Google Scholar]
- Bargiota, T., V. Mitropoulos, V. Krassanakis, and B. Nakos. 2013. Measuring locations of critical points along cartographic lines with eye movements. Proceedings of the 26th International Cartographic Association Conference, Dresden, Germany. [Google Scholar]
- Bejdek, J. C. 1981. Pattern Recognition with Fuzzy Objective Function Algorithms. New York: Plenum Press. [Google Scholar]
- Bertin, J. C. 1967, 1983. Semiology of Graphics: Diagrams, Networks, Maps, Madison. University of Wisconsin Press (French Edition 1967). [Google Scholar]
- Blignaut, P. 2009. Fixation identification: the optimum threshold for a dispersion algorithm. Attention, Perception, & Psychophysics 71, 4: 881–895. [Google Scholar]
- Blignaut, P., and T. Beelders. 2009. The effect of fixational eye movements on fixation identification with dispersion-based fixation detection algorithm. Journal of Eye Movement Research 2, (5):4: 1–14. [Google Scholar] [CrossRef]
- Bodade, R., and S. Talbar. 2010. Novel approach of accurate iris localisation form high resolution eye images suitable for fake iris detection. International Journal of Information Technology and Knowledge Management 3, 2: 685–690. [Google Scholar]
- Camilli, M., R. Nacchia, M. Terenzi, and F. Di Nocera. 2008. ASTEF: A simple tool for examining fixations. Behavior Researcher Methods 40, 2: 373–382. [Google Scholar] [CrossRef] [PubMed]
- Ciołkosz-Styk, A. 2012. The visual search method in map perception research. Geoinformation Issues 4, 1: 33–42. [Google Scholar]
- Coey, C. A., S. Wallot, M. J. Richardson, and G. V. Orden. 2012. On the Structure of Measurement Noise in Eye-Tracking. Journal of Eye Movement Research 5, (4):5: 1–10. [Google Scholar] [CrossRef]
- DiBiase, D., A. M. MacEachren, J. B. Krygier, and C. Reeves. 1992. Animation and the Role of Map Design in Scientific Visualization. Cartography and Geographic Information Systems 19, 4: 201–214. [Google Scholar] [CrossRef]
- Dong, W., H. Liao, F. Xu, Z. Liu, and S. Zhang. 2014. Using eye tracking to evaluate the usability of animated maps. Science China Earth Sciences 57, 3: 512522. [Google Scholar] [CrossRef]
- Duchowski, A. T. 2002. A breath-first survey of eyetracking applications. Behavior Research Methods, Instruments & Computers 34, 4: 455–470. [Google Scholar]
- Duchowski, A.T. 2007. Eye Tracking Methodology: Theory & Practice, 2nd ed. London: SpringerVerlag. [Google Scholar]
- Fabrikant, S. 2005. Towards an understanding of geovisualization with dynamic displays: Issues and Prospects. Proceedings Reasoning with Mental and External Diagrams: Computational Modeling and Spatial Assistance; 2005, American Association for Artificial Intelligence: Stanford. [Google Scholar]
- Fish, C. 2010. Change detection in animated choropleth maps. MSc Thesis, Michigan State University. [Google Scholar]
- Fish, C., K. P. Goldsberry, and S. Battersby. 2011. Change blindness in animated choropleth maps: an empirical study. Cartography and Geographic Information Science 38, 4: 350–362. [Google Scholar] [CrossRef]
- Garlandini, S., and S. I. Fabrikant. 2009. Evaluating the Effectiveness and Efficiency of Visual Variables for Geographic Information Visualization. In COSIT 2009, LNCS 5756. Edited by et al. Hornsb Springer-Verlag Berlin Heidelberg: pp. 195–211. [Google Scholar]
- Goldberg, J. H., and X. P. Kotval. 1999. Computer interface evaluation using eye movements: methods and constructs. International Journal of Industrial Ergonomics 24: 631–645. [Google Scholar] [CrossRef]
- Griffin, A. L., and S. Bell. 2009. Applications of signal detection theory to geographic information science. Cartographica: The International Journal for Geographic Information and Geovisualization 44, 3: 145–158. [Google Scholar] [CrossRef]
- Griffin, A. M., A. M. MacEachren, F. Hardisty, E. Steiner, and B. Li. 2006. A Comparison of Animated Maps with Static Small-Multiple Maps for Visually Identifying Space-Time Clusters. Annals of the Association of American Geographers 96, 4: 740–753. [Google Scholar] [CrossRef]
- Harrower, M. 2007a. The Cognitive Limits of Animated Maps. Cartographica 42, 4: 349–357. [Google Scholar] [CrossRef]
- Harrower, M. 2007b. Unclassed animated choropleth maps. The Cartographic Journal 44, 4: 313–320. [Google Scholar] [CrossRef]
- Harrower, M., and S. Fabrikant. 2008. The role of map animation for geographic visualization. In Geographic Visualization. Edited by Dodge, McDerby and Turner. John Wiley & Sons: London: pp. 49–65. [Google Scholar]
- Jacob, R. J. K., and K. S. Karn. 2003. Eye Tracking in Human-Computer Interaction and Usability Research: Ready to Deliver the Promises. In The Mind's Eyes: Cognitive and Applied Aspects of Eye Movements. Edited by Radach Hyona & Deubel. Oxford: Elsevier Science, pp. 573–605. [Google Scholar]
- Jenks, G. F. 1973. Visual Integration in Thematic Mapping: Fact or Fiction? International Yearbook of Cartography 13: 27–35. [Google Scholar]
- Just, M. A., and P. A. Carpenter. 1976. Eye fixation and cognitive processes. Cognitive Psychology 8: 441–480. [Google Scholar] [CrossRef]
- Incoul, A., K. Ooms, and P. De Maeyer. 2015. Comparing paper and digital topographic maps using eye tracking. In Modern Trends in Cartography. Edited by J. Brus et al. and et al. Springer International Publishing: pp. 339–356. [Google Scholar]
- Karagiorgou, S., V. Krassanakis, V. Vescoukis, and B. Nakos. 2014. Experimenting with polylines on the visualization of eye tracking data from observations of cartographic line. Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research, Vienna, Austria, 22-26; Edited by P. Kiefer, I. Giannopoulos, M. Raubal and A. Krüger. (co-located with the 8th International Conference on Geographic Information Science (GIScience 2014)). [Google Scholar]
- Karl, D. 1992. Cartographic Animation: Potential and Research Issues. Cartographic Perspectives 13: 3–9. [Google Scholar] [CrossRef]
- Keates, J. S. 1996. Understanding maps, 2nd ed. Harlow, Essex, England: Longman. [Google Scholar]
- Kiik, A. 2015. Cartographic design of thematic polygons: a comparison using eye-movements metrics analysis. MSc Thesis, 2015, Department of Physical Geography and Ecosystem Science, Lund University, Sweden. [Google Scholar]
- Kraak, M J., and A. M. MacEachren. 1994. Visualization of spatial data's temporal component. Proceedings, Spatial Data Handling, Advances in GIS Research, Edinburg. [Google Scholar]
- Krassanakis, V. 2013. Exploring the map reading process with eye movement analysis. In Eye Tracking for Spatial Research, Proceedings of the 1st International Workshop (in conjunction with COSIT 2013). Edited by Kiefer, Giannopoulos, Raubal and Hegarty. Scarborough, United Kingdom: pp. 2–7. [Google Scholar]
- Krassanakis, V. 2014. Development of a methodology of eye movement analysis for the study of visual perception in animated maps . Doctoral Dissertation, 2014, School of Rural and Surveying Engineering, National Technical University of Athens. [Google Scholar]
- Krassanakis, V., V. Filippakopoulou, and B. Nakos. 2011a. The influence of attributes of shape in map reading process. Proceedings of the 25th International Cartographic Association Conference, Paris, France. [Google Scholar]
- Krassanakis, V., A. Lelli, I. E. Lokka, V. Filippakopoulou, and B. Nakos. 2013. Investigating dynamic variables with eye movement analysis. Proceedings of the 26th International Cartographic Association Conference, Dresden, Germany. [Google Scholar]
- Krassanakis, V., V. Filippakopoulou, and B. Nakos. 2011b. An Application of Eye Tracking Methodology in Cartographic Research. Proceedings of the EyeTrackBehavior2011(Tobii), Frankfurt, Germany. [Google Scholar]
- Krassanakis, V., V. Filippakopoulou, and B. Nakos. 2014. EyeMMV toolbox: An eye movement post-analysis tool based on a two-step spatial dispersion threshold for fixation identification. Journal of Eye Movement Research 7, (1): 1: 1–10. [Google Scholar] [CrossRef]
- Lloyd, R. 1997. Visual Search Processes Used in Map Reading. Cartographica 34, 1: 11–32. [Google Scholar] [CrossRef]
- Lloyd, R. 2005. Attention on Maps. Cartographic Perspectives 52: 28–57. [Google Scholar] [CrossRef]
- Lowe, R., and J. M. Boucheix. 2010. Attention direction in static and animated diagrams. In Diagrammatic Representation and Inference. Edited by A.K. Goel, M. Jamnik and N.H. Narayanan. Springer Berlin Heidelberg: pp. 250–256. [Google Scholar]
- MacEachren, A. M. 1995. How maps work: Representation, Visualization, and Design. New York: The Guilford press. [Google Scholar]
- Maggi, S., and S. I. Fabrikant. 2014. Triangulating Eye Movement Data of Animated Displays. Proceedings of the 2nd International Workshop on Eye Tracking for Spatial Research, Vienna, Austria, 27-31; Edited by P. Kiefer, I. Giannopoulos, M. Raubal and A. Krüger. (co-located with the 8th International Conference on Geographic Information Science (GIScience 2014)). pp. 27–31. [Google Scholar]
- Maggi, S., S. I. Fabrikant, J. P. Imbert, and C. Hurter. 2015. How Do Display Design and User Characteristics Matter in Animated Visualizations of Movement Data? Proceedings of the 27th International Cartographic Association Conference, Rio de Janeiro, Brazil. [Google Scholar]
- Manor, B. R., and E. Gordon. 2003. Defining the temporal threshold for ocular fixation in free-viewing visuocognitive tasks. Journal of Neuroscience Methods 128: 85–93. [Google Scholar] [CrossRef] [PubMed]
- Marr, D. 1982. Vision: A Computational Investigation into the Human Representation and Processing of Visual Information. W. H. Freeman and Company, New York. [Google Scholar]
- Matsuno, T., and M. Tomonaga. 2006. Visual search for moving and stationary items in chimpanzees (Pan troglodytes) and humans (Homo sapiens). Behavioural Brain Research 172: 219–232. [Google Scholar] [CrossRef] [PubMed]
- Michaelidou, E., V. Filippakopoulou, B. Nakos, and A. Petropoulou. 2005. Designing Point Map Symbols: The effect of preattentive attribute of shape. Proceedings of the 22th International Cartographic Association Conference, A Coruna, Spain. [Google Scholar]
- Montello, D. R. 2002. Cognitive Map-Design Research in the Twentieth Century: Theoretical and Empirical Approaches. Cartography and Geographic Information Science 29, 3: 283–304. [Google Scholar] [CrossRef]
- Moon, S., E. K. Kim, and C. S. Hwang. 2014. Effects of Spatial Distribution on Change Detection in Animated Choropleth Maps. Journal of the Korean Society of Surveying, Geodesy, Photogrammetry and Cartography 32, 6: 571–580. [Google Scholar] [CrossRef]
- Nossum, A. S. 2012. Semistatic Animation-Integrating Past, Present and Future in Map Animations. The Cartographic Journal 49, 1: 43–54. [Google Scholar] [CrossRef]
- Ooms, K., P. De Mayer, V. Fack, E. Van Assche, and F. Witlox. 2012. Interpreting maps through the eyes of expert and novice users. International Journal of Geographical Information Science 26, 10: 1773–1788. [Google Scholar] [CrossRef]
- Ooms, K., P. De Mayer, and V. Fack. 2014. Study of the attentive behavior of novice and expert map users using eye tracking. Cartography and Geographic Information Science 41, 1: 37–54. [Google Scholar] [CrossRef]
- Opach, T., I. Gołębiowska, and S. I. Fabrikant. 2013. How Do People View Multi-Component Animated Maps? The Cartographic Journal, 1–13. [Google Scholar] [CrossRef]
- Opach, T., and A. Nossum. 2011. Evaluating the usability of cartographic animations with eye-movement analysis. Proceedings of the 27th International Cartographic Association Conference, Paris, France. [Google Scholar]
- Poole, A., and L. J. Ball. 2005. Eye Tracking in HumanComputer Interaction and Usability Research: Current Status and Future Prospects. In Encyclopedia of human computer interaction. Edited by C. Ghaoui. Pennsylvania: Idea Group, pp. 211–219. [Google Scholar]
- Popelka, S., and A. Brychtova. 2013. Eye-tracking Study on Different Perception of 2D and 3D Terrain Visualization. The Cartographic Journal 50, 3: 240–246. [Google Scholar] [CrossRef]
- Robinson, A. H., J. L. Morrsion, P. C. Muehrcke, A. J. Kimerling, and S. C. Guptill. 1995. Elements of Cartography. New York: John Wiley & Sons. [Google Scholar]
- Royden, C. S., J. M. Wolfe, and N. Klempen. 2001. Visual search asymmetries in motion and optic flow fields. Perception & Psychophysics 63, 3: 436–444. [Google Scholar]
- Russo, P., C. Pettit, A. Coltekin I, M. of, M. Cox, and C. Bayliss. 2014. Understanding soil acidification process using animation and text: An empirical user evaluation with eye tracking. In Cartography from Pole to Pole. Edited by M. Buchroithner and et al. Springer Berlin Heidelberg: p. pp. 431448. [Google Scholar]
- Salvucci, D. D., and J. H. Goldberg. 2000. Identifying Fixations and Saccades in EyeTracking Protocols. Proceedings of the Symposium on Eye Tracking Research and Applications; pp. 71–78. [Google Scholar]
- Slocum, T. A., R. B. McMaster, F. C. Kessler, and H. H. Howard. 2009. Thematic Cartography and Geovisualization, 3rd Ed. ed. Prentice Hall Series in Geographic Information Science. [Google Scholar]
- Sluter, R. S. 2001. New theoretical research trends in cartography. Revista Brasileira de Cartografia 53: 29–37. [Google Scholar] [CrossRef]
- Steinke, T. R. 1987. Eye movements studies in cartography and related fields. Studies in Cartography, Monograph 37, Cartographica 24, 2: 40–73. [Google Scholar] [CrossRef]
- Stofer, K., and X. Che. 2014. Comparing experts and novices on scaffolded data visualizations using eyetracking. Journal of Eye Movement Research 7, (5):2: 1–15. [Google Scholar] [CrossRef]
- Strasburger, H., I. Rentschler, and M. Jüttner. 2011. Peripheral vision and pattern recognition: A review. Journal of Vision 11, (5):13: 1–82. [Google Scholar] [CrossRef]
- Wandell, B. A. 1995. Foundations of vision. Sinauer Associates. [Google Scholar]
- Wolfe, J. M. 2005. Guidance of Visual Search by Preattentive Information. In Neurobiology of attention. Edited by L. Itti, G. Rees and J. Tsotsos. San Diego, CA: Academic Press/Elsevier. [Google Scholar]
- Wolfe, J. M., and T. S. Horowitz. 2004. What attributes guide the deployment of visual attention and how do they do it? Nature Reviews Neuroscience 5: 1–7. [Google Scholar] [CrossRef]
- Xiaofang, W., D. Qingyun, X. Zhiyong, and L. Na. 2005. Research and Design of Dynamic symbol in GIS. Proceedings of the International Symposium on Spatio-temporal Modeling, Spatial Reasoning, Analysis, Data Mining and Data Fusion, Beijing. [Google Scholar]
Copyright © 2016. This article is licensed under a Creative Commons Attribution 4.0 International License.
Share and Cite
Krassanakis, V.; Filippakopoulou, V.; Nakos, B. Detection of Moving Point Symbols on Cartographic Backgrounds. J. Eye Mov. Res. 2016, 9, 1-16. https://doi.org/10.16910/jemr.9.2.2
Krassanakis V, Filippakopoulou V, Nakos B. Detection of Moving Point Symbols on Cartographic Backgrounds. Journal of Eye Movement Research. 2016; 9(2):1-16. https://doi.org/10.16910/jemr.9.2.2
Chicago/Turabian StyleKrassanakis, Vassilios, Vassiliki Filippakopoulou, and Byron Nakos. 2016. "Detection of Moving Point Symbols on Cartographic Backgrounds" Journal of Eye Movement Research 9, no. 2: 1-16. https://doi.org/10.16910/jemr.9.2.2
APA StyleKrassanakis, V., Filippakopoulou, V., & Nakos, B. (2016). Detection of Moving Point Symbols on Cartographic Backgrounds. Journal of Eye Movement Research, 9(2), 1-16. https://doi.org/10.16910/jemr.9.2.2