MatMouse: A Mouse Movements Tracking and Analysis Toolbox for Visual Search Experiments

: The present study introduces a new MATLAB toolbox, called MatMouse, suitable for the performance of experimental studies based on mouse movements tracking and analysis. MatMouse supports the implementation of task-based visual search experiments. The proposed toolbox provides speciﬁc functions which can be utilized for the experimental building and mouse tracking processes, the analysis of the recorded data in speciﬁc metrics, the production of related visualizations, as well as for the generation of statistical grayscale heatmaps which could serve as an objective ground truth product. MatMouse can be executed as a standalone package or integrated in existing MATLAB scripts and / or toolboxes. In order to highlight the functionalities of the introduced toolbox, a complete case study example is presented. MatMouse is freely distributed to the scientiﬁc community under the third version of GNU General Public License (GPL v3) on GitHub platform.


Introduction
The examination of visual behavior and reaction requires the performance of scientific experimentation using different types of visual stimuli. Experimental stimuli might include simple symbols with specific topological or geometric attributes [1], natural (e.g., [2]) or artificial images (e.g., cartographic backgrounds [3]), and virtual reality representations (see e.g., the study presented by [4]). Over the last few years, several technical approaches have been established towards the implementation of visual attention experiments and analyses including simple or more sophisticated methods. Simple experimental techniques are usually based on reaction time measures [5] while more sophisticated ones involve the analysis of eye movements (i.e., eye tracking method), methods used for testing brain activity (i.e., functional magnetic resonance imaging-fMRI), or combinations of several technical approaches (e.g., the study presented by [6]) where information collected by eye tracking, pupil dilation, and EEG analysis are fuzzed in order to investigate user behavior and preferences on a web site.
Among the existing experimental techniques, mouse tracking constitutes one of the simplest methods implemented for the exploration of visual perception and cognition. The technique involves processes related to the recording and analyzing of the trajectories produced by computer mouse movements [7]. Despite the simplicity of the method, it can produce critical results related to the study of cognitive processes [8,9] and decision making [10], while the development of relative advanced metrics could substantially enhance novel psychological hypothesis testing [11,12]. Moreover, mouse movements analysis could also be applied for assessing emotional responses [13], the evaluation of the effectiveness of alternative design choices (e.g., in a graphical user interface of a software or in a web page [14]), user satisfaction [15], possible response differences [16], as well as for the prediction of the produced visual attention patterns [17,18]. Over the last decade, various software tools have been proposed and delivered to the scientific community for the analysis of mouse movements trajectories produced during the observation of visual stimuli on a computer monitor. Freeman and Ambady [19] presented MouseTracker software suitable for the production and the implementation of mouse-tracking experiments as well as for processing and visualizing mouse movement trajectories. The reliability of MouseTracker was tested through the direct comparison to traditional reaction time measures. MouseTracker works as a standalone package and it has an interactive graphical user interface. Although the aforementioned tool is freely distributed, it does not offer the option to directly integrate it with existing platforms and toolboxes, such as PTB-Psychtoolbox [20] for example. The integration possibility in existing architectures is offered by another open source package called Mousetrap provided by Kieslich and Henninger [21]. Mousetrap is a cross-platform package which works as a plug-in to the well-established experiment builder OpenSesame [22]. This toolbox is additionally distributed as an R package available on CRAN [23,24]. Recently, Mathur and Reichling [25] also proposed another mouse-tracking JavaScript software tool (working along R code) which is adapted to Qualtrics platform and could be used in category-competition experiments.
The main metrics implemented in mouse-tracking tools are based on the analysis of individual mouse trajectories during the reaction of a participant in an experimental trial. Existing tools compute basic and derived (e.g., minimum, maximum, or standard deviations) values connected to mouse trajectories. Specifically, these metrics involve values related to mouse positions (including the extraction of mouse positions without mouse movements), reaction time, directional changes, velocity, and acceleration [7]. Moreover, mouse tracking analysis examines the deviation of the produced trajectories from the theoretical optimal ones which correspond to straight lines. This deviation can be illustrated by computing metrics such as maximum absolute deviation (MAD), area under curve (AUC), and maximum deviation (MD) [9,10,26]. The behavior examination of experimental participants (or users for the case of usability studies) could be also enhanced by the visual exploration of different visualizations of the produced trajectories on the visual stimuli. Despite the existing software solutions implementing and supporting specific and advanced metrics to reveal visual behavior, the relative techniques for data visualization can be improved. Moreover, the development of 'cumulative' metrics that indicate the overall visual behavior could help towards modeling this behavior as well as for training models towards predicting participant/user reaction.
Furthermore, raw data produced by mouse tracking techniques meet several similarities with eye tracking data considering both their spatiotemporal distribution and their connection with the perceptual and cognitive processes. Scientific literature shows that there is a strong correlation between mouse and eye movements (e.g., [27][28][29]) while there are also research studies trying to predict the gaze behavior using mouse movements (e.g., [30]). Hence, the development of scientific software for mouse tracking and analysis could also follow approaches implemented in eye movement analysis producing critical outcomes for both studying and modeling visual reaction in different types of visual stimuli. For example, the generation of grayscale statistical heatmaps based on the collected data could directly reveal participants' behavior. In accordance to the terminology applied to eye tracking research (e.g., the recent study presented by Perrin et al. [31]), such products could serve as objective ground truths of the human behavior revealed by mouse reaction.
In the present study, a new toolbox, called MatMouse (see Supplementary Materials), is introduced. The toolbox is appropriate for building and analysis of experimental studies based on mouse movements tracking. MatMouse constitutes a MATLAB (Mathworks ® ) toolbox that is designed to support task-based experiments of visual search. The proposed toolbox consists of three main components which give the opportunity to build simple task-based experimental studies and capture mouse coordinates on selected visual stimuli, to analyze the captured mouse movements, and to produce mouse movement visualizations. MatMouse provides functions that can be easily incorporated in existing scripts or can be used in conjunction with other toolboxes. MatMouse will be freely distributed to the scientific community.
In the sections below, analytical descriptions for both experimental building and data analysis are provided. Specifically, Material and Methods section provides specific examples and demos (through an example case study in the field of map perception) in order to highlight all the functionalities of the introduced toolbox including metrics analysis, data visualizations, as well as the generation of objective data ground truth (grayscale statistical heatmap). Sample data visualizations and ground truth demo based on the collected data are also demonstrated in the Results section. Finally, the contribution of the proposed toolbox in experimental development are summarized and discussed in Discussion and Conclusions section.

Materials and Methods
The principle idea behind the development of MatMouse toolbox is to deliver a practical tool that can be directly utilized to build visual search experiments as well as to analyze the produced experimental mouse movement data. The toolbox is implemented in MATLAB and the functions provided aim to support three main components. The first component is used to build an experimental study and capture the mouse movements performed by the experimental participants. The second component computes basic and advanced metrics based on the spatiotemporal analysis of the produced trajectories, while the third component involves functionalities that produce individual visualizations of the collected raw data referred to each experimental visual stimulus. The aforementioned components are supported by five core functions. There is not direct connection to each function to a specific component since some of the functions contribute to different components (see Sections 2.1-2.3 for further details). As mentioned above, MatMouse supports the execution of typical task-based experimentations which require the visual reaction of the participant in a visual stimulus or in a sequence of visual stimuli. Hence, the information provided extends the results produced by simply capturing the reaction time. In the following sections the functionalities of the provided components are presented while illustrative examples and a case study are also provided in order to facilitate potential toolbox users to easily integrate MatMouse in their research studies.

Mouse Movements Tracking
A typical visual search task includes the indication (to the participants of the experiment) of a specific "target" object or symbol which has to be detected among several "distractors" [32]. Hence, a visual stimulus or a sequence of visual stimuli serves as the main input to an experiments building (software) environment. The fundamental export (raw data) of a mouse tracking process during a visual search includes the generated spatiotemporal information, namely, the spatial coordinates of the recorded mouse movements along with the corresponding time stamps. The collection of the corresponding mouse movement raw data is based on the execution of two main functions of MatMouse. More precisely, MatMouse function "movement_track" is executed in order to collect raw mouse movement data using one simple visual stimulus while the function "movement_track_seq" used for the cases that a specific sequence of visual stimuli must be presented to the participants of the experiment. Practically, the second mentioned function could be used for building any experimental trial since the toolbox user is able to define the sequence of selected visual stimuli. The transition to the next image (of the pre-selected sequence) is made after the simple reaction of the participant revealed by a simple mouse click. All the well-known image file formats (e.g., png, jpeg, etc.) are supported and could be imported in order to create a sequence of images.
MatMouse is designed to work with two monitor screens connected to the same computer which is used in order to concurrently track the mouse movement coordinates and to display the visual stimuli during the experimental process. The two monitor screens have to be set in the extended mode. By default, the toolbox uses the secondary (extended) monitor for visual stimuli presentation while the primary monitor is utilized by the experimental operator in order to run the corresponding MATLAB scripts. In case where only one monitor is available then both tasks, namely presentation and operation, are performed in the same display. The process of raw data collection is independent from the spatial resolution of the display monitor screen. Both functions capture the spatiotemporal coordinates of mouse movements (t, x, y) collected during the observation of visual stimuli. The parameter of time corresponds to the relative values measured using typical MATLAB functions. A pause time equal to 1 ms has been selected as default value in order to give MATLAB the opportunity to display images and capture coordinates (a similar approach is followed in Eyelink toolbox [33] integrated in Psychtoolbox). The exported mouse movement spatiotemporal coordinates are computed in image coordinates for both horizontal and vertical dimension and in seconds for the temporal direction. The origin of the coordinate system corresponds to the upper left corner of the image.

Movement Metrics Analysis
Taking into consideration that the analysis of individual mouse movement trajectories should be adapted to the selected research questions, the metrics computed by the toolbox are mainly oriented to the calculation of representative indices which aim to highlight the overall searching behavior, extending the typical measurements of reaction time. Metrics computation is performed by the functions "calc_metrics".
Function "calc_metrics" uses the captured mouse movement coordinates as input and extracts basic metrics related to the produced trajectory on each visual stimulus. The toolbox user has also the option to select as input a subset of the recorded data using typical MATLAB operations. The exported metrics involve the total reaction time (in seconds), the total trajectory length (in pixels) calculated using the Euclidean distance, as well as some basic statistics (average, standard deviation, minimum, maximum, and range of values) regarding the direction angle of the line segments that compose the individual trajectory. Moreover, extending the idea of typical AUC and MD measures used in other mouse tracking and analysis toolboxes, MatMouse calculates the aforementioned statistics also on the Euclidean distances from each captured mouse point to the corresponding optimal trajectory, that is, the straight line connecting the first and the last captured mouse point (an example in presented in Figure 1). parameter of time corresponds to the relative values measured using typical MATLAB functions. A pause time equal to 1 ms has been selected as default value in order to give MATLAB the opportunity to display images and capture coordinates (a similar approach is followed in Eyelink toolbox [33] integrated in Psychtoolbox). The exported mouse movement spatiotemporal coordinates are computed in image coordinates for both horizontal and vertical dimension and in seconds for the temporal direction. The origin of the coordinate system corresponds to the upper left corner of the image.

Movement Metrics Analysis
Taking into consideration that the analysis of individual mouse movement trajectories should be adapted to the selected research questions, the metrics computed by the toolbox are mainly oriented to the calculation of representative indices which aim to highlight the overall searching behavior, extending the typical measurements of reaction time. Metrics computation is performed by the functions "calc_metrics".
Function "calc_metrics" uses the captured mouse movement coordinates as input and extracts basic metrics related to the produced trajectory on each visual stimulus. The toolbox user has also the option to select as input a subset of the recorded data using typical MATLAB operations. The exported metrics involve the total reaction time (in seconds), the total trajectory length (in pixels) calculated using the Euclidean distance, as well as some basic statistics (average, standard deviation, minimum, maximum, and range of values) regarding the direction angle of the line segments that compose the individual trajectory. Moreover, extending the idea of typical AUC and MD measures used in other mouse tracking and analysis toolboxes, MatMouse calculates the aforementioned statistics also on the Euclidean distances from each captured mouse point to the corresponding optimal trajectory, that is, the straight line connecting the first and the last captured mouse point (an example in presented in Figure 1). These statistics highlight the overall deviation of the visual search behavior. The parameters of the optimal trajectory are computed considering the first and the last point of the trajectory and the typical form of the line equation (ax + by + c = 0). Furthermore, in order to indicate the overall spatial dispersion of the recorded mouse movement points, the corresponding convex hull area is also computed (in pixels). The convex hull area highlights the percentage of the searched area in These statistics highlight the overall deviation of the visual search behavior. The parameters of the optimal trajectory are computed considering the first and the last point of the trajectory and the typical form of the line equation (ax + by + c = 0). Furthermore, in order to indicate the overall spatial dispersion of the recorded mouse movement points, the corresponding convex hull area is also computed (in pixels). The convex hull area highlights the percentage of the searched area in comparison with the area that corresponds to the monitor used for visual stimuli presentation. It should be noted that, in order to have meaningful statistics, the trajectory should consist of at least three points.
Following the approach implemented in eye movement analysis where an eye tracking protocol is analyzed in fixation events before the implementation of any other next level of analysis, the function "calc_metrics" extracts also the unique mouse movement positions which are generated during visual search process. The positions where there are temporarily no mouse movements are analog to the fixation points calculated to eye tracking analysis. The main difference between a fixation point and a mouse position without movements is that in the first case the fixation center corresponds to a cluster which consists of several points within a specific spatial range, rather than the single mouse point of the latter case. Indeed, during a fixation event eyes remain relative stationary [34] while a typical range for the fixation points is approximately equal to 1 • of visual angle (see e.g., the study presented by Ooms & Krassanakis [35] for further details on spatial thresholds used in fixation identification algorithms). Additionally, in comparison to mouse movements, fixation events have a minimum duration (the value of 80 msec is reported as the minimum duration threshold according to the literature [36]). For the case of mouse movements, the spatial threshold corresponds to 0 • of visual angle, while there is not any limit in minimum value for the duration of a stationary mouse cursor position.
The recorded raw data might be very dense, especially considering the high recording frequency as well as experimental cases where the possible duration of visual search process could be quite extensive. Therefore, the analysis of the collected mouse movement data in order to extract unique mouse movement positions could improve the exploration and any further manipulation of the experimental raw data.

Visualizations and Heatmap Ground Truth Generation
Although the computation of specific metrics helps directly to the analysis of experimental results as well as to the comparison of different trials among the participants and the examined visual stimuli, data visualization could substantially contribute to the visual exploration of the experimental raw data and the illustration of the produced mouse movements patterns. MatMouse provides a variety of data visualizations. As already mentioned, the toolbox user can apply the corresponding functions for data visualization either on the whole collected data set or on a particular subset of the collected data that is extracted using typical MATLAB operations. Function "show_visualizations" exports 2D plots of mouse movement trajectories that demonstrate the spatial dispersion of the individual visual search and illustrate the deviations of both horizontal and vertical mouse coordinates in time. Moreover, the curvature of the path is calculated at each trajectory point and is depicted on a 2D plot as a line with varying colors that depend on the curvature's level. For this purpose, the Savitzky & Golay method is used for data smoothing that is based on local least-squares polynomial approximation [37].
Another 2D plot demonstrates the duration at each mouse point as a circle whose radius depends on the corresponding duration. A label is also generated that denotes the point with the longest duration.
Apart from the aforementioned visualizations, MatMouse supports also the generation of heatmap visualizations. Heatmaps are created using function "show_heatmap" and are represented either as a typical 2D plot or as a 2.5D isolines surface which is based on the spatial distribution of the collected mouse movement points. The same function is also used for the generation of grayscale (statistical) heatmaps that could serve as subjective ground truths since they indicate the overall visual behavior of the participants who take part in a research study. Grayscale heatmaps are created based on the approach followed by Krassanakis et al. [38]. More specifically, the grayscale heatmap constitutes of an image where each pixel's value represents the corresponding frequencies of the existing mouse points. The values of the frequencies are normalized in the range 0-255 (8 bit image). The production of grayscale heatmaps is based on the values of the selected parameters which include the standard deviation and the kernel size of the performed Gaussian filtering.

MatMouse Functions
The functionality of all MatMouse functions is summarized in Table 1. The toolbox is implemented and tested in MATLAB R2020b running on a PC with Windows 10. For each function, a short description and its syntax is given accompanied with a detailed description of its input and output parameters. Comments are also provided, followed by an example showing how to use the function.
and so on.

Output parameters
A: An array that contains the tracked mouse movements for all the stimuli images. Array A(i) is a structure with 3 fields containing the tracked movements for the i-th stimulus image, with 1 ≤ I ≤ N where N denotes the number of images in the ImagesList text file. A(i).t: time stamp (in seconds) for the i-th image A(i).x: point's x coordinate (in image pixels) for the i-th image A(i).y: point's y coordinate (in image pixels) for the i-th image Comments The origin of the coordinate system is on the top left corner of the input images. Example A = movement_track_seq('images_list.txt',1) In this example, the images whose filenames are given in text file "images_list.txt" are used in the current monitor. As a result, an array A is created that contains the tracked mouse movements for all the stimuli images.

Function name calc_metrics Description
Provides statistics regarding the recorder trajectory as well as its comparison to the optimal trajectory. Syntax [react,len,uniq,lineq,dstat,charea,curv] = calc_metrics(A); Input parameters An array A containing the tracked mouse movements of a trajectory. It can be provided by functions movement_track or movement_track_seq. Output parameters react: total reaction time in sec len: total trajectory length in pixels uniq: structure of unique points. The structure fields are: uniq.d: duration (in seconds) uniq.x: point's x coordinate (in image pixels) uniq.y: point's y coordinate (in image pixels) lineq: structure with the coefficients (a, b and c) of the line equation ax + by + c = 0 describing the optimal trajectory. The line is calculated from the starting and ending trajectory points. The structure fields are: lineq.a: line parameter a lineq.b: line parameter b lineq.c: line parameter c dstat: structure of distance-based statistics relevant to the optimal trajectory. The structure fields are: dstat.avg: average dstat.std: standard deviation dstat.min: min value dstat.max: max value dstat.range: range of values charea: convex hull area (in pixels) generated by the recorder trajectory. curv: curvature at each unique trajectory point.

Example
[react,~,uniq,lineq,dstat,~,curv] = calc_metrics(A) In this example, various statistics are calculated based on the tracked mouse movements array A. Specifically, the calculated values are: the total reaction time react, the unique trajectory points uniq, the parameters lineq of the linear that describes the optimal trajectory, the distance-based statistics dstat as well as the curvature curv at each unique trajectory point. The output parameters len and charea are ignored.    In the following paragraphs, a step-by-step guide is presented to describe how an experiment is built and how the collected data are analyzed.

Tracking Data Collection
Data_DemoExpMap1=movement_track(ImageFilename,1,'Data_DemoExpMap1.txt'); If more than one images are involved then a simple text file is created that contains the sequence of the experimental stimuli images. In this example, the text file is titled "DemoExpMapSeq.txt" and contains the following three lines: In the following paragraphs, a step-by-step guide is presented to describe how an experiment is built and how the collected data are analyzed.

Tracking Data Collection
Data_DemoExpMap1=movement_track(ImageFilename,1,'Data_DemoExpMap1.txt'); If more than one images are involved then a simple text file is created that contains the sequence of the experimental stimuli images. In this example, the text file is titled "DemoExpMapSeq.txt" and contains the following three lines: There is no limit in the number of images that can be used. Moreover, an image can be imported multiple times if it is a requirement of a research study. This might be very helpful in cases where a symbol or a specific instruction is given to the participant (e.g., for searching the same target in different visual scenes). In order to collect data for a specific participant, the following command is used: Hence, in case where multiple runs of the same experiment must be integrated in the same script, the previous command can be executed repeatedly with different input and output variables.
The example command above is related only to the experimental data collection process and not to the analysis of the collected data. The exported text file "Data_p1.txt" contains the time stamps and coordinates of all the collected points using a structure described in the movement_track_seq section of Table 1.

Analysis and Visualization
The next step involves the analysis and the visualization of the collected data. Once the subsets of the raw data have been collected, there are commands that allow the toolbox's user to: Typical MATLAB operations can be applied when heatmap-based visualizations and ground truth need to be produced considering data collected by different participants. For instance, suppose that the collected data produced by three participants are stored in the variables Data_p1, Data_p2, Data_p3 correspondingly and we want to analyze the data of the second used stimulus image. In this case, the first input variable in "show_visualizations" is generated using the following commands: Data.x = [Data_p1(2).x; Data_p2(2).x; Data_p3(2).x]; Data.y = [Data_p1(2).y; Data_p2(2).y; Data_p3(2).y];

Results
In the previous sections, the functions provided by the MatMouse Toolbox were presented in accordance with a case study example. Full examples are also available in MatMouse repository. In order to demonstrate the potential of MatMouse, Figures 3-10 provide sample visualizations which are based on the collected data of the case study and the visualization commands reported in Section 2.5. Figure 3 demonstrates the tracked mouse trajectory generated by participants' reaction on the experimental stimulus. The mouse trace is presented as a continuous blue line while captured mouse points are highlighted with red circles. Taking into account that tracking frequency can be considered constant, denser spatial distributions indicate regions with smaller mouse transitions.        The spatiotemporal horizontal and vertical coordinates of the produced trajectories can be visualized in the stimulus image coordinate system, as shown in Figure 4. Considering that horizontal and vertical dimensions are indicated by different curves along passing time, their spatial variations could point out the mouse transitions on the experimental visual stimulus. Hence, bigger curvature changes in the horizontal axis indicate larger horizontal or vertical mouse transitions. Figure 5 demonstrated the curvature values on the generated mouse trajectory using a typical color bar. Lower curvature values are highlighted with green color while higher values are shown in red color.
In Figure 6, the duration time at each unique mouse point is depicted using a scanpath-like (in comparison with eye movements) visualization. Durations are depicted as circles where larger circle radii correspond to higher durations and vice versa. In addition, the point with the highest duration value is explicitly labeled. In Figure 6, the duration time at each unique mouse point is depicted using a scanpath-like (in comparison with eye movements) visualization. Durations are depicted as circles where larger circle radii correspond to higher durations and vice versa. In addition, the point with the highest duration value is explicitly labeled.   In Figure 6, the duration time at each unique mouse point is depicted using a scanpath-like (in comparison with eye movements) visualization. Durations are depicted as circles where larger circle radii correspond to higher durations and vice versa. In addition, the point with the highest duration value is explicitly labeled.   Alternatively, the cumulative behavior can also be highlighted using a 2.5D isolines visualization, as shown in Figure 8. In both Figures 7 and 8, the generated clusters are clearly illustrated indicating the allocation of overall participants' behavior during the execution of the requested task. It is important to mention that the titles of the produced visualizations (Figures 3-8) are automatically produced by MatMouse toolbox considering the inputs defined by the user in the corresponding function. Additionally, the exported figures can be manipulated using the interactive tools available in MATLAB software either for data exploration or for further editing.
Besides the aforementioned visualizations illustrated in Figures 3-8, MatMouse exports grayscale statistical heatmaps, as shown in Figure 9. Such visualizations can serve as an objective ground truth of the performed experiment. More specifically, the grayscale statistical heatmap depicted in Figure 9 is produced by implementing the corresponding function of MatMouse for the Alternatively, the cumulative behavior can also be highlighted using a 2.5D isolines visualization, as shown in Figure 8. In both Figures 7 and 8, the generated clusters are clearly illustrated indicating the allocation of overall participants' behavior during the execution of the requested task. It is important to mention that the titles of the produced visualizations (Figures 3-8) are automatically produced by MatMouse toolbox considering the inputs defined by the user in the corresponding function. Additionally, the exported figures can be manipulated using the interactive tools available in MATLAB software either for data exploration or for further editing.
Besides the aforementioned visualizations illustrated in Figures 3-8, MatMouse exports grayscale statistical heatmaps, as shown in Figure 9. Such visualizations can serve as an objective ground truth of the performed experiment. More specifically, the grayscale statistical heatmap depicted in Figure 9 is produced by implementing the corresponding function of MatMouse for the  In Figure 10, the generated mouse trajectories for the three participants of the demo case study are illustrated on top of the calculated grayscale statistical heatmap. The ground truth produced after considering the collected data from all participants reveals (through a statistical image) possible stimulus positions that correspond to mouse movements. It is evident that regions corresponding to higher frequencies match to denser mouse points clusters. In Figure 10, the generated mouse trajectories for the three participants of the demo case study are illustrated on top of the calculated grayscale statistical heatmap. The ground truth produced after considering the collected data from all participants reveals (through a statistical image) possible stimulus positions that correspond to mouse movements. It is evident that regions corresponding to higher frequencies match to denser mouse points clusters.

Discussion and Conclusions
In this study, a new toolbox, titled MatMouse, is introduced that provides mouse tracking and analysis functionality. Several aspects of the proposed toolbox are highlighted that are related to visual search experiments. Moreover, a complete case study is presented that elaborates further the toolbox features and functionality in order to apply it on both experimental building and data capturing as well as for mouse movement data analysis and visualization. Since MatMouse is developed in MATLAB environment, it constitutes a cross-platform package which can be executed in every operating system (MS Windows, Mac OS, and Linux) where MATLAB software is preinstalled. In practice, the toolbox can be used either as a standalone package or by integrating it in existing scripts or toolboxes, e.g., for experimental building (e.g., Psychtoolbox), eye movement (e.g., ILAB [41], GazeAlyze [42], EyeMMV [38], EALab [43], LandRate [44] etc.), and/or EEG (e.g., EEGLAB [45], DETECT [46], etc.) analysis.  Alternatively, the cumulative behavior can also be highlighted using a 2.5D isolines visualization, as shown in Figure 8. In both Figures 7 and 8, the generated clusters are clearly illustrated indicating the allocation of overall participants' behavior during the execution of the requested task.
It is important to mention that the titles of the produced visualizations (Figures 3-8) are automatically produced by MatMouse toolbox considering the inputs defined by the user in the corresponding function. Additionally, the exported figures can be manipulated using the interactive tools available in MATLAB software either for data exploration or for further editing.
Besides the aforementioned visualizations illustrated in Figures 3-8, MatMouse exports grayscale statistical heatmaps, as shown in Figure 9. Such visualizations can serve as an objective ground truth of the performed experiment. More specifically, the grayscale statistical heatmap depicted in Figure 9 is produced by implementing the corresponding function of MatMouse for the second visual stimuli and is based on the raw data collected from the three experimental trials. Hence, it represents the visual cumulative mouse reaction behavior of all participants.
In Figure 10, the generated mouse trajectories for the three participants of the demo case study are illustrated on top of the calculated grayscale statistical heatmap. The ground truth produced after considering the collected data from all participants reveals (through a statistical image) possible stimulus positions that correspond to mouse movements. It is evident that regions corresponding to higher frequencies match to denser mouse points clusters.

Discussion and Conclusions
In this study, a new toolbox, titled MatMouse, is introduced that provides mouse tracking and analysis functionality. Several aspects of the proposed toolbox are highlighted that are related to visual search experiments. Moreover, a complete case study is presented that elaborates further the toolbox features and functionality in order to apply it on both experimental building and data capturing as well as for mouse movement data analysis and visualization. Since MatMouse is developed in MATLAB environment, it constitutes a cross-platform package which can be executed in every operating system (MS Windows, Mac OS, and Linux) where MATLAB software is pre-installed. In practice, the toolbox can be used either as a standalone package or by integrating it in existing scripts or toolboxes, e.g., for experimental building (e.g., Psychtoolbox), eye movement (e.g., ILAB [41], GazeAlyze [42], EyeMMV [38], EALab [43], LandRate [44] etc.), and/or EEG (e.g., EEGLAB [45], DETECT [46], etc.) analysis.
Although the majority of the analyzed metrics is also implemented by existing toolboxes (e.g., the package provided by Kieslich et al. [24]), the presented study aims to deliver an easy-to-use toolbox in order to additionally produce interactive visualizations as well as subjective ground truths based on collected raw data. More specifically, the generated grayscale statistical heatmaps can be directly compared with relative ground truth produced by eye tracking recordings (see, e.g., the eye tracking datasets recently described and provided by Krassanakis et al. [47] and by Perrin et al. [31]). Indeed, considering the correlation between eye movements and mouse movements, the appropriate parameters for heatmap generation, including the standard deviation and the kernel size of the performed Gaussian filtering, could be selected following approaches similar to the eye movement analysis. Building such ground truth contributes to the development of corresponding datasets that can serve as robust basis for both examining and modeling the process of visual behavior. Therefore, the exported products can be used directly as the main input for the performance of machine learning techniques (e.g., for predicting the visual search behavior during the performance of typical visual search tasks such as target identification processes).
Moreover, the produced (successive) mouse movement records have a temporal interval distance that approximately corresponds to 1 ms. Despite the produced records having not been validated with existing tools, this value can be achieved using the powerful build-in functions of MATLAB software while it can be considered more than adequate for data collection, especially taking into account the recording frequencies implemented in existing software.
The provided toolbox follows a different approach towards clustering mouse trajectories process comparing to methods reported in existing packages, such as Mousetrap. Here, the generated clusters are highlighted using a normalized grayscale statistical heatmap. Although this method is quite suitable for the description of the spatial behavior allocation, it lacks to report the dynamic change in mouse coordinates during visual search on a specific visual stimulus. However, this limitation could be easily overtaken considering the supported metrics and visualization provided by the proposed toolbox.
The performance of visual search experiments constitutes an essential process in several research fields which aim to study and model visual behavior. Such experiments are based on theories related to vision and visual attention. For example, the case study presented here can be based on the hypothesis that pre-attentive features (e.g., specific shape topological properties, such as holes or line termination) can guide visual attention during visual search behavior. Similar hypotheses could be tested through visual task-based experimentation, while they can involve different types of examinations (e.g., ranking the performance of observers during the observation of different types of visual stimuli). Hence, the MatMouse toolbox could serve as a complete platform in this direction, providing subjective data and analysis metrics produced by capturing the reaction of observers.