Next Article in Journal
Land Use versus Land Cover: Geo-Analysis of National Roads and Synchronisation Algorithms
Previous Article in Journal
Badland Erosion and Its Morphometric Features in the Tropical Monsoon Area

A Generic Approach toward Indoor Navigation and Pathfinding with Robust Marker Tracking

by 1,*, 2 and 2
Interactive Media Design Lab, Nara Institute of Science and Technology, Nara 630-0192, Japan
Department of Computer Science and IT, University of Malakand, Chakdara 18800, Pakistan
Author to whom correspondence should be addressed.
Remote Sens. 2019, 11(24), 3052;
Received: 7 November 2019 / Revised: 11 December 2019 / Accepted: 12 December 2019 / Published: 17 December 2019


Indoor navigation and localization has gained a key attention of the researchers in the recent decades. Various technologies such as WiFi, Bluetooth, Ultra Wideband (UWB), and Radio-frequency identification (RFID) have been used for indoor navigation and localization. However, most of these existing methods often fail in providing a reasonable solution to the key challenges such as implementation cost, accuracy and extendibility. In this paper, we proposed a low-cost, and extendable framework for indoor navigation. We used simple markers printed on the paper, and placed on ceilings of the building. These markers are detected by a smartphone’s camera, and the audio and visual information associated with these markers are used as a user guidance. The system finds shortest path between any two arbitrary nodes for user navigation. In addition, it is extendable having the capability to cover new sections by installing new nodes at any place in the building. The system can be used for guidance of the blind people, tourists and new visitors in an indoor environment. The evaluation results reveal that the proposed system can guide users toward their destination in an efficient and accurate manner.
Keywords: augmented reality; indoor navigation; automatic path generation; marker tracking augmented reality; indoor navigation; automatic path generation; marker tracking

1. Introduction

Indoor navigation systems are rapidly growing with amazing technologies. Typically, these systems are used for assistance of disabled or aged people, robot path planning, AR gaming, tourist’s guidance, and training [1,2]. Indoor navigation systems are aimed with either infrastructure-dependent systems [3,4] which use sensors embedded in the environment for user tracking or infrastructure-independent systems [5]. Typically, a user navigating in an indoor environment needs two types of information including his/her own position and a path toward his/her specific destination [5]. A recent study [6] identified the following challenges usually considered for navigation and localization of user in large scale environments.
Accuracy and continuity: The accuracy and continuity of locations are important, especially for visually impaired people. For a real-time guidance, a localization accuracy of about two meters is desired. A higher localization error could mislead a user on a wrong path or cause collision in the environment.
Scaling and extendibility: A number of previous methods are available for single story and 2D plan environments [7]. However, most of the buildings such as shopping malls, universities, and hospitals have multistory buildings. User localization in such multi-story building is further challenging, for example, localization across floors and during floor transition. Similarly, extending an existing indoor system in a building to the new areas such as adding a new floor or installing new rooms is further challenging, where existing methods often failed.
Signal strength: Some of the state-of-the-art methods are concerned with signal processing. These methods suffer from signal issues, for example some devices may receive weak signals than others.
Computational cost and efficiency: The computational cost is another challenging issue, especial for large-scale buildings. Efficient methods are required to accurately localize a user in real time.
Motion recognition: To recognize a user from the walking style or to detect users’ steps during walking is also a challenging issue. It helps in accurate localization and time calculation to reach a destination. However, incase if the internal sensors missed the steps, it may cause a noticeable localization error.
Delay Detection: The delay in signal is also a challenging issue in indoor navigation. The delayed data may cause to mislead a user, particularly at decision points such as corridor intersections, stairs, and entrances.
In addition to the aforementioned challenges, other issues include the installation cost (money, time, space, weight, and energy), complexity (software, and hardware) and robustness [8]. To address these issues different methods have been proposed. However, there is no generic system that covers most of these issues. For example, Global Positioning System (GPS) is considered as a de facto standard and ideal solution for outdoor navigation and user tracking [9]. However, in large multi-floor indoor environments, GPS signals become weak, and lose the accuracy and robustness. Similarly, other state-of-the-art systems have also achieved significant achievements in indoor navigation. However, most of these systems have the common limitations including high cost, low accuracy, signals issues, difficult to install and use, specific to a particular building (not generic) and not extendable. For example, infrastructure-dependent systems [3,4] demand for a high cost. Similarly, cascaded deep neural network (CDNN) [10] suffers from computational complexity and minimal accuracy.
A realistic solution is the use of ARToolKit [11] markers i.e., black-and-white images printed on the papers and placed on ceiling traced by a video camera. This method of ARToolKit markers was proposed in a previous work [12] of this project. However, the method [12] was specific for blind users, not extendable, used manual paths planning, and it losses robustness if light is not optimal. In addition, the users have to use laptop which is difficult for a user to carry while navigating in the indoor environment.
To the best of our knowledge, there is no mechanism for user guidance and navigation in indoor environments to provide an optimal solution to the aforementioned issues. Therefore, a robust system that may cover most of these issues is highly demanded.
In this paper, we present a simple, easy-to-use, low-cost, and generic navigation system for navigation in complex indoor environments. The system is based on the ARToolKit [11] markers placed on the ceiling of the building, and recognized with a smartphone’s camera. These markers are associated with location information, audio/textual information, and the connectivity with other markers in the building. The system can be used by blind peoples and new visitors including tourists and guests to find their destination in a new indoor environment. The proposed system works in three different modes. It has the capability to guide a user from a source to destination via a shortest path (guided mode). User can check his/her current position (Free mode). The administrator can extend the number of markers at any place (Admin mode). The proposed system is generic and can be implemented in any building. It has the capability of extensions to new areas such as new floor in the building. The extension is achieved by installing new markers in the environment and building their connection with existing markers. The system was implemented and evaluated in the new academic building of the University of Malakand, Chakdara 18800, Dir Lower, Pakistan. The evaluation results reveal that the proposed system can guide the users toward their destination in an efficient and accurate manner. The system was evaluated with system usability scale (SUS)-questionnaire [13], as well as another simple questionnaire. The subjective results showed a significant achievement in the system usability. The calculated SUS score was 92.0 . The application is freely downloadable from the link:
In summary, we have the following contributions.
  • We proposed a smartphone-based indoor navigation system with automatic path generation and user guidance in audio/textual form.
  • The proposed system is efficient, low-cost, accurate, easy-to-install, and easy-to-use.
  • The system is implemented as an extendable android application, which allows the building administrator to manage floor plans, and add or delete new nodes (Fiducial markers) with corresponding audio/textual information. It is generic and can be implemented in any arbitrary indoor environment.
  • We evaluated the proposed system with users using four different paths of navigation in an indoor environment, and found it accurate and efficient.
The rest of the paper is organized as follows. Section 2 presents the basic terms including Augmented Reality, fiducial markers and ARToolKit. Section 3 presents the related work. Section 4 presents our system with different modules in it. Section 5 presents the system evaluation and experimental results. Section 6 provides discussion on the results and concludes the paper with future plan.

2. Preliminaries and Definitions

In this section, we described basic terms including Augmented Reality (AR), and different toolkits used for AR marker tracking.

2.1. Augmented Reality

Augmented Reality (AR) attempts to superimpose virtual information over the real environment to improve its information to viewers. AR is used in many fields such as industrial manufacturing, medical visualization, entertainment, consumer design, education, training, finding direction, object identification, location oriented communication, aircraft localization and pilot assistance, military aircraft navigation, and others [14].

2.2. Fiducial Markers and ARToolKit

Fiducial markers are patterns like images printed on papers tracked by a video camera for pose estimation, virtual overlay, human computer interaction or other purposes. Different toolkits are available for marker tracking such as ARTag [15], ARToolKit [11], AruCo [16,17], and ARToolKitPlus [18]. Figure 1 shows different examples of ARToolKit markers.
Marker-based tracking is commonly used in AR, where predefined markers are placed in real scene for pose estimation and overlaying the virtual objects [20]. ARToolKit [11] uses markers to translate and rotate virtual objects over the real world environment [21].
ARToolKit is an open source, well-documented and easy-to-use marker tracking system, that is considered among the pioneer’s toolkits for marker-based AR applications. Similarly, other toolkits such as ARTag and ARToolKitPlus were later introduced [22]. The National Research Council of Canada introduced ARTag which is also gaining popularity in the recent AR systems due to its improved performance [23,24]. Most of the toolkits use square-shaped markers, which is easy for pose calculation [25]. However, some marker tracking systems also use other shape of markers instead of a square [26].

3. Related Work

Indoor navigation systems can be categorized into two main categories including infrastructure-dependent [27] and infrastructure-independent [5] systems. Infrastructure-dependent system use physical sensors embedded in the environment, whereas infrastructure-independent system follow other alternatives such as computer vision and visible light communications (VLC). Wireless network, and computer vision have a number of application for indoor navigation and path planning. Further categories include dead reckoning techniques, and VLC-based methods [28,29]. Dead reckoning techniques use a variety of sensors like accelerometer, gyroscope, compass, and magnetometer. Wireless methods include satellite GPS signals [30,31], Near Field Communication (NFC) [32], Infrared (IR), Radio Frequency Identification (RFID) [33,34], and Bluetooth/WiFi [35,36]. Computer vision based methods being low-cost, accurate, and easy-to-install are also getting user attentions [12,37]. Computer vision based navigation systems use either marker-less or marker-based methods [38]. In this section, we review the existing indoor navigation systems. A recent survey [39] is recommended for more details about the indoor localization techniques and technologies.

3.1. Wireless Networking for Indoor Navigation

Several researchers have used wireless techniques to help users navigate in indoor environments. Their works heavily rely on installation of networking infrastructures like access points, beacons, and other sensors. Bluetooth, WiFi, and RFID sensors are used for indoor guidance of sightless users with the help of audio instructions [40,41]. NFC-based indoor navigation system [32] is another approach, which stores map information on a server. Location information are retrieved by touching the NFC tag with a smartphone. The main limitation with this approach is that the user has to manually search and touch next NFC tag on the path.
Infrared and magnetic sensors are also used for user navigation. These sensors detect a unique location-based codes attached to the ceiling [42]. When the user enters a building, the building map is downloaded to a handheld IR device via Bluetooth. It uses voice commands to retrieve location information from the system. The solution relies on a large IR sensors infrastructure deployed within a building. However, the users feel difficult in their motions, as they have to carry bulky IR devices while walking in the environment. Similarly, GPS signals are also used for user tracking in large indoor environments like single-roof houses, schools, and high buildings. GPS may lead to incorrect tracking due to different reasons such as the use of nonstandard building material, minimal signal to noise ratio, and poor satellite signal reception [43].
Dead reckoning techniques are also used for user localization and navigation by calculating new position from a previously determined position. It includes the method of calculating user steps and direction towards a particular destination using accelerometer of the smartphone and compass [44]. However, such methods are with higher error rate due to inaccurate calculation of step length. The use of magnetometer, accelerometer in conjunction with WiFi is another approach that guides a user by superimposing directional arrows on a scene picture [45]. Similarly, pedestrian dead reckoning systems, aided by accelerometer of the smartphone phone and gyroscope are also used [46]. However, it needs user intervention to submit a map of a building manually. Therefore, the system usability is not satisfactory. In addition, the accuracy is also not satisfactory.

3.2. Computer Vision Applications in Indoor Navigation

Computer vision application are available in marker-based or marker-less methods for indoor navigation and positioning. Plain markers are fixed on floor, ceiling, or walls of indoor environment. The markers are then captured using a video stream from camera device. Marker-less techniques gathers features from a video stream like corners, walls, and objects and computes path based on those features.
ARTooKit markers tracked with a camera connected to a laptop are used for indoor navigation [12,47,48,49]. Huey and Sebastian [47] have used a predefined 2-dimensional map that is fed into the system, which is augmented with directional arrows using a route planner algorithm. The paths are not automatically generated as the map and its node matrix has to be manually updated from the floor plan of the building. Audio information is played when a marker is detected using the camera. Carrying a laptop by the user for guidance is not usually practical.
Similarly, overlaying the location information over the detected markers is another assistance toward indoor navigation [48]. This navigation system uses head mounted camera attached with a tablet for marker recognition and calculating the user position from the detected markers. It also helps in shortest path finding toward a user destination. However, this system uses a WiFi network for communication with a remote server for marker recognition and location retrieval, which may minimize the efficiency. Another approach is wrist-mounted touch-screen connected with a laptop used for marker tracking [49]. It gives accurate results in normal light conditions. However, carrying a laptop while walking is difficult. Furthermore, it lacks capability of map generation, as it requires manual editing of map coordinates.
Ebsar [50] is another assistance tool for blind navigation in indoor environment. It guides the blind users via an optical head-mounted display (Google Glasses). It tracks the user movement and builds a map which is uploaded to a server. It generates QR codes for various points of interests such as rooms, offices, and restrooms. Voice information are provided in Arabic language. The subjective analysis proves the significance of Ebsar in terms of its usability, effectiveness, and ease of use. However, the use of Google Glasses is additional burden for the user and not available for common people.
Figure 2 shows a previous work [12] of our current project. It works as a desktop application based on ARToolKit markers detected with a webcam attached to a laptop. Markers are deployed inside a building and their connectivity is manually carried out using hardcoded entries in the application’s database along with auditory information about each marker. A blind user can then navigate through the building by detecting the markers with a webcam, and getting response audio information using headphones. This work provided a baseline to our current work and have the advantages of low-cost, and the system works robustly in the optimal light conditions. However, there are few limitations that we have covered in our current work. First, their system [12] was implement as a laptop computer which is difficult to carry while navigating in the building. Second, the paths are manually added into the application using hard-coding which makes it harder to extend/update the current path setup. Furthermore, there is no extendibility for the installation of the new markers after its first installation. Their system may also fail if the environment light is not enough for marker detection.

3.3. Smartphone-Based Indoor Navigation

Smartphone-based applications are common in indoor navigation, as it allows free user mobility. Murata et al. [6] identified six challenges mainly concerned with user mobility, and localization in large scale indoor environments (see Section 1). They proposed a new system to handle these issues. The system is used for accurate and efficient user localization in large-scale indoor environment. It consists of a series of steps to improve the probabilistic localization algorithms. It uses inertial sensors of the smartphone and Received Signal Strength (RSS) from Bluetooth Low Energy (BLE) beacons. The results show a significant accuracy, efficiency and independent user mobility. However, the system is specific to visually impaired persons.
BLE beacons are also used for indoor localization via a fuzzy logic type-2 based fingerprinting algorithm [51]. It calculates the geometric distance between the beacon to a fingerprint point. The results show that the algorithm has accurate localization with precise navigation. However, it requires pre-configuration for fingerprint localization. For large-scale environment, the algorithm requires more beacons for accurate localization. CDNN can also be used for collecting data from smartphone for indoor navigation [10]. It uses several deep neural networks (DNNs) in a tree structure with independent nodes. The results were found satisfactory and it outperforms than a number of existing methods. However, it suffers from space and computational complexities, especially in training phase of each DNN.
Another smartphone-based indoor navigation system uses custom 2D colored markers, and accelerometer for step detection [52]. Colored markers are printed on plain papers, and placed on the key points such as entrance, rooms, offices and intersection points inside the building. However, it records the exact position of each marker in the building in offline manner such that the indoor paths are manually built. Smartphone’s accelerometer computes the distance between the markers. The system proves to be scalable and simple. However, there are several limitations such as poor detection of colored markers in low light, does not work with multiple-floors building, and inaccuracy can be found in measuring steps using accelerometer.
A number of Dead reckoning techniques [44,46] used smartphones for indoor navigation. The particle filter (PF) based indoor localization is another approach generally used for a group of users [53]. Here anchor and mobile emitters are used for RSS measurements using a centralized cooperative PF. Each user in the group receives Radio frequency (RF) via their smartphones. The RF are generated by fixed location beacons in the indoor environment (anchor). The results show that it outperforms with a considerable accuracy. However, it is limited to small group of users.
VLC [29] is another approach that calculates user location based on receiving the data in on-and-off keying (OOK) format. This OOK data is transmitted by light emitting diodes (LEDs), and received by a smartphone’s camera. The method gives accurate results. However, the installation cost increases as it requires LEDs, especially large-scale environment will need more LEDs.

3.4. Problems with State-of-the-Art

The literature review indicates that there are a number of methods proposed for indoor navigation. Each method has attempted to solve particular issues while still have some other limitations. There is a significant achievement for accurate and robust localization. However, unlike outdoor navigation, the indoor navigation methods are still with some common challenges where these methods failed to address. These challenges include the installation cost, the usability, ease of use, configuration complexity, extendibility and applicability to arbitrary building in a generic way. The existing methods are with different limitations such as their high cost, limited to a specific building, and dependency on different factors such as light intensity, and signal strength. Similarly, the use of additional hardware or special sensors are increasing the cost as well as the user fatigue.

4. Our Method

The main goal of this study is to design a system that is easy to deploy, low-cost, and produces accurate results with comparatively less computing power. Figure 3 illustrates a block diagram of the proposed system, in which a user employs the mobile application to interact with the fiducial markers and receives location-based guidance information in auditory form. This information is stored as textual description of the location, which is converted to audio using a text-to-speech module. If a text-to-speech module is not available for a certain language, we store this information as direct audio files, being recorded by the building administrator.
The system has the following key features.
  • We designed a low-cost navigation system that uses simple fiducial markers. The markers are printed on a plain paper, and placed on the ceilings of the building near different places such as offices, stairs, rooms, and corridors (see Figure 4).
  • The system automatically generates path by detecting and connecting the fiducial markers with the help of a smartphone camera and creates a graph in the phone by connecting the markers.
  • The system has audio/textual information played/displayed to guide the user upon the recognition of each marker.
  • The user is guided toward a given destination by following a shortest path inside a single or multi-floor building.
  • The system is dynamically extendable. It provides a way to edit an already generated path, and to extend it for incorporating newly deployed markers in the building.
Figure 5 shows different screenshots of the smartphone, describing various modules and modes of the proposed system. In the administrative mode the system requires login information. The administrator can do different activities such as to extend the markers, delete the markers and update the maps. The free mode is for the end-users to access the markers via their smartphone and navigate in the building through the guidance provided by the application. Different modules of the proposed system are described in the following subsections. Some of these modules are concerned with installation, while others are executed at user side.

4.1. Algorithm Overview and Marker Placement

We have tested our system at the University of Malakand, Pakistan. However, the proposed system is generic and can be applied in any building. In this regard, we have designed 512 markers. These markers can be used to mark different nodes in the building. The first step is the system installation by the administrator. The administrator will download the markers, print and place at different places.
The markers should be placed in parallel to the user path (see Figure 6). Then all these markers are scanned via our algorithm. Our algorithm allows the administrator to follow the easy instruction and prepare the application for a specific building for the end-users. There are two phases of marker scanning by the administrator. The first scan ensures the inter connectivity among the markers. In the second phase of marker scanning the audio information are given in correspondence to specific locations. These information can be an audio file or a text (to be converted via text-to-speech conversion). Furthermore, the map information are also added in the second scan.

4.2. Path Generation and Augmentation

Path generation process is an installation step to make a connection among all the markers installed in the building, and to identify the indoor paths in the building. Path generation works as follows.
ARToolKit markers are prepared and printed on plain paper for sticking them on the ceiling of the building at different locations such as offices, laboratories, stairs, corridors, and restrooms. With our Android application installed on a smartphone, the user (Administrator) scans these markers by traversing throughout the building. The markers are detected one by one and each marker is added as a new node in floor graph. On the detection of a subsequent marker, it is connected to the previous marker. Considering that the algorithm has tracked a previous marker m 1 and it is currently tracking marker m 2 ); the steps for guidance a user along the path are described as follows.
  • θ A n g l e ( C a m e r a y , M a r k e r y )
  • Switch( θ )
    Case: 0 : “Direct straight”
    Case: 90 : “Direct right”
    Case: 180 : “Direct back”
    Case: 270 : “Direct left”
  • End.
Suppose we have detected the first marker m 1 and created its node in the graph. Upon the detection of next marker m 2 , the application checks the angle θ between the y-axis of the camera and the y-axis of the marker. The θ does not need to be in exact line with the aforementioned measurements; instead, we take a range of [ + 45 , 44 ] for calculating direction. For example, connecting new marker in left direction (i.e., 270 ) to the previous marker we consider 225 θ 314 . The marker interconnection is depicted in the Figure 7. Figure 4 shows an image of the corridor with markers being deployed on the ceiling adjacent to each key-point location.
Similarly, the administrator also need to install the auditory and textual information. In this regard, all the installed markers are again traversed via the smartphone’s camera. These audio and textual information are used to guide a user inside the building for navigation and localization. The path augmentation procedure works as follows.
When the application detects a marker, the user is asked to provide the corresponding audio or textual information to be stored in the database. In addition, the distance among a series of markers are also stored. Figure 8 depicts the path generation and augmentation process using a block diagram. The marker database created with path generation algorithm is then augmented with auditory and textual details using the path augmenting procedure.

4.3. Path Extension

Path extension is usually needed when we have generated and augmented all the markers in a building, and we need to add new markers to cover further additional areas in the building. These new markers are added to the existing floor graph with the path extension algorithm.
Here the administrator has to execute the path extension module. This module allows the administrator to scan an existing marker as a starting point, where the he/she wants to add the new marker(s); followed by a list of new markers. Finally, the administrator has to complete the extension by adding a second existing marker from previously installed markers if the new markers are installed inside in the marker loop. Alternatively, he/she should press complete extensions if the new markers are at the end of previously installed markers.
As given in Figure 9a, we deployed new markers in the building, depicted as circles with gray shades. With the path extension interface of the android application, we select the marker (id: 7) and move towards the new marker (id: 8). In this manner, we move along the new path and scan each of the marker to connect it to the existing path.

4.4. User Guidance

Guiding a user inside a building is the main objective of this research. User guidance procedure works as follows.
The user selects a destination from a list (see Figure 5). User is directed toward the first nearest marker m 1 . Based on the connectivity of marker nodes in the floor graph, the application determines the shortest path to the destination starting from m 1 using Dijkstra’s shortest path algorithm [54]. On this path, for the next marker m 2 , the application uses the marker connection information (Figure 7) to guide the user. With detection of each marker the application gives audio/textual information to guide the user until he/she reaches his/her final destination. The application announces new information as a user guidance dynamically with detection of each marker placed in the calculated path. Figure 10 shows the guidance/directions given after detecting each markers in the path while traveling from one node to another one. The application may fail to detect the marker if there is no enough light in the environment [55]. To solve this issue, the smartphone’s lighting-torch is automatically switched on when the light goes down than the optimal threshold. The lower threshold for light intensity is 55 F C , that was identified in our previous study [55].

5. Experimental Study and Results

We have selected the first and second floor of the Academic Block building, located in University of Malakand, Pakistan for demonstration and testing of the system. We deployed ARToolKit markers along the corridors on the ceilings adjacent to key points like classrooms, offices, and stairs. Figure 11 shows east side portion of the second floor of the building (i.e., The Department of Computer Science). The west side of the second floor has also a similar structure. Similarly, the first floor also has a similar design like the second floor.
We selected four different paths in the building for testing the proposed algorithms. There were five stairs at various locations from first floor to the second floor of the building. Table 1 provides the details of the four selected paths. The selected paths are shown in Figure 11 and Figure 12. These paths are described as follows.
This path goes across the corridor of first floor from one of the office (ID: 7) up to a classroom (ID: 22) on the same floor.
It starts at a classroom (ID: 24) on the first floor and reaches an office (ID: 75) on the second floor across stairs (ID: 72-73).
It starts at a classroom (ID: 25) on the first floor and follows to a classroom (ID: 34) on the second floor across stairs (ID: 46-42)
It starts at a classroom (ID: 25) to a hall (ID: 75) on the second floor across stairs (ID: 51-55).

5.1. Guidance Test

We evaluated the system with experiments by ten different users (with age between 22 years to 38 years) navigating along four different paths, as shown in Figure 12. Unfortunately, we did not find blind users. Therefore, to decrease the familiarity with the indoor building and its floor map, we covered up users’ eyes with a black cloth. Figure 13 shows a user navigating in the building during our experiments. All the users used the system for navigating on the four paths (Figure 12). During the experiments we counted the time taken, number of marker missed, and number of false detections. Figure 14 shows the average time taken in guidance along each selected path. Detail of user guidance evaluation on each path is given in Table 2.

5.2. Evaluation and Results

During the system assessment, we evaluated various factors like (a) time to reach the destination from the starting key point, (b) whether a marker was missed from detection along the path, and (c) whether a marker was misinterpreted by the algorithm during identification.
Figure 14 shows the time average time taken along each path, whereas Table 2 shows the statistical results noted during the experiments. Regarding miss detection of markers, none of the markers was left from detection while the users moved with normal gaze along the paths. User-2 faced two times false detection during a run on Path-1, where the marker was not clearly identified, thus the direction was not correctly calculated. It happened due to low light on that specific marker while the smartphone was in normal range of light so the smartphone was failed to switch-on the torch for light. Subjective analyses: In order to analyze the users’ opinions about the proposed system, we conducted a subjective study. In this regard, we used standard method of system evaluation called system usability scale (SUS) [13]. SUS gives a single value that represents the overall usability of the system. It uses ten statements (see Table 3) about the system usability, answered by the users with numeric values from 1 to 5; where 1 indicates strongly disagree and 5 indicates strongly agree. In addition, we also used another simple questionnaire containing five more general questions.
Table 3 shows the results collected using the ten statements of the SUS questionnaire [13]. For odd number statements the score is calculated as the scale value minus one, whereas for even statements it is calculated as 5 minus the scale value. Each row in Table 3 presents the opinions of all ten user about one particular statement. For example, in response to statement 1, nine users rated as 5 and one user voted it as 4; therefore, the average score is calculated as: [ ( ( 5 1 ) × 9 ) + ( ( 4 1 ) × 1 ) ] / 10 = 3.9 . Similarly, for statement 2, seven users voted 1 and three user voted 2, therefore the average value is calculated as: [ ( ( 5 1 ) × 7 ) + ( ( 5 2 ) × 3 ) ] / 10 = 3.7 . Here, the number 10 represents the total number of users, which is used for average calculation. In this manner, the total of the average scores was 36.8 and SUS score was 36.8 × 2.5 = 92.0 .
In addition, we asked five simple questions about different aspects including system reliability, user satisfaction, usability, response time, and free navigation. The participants were asked to rate these factors as poor, satisfactory, good, very good, and excellent based on their experience in the experiments. We asked the following questions from each user. The results are shown in Table 4 and Figure 15.
Q. 1:
How much do you rate the system reliability?
Q. 2:
How much are you satisfied with the guidance information to get your destination?
Q. 3:
How much do you rate that the proposed system is easy to use?
Q. 4:
How much do you rate the response time?
Q. 5:
How much do you feel free while navigating with the proposed system?

6. Discussion and Conclusions

Typically, the main objective of an indoor navigation and localization system is to robustly and accurately localize the users in the indoor environment, and to guide them from source to destination in an efficient and easy manner. In our experiments, we asked the users to navigate through four different paths. The robustness of the system is ensured by controlling miss-detections and false detections of the markers. Similarly, the efficiency can be measured via counting the time to reach a destination. Subjective analysis helps to evaluate other qualitative features of the system.
Table 2 shows the results of our experiments. Ten users performed the experiments by navigating through four different paths. There were only two markers miss-detected in all the experiments. The results were with 3.1 % of false-detection (2 out of 64 markers were false identified) and 0 % miss-detections. Similarly, the time taken by each user for each path was also satisfactory. The results revealed that the proposed methodology is efficient, robust and accurate. The user satisfaction, system usability, ease of use and freedom in mobility were evaluated subjectively by interviewing the users with a questionnaire.
Table 4 shows the results of the subjective analysis. It shows that 90 % of the users rated the proposed system reliability as excellent and 10 % as very good. The participants were asked about their satisfaction while relying on the proposed system and the guidance provided for indoor navigation. The results show that 70 % rated the user satisfaction as excellent, 20 % as very good, and 10 % as good. Similarly, the usability was also rated as very good ( 20 % ) and excellent ( 80 % ), and the response time was rated as good ( 20 % ), very good ( 20 % ) and excellent ( 60 % ). Similarly, the freedom in mobility was rated as satisfactory ( 30 % ), good ( 10 % ), very good ( 20 % ) and excellent ( 40 % ).
Similarly, the system was evaluated with SUS questionnaire, which showed the overall usability of the system. SUS yielded a value of 92.0 . We are not aware of any existing indoor systems which has been evaluated with SUS, however, generally the value 92.0 indicates a significant achievement in the usability. The overall results indicate the significance of the proposed method and its ease of use. The system was evaluated by a sample of participant. Changing the users may yield somehow different results.

6.1. Limitations and Future Work

  • In our system, the user holds the smartphone in his/her hand, which is still not easy while navigating in the environment. Figure 16 shows a common and easy way of smartphone’s placement use in an existing method [56]. We are planning in future to handle this issue.
  • The marker placement is still difficult and it may affect the beauty of the building. To address this issue we are planning to use hidden markers or tracking natural images.
  • In our experiments, we did not find visually impaired participant. Furthermore, we did not implement any existing system for possible comparison of the results.
  • One limitation of our work is that it directs the users in four directions including forward, backward, left, right. It requires a structured indoor environment. The marker placement is required to be parallel to the corresponding paths (see Figure 6). In future, we are planning to solve this issue. In addition, we are also planning to use the concept of hidden markers.

6.2. Conclusions

We proposed a novel approach towards assisting a user in indoor navigation and localization. The proposed solution consists of an automated system that can efficiently identify and generate paths inside a large building. Its deployment costs are very low, as compared to solutions provided by other researchers. Fiducial markers are printed on plain paper and pasted on the corridor ceilings. These markers are detected with a smartphone camera with the help of an android application. Audio/textual information are associated with all the installed markers. The guidance module lets the users to reach a destination over the shortest path in a single or multi-story building. Testing the system with real users and evaluating the data proved that the overall navigation system is an efficient, accurate, and an easy-to-use solution.

Author Contributions

S.U. initiated the main idea, S.N. implemented the software. D.K. wrote the paper. All the authors designed the experimental plan. S.U. and S.N. conducted the experiments, collected the results and reviewed the paper. S.U. supervised the project. All the authors finalized the paper.


This project is partially funded by IGNITE National Technology Fund (Formally known as ICT R&D Fund), Ministry of Information Technology, Government of Pakistan ( I C T R D F / T R & D / 2014 / 12 ).


We are thankful to the participants of the experiments. We are also thankful to the anonymous reviewers for their valuable comments.

Conflicts of Interest

The authors declare no conflict of interest.


The following abbreviations are used in this manuscript:
UWBUltra wideband
RFIDRadio-frequency identification
ARAugmented Reality
RSSReceived signal strength
VLCvisible light communications
OOKOn-and-off keying
LEDLight emitting diode
GPSGlobal positioning system
CDNNCascaded deep neural network
SUSSystem usability scale
NFCNear field communication


  1. Díaz-Vilariño, L.; Boguslawski, P.; Khoshelham, K.; Lorenzo, H. Obstacle-Aware Indoor Pathfinding Using Point Clouds. ISPRS Int. J. Geo-Inf. 2019, 8, 233. [Google Scholar] [CrossRef]
  2. Ganz, A.; Schafer, J.; Gandhi, S.; Puleo, E.; Wilson, C.; Robertson, M. PERCEPT Indoor Navigation System for the Blind and Visually Impaired: Architecture and Experimentation. Int. J. Telemed. Appl. 2012, 2012, 19:1–19:12. [Google Scholar] [CrossRef] [PubMed]
  3. Khoshelham, K.; Zlatanova, S. Sensors for Indoor Mapping and Navigation. Sensors 2016, 16, 655. [Google Scholar] [CrossRef] [PubMed]
  4. Wang, Y.T.; Peng, C.C.; Ravankar, A.A.; Ravankar, A. A Single LiDAR-Based Feature Fusion Indoor Localization Algorithm. Sensors 2018, 18, 1294. [Google Scholar] [CrossRef]
  5. Winter, S.; Tomko, M.; Vasardani, M.; Richter, K.F.; Khoshelham, K.; Kalantari, M. Infrastructure-Independent Indoor Localization and Navigation. ACM Comput. Surv. 2019, 52, 61:1–61:24. [Google Scholar] [CrossRef]
  6. Murata, M.; Ahmetovic, D.; Sato, D.; Takagi, H.; Kitani, K.M.; Asakawa, C. Smartphone-based localization for blind navigation in building-scale indoor environments. Pervasive Mob. Comput. 2019, 57, 14–32. [Google Scholar] [CrossRef]
  7. Lymberopoulos, D.; Liu, J.; Yang, X.; Choudhury, R.R.; Handziski, V.; Sen, S. A Realistic Evaluation and Comparison of Indoor Location Technologies: Experiences and Lessons Learned. In Proceedings of the 14th International Conference on Information Processing in Sensor Networks IPSN ’15, Seattle, WA, USA, 14–16 April 2015; ACM: New York, NY, USA, 2015; pp. 178–189. [Google Scholar]
  8. Liu, H.; Darabi, H.; Banerjee, P.; Liu, J. Survey of Wireless Indoor Positioning Techniques and Systems. IEEE Trans. Syst. Man Cybern. Part C (Appl. Rev.) 2007, 37, 1067–1080. [Google Scholar] [CrossRef]
  9. Abbott, E.; Powell, D. Land-vehicle navigation using GPS. Proc. IEEE 1999, 87, 145–162. [Google Scholar] [CrossRef]
  10. Hassan, M.R.; Haque, M.S.M.; Hossain, M.I.; Hassan, M.M.; Alelaiwi, A. A novel cascaded deep neural network for analyzing smart phone data for indoor localization. Futur. Gener. Comput. Syst. 2019, 101, 760–769. [Google Scholar] [CrossRef]
  11. Kato, H.; Billinghurst, M. Marker Tracking and HMD Calibration for a Video-based Augmented Reality Conferencing System. In Proceedings of the 2nd IEEE and ACM International Workshop on Augmented Reality (IWAR ’99), San Francisco, CA, USA, 20–21 October 1999; pp. 85–94. [Google Scholar]
  12. Zeb, A.; Ullah, S.; Rabbi, I. Indoor vision-based auditory assistance for blind people in semi controlled environments. In Proceedings of the 2014 4th International Conference on Image Processing Theory, Tools and Applications (IPTA), Paris, France, 14–17 October 2014; pp. 1–6. [Google Scholar]
  13. Brooke, J. SUS: A quick and dirty usability scale. In Usability Evaluation in Industry; Jordan, P.W., Thomas, B., Eds.; Taylor and Francis: London, UK, 1996; pp. 189–194. [Google Scholar]
  14. Aloor, J.J.; Sahana, P.S.; Seethal, S.; Thomas, S.; Pillai, M.T.R. Design of VR headset using augmented reality. In Proceedings of the 2016 International Conference on Electrical, Electronics, and Optimization Techniques (ICEEOT), Chennai, India, 3–5 March 2016; pp. 3540–3544. [Google Scholar]
  15. Fiala, M. ARTag, a Fiducial marker system using digital techniques. In Proceedings of the 2005 IEEE Computer Society Conference on Computer Vision and Pattern Recognition (CVPR’05), San Diego, CA, USA, 20–25 June 2005; pp. 590–596. [Google Scholar]
  16. Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.; Medina-Carnicer, R. Generation of fiducial marker dictionaries using Mixed Integer Linear Programming. Pattern Recognit. 2016, 51, 481–491. [Google Scholar] [CrossRef]
  17. Garrido-Jurado, S.; Muñoz-Salinas, R.; Madrid-Cuevas, F.; Marín-Jiménez, M. Automatic generation and detection of highly reliable fiducial markers under occlusion. Pattern Recognit. 2014, 47, 2280–2292. [Google Scholar] [CrossRef]
  18. Wagner, D.; Schmalstieg, D. ARToolKitPlus for Pose Tracking on Mobile Devices. In Proceedings of the 12th Computer Vision Winter Workshop (CVWW’07), Paris, France, 4–6 February 2007; pp. 139–146. [Google Scholar]
  19. Kato, H.; Billinghurst, M.; Poupyrev, I. ARToolKit Version 2.33. Available online: (accessed on 27 August 2019).
  20. Sun, R.; Sui, Y.; Li, R.; Shao, F. The Design of a New Marker in Augmented Reality. In Proceedings of the Int. Conf. on Economics and Finance Research, Singapore, 26–28 February 2011; pp. 129–132. [Google Scholar]
  21. Vriends, T.; Coroporaal, H. Evaluation of High Level Synthesis for the Implementation of Marker Detection on FPGA. Master’s Thesis, Eindhoven University of Technology, Eindhoven, The Netherlands, 2011. [Google Scholar]
  22. Fiala, M. Comparing ARTag and ARToolkitPlus Fiducial Marker Systems. In Proceedings of the HAVE 2005-IEEE International Workshop on Haptic Audio Visual Environments and Applications, Ottawa, ON, Canada, 1–2 October 2005; pp. 148–153. [Google Scholar]
  23. Fiala, M. Designing highly reliable fiducial markers. IEEE Trans. Pattern Anal. Mach. Intell. 2010, 32, 1317–1324. [Google Scholar] [CrossRef] [PubMed]
  24. Wu, H.; Shao, F.; Sun, R. Research of quickly identifying markers on Augmented Reality. In Proceedings of the IEEE International Conference on Advanced Management Science (ICAMS), Chengdu, China, 9–11 July 2010; pp. 671–675. [Google Scholar]
  25. Siltanen, S.; Teknillinen Tutkimuskeskus, V. Theory and Applications of Marker-based Augmented Reality; VTT Science: Espoo, Finland, 2012. [Google Scholar]
  26. Owen, C.B.; Xiao, F.; Middlin, P. What is the best fiducial? In Proceedings of the First IEEE International Augmented Reality Toolkit Workshop, Darmstadt, Germany, 29 September 2002; pp. 98–105. [Google Scholar]
  27. Tran, H.; Mukherji, A.; Bulusu, N.; Pandey, S.; Zhang, X. Improving Infrastructure-based Indoor Positioning Systems with Device Motion Detection. In Proceedings of the 2019 IEEE International Conference on Pervasive Computing and Communications (PerCom), Kyoto, Japan, 11–15 March 2019; pp. 176–185. [Google Scholar]
  28. Ghassemlooy, Z.; Arnon, S.; Uysal, M.; Xu, Z.; Cheng, J. Emerging Optical Wireless Communications-Advances and Challenges. IEEE J. Sel. Areas Commun. 2015, 33, 1738–1749. [Google Scholar] [CrossRef]
  29. Li, Y.; Ghassemlooy, Z.; Tang, X.; Lin, B.; Zhang, Y. A VLC Smartphone Camera Based Indoor Positioning System. IEEE Photonics Technol. Lett. 2018, 30, 1171–1174. [Google Scholar] [CrossRef]
  30. Barnard, M. The Global Positioning System. IEE Rev. 1992, 38, 99–103. [Google Scholar] [CrossRef]
  31. Panzieri, S.; Pascucci, F.; Ulivi, G. An outdoor navigation system using GPS and inertial platform. IEEE/ASME Trans. Mechatron. 2002, 7, 134–142. [Google Scholar] [CrossRef]
  32. Ozdenizci, B.; Ok, K.; Coskun, V.; Aydin, M.N. Development of an Indoor Navigation System Using NFC Technology. In Proceedings of the 2011 Fourth International Conference on Information and Computing, Washington, DC, USA, 28–29 March 2011; pp. 11–14. [Google Scholar]
  33. Yelamarthi, K.; Haas, D.; Nielsen, D.; Mothersell, S. RFID and GPS integrated navigation system for the visually impaired. In Proceedings of the 2010 53rd IEEE International Midwest Symposium on Circuits and Systems, Seattle, WA, USA, 1–4 August 2010. [Google Scholar]
  34. Fallah, N.; Bekris, K.E.; Folmer, E. Indoor Human Navigation Systems: A Survey. Interact. Comput. 2013, 25, 21–33. [Google Scholar]
  35. Blattner, A.; Vasilev, Y.; Harriehausen-Mühlbauer, B. Mobile Indoor Navigation Assistance for Mobility Impaired People. Procedia Manuf. 2015, 3, 51–58. [Google Scholar] [CrossRef]
  36. Mahmood, A.; Javaid, N.; Razzaq, S. A review of wireless communications for smart grid. Renew. Sustain. Energy Rev. 2015, 41, 248–260. [Google Scholar] [CrossRef]
  37. Abu Doush, I.; Alshatnawi, S.; Al-Tamimi, A.K.; Alhasan, B.; Hamasha, S. ISAB: Integrated Indoor Navigation System for the Blind. Interact. Comput. 2016, 29, 181–202. [Google Scholar] [CrossRef]
  38. Rabbi, I.; Ullah, S. A Survey on Augmented Reality Challenges and Tracking. Acta Gr. 2013, 24, 29–46. [Google Scholar]
  39. Zafari, F.; Gkelias, A.; Leung, K.K. A Survey of Indoor Localization Systems and Technologies. IEEE Commun. Surv. Tutor. 2019, 21, 2568–2599. [Google Scholar] [CrossRef]
  40. Kasprzak, S.; Komninos, A.; Barrie, P. Feature-Based Indoor Navigation Using Augmented Reality. In Proceedings of the 2013 9th International Conference on Intelligent Environments, Athens, Greece, 18–19 July 2013; pp. 100–107. [Google Scholar]
  41. Xie, T.; Jiang, H.; Zhao, X.; Zhang, C. A Wi-Fi-Based Wireless Indoor Position Sensing System with Multipath Interference Mitigation. Sensors 2019, 19, 3983. [Google Scholar] [CrossRef] [PubMed]
  42. Mehta, P.; Kant, P.; Shah, P.; Roy, A.K. VI-Navi: A Novel Indoor Navigation System for Visually Impaired People. In Proceedings of the 12th International Conference on Computer Systems and Technologies, CompSysTech ’11, Vienna, Austria, 16–17 June 2011; pp. 365–371. [Google Scholar]
  43. Kjærgaard, M.B.; Blunck, H.; Godsk, T.; Toftkjær, T.; Christensen, D.L.; Grønbæk, K. Indoor Positioning Using GPS Revisited. In Pervasive Computing; Floréen, P., Krüger, A., Spasojevic, M., Eds.; Springer: Berlin/Heidelberg, Germany, 2010; pp. 38–56. [Google Scholar]
  44. Mulloni, A.; Seichter, H.; Schmalstieg, D. Handheld Augmented Reality Indoor Navigation with Activity-based Instructions. In Proceedings of the 13th International Conference on Human Computer Interaction with Mobile Devices and Services, Stockholm, Sweden, 30 August–2 September 2011; pp. 211–220. [Google Scholar]
  45. Lo, C.C.; Lin, T.C.; Wang, Y.C.; Tseng, Y.C.; Ko, L.C.; Kuo, L.C. Using Intelligent Mobile Devices for Indoor Wireless Location Tracking, Navigation, and Mobile Augmented Reality. In Proceedings of the IEEE VTS Asia Pacific Wireless Commun. Symposium (APWCS), Kaohsiung, Taiwan, 20–21 May 2010. [Google Scholar]
  46. Mohamed, A.; Adel Moussa, N.E.S. Map Aided Pedestrian Dead Reckoning Using Buildings Information for Indoor Navigation Applications. Positioning 2013, 4, 227–239. [Google Scholar]
  47. Huey, L.C.; Sebastian, P.; Drieberg, M. Augmented reality based indoor positioning navigation tool. In Proceedings of the 2011 IEEE Conference on Open Systems, Langkawi, Malaysia, 25–28 September 2011; pp. 256–260. [Google Scholar]
  48. Kim, J.; Jun, H. Vision-based location positioning using augmented reality for indoor navigation. IEEE Trans. Consum. Electron. 2008, 54, 954–962. [Google Scholar] [CrossRef]
  49. Kalkusch, M.; Lidy, T.; Knapp, N.; Reitmayr, G.; Kaufmann, H.; Schmalstieg, D. Structured visual markers for indoor pathfinding. In Proceedings of the First IEEE International Workshop Agumented Reality Toolkit, Darmstadt, Germany, 29 September 2002; pp. ART02:1–ART02:8. [Google Scholar]
  50. Al-Khalifa, S.; Al-Razgan, M. Ebsar: Indoor guidance for the visually impaired. Comput. Electr. Eng. 2016, 54, 26–39. [Google Scholar] [CrossRef]
  51. AL-Madani, B.; Orujov, F.; Maskeliunas, R.; Damaševičius, R.; Venčkauskas, A. Fuzzy Logic Type-2 Based Wireless Indoor Localization System for Navigation of Visually Impaired People in Buildings. Sensors 2019, 19, 2114. [Google Scholar] [CrossRef]
  52. Chandgadkar, A. An Indoor Navigation System for Smartphones; Department of Computer Science: London, UK, 2013. [Google Scholar]
  53. Seco, F.; Jiménez, A.R. Smartphone-Based Cooperative Indoor Localization with RFID Technology. Sensors 2018, 18, 266. [Google Scholar] [CrossRef]
  54. Dijkstra, E.W. A note on two problems in connexion with graphs. Numer. Math. 1959, 1, 269–271. [Google Scholar] [CrossRef]
  55. Khan, D.; Ullah, S.; Rabbi, I. Factors affecting the design and tracking of ARToolKit markers. Comput. Stand. Interfaces 2015, 41, 56–66. [Google Scholar] [CrossRef]
  56. Zhao, H.; Cheng, W.; Yang, N.; Qiu, S.; Wang, Z.; Wang, J. Smartphone-Based 3D Indoor Pedestrian Positioning through Multi-Modal Data Fusion. Sensors 2019, 19, 4554. [Google Scholar] [CrossRef] [PubMed]
Figure 1. Examples of ARToolKit markers [19].
Figure 1. Examples of ARToolKit markers [19].
Remotesensing 11 03052 g001
Figure 2. Architecture of a previous ARToolKit markers based indoor navigation system [12].
Figure 2. Architecture of a previous ARToolKit markers based indoor navigation system [12].
Remotesensing 11 03052 g002
Figure 3. Block diagram of the overall system for user guidance within an indoor environment.
Figure 3. Block diagram of the overall system for user guidance within an indoor environment.
Remotesensing 11 03052 g003
Figure 4. Marker deployment in the corridor of the building.
Figure 4. Marker deployment in the corridor of the building.
Remotesensing 11 03052 g004
Figure 5. Different functionalities of the proposed system. Left: Mode selection. Middle: Map selection. Right: Different locations in the building. The user will set his/her destination.
Figure 5. Different functionalities of the proposed system. Left: Mode selection. Middle: Map selection. Right: Different locations in the building. The user will set his/her destination.
Remotesensing 11 03052 g005
Figure 6. Marker placement in the building. (a): The marker is placed in parallel to the corridor (correct). (b): Wrong placement (not parallel).
Figure 6. Marker placement in the building. (a): The marker is placed in parallel to the corridor (correct). (b): Wrong placement (not parallel).
Remotesensing 11 03052 g006
Figure 7. Direction calculation during marker interconnection.
Figure 7. Direction calculation during marker interconnection.
Remotesensing 11 03052 g007
Figure 8. Path generation and augmentation process.
Figure 8. Path generation and augmentation process.
Remotesensing 11 03052 g008
Figure 9. The extension mechanism. (a) Deploying new markers in the building. (b) Extending path-1 to include the new markers.
Figure 9. The extension mechanism. (a) Deploying new markers in the building. (b) Extending path-1 to include the new markers.
Remotesensing 11 03052 g009
Figure 10. Guidance direction calculation at each marker when detected by camera.
Figure 10. Guidance direction calculation at each marker when detected by camera.
Remotesensing 11 03052 g010
Figure 11. The indoor environment for the experiments. Top: Second floor. Bottom: First floor. The four paths selected for experiments are highlighted with colors. Light green color represents path 1, dark green color represents path 2, brown color represents path 3, and the red color represents path 4.
Figure 11. The indoor environment for the experiments. Top: Second floor. Bottom: First floor. The four paths selected for experiments are highlighted with colors. Light green color represents path 1, dark green color represents path 2, brown color represents path 3, and the red color represents path 4.
Remotesensing 11 03052 g011
Figure 12. Layout of the selected paths for the experiments.
Figure 12. Layout of the selected paths for the experiments.
Remotesensing 11 03052 g012
Figure 13. A user navigating in the building using the proposed system. The eyes are covered with a black cloth so that the user cannot see anything and the navigation became fully dependent on the proposed system.
Figure 13. A user navigating in the building using the proposed system. The eyes are covered with a black cloth so that the user cannot see anything and the navigation became fully dependent on the proposed system.
Remotesensing 11 03052 g013
Figure 14. Average time taken in guidance along all the paths.
Figure 14. Average time taken in guidance along all the paths.
Remotesensing 11 03052 g014
Figure 15. Results collected from the users after performing the experiments.
Figure 15. Results collected from the users after performing the experiments.
Remotesensing 11 03052 g015
Figure 16. An easy way of smartphone’s placement and its orientation [56].
Figure 16. An easy way of smartphone’s placement and its orientation [56].
Remotesensing 11 03052 g016
Table 1. Details of selected paths.
Table 1. Details of selected paths.
PathSource Node: Floor (ID)Destination Node: Floor (ID)Total MakersTotal Distance (meter)
1First Floor (7)First Floor (22)1435.0
2First Floor (24)Second Floor (75)1230.5
3First Floor (25)Second Floor (34)1839.0
4First Floor (25)Second Floor (61)2345.1
Table 2. Detail of user guidance evaluation on each path.
Table 2. Detail of user guidance evaluation on each path.
PathUserTime Taken (Second)Miss DetectionsFalse Detections
(a) Guidance along path-1118000
(b) Guidance along path-2121000
(c) Guidance along path-3117800
(d) Guidance along path-4135700
Table 3. Subjective results collected from ten users using SUS questionnaire [13]. The total of the average scores is 36.8 and SUS score is 36.8 × 2.5 = 92.0 .
Table 3. Subjective results collected from ten users using SUS questionnaire [13]. The total of the average scores is 36.8 and SUS score is 36.8 × 2.5 = 92.0 .
Concerned StatementStrongly Disagree Strongly AgreeAverage Score
1I think, I would like to use this system in any new indoor environment (if available).000193.9
2I think, the system is unnecessarily complex.730003.7
3I think the system is easy to use.001183.7
4I think that I would need the support of a technical person to be able to use this system.721003.6
5Various functions in this system were well integrated.000553.5
6I found too much inconsistency in this system.1000004.0
7I would imagine that most people would learn to use this system very quickly.000463.6
8I think the system is very difficult to use.820003.8
9I felt very confident using this system000553.5
10I needed to learn a lot of things before I could start navigating with this system.631003.5
Table 4. Subjective results collected from the users after the experiments using a simple questionnaire.
Table 4. Subjective results collected from the users after the experiments using a simple questionnaire.
Concerned OpinionTotalPoorSatisfactoryGoodVery GoodExcellent
Q.1: System reliability1000019
Q.2: Satisfaction from the guidance1000127
Q.3: Usability1000028
Q.4: Response time1000226
Q.5: Freedom in navigation1003124
Back to TopTop