# Exploration of Indoor Barrier-Free Plane Intelligent Lofting System Combining BIM and Multi-Sensors

^{1}

^{2}

^{3}

^{*}

## Abstract

**:**

## 1. Introduction

- (1)
- The Meacanum wheels are deployed in the system, which improves localization accuracy from the hardware;
- (2)
- In data acquisition, we discuss the localization and attitude separately which will effectively reduce the amount of estimation and calculation. Figure 1 shows the integration framework;
- (3)
- We adopt a simple and efficient ground robot motion control theory. It is suitable for Mecanum wheel robots and indoor barrier-free environment;
- (4)
- We propose a system combining multi-sensors and BIM laser lofting instrument. It expands the relevant market and provides a reference for the future three-dimensional intelligent lofting under multi-scenes.

## 2. Related Work

## 3. General Design

#### 3.1. Robot Platform Design

#### 3.2. Mecanum Wheel

**F**perpendicular to the centerline of the roller is generated.

**F**can be decomposed into a lateral direction force

**F**

_{x}and a longitudinal direction force

**F**

_{y}as shown by the dotted line. It can be seen that unlike a conventional rubber wheel, the Mecanum wheel generates an extra component force

**F**

_{x}perpendicular to the forward direction.

**F**

_{1},

**F**

_{2},

**F**

_{3}and

**F**

_{4}perpendicular to the centerline of the wheel roller, and the forces are decomposed:

**F**

_{xi}is the lateral component of force

**F**

_{i};

**F**

_{yi}is the longitudinal component of force

**F**

_{i}.

**F**

_{xi}=

**F**

_{yi}(i = 1,2,3,4). Taking

**F**

_{x1}+

**F**

_{x2}and

**F**

_{x3}+

**F**

_{x4}as an example, the two parallel forces are the same in magnitude and opposite in direction, so the generated force couple will make the robot produce a pure rotation effect and make it rotate. The cooperation of the four Mecanum wheels can provide the robot with 3 degrees of freedom necessary for the omnidirectional rotation in the horizontal plane.

#### 3.3. Inverse Kinematic Analysis

_{r}of the robot are taken as state variables. The vehicle coordinate system ∑o

_{r}is established based on point o

_{r}, which is also the placement point of the prism.

_{r}is ${\mathit{P}}_{\mathrm{r}}={[\begin{array}{ccc}{x}_{\mathrm{r}}& {y}_{\mathrm{r}}& {\psi}_{\mathrm{r}}\end{array}]}^{\mathrm{T}}$, and the kinematic formula of each wheel is expressed as:

_{i}(i = 1, 2, 3, 4) is the linear velocity of wheel i, a is half the length of the car, b is half the width of the car.

_{n}is:

**V**

^{(n)}is the horizontal velocity component, vertical velocity component, and angular velocity of the robot in the ∑o

_{n}, $\mathit{R}({\psi}_{\mathrm{r}})$ is the transformation matrix from ∑o

_{r}to ∑o

_{n.}

_{p}is the proportional coefficient, k

_{i}is the integral coefficient, k

_{d}is the differential coefficient, ${e}_{v}={V}_{i}-{V}_{i}^{(real)}(i=1,2,3,4)$, ${V}_{i}^{(real)}$ is the real velocity of wheels which can be obtained from wheel encoder feedback.

## 4. BIM Laser Lofting Instrument

- Resection: the instrument is set up arbitrarily and measure two or more known points to establish a coordinate system;
- Reference Axis (Base Point and Reference Axis) Measurement: the instrument is set up arbitrarily and measure the base point (0,0) and the point on the reference axis (x axis or y axis) to establish a coordinate system;
- Backsight Point (Known Point) Measurement: position the instrument on a known point, measure another known point to establish a coordinate system;
- Backsight Point (Reference Axis on the Base Point) Measurement: position the instrument on the base point (0,0), measure the point on the reference axis (x axis or y axis) to establish a coordinate system.

## 5. Multi-Sensors Fusion Algorithm

#### 5.1. Sensors Angle Output

#### 5.1.1. Accelerometer

_{a}is obliquely placed in space, its rotation process is $\mathit{R}(\mathbf{\xb7})=\mathit{X}(\mathbf{\xb7})\mathit{Y}(\mathbf{\xb7})\mathit{Z}(\mathbf{\xb7})$, and assumes that it rotates all in the positive direction of the attitude angle. It finally outputs the projection component ${\mathit{A}}_{\mathrm{c}}={\left[\begin{array}{ccc}{A}_{x}^{(\mathrm{c})}& {A}_{y}^{(\mathrm{c})}& {A}_{z}^{(\mathrm{c})}\end{array}\right]}^{\mathrm{T}}$ of the gravity acceleration g on the three axis x

_{c}, y

_{c}, z

_{c}of the carrier coordinate system ∑o

_{c}.

^{2}, $\mathit{R}(\mathbf{\xb7})$ is the rotation matrix, ${\mathit{A}}_{\mathrm{c}}={\left[\begin{array}{ccc}{A}_{x}^{(\mathrm{c})}& {A}_{y}^{(\mathrm{c})}& {A}_{z}^{(\mathrm{c})}\end{array}\right]}^{\mathrm{T}}$ is the components of the gravitational acceleration on the three-axis of the carrier coordinate system ∑o

_{c}, φ, θ and ψ is the Roll angle, Pitch angle, and Yaw angle of accelerometer T

_{a}in ∑o

_{c}.

#### 5.1.2. Magnetometer

_{m}is obliquely placed on the ground. Then, T

_{m}outputs the magnetic field intensity components ${\mathit{M}}_{\mathrm{c}}={\left[\begin{array}{ccc}{M}_{x}^{(\mathrm{c})}& {M}_{y}^{(\mathrm{c})}& {M}_{z}^{(\mathrm{c})}\end{array}\right]}^{T}$ of the earth’s magnetic field

**H**on the three axis x

_{c}, y

_{c}, z

_{c}of the carrier coordinate system ∑o

_{c}. Finally, the Yaw angle ψ can be calculated as follows:

_{m}is placed obliquely in space and its rotation matrix is $\mathit{R}(\mathbf{\xb7})=\mathit{X}(\mathbf{\xb7})\mathit{Y}(\mathbf{\xb7})\mathit{Z}(\mathbf{\xb7})$, all rotation in the positive direction of the attitude angle. Then the rotation matrix can be calculated as follows:

_{c}in the earth’s magnetic field coordinate system, and H

_{x}= 0 because the geomagnetic direction is from the geomagnetic south pole to the geomagnetic north pole, ${\mathit{M}}_{c}={\left[\begin{array}{ccc}{M}_{x}^{(\mathrm{c})}& {M}_{y}^{(\mathrm{c})}& {M}_{z}^{(\mathrm{c})}\end{array}\right]}^{\mathrm{T}}$ is the component of the earth’s magnetic field on the three-axis of the carrier coordinate system ∑o

_{c}.

#### 5.1.3. Gyroscope

#### 5.2. Data Fusion Algorithm

#### Extended Kalman Filter

_{n}, ${\mathit{h}}_{1}^{\mathrm{a}}$ is the observation equation of accelerometer, ${\mathit{h}}_{2}^{\mathrm{m}}$ is the observation equation of magnetometer, ${\mathit{v}}_{k}$ is observation noise, which follows Gaussian distribution with a mean value of 0, namely ${\mathit{v}}_{k}~N(0,R)$.

## 6. Motion Control

## 7. Experiment and Results

#### 7.1. Introduction of Power Distribution

#### 7.2. Experimental Results

## 8. Discussion

## 9. Conclusions

## Author Contributions

## Funding

## Acknowledgments

## Conflicts of Interest

## References

- Gu, X.; Bao, F.; Cheng, X. Construction Engineering Survey. In Elementary Surveying, 4th ed.; Yang, N., Ed.; Tongji University Press: Shanghai, China, 2011; p. 288. [Google Scholar]
- Paul, S. Using a Transit and Tape. In Construction Surveying & Layout, 1st ed.; William, D.M., Ed.; Building News: Anaheim, UK; Washington, DC, USA, 2002; pp. 43–61. [Google Scholar]
- Luo, Y.; Chen, J.; Xi, W.; Zhao, P.; Qiao, X.; Deng, X.; Liu, Q. Analysis of tunnel displacement accuracy with total station. Measurement
**2016**, 83, 29–37. [Google Scholar] [CrossRef] - Qingjuana, S.; Hui, C. Surveying and Plotting Method for the River Channel Section Based on the Total Station and CASS. Procedia Eng.
**2012**, 28, 424–428. [Google Scholar] [CrossRef] [Green Version] - Bechar, A.; Vigneault, C. Agricultural robots for field operations: Concepts and components. Biosyst. Eng.
**2016**, 149, 94–111. [Google Scholar] [CrossRef] - Hansen, K.D.; Garcia-Ruiz, F.; Kazmi, W.; Bisgaard, M.; la Cour-Harbo, A.; Rasmussen, J.; Andersen, H.J. An Autonomous Robotic System for Mapping Weeds in Fields. In Proceedings of the IFAC Proceedings Volumes, Gold Coast, Australia, 26–28 June 2013; pp. 217–224. [Google Scholar]
- Unal, I.; Kabas, O.; Sozer, S. Real-Time Electrical Resistivity Measurement and Mapping Platform of the Soils with an Autonomous Robot for Precision Farming Applications. Sensors
**2020**, 20, 251. [Google Scholar] [CrossRef] [Green Version] - Vithanage, R.K.W.; Harrison, C.S.; De Silva, A.K.M. Autonomous rolling-stock coupler inspection using industrial robots. Robot. Comput. Integr. Manuf.
**2019**, 59, 82–91. [Google Scholar] [CrossRef] [Green Version] - Guo, S.; Diao, Q.; Xi, F. Vision Based Navigation for Omni-directional Mobile Industrial Robot. Procedia Comput. Sci.
**2017**, 105, 20–26. [Google Scholar] [CrossRef] - Neumann, P.P.; Hüllmann, D.; Bartholmai, M. Concept of a gas-sensitive nano aerial robot swarm for indoor air quality monitoring. Mater. Today Proc.
**2019**, 12, 470–473. [Google Scholar] [CrossRef] - AlHaza, T.; Alsadoon, A.; Alhusinan, Z.; Jarwali, M.; Alsaif, K. New Concept for Indoor Fire Fighting Robot. Procedia-Soc. Behav. Sci.
**2015**, 195, 2343–2352. [Google Scholar] [CrossRef] [Green Version] - Xu, J.; Zhang, L.; Zhu, Y.; Gou, H. The Application of GPS-RTK in Engineering Measurement and Position. In Proceedings of the 2009 Second International Symposium on Knowledge Acquisition and Modeling, Wuhan, China, 30 November–1 December 2009; pp. 186–189. [Google Scholar]
- Mueller, A.; Himmelsbach, M.; Luettel, T.; Hundelshausen, F.V.; Wuensche, H.-J. GIS-based topological robot localization through LIDAR crossroad detection. In Proceedings of the 2011 14th International IEEE Conference on Intelligent Transportation Systems (ITSC), Washington, DC, USA, 5–7 October 2011; pp. 2001–2008. [Google Scholar]
- Patruno, C.; Colella, R.; Nitti, M.; Reno, V.; Mosca, N.; Stella, E. A Vision-Based Odometer for Localization of Omnidirectional Indoor Robots. Sensors
**2020**, 20, 875. [Google Scholar] [CrossRef] [Green Version] - Frías, E.; Díaz-Vilariño, L.; Balado, J.; Lorenzo, H. From BIM to Scan Planning and Optimization for Construction Control. Remote Sens.
**2019**, 11, 1963. [Google Scholar] [CrossRef] [Green Version] - Jiang, S.Y. Discussion on the Application of GPS Technology in Bridge Construction Plane Control Net and in Survey of Bridge Axis Lofting. Appl. Mech. Mater.
**2014**, 580–583, 2842–2847. [Google Scholar] [CrossRef] - Lee, S.; Lee, S.; Baek, S. Vision-Based Kidnap Recovery with SLAM for Home Cleaning Robots. J. Intell. Robot. Syst.
**2011**, 67, 7–24. [Google Scholar] [CrossRef] - Breuer, T.; Giorgana Macedo, G.R.; Hartanto, R.; Hochgeschwender, N.; Holz, D.; Hegger, F.; Jin, Z.; Müller, C.; Paulus, J.; Reckhaus, M.; et al. Johnny: An Autonomous Service Robot for Domestic Environment. J. Intell. Robot. Syst.
**2011**, 66, 245–272. [Google Scholar] [CrossRef] - Liu, F.; Li, X.; Wang, J.; Zhang, J. An Adaptive UWB/MEMS-IMU Complementary Kalman Filter for Indoor Location in NLOS Environment. Remote Sens.
**2019**, 11, 2628. [Google Scholar] [CrossRef] [Green Version] - Benini, A.; Mancini, A.; Longhi, S. An IMU/UWB/Vision-based Extended Kalman Filter for Mini-UAV Localization in Indoor Environment using 802.15.4a Wireless Sensor Network. J. Intell. Robot. Syst.
**2012**, 70, 461–476. [Google Scholar] [CrossRef] - Gregory, D.; Michael, J. Inertial sensors, GPS, and odometry. In Springer Handbook of Robotics, 2nd ed.; Springer: Berlin/Heidelberg, Germany, 2008; Volume 20, pp. 477–479. [Google Scholar]
- Kumar, G.A.; Patil, A.K.; Patil, R.; Park, S.S.; Chai, Y.H. A LiDAR and IMU Integrated Indoor Navigation System for UAVs and Its Application in Real-Time Pipeline Classification. Sensors
**2017**, 17, 1268. [Google Scholar] [CrossRef] [Green Version] - Zhang, X.; Shi, H.; Pan, J.; Zhang, C. Integrated navigation method based on inertial navigation system and Lidar. Opt. Eng.
**2016**, 55, 044102. [Google Scholar] [CrossRef] - Liu, S.; Atia, M.M.; Karamat, T.B.; Noureldin, A. A LiDAR-Aided Indoor Navigation System for UGVs. J. Navig.
**2014**, 68, 253–273. [Google Scholar] [CrossRef] [Green Version] - Suhr, J.K.; Jang, J.; Min, D.; Jung, H.G. Sensor Fusion-Based Low-Cost Vehicle Localization System for Complex Urban Environment. IEEE Trans. Intell. Transp. Syst.
**2017**, 18, 1078–1086. [Google Scholar] [CrossRef] - Faralli, A.; Giovannini, N.; Nardi, S.; Pallottino, L. Indoor real-time localisation for multiple autonomous vehicles fusing vision, odometry and IMU data. In Proceedings of the Third International Workshop on Modelling and Simulation for Autonomous Systems, Rome, Italy, 15–16 June 2016; pp. 288–297. [Google Scholar]
- Jiang, X.; Chen, G.; Capi, G.; Ishll, C.; Mai, Y.; Lai, Y. Development of an indoor positioning and navigation system using monocular SLAM and IMU. In Proceedings of the First International Workshop on Pattern Recognition, Tokyo, Japan, 11 July 2016; p. 100111L. [Google Scholar]
- Chen, M.; Yang, S.; Yi, X.; Wu, D. Real-time 3D mapping using a 2D laser scanner and IMU-aided visual SLAM. In Proceedings of the 2017 IEEE International Conference on Real-time Computing and Robotics (RCAR), Okinawa, Japan, 14–18 July 2017; pp. 297–302. [Google Scholar]
- Deilamsalehy, H.; Timothy, C. Sensor fused three-dimensional localization using IMU, camera and LiDAR. In Proceedings of the 2016 IEEE SENSORS, Orlando, FL, USA, 30 October–2 November 2016; pp. 376–378. [Google Scholar]
- Nieuwenhuisen, M.; Droeschel, D.; Beul, M.; Behnke, S. Autonomous Navigation for Micro Aerial V ehicles in Complex GNSS-denied Environment. J. Intell. Robot. Syst.
**2016**, 84, 199–216. [Google Scholar] [CrossRef] - Aghili, F.; Salerno, A. Driftless 3D Attitude Determination and Positioning of Mobile Robots By Integration of IMU With Two RTK GPSs. IEEE/ASME Trans. Mechatron.
**2013**, 18, 21–31. [Google Scholar] [CrossRef] - Liu, F.; Chen, Y.; Li, Y. Research on Indoor Robot SLAM of RBPF Improved with Geometrical Characteristic Localization. In Proceedings of the IEEE 2017 29th Chinese Control And Decision Conference (CCDC), Chongqing, China, 28–30 May 2017; pp. 3325–3330. [Google Scholar]
- Lee, T.J.; Kim, C.H.; Cho, D. A Monocular Vision Sensor-Based Efficient SLAM Method for Indoor Service Robots. IEEE Trans. Ind. Electron.
**2019**, 66, 318–328. [Google Scholar] [CrossRef] - Du, H.; Wang, W.; Xu, C.; Xiao, R.; Sun, C. Real-Time Onboard 3D State Estimation of an Unmanned Aerial Vehicle in Multi-Environment Using Multi-Sensor Data Fusion. Sensor
**2020**, 20, 919. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Yi, J.; Wang, H.; Zhang, J.; Song, D.; Jayasuriya, S.; Liu, J. Kinematic Modeling and Analysis of Skid-Steered Mobile Robots With Applications to Low-Cost Inertial-Measurement-Unit-Based Motion Estimation. IEEE Trans. Robot.
**2009**, 25, 1087–1097. [Google Scholar] - Ferreira, J.C.; Resende, R.; Martinho, S. Beacons and BIM Models for Indoor Guidance and Location. Sensors
**2018**, 18, 4374. [Google Scholar] [CrossRef] [PubMed] [Green Version] - Gfrerrer, A. Geometry and kinematics of the Mecanum wheel. Comput. Aided Geom. Des.
**2008**, 25, 784–791. [Google Scholar] [CrossRef]

**Figure 5.**The force model of robot: (

**a**) Force model of clockwise rotation; (

**b**) Simplified force model.

**Figure 7.**Building Information Modeling (BIM) laser lofting instrument: (

**a**) The introduction of Topcon LN-100; (

**b**) Sokkia 360° prism matched with Topcon LN-100.

**Figure 9.**The methods of instrument layout: (

**a**) Resection; (

**b**) Reference Axis (Base Point and Reference Axis) Measurement; (

**c**) Backsight Point (Known Point) Measurement; (

**d**) Backsight Point (Reference Axis on the Base Point) Measurement.

Experiments | Starting Position/m | Expected Position/m | Actual Position/m | Error/mm |
---|---|---|---|---|

1 | (0, 0) | (10, 0) | 9.990 | 10 |

2 | (0, 0) | (10, 0) | 9.996 | 4 |

3 | (0, 0) | (10, 0) | 9.991 | 9 |

4 | (0, 0) | (10, 0) | 9.993 | 7 |

5 | (0, 0) | (10, 0) | 9.993 | 7 |

6 | (0, 0) | (10, 0) | 9.992 | 8 |

7 | (0, 0) | (10, 0) | 9.993 | 7 |

8 | (0, 0) | (10, 0) | 9.997 | 3 |

9 | (0, 0) | (10, 0) | 9.990 | 10 |

10 | (0, 0) | (10, 0) | 9.992 | 8 |

LN-100 | Parameters |
---|---|

Measuring accuracy | ±3 mm (Distance)/5″ (Angle) |

Laser tracking range | 0.9–100 m |

Working range | 360° (Horizontal), ±25°(Vertical) |

Leveling range | ±3° |

Wireless LAN | 802.11 n/b/g |

Communication range | 100 m |

Points | Points Position/m | Actual Position/m | Error/mm |
---|---|---|---|

Instrument location | (0, 0) | ||

Start point | (1, 0) | ||

1 | (0.6, 0.6) | (0.603, 0.593) | 7.6 |

2 | (1, 2) | (0.992, 1.993) | 10.6 |

3 | (3, 4) | (3.000, 3.991) | 9.0 |

4 | (7, 8) | (6.994, 7.993) | 9.2 |

5 | (8, 8.5) | (8.004, 8.495) | 6.4 |

6 | (15, 10) | (14.993, 10.004) | 8.0 |

7 | (11, 12) | (10.994, 11.998) | 6.3 |

8 | (14, 14) | (14.005, 13.996) | 6.4 |

9 | (17, 16) | (16.995, 16.007) | 8.6 |

10 | (22, 20) | (21.994, 19.998) | 6.3 |

Full Floor Length | Allowable Deviation | |
---|---|---|

Single door window | <30 m | 15 mm |

≥30 m | 20 mm | |

Adjacent doors and windows | 10 mm | |

Fire hydrant | 20 mm | |

Plumbing equipment | 15 mm | |

Central position of floral decoration | 10 mm |

© 2020 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (http://creativecommons.org/licenses/by/4.0/).

## Share and Cite

**MDPI and ACS Style**

Zhang, Z.; Cheng, X.; Yang, B.; Yang, D.
Exploration of Indoor Barrier-Free Plane Intelligent Lofting System Combining BIM and Multi-Sensors. *Remote Sens.* **2020**, *12*, 3306.
https://doi.org/10.3390/rs12203306

**AMA Style**

Zhang Z, Cheng X, Yang B, Yang D.
Exploration of Indoor Barrier-Free Plane Intelligent Lofting System Combining BIM and Multi-Sensors. *Remote Sensing*. 2020; 12(20):3306.
https://doi.org/10.3390/rs12203306

**Chicago/Turabian Style**

Zhang, Zijian, Xiaojun Cheng, Bilian Yang, and Dong Yang.
2020. "Exploration of Indoor Barrier-Free Plane Intelligent Lofting System Combining BIM and Multi-Sensors" *Remote Sensing* 12, no. 20: 3306.
https://doi.org/10.3390/rs12203306