Vision-Based Cable Displacement Measurement Using Side View Video

Recent tragedies around the world have shown how accidents in the cable-stayed bridges can wreak havoc on the society. To ensure the safety of the cable-stayed bridges, several studies have estimated the cable tension force using the vibration of cables. Most of these methods for estimating the tension of a cable start with measuring the displacement of the cable. Recent development of commercial cameras provide opportunity for more convenient and efficient method for measuring the displacement of cable. However, traditional vision-based displacement measurement methods require the assumption that the movement of the cable should be measured in parallel to the camera plane. This assumption limits the installation location of the camera when measuring the displacement of a cable. Therefore, this study introduces a new vision-based cable displacement measurement system that can measure the displacement of a cable in various locations even when the camera is installed in the side of the cable. The proposed method consists of three phases: (1) camera projection matrix estimation, (2) cable tracking in the image coordinate, and (3) cable displacement estimation in the world coordinate. To validate the performance of the proposed method, a simulation-based validation test, a lab-scale validation test, and an on-site validation test were conducted. The simulation-based validation test verified the performance of the proposed method in an ideal condition, and the lab-scale validation test showed the performance of the method in physical environment. Finally, the on-site validation test showed that the proposed method can measure the cable displacement with a side view camera.


Introduction
Recent tragedies around the world showed how accidents in the cable-stayed bridges can wreak havoc on the society. In 2015, a lightning accident at the Seohae Grand Bridge in Pyeongtaek, Gyeonggi-do, South Korea caused damage and breakage of cables, resulting in casualties [1]. In addition, at the Cheonsadaegyo Bridge in Mokpo, Jeollanam-do, South Korea, which was completed in 2019, citizens are feeling anxious due to the vibration of the bridge cables [2]. Furthermore, in Minneapolis, Minnesota, USA in 2012, large and small damages are occurring due to cables, such as cable breaking and the closing of the pedestrian bridge [3].
Various studies have been conducted to monitor the condition of the cable-stayed bridge by estimating the cable tension force. Kim et al. estimated cable tension forces based on the natural frequency of cables [4], and Yin et al. analyzed the cable tension force using

System Development
The displacement estimated by using the conventional vision-based displaceme measurement methods may contain large error depending on the camera location. Whe the cable displacement is parallel to the camera plane as shown in Figure 1a, the displac ment of the structure can be accurately estimated by applying the conventional trackin algorithm. However, when using the vision-based displacement measurement method the actual field, it is often difficult to install the camera perpendicular to the cable di placement. For example, when installing a camera at the opposite side of the bridge, th line of the sight might be blocked by an obstacle such as a vehicle. The line of sight of th camera can be secured when the camera is installed next to the cable as shown in Figu 1b. However, in this case, the displacement estimated using the side view camera w contain project error due to different field of view. Therefore, this study introduces a ne method to estimate the displacement of the cable using the side view camera, even whe the displacement of the cable-stayed bridge does not coincide with the camera plane. The proposed method for measuring the cable displacement using a side view cam era is consist of three phases, as shown in Figure 2. First, a camera calibration process conducted in phase 1 to remove radial distortion and to obtain camera projection matr which contains the camera intrinsic parameters and the pose of the camera. Next, in pha 2, the feature points in the cables are tracked in the image coordinate. Finally, in phase The proposed method for measuring the cable displacement using a side view camera is consist of three phases, as shown in Figure 2. First, a camera calibration process is conducted in phase 1 to remove radial distortion and to obtain camera projection matrix which contains the camera intrinsic parameters and the pose of the camera. Next, in phase 2, the feature points in the cables are tracked in the image coordinate. Finally, in phase 3, the displacement of the cable in the world coordinate can be calculated by combining the information from phase 1 (i.e., camera projection matrix) and phase 2 (i.e., tracking result in 2D image coordinate). the displacement of the cable in the world coordinate can be calculated by combining the information from phase 1 (i.e., camera projection matrix) and phase 2 (i.e., tracking result in 2D image coordinate).

Figure 2.
Overview for cable displacement measurement using side view video.

Phase 1. Camera Projection Matrix Estimation
The first step for measuring the cable displacement using side view camera is calibrating the camera. There are different purposes for conducting camera calibration in this study. First purpose is to remove a radial distortion in the images. Most of the commercial camera use wide-angle lens which has a large radial distortion. The camera calibration process can remove these distortions and minimize the error. Another reason for conducting camera calibration is to obtain camera projection matrix which contains intrinsic parameters such as focal length and the extrinsic parameters such as camera pose. A camera projection matrix is a matrix that projects the 3D coordinate points into the 2D image coordinate of the corresponding camera. In this study, the camera projection matrix is used to restore the displacement of the cable from the recorded video taken from the side.
The configuration for the camera calibration process proposed in this study is shown in Figure 3. The camera projection matrix can be estimated by using multiple images taken from the Checkerboard at various locations and orientations. This study adopted the camera calibration method introduced by Zhang et al. [23]. As a result of camera calibration, camera intrinsic matrix and camera extrinsic matrix can be obtained. By combining camera intrinsic matrix K and camera extrinsic matrix [R|t] T , the camera projection matrix can be obtained as shown in Equation (1).

Figure 2.
Overview for cable displacement measurement using side view video.

Phase 1. Camera Projection Matrix Estimation
The first step for measuring the cable displacement using side view camera is calibrating the camera. There are different purposes for conducting camera calibration in this study. First purpose is to remove a radial distortion in the images. Most of the commercial camera use wide-angle lens which has a large radial distortion. The camera calibration process can remove these distortions and minimize the error. Another reason for conducting camera calibration is to obtain camera projection matrix which contains intrinsic parameters such as focal length and the extrinsic parameters such as camera pose. A camera projection matrix is a matrix that projects the 3D coordinate points into the 2D image coordinate of the corresponding camera. In this study, the camera projection matrix is used to restore the displacement of the cable from the recorded video taken from the side.
The configuration for the camera calibration process proposed in this study is shown in Figure 3. The camera projection matrix can be estimated by using multiple images taken from the Checkerboard at various locations and orientations. This study adopted the camera calibration method introduced by Zhang et al. [23]. the displacement of the cable in the world coordinate can be calculated by combining the information from phase 1 (i.e., camera projection matrix) and phase 2 (i.e., tracking result in 2D image coordinate).

Phase 1. Camera Projection Matrix Estimation
The first step for measuring the cable displacement using side view camera is calibrating the camera. There are different purposes for conducting camera calibration in this study. First purpose is to remove a radial distortion in the images. Most of the commercial camera use wide-angle lens which has a large radial distortion. The camera calibration process can remove these distortions and minimize the error. Another reason for conducting camera calibration is to obtain camera projection matrix which contains intrinsic parameters such as focal length and the extrinsic parameters such as camera pose. A camera projection matrix is a matrix that projects the 3D coordinate points into the 2D image coordinate of the corresponding camera. In this study, the camera projection matrix is used to restore the displacement of the cable from the recorded video taken from the side.
The configuration for the camera calibration process proposed in this study is shown in Figure 3. The camera projection matrix can be estimated by using multiple images taken from the Checkerboard at various locations and orientations. This study adopted the camera calibration method introduced by Zhang et al. [23]. As a result of camera calibration, camera intrinsic matrix and camera extrinsic matrix can be obtained. By combining camera intrinsic matrix K and camera extrinsic matrix [R|t] T , the camera projection matrix can be obtained as shown in Equation (1). As a result of camera calibration, camera intrinsic matrix and camera extrinsic matrix can be obtained. By combining camera intrinsic matrix K and camera extrinsic matrix [R|t] T , the camera projection matrix can be obtained as shown in Equation (1). Phase 2 tracks the cable in the 2D image coordinate using the video taken from side view, as shown in Figure 4. The proposed method adopted the structural displacement measurement method proposed by Yoon et al. (2016), which used the optical flow based KLT tracker.

Phase 2. Cable Tracking in the Image Coordinate
Phase 2 tracks the cable in the 2D image coordinate using the video taken from side view, as shown in Figure 4. The proposed method adopted the structural displacement measurement method proposed by Yoon et al. (2016), which used the optical flow based KLT tracker. The first step for tracking is to select a region of interest (ROI). The ROI should include as many as points in the cable, but avoid the points outside of the cable, as shown in Figure 5. If the ROI area is selected too large, feature points not related to the cable can be tracked. If the ROI area is selected too small, the number of feature points to be used for tracking may be insufficient. Therefore, it is important to select the ROI appropriately so that the features can be tracked in the further steps. The next step is to extract feature points in the ROI. This study adopted Harris Corner points [24] as feature points. Harris Corner Detection is a method used to define a specific window (w) in an image and detect a part with a large difference in intensity within the window as a corner point while moving the window. Since the KLT tracker calculates the intensity difference between the previous frame and the next frame, a corner point is generally used as a feature point. The first step for tracking is to select a region of interest (ROI). The ROI should include as many as points in the cable, but avoid the points outside of the cable, as shown in Figure 5. If the ROI area is selected too large, feature points not related to the cable can be tracked. If the ROI area is selected too small, the number of feature points to be used for tracking may be insufficient. Therefore, it is important to select the ROI appropriately so that the features can be tracked in the further steps.

Phase 2. Cable Tracking in the Image Coordinate
Phase 2 tracks the cable in the 2D image coordinate using the video taken from side view, as shown in Figure 4. The proposed method adopted the structural displacement measurement method proposed by Yoon et al. (2016), which used the optical flow based KLT tracker. The first step for tracking is to select a region of interest (ROI). The ROI should include as many as points in the cable, but avoid the points outside of the cable, as shown in Figure 5. If the ROI area is selected too large, feature points not related to the cable can be tracked. If the ROI area is selected too small, the number of feature points to be used for tracking may be insufficient. Therefore, it is important to select the ROI appropriately so that the features can be tracked in the further steps. The next step is to extract feature points in the ROI. This study adopted Harris Corner points [24] as feature points. Harris Corner Detection is a method used to define a specific window (w) in an image and detect a part with a large difference in intensity within the window as a corner point while moving the window. Since the KLT tracker calculates the intensity difference between the previous frame and the next frame, a corner point is generally used as a feature point. The next step is to extract feature points in the ROI. This study adopted Harris Corner points [24] as feature points. Harris Corner Detection is a method used to define a specific window (w) in an image and detect a part with a large difference in intensity within the window as a corner point while moving the window. Since the KLT tracker calculates the intensity difference between the previous frame and the next frame, a corner point is generally used as a feature point.
When a window in the image is moved by ∆x, ∆y, the difference in intensity within the window is calculated in the form of a sum of squares as in Equation (2). In Equation (2), when the movement of the window (∆x, ∆y) is very small, it can be summarized as in Equation (3).
Corner points can be select for the points that have large eigenvalues of M (λ 1 , λ 2 ) obtained in the Equation (3). Since the eigenvalue analysis requires a large amount of computation, the following Equation (4) was used for detecting corner points.
Once the feature points are extracted, the next step is to track the feature points using an optical flow. If the intensity of the current frame is J(x) and the intensity of the previous frame is I(x), it can be expressed as in Equation (5).
In Equation (5), d is a displacement vector between consecutive frames, and g = ∂I ∂x , ∂I ∂y is a gradient vector expressed by the Taylor series, assuming that d is very small. The residue of the window based on the feature points is defined as in Equation (6).
In Equation (6), w is a weighting function and h is I(x) − J(x), which is the intensity difference of successive frames. In order to minimize residual, if is differentiated by d and the result is equal to 0, it can be expressed as Equation (7).
As a result, the displacement vector d can be expressed as Equation (8).
In Equation (8), G = gg T wdA, and e = (I − J)gwdA. By applying this process to every frame, the displacement vector d for the feature point can be obtained.
Finally, the feature points of the cable in the image coordinates can be obtained by removing the outliers. The movement of the feature points of the current frame to the feature points of the next frame can be represented by a transformation matrix (T). In this study, the MLESAC (Maximum Likelihood Estimation SAmple Consensus) algorithm proposed by P.H.S. et al. [25] was used to estimate the transformation matrix (T) while removing the outliers.

Phase 3. Cable Displacement Estimation in 3D World Coordinate
Phase 3 estimates the displacement of the cable in the world coordinate using the camera project matrix obtained from Phase 1 and the feature points of the cable in the image coordinate obtained from Phase 2.
The relationship between a point in 3D world coordinate [X Y Z] and the projected point to an image [x y] is shown in the equation below.
where s is an arbitrary scale factor, C ij are the elements of the camera projection matrix. The camera projection matrix can be obtained from Phase 1, and the feature points in the 2D image coordinates cab be obtained from Phase 2. The unknown parameters in the Equation (9) are the scale factor and the point in the world coordinate. There are four unknowns (X, Y, Z, s) and three equations, and therefore the point in the world coordinate (X, Y, Z) cannot be directly solved from this equation.
Therefore, in this study, it was assumed that the out-of-plane displacement of the cable is negligible. By defining the out-of-plane direction to be the Z-axis, Z can be assumed to be a constant value, and the equation can be rewritten as below.
To simplify the Equation (10), the plane of the cable can be defined as where Z = 0. In this case, Equation (10) can be simplified as Equation (11), and the feature points in the world coordinate can be estimated from the feature points in the image coordinate.

Validation Test
Overall, three validation tests were conducted to verify the performance of the proposed method. First, a simulation-based validation test was conducted to seek the accuracy of the proposed method in idealized condition. Next, a lab-scale validation test was conducted to validate the performance of the proposed method in the physical environment. Finally, an on-site validation test was conducted at a cable stayed bridge to seek the applicability of the proposed method to a real bridge. To analyze the performance of the proposed method in the validation tests, the displacements measured by proposed method were compared with the displacements measured by traditional KLT tracker (without compensating the effect of the side view). While the displacement measured by the proposed method is measured by a physical unit (i.e., mm), the displacement measured by the traditional KLT tracker is in pixel form. Therefore, to convert the pixel displacement into a physical unit, the scale factor for the traditional KLT tracker was obtained manually by measuring the length of a known object.

Simulation-Based Validation Test
A simulation-based validation test was conducted to calculate the accuracy of the proposed method in idealized condition. A simulation for cable vibration was generated using MATLAB and Simulink. The simulation was then visualized and converted into a video with resolution of 1920 × 962 with 60 fps as if the video was taken from a distance of 2.6 m with side view camera. Overall, three targets were attached to the cable, and a 100 mm × 60 mm checkerboard were located next to the cable, as shown in Figure 6, defining the out-of-plane of the cable to the Z-axis. The position (rotation and translation) of the camera estimated using the camera calibration is shown in Figure 7. Since the simulation test was performed in an ideal environment, the reprojection error was 0.02081. As a result, the intrinsic matrix was calcu-  Once the camera projection matrix was obtained, the displacement of the side view video was calibrated using the proposed method. Figure 8 illustrates the displacement results of the simulation validation test. To evaluate the performance of the proposed method, the calibrated displacement using the proposed method (red line) was compared with the reference displacement (black line) and the displacement without proposed method (blue line). As the figure indicates, the proposed method was able to estimate the displacement even with the side view, while the without proposed method) showed a significant difference with the reference displacement. The position (rotation and translation) of the camera estimated using the camera calibration is shown in Figure 7. Since the simulation test was performed in an ideal environment, the reprojection error was 0.02081. As a result, the intrinsic matrix was calcu-  Once the camera projection matrix was obtained, the displacement of the side view video was calibrated using the proposed method. Figure 8 illustrates the displacement results of the simulation validation test. To evaluate the performance of the proposed method, the calibrated displacement using the proposed method (red line) was compared with the reference displacement (black line) and the displacement without proposed method (blue line). As the figure indicates, the proposed method was able to estimate the displacement even with the side view, while the without proposed method) showed a significant difference with the reference displacement. Once the camera projection matrix was obtained, the displacement of the side view video was calibrated using the proposed method. Figure 8 illustrates the displacement results of the simulation validation test. To evaluate the performance of the proposed method, the calibrated displacement using the proposed method (red line) was compared with the reference displacement (black line) and the displacement without proposed method (blue line). As the figure indicates, the proposed method was able to estimate the displacement even with the side view, while the without proposed method) showed a significant difference with the reference displacement.
The RMSE (root mean square error) of the displacement is shown in  The RMSE (root mean square error) of the displacement is shown in To quantitatively analyze how much the angle affects the performance of the proposed method, the simulation-based validation test was repeated by changing the angle between the plane of the camera and the plane of the cable displacement from 30° to 90°. The camera was assumed to be installed at 3 m away from the cable, as shown in Figure  9. The estimated displacement using the proposed method and the traditional KLT tracker (without compensating the side angle effect), compared to the reference displacement, which is shown in Figure 10, and the RMSE, which is shown in Table 2. As shown in the figure, the displacement estimated using the proposed method and the displacement using the traditional KLT were almost identical to the reference displacement when the camera was installed at an ideal location (no side angle). When the angle between the plane of the camera and the plane of the cable displacement was 30°, the RMSE for the traditional KLT was 1.5135 while the RMSE for the proposed method was 0.4356. When the angle increased, the RMSE for the traditional KLT significantly increased while that of the proposed method increased slightly. When the angle reached to 80°, both the proposed method and the traditional KLT tracker were not able to estimate the cable displacement because the feature points in the cable were not visible.  To quantitatively analyze how much the angle affects the performance of the proposed method, the simulation-based validation test was repeated by changing the angle between the plane of the camera and the plane of the cable displacement from 30 • to 90 • . The camera was assumed to be installed at 3 m away from the cable, as shown in Figure 9. The estimated displacement using the proposed method and the traditional KLT tracker (without compensating the side angle effect), compared to the reference displacement, which is shown in Figure 10, and the RMSE, which is shown in Table 2. As shown in the figure, the displacement estimated using the proposed method and the displacement using the traditional KLT were almost identical to the reference displacement when the camera was installed at an ideal location (no side angle). When the angle between the plane of the camera and the plane of the cable displacement was 30 • , the RMSE for the traditional KLT was 1.5135 while the RMSE for the proposed method was 0.4356. When the angle increased, the RMSE for the traditional KLT significantly increased while that of the proposed method increased slightly. When the angle reached to 80 • , both the proposed method and the traditional KLT tracker were not able to estimate the cable displacement because the feature points in the cable were not visible.
From the simulated validation test, it has been proven that the proposed method could reduce the projection error due to the side angle view, especially when the angle between the camera plane and the displacement plane is around 75 • . However, we could also find the limitation of the proposed method; the proposed method cannot compensate the side angle effect, if the angle is equal or larger than 80 • .

Lab-Scale Validation Test
A lab-scale validation test was conducted to validate the performance of the proposed method in physical environment as shown in Figure 11. To obtain the reference displacement accurately, the lab-scale validation test was conducted by tracking a point in a tracking pad. Points in the tracking pad was assumed to moved only in the Y-axis direction. The video was recorded by a side view camera with 4032 × 3024 resolution and 30 fps which was installed about 70 cm away from the checkboard.        posed method in physical environment as shown in Figure 11. To obtain the reference displacement accurately, the lab-scale validation test was conducted by tracking a point in a tracking pad. Points in the tracking pad was assumed to moved only in the Y-axis direction. The video was recorded by a side view camera with 4032 × 3024 resolution and 30 fps which was installed about 70 cm away from the checkboard. Figure 11. Configuration of the lab-scale validation test.
As a first step, pose of the camera was estimated by taking total of 19 photos from different positions, as shown in Figure 12. The reprojection error for camera calibration was 0.0679 mm, which was slightly larger than that of simulation test. As a result, the intrinsic matrix was calculated as [3018.  As a first step, pose of the camera was estimated by taking total of 19 photos from different positions, as shown in Figure 12. The reprojection error for camera calibration was 0.0679 mm, which was slightly larger than that of simulation test. As a result, the intrinsic matrix was calculated as [3018. Once all of the distortions of the images were removed and camera projection matrix was obtained, the feature points in the cable in the image coordinate were measured. In this lab-scale validation test, the points in the 2D image coordinates were obtained using the checkerboard points detection method proposed by A Geiger et. al. [26]. It was assumed that each point moved one space (20 mm) along the Y-axis per each time step. Finally, the points in the image coordinate were converted into the world coordinate using proposed method. Figure 13 shows the results of the displacement without (blue) and with (red) applying the proposed method. The error for both methods were lower than the error of the simulation-based validation test, since the camera was installed at a closer distance and the tracking error was negligible. The RMSE of the displacement without the proposed method was 9.0292 mm, and the RMSE of the displacement with the proposed method was 0.9318 mm. From the result, it can be concluded that the proposed method can significantly reduce the error of the displacement by compensating the projection error of the side view video. Once all of the distortions of the images were removed and camera projection matrix was obtained, the feature points in the cable in the image coordinate were measured. In this lab-scale validation test, the points in the 2D image coordinates were obtained using the checkerboard points detection method proposed by A Geiger et. al. [26]. It was assumed that each point moved one space (20 mm) along the Y-axis per each time step. Finally, the points in the image coordinate were converted into the world coordinate using proposed method. Figure 13 shows the results of the displacement without (blue) and with (red) applying the proposed method. The error for both methods were lower than the error of the simulation-based validation test, since the camera was installed at a closer distance and the tracking error was negligible. The RMSE of the displacement without the proposed method was 9.0292 mm, and the RMSE of the displacement with the proposed method was 0.9318 mm. From the result, it can be concluded that the proposed method can significantly reduce the error of the displacement by compensating the projection error of the side view video.
Finally, the points in the image coordinate were converted into the world coordinate using proposed method. Figure 13 shows the results of the displacement without (blue) and with (red) applying the proposed method. The error for both methods were lower than the error of the simulation-based validation test, since the camera was installed at a closer distance and the tracking error was negligible. The RMSE of the displacement without the proposed method was 9.0292 mm, and the RMSE of the displacement with the proposed method was 0.9318 mm. From the result, it can be concluded that the proposed method can significantly reduce the error of the displacement by compensating the projection error of the side view video.

On-Site Validation Test
On-site validation test was conducted at Cheonsa Bridge, South Korea to seek the applicability of the proposed method. The configuration of the on-site validation test is shown in Figure 14a. A target was attached to the cable to be measured, and a checkerboard was installed next to the cable. The axis of the global coordinate system X, Y, Z was defined according to the checkerboard. Two cameras with resolution of 3840 × 2160 and 30 fps were installed at the site, one camera with a front view (Figure 14b), and another camera with side view (Figure 14c). The front view camera was installed approximately 15 m away from the checkerboard, and the side view camera was installed 3 m away from the checkerboard.

On-Site Validation Test
On-site validation test was conducted at Cheonsa Bridge, South Korea to seek the applicability of the proposed method. The configuration of the on-site validation test is shown in Figure 14a. A target was attached to the cable to be measured, and a checkerboard was installed next to the cable. The axis of the global coordinate system X, Y, Z was defined according to the checkerboard. Two cameras with resolution of 3840 × 2160 and 30 fps were installed at the site, one camera with a front view (Figure 14b), and another camera with side view (Figure 14c). The front view camera was installed approximately 15 m away from the checkerboard, and the side view camera was installed 3 m away from the checkerboard. To estimate the pose of the side view camera, a total of 237 images were taken from various locations with different angles as shown in Figure 15. The reprojection error for the camera calibration was 0.3563 pixels, which was higher compared to the simulation and the lab-scale test. As a result of, the intrinsic matrix was estimated as [3102.  To estimate the pose of the side view camera, a total of 237 images were taken from various locations with different angles as shown in Figure 15. The reprojection error for the camera calibration was 0.3563 pixels, which was higher compared to the simulation and the lab-scale test. As a result of, the intrinsic matrix was estimated as [3102. 2 various locations with different angles as shown in Figure 15. The reprojection error for the camera calibration was 0.3563 pixels, which was higher compared to the simulation and the lab-scale test. As a result of, the intrinsic matrix was estimated as [3102.  Next, the ROI was selected from the initial frame of the side view video and the feature points in the image coordinate were tracked for each video frame. Then, the feature Next, the ROI was selected from the initial frame of the side view video and the feature points in the image coordinate were tracked for each video frame. Then, the feature points in the image coordinate were transformed into the world coordinate using the camera projection matrix. Finally, the displacement of the cable in the world coordinates were calculated.
In this on-site validation test, a displacement measured by using a front view camera were used as the reference displacement. To obtain the reference displacement, the following procedure was conducted. A video was recorded by a front view camera as shown in Figure 14b. Feature points were tracked in 2D image without applying the proposed method. Next, the scale factor was calculated using the known length in the image (i.e., checkerboard). Finally, the displacement was calculated. While the front view camera was used as a reference displacement, it is not a perfect measurement since it will contain a tracking error and a projection error. Figure 16 shows the displacement of the cable without (blue) and with (red) the proposed method, together with the front view camera result (black). The RMSE of the proposed method was 1.6803 mm, while the RMSE of the conventional method showed 6.4672 mm. The proposed method estimated the cable displacement more accurately compared to the without proposed method by 4.7869 mm. Since the reference displacement obtained from the front view camera contains the error, the actual RMSE are not accurate; the actual RMSE might be lower than what we achieved. However, it was shown clearly that the proposed method was able to reduce the error significantly when measuring a displacement from a side view video. points in the image coordinate were transformed into the world coordinate using the camera projection matrix. Finally, the displacement of the cable in the world coordinates were calculated.
In this on-site validation test, a displacement measured by using a front view camera were used as the reference displacement. To obtain the reference displacement, the following procedure was conducted. A video was recorded by a front view camera as shown in Figure 14b. Feature points were tracked in 2D image without applying the proposed method. Next, the scale factor was calculated using the known length in the image (i.e., checkerboard). Finally, the displacement was calculated. While the front view camera was used as a reference displacement, it is not a perfect measurement since it will contain a tracking error and a projection error. Figure 16 shows the displacement of the cable without (blue) and with (red) the proposed method, together with the front view camera result (black). The RMSE of the proposed method was 1.6803 mm, while the RMSE of the conventional method showed 6.4672 mm. The proposed method estimated the cable displacement more accurately compared to the without proposed method by 4.7869 mm. Since the reference displacement obtained from the front view camera contains the error, the actual RMSE are not accurate; the actual RMSE might be lower than what we achieved. However, it was shown clearly that the proposed method was able to reduce the error significantly when measuring a displacement from a side view video.

Conclusions
This paper presented a new method for estimating the displacement of a cable from a side view camera. The proposed method was comprised of three phases. Phase 1 estimates the camera projection matrix which contains the intrinsic matrix and the information related to the pose of the camera. Phase 2 tracks the feature points of the cable in

Conclusions
This paper presented a new method for estimating the displacement of a cable from a side view camera. The proposed method was comprised of three phases. Phase 1 estimates the camera projection matrix which contains the intrinsic matrix and the information related to the pose of the camera. Phase 2 tracks the feature points of the cable in the image coordinate. Finally, in Phase 3, the cable displacement in the world coordinate is restored by combining the result of phases 1 and 2.
Through the simulation-based validation test, it was possible to evaluate the accuracy of the proposed method under ideal conditions. In addition, by conducting the lab-scale validation test, it was possible to confirm the performance of the proposed method in the physical environment. Finally, an on-site validation test was conducted at Cheonsa Bridge, South Korea, to seek the applicability of the proposed method to the real cable-stayed bridge. Compared to the results of the simulation-based validation test and the lab-scale validation test, the error occurred larger in the on-site validation test. This was due to various environmental conditions, but also due to the inaccuracy of the reference value. However, even taking that into account, it was possible to show that the proposed method can significantly reduce the error compared to the traditional method without proposed method. Therefore, it is expected that the proposed method will contribute toward visionbased cable displacement measurement by broadening the camera installation locations.