Next Article in Journal
Assessment of External Properties for Identifying Banana Fruit Maturity Stages Using Optical Imaging Techniques
Next Article in Special Issue
Use of Machine Learning and Wearable Sensors to Predict Energetics and Kinematics of Cutting Maneuvers
Previous Article in Journal
Silicon-Based Sensors for Biomedical Applications: A Review
Previous Article in Special Issue
3D Pose Detection of Closely Interactive Humans Using Multi-View Cameras
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

A Wireless Visualized Sensing System with Prosthesis Pose Reconstruction for Total Knee Arthroplasty

Institute of Microelectronics, Tsinghua University, Beijing 100084, China
*
Author to whom correspondence should be addressed.
Sensors 2019, 19(13), 2909; https://doi.org/10.3390/s19132909
Submission received: 16 May 2019 / Revised: 21 June 2019 / Accepted: 28 June 2019 / Published: 1 July 2019
(This article belongs to the Special Issue Multi-Modal Sensors for Human Behavior Monitoring)

Abstract

:
The surgery quality of the total knee arthroplasty (TKA) depends on how accurate the knee prosthesis is implanted. The knee prosthesis is composed of the femoral component, the plastic spacer and the tibia component. The instant and kinetic relative pose of the knee prosthesis is one key aspect for the surgery quality evaluation. In this work, a wireless visualized sensing system with the instant and kinetic prosthesis pose reconstruction has been proposed and implemented. The system consists of a multimodal sensing device, a wireless data receiver and a data processing workstation. The sensing device has the identical shape and size as the spacer. During the surgery, the sensing device temporarily replaces the spacer and captures the images and the contact force distribution inside the knee joint prosthesis. It is connected to the external data receiver wirelessly through a 432 MHz data link, and the data is then sent to the workstation for processing. The signal processing method to analyze the instant and kinetic prosthesis pose from the image data has been investigated. Experiments on the prototype system show that the absolute reconstruction errors of the flexion-extension rotation angle (the pitch rotation of the femoral component around the horizontal long axis of the spacer), the internal–external rotation (the yaw rotation of the femoral component around the spacer vertical axis) and the mediolateral translation displacement between the centers of the femoral component and the spacer based on the image data are less than 1.73°, 1.08° and 1.55 mm, respectively. It provides a force balance measurement with error less than ±5 N. The experiments also show that kinetic pose reconstruction can be used to detect the surgery defection that cannot be detected by the force measurement or instant pose reconstruction.

1. Introduction

There is an increasing number of people that suffer from arthritis or chronic joint diseases. For example, in the United States, the number of people aged 65 and older with these symptoms will exceed 41 million by 2030 [1]. Among them, knee arthritis contributes considerably to morbidity and results in a low quality of life. The primary surgical treatment for these patients is the total knee arthroplasty (TKA), namely, the replacement of the worn knee joint with a prosthesis implant [2,3,4,5,6]. As shown in Figure 1, the TKA implant is composed of three components: the femoral component, the tibia component and the polyethylene plastic spacer.
A successful TKA surgery improves the knee function, and relieves the knee joint pain. However, the improper placement of the knee prosthesis may accelerate the attrition of the plastic spacer, which will result in a reduced working life, and cause severe pain to patients [2]. The integrity of a successful TKA surgery depends on several factors, including the appropriate alignment of the components, the rotational congruency between the prosthetic proximal tibia and the prosthetic femoral condyles, and the ligamentous balance of the knee joint [3]. Recent research [4] shows that the overall TKA failure rate is about 2.3%, and more than 74% of TKA failures are caused by non-infection reasons such as instability, aseptic loosening, stiffness, etc. Actually, most TKA failures are directly or indirectly caused by the operative mechanical reasons such as malalignment [5]. Traditionally, the quality of the TKA surgery is determined exclusively by the experience of the surgeons. The digitally guided assistance equipment will be meaningful to improve the surgery quality, and reduce the failure rate caused by the operative reasons.
Many efforts have been made to develop the instrumentation of the knee joint prosthesis, for the purpose of surgery quality evaluation or long-term monitoring. Some investigations focused on measuring the contact force inside the joint prosthesis under different conditions [7,8,9,10,11,12,13,14,15]. In [7,8,9], a tibia tray was designed to measure the six load components in TKA. The in-vivo experiment under the conditions of walking or stair climbing during the postoperative follow-up of 6 to 10 months was conducted, in which the contact force of the prosthesis components were measured for five subjects. Similarly, a set of force sensing components were measured in [10,11,12,13] for three subjects in the scenarios of exercising and recreational activities after the TKA surgery, and the in-vivo experiment validated the effectiveness of the method. An implantable tibia prosthesis with the multiaxial force sensing was implemented and reported in [13]. In [14], a sensing system inside the knee implant for the force measurement was designed, fabricated and tested. The sensing system could be used to measure the contact force up to 1.5 times the body weight, and it was suitable for long-term monitoring. In [15,16,17], an instrumented smart knee prosthesis for the in-vivo measurement of the contact force and kinematics was proposed. This system could be used to monitor the knee prosthesis after the implantation with three magnetic sensors and a permanent magnet, and to prevent the possible damage to the prosthesis by detecting the load imbalance or abnormal forces and kinematics in the knee prosthesis. In [18], a long-term knee implant fatigue monitoring system using a floating-gate sensor array was introduced for the long-term, battery-less fatigue monitoring. All the above-mentioned designs were successful in measuring knee prosthesis contact force, and some could also measure the kinetic movement of the knee prosthesis. Nevertheless, they were designed to monitor the knee prosthesis after the TKA surgeries. It is also very meaningful to develop the technique to improve the implantation surgery quality during the process of TKA surgeries. The motivation of this work is to provide a device to help the surgeons to improve the TKA surgery quality during the surgery process. Such a device is used only during the TKA surgery for measurement of less than one hour.
Some efforts have been made to provide the auxiliary methods to help the surgeons to improve the quality in the surgery procedure. In [19], a “VERASENSE Knee System” is presented, in which an array of sensors provides the dynamic, intraoperative feedback regarding tibiofemoral position and quantitative pressure at peak contact points in the medial and lateral compartments in the TKA surgery. The kinematic tracking can also be assessed. In [20], a wireless knee joint force measurement system is proposed to increase the accuracy of the ligament balancing procedure. In [21], a force amplitude- and location-sensing device has been designed to improve the ligament balancing procedure in TKA. However, the above-mentioned systems only acquire the single modal sensor data, which may miss some important information. The sensing system can be more comprehensive with additional sensor data such as the image data.
A wireless visualized sensing system has been proposed and implemented in this work for multimodal signal sensing inside the knee joint prosthesis. The system is used during the surgery as a trial component for adjustment and calibration of the implant without reforming the standard knee prosthesis or changing the standard clinical procedures. The proposed system consists of a small-sized sensing device, a wireless data receiver and a data processing workstation. The system can be used to acquire both the direct images and the contact force distribution inside the prosthesis. The real-time images can help the surgeons directly see and understand the real situation in the joint. More importantly, the image sensors have higher resolution than the piezoelectric, magneto-resistive and other types of physical sensors, and the proposed system can acquire much richer information about the knee joint prosthesis with the vision sensing.
The proposed sensing device is used during the surgery when the patient lies flat. Once the sensing device is placed in the knee joint, the surgeons will move the shin manually, and the device records the inside images and the contact forces. The further signal processing reconstructs the instant pose, and the kinematic trajectory of the relative movement between the femoral component and spacer. The surgeons can use the obtained information to calibrate the position of the components in the joint prosthesis. During the surgery, the sensing device is temporarily inserted into the knee implant as a trial component and will be replaced by the real spacer component once the femoral component and tibia component are fitted to the proper position. The influence on the TKA procedure is quite small. The TKA surgeons can easily use it in the standard clinical procedures. In general, the proposed sensing device cannot be used to monitor the progression of the prosthesis over time after the surgery, partially due to the limited battery lifetime with the huge power consumption associated with image data acquisition and processing.
The rest of the paper is organized as follows. In Section 2, the proposed system architecture and the design considerations are presented. Section 3 describes the hardware design details of the sensing device as well as the wireless data receiver. In Section 4, the detailed data processing procedures and algorithms will be introduced. Section 5 gives the experimental results. This work is concluded in Section 6.

2. Design Considerations and System Design

The proposed wireless sensing measurement system consists of three parts, namely, the multimodal sensing device, the wireless data receiver, and the workstation, as shown in Figure 2.
The small-sized multimodal signal sensing device has the identical shape and size as the real spacer of the knee joint prosthesis. During the surgery, before the final placement of the real spacer, the sensing device is placed in its position inside the joint prosthesis as a trial of the spacer. The sensing device can take two types of data: (1) the images of the femoral component viewed from the spacer side; (2) the contact force distribution between the spacer and the femoral component. To facilitate the image data processing, some simple but easily recognizable markers are printed on the surface of the femoral component.
The wireless data receiver is used to receive the sensing data acquired by the sensing device through a 432 MHz wireless link. The data receiver is then connected with the workstation through a USB cable. The signal processing will be carried out in the workstation, which will recognize the real-time relative position between the spacer and the femoral component from the multimodal sensing data. The algorithm implemented in the workstation can also the 3D kinematic trajectory of the femoral component with respect to the spacer. The surgeons can use the reconstructed information to evaluate if the knee joint prosthesis is properly installed, and make the adjustment if necessary.
Generally speaking, the force data acquired through this system is only 2D data, and it is difficult to obtain the 3D information from the force data only. The image data is also 2D data, but the high-resolution images also contain the depth information. With the proposed kinetic pose estimation algorithm in this work, the presented system can output a 3D motion trajectory which is not available from other systems that acquire the force data only.
There are some abnormal TKA situations that cannot be identified using the force data only, and there are also some abnormal situations that cannot be detected only using the image data. Figure 3 gives some examples to show the advantage of multimodal sensing data instead of the single-modal data. As shown in Figure 3a, if the knee joint is appropriately installed, the spacer and the femoral component is well aligned with a certain relative pose in both the static and kinetic conditions, and the ligament around the two sides of the knee joint is well balanced. The sensing device presented in [21] could be used to determine if the ligament is well balanced by comparing the contact forces of the two sides, since it could only measure the contact force distribution between the spacer and the femoral component. Such a device could detect the inappropriate condition as shown in Figure 2b, in which the spacer is tilted, and the contact force distribution is not in equilibrium if the patients stand straight. However, the sensing device in [21] could not report the inappropriate situation as shown in Figure 2c. In this situation, the contact forces between the two sides of the knee joint are equal, and therefore the device in [21] would faultily judge it as a good situation, although the spacer and the femoral component are misaligned. Nevertheless, the malalignment can be detected by processing the direct images taken inside the joint using the multimodal sensing device presented in this work. In general, the surgery flaws shown in both Figure 3b,c can be detected using the presented multimodal sensing system.

3. Hardware Implementation

The hardware implementation of the presented sensing system will be presented in this section.

3.1. Multimodal Sensing Device

The functional diagram of the image and force sensing device is shown in Figure 4. It consists of five key functional blocks, i.e., the image sensor, the force sensors array in together with an analog-to-digital converter (ADC) for signal conversion, the wireless data transmitter, the sensor interface chip to bridge the sensors and the transmitter, and the microcontroller (MCU) for the system initialization. There are four white LEDs distributed evenly around the image sensor lens. These four LEDs provide the necessary lighting for the image sensing, since the sensing device will work in the dark or dim environment side the knee joint. The illuminance can be tuned by adjusting the driving current of the LEDs to avoid the overexposure or underexposure.
The MSP430 MCU by Texas Instruments Inc. provides the initial settings for all the other chips in the sensing device during the power-up phase. Since the wireless data transmitter and the sensors operate autonomously, the MCU is programmed to power off to save power consumption once the device starts the full operation.
The OV7670 image sensor used in this device is a CMOS sensor manufactured by Omnivision with a maximum resolution of 640 × 480. In this application, it can be programmed to output images with 480 × 480 or 240 × 240 resolution. A view angle of 140° is achieved by using a specially designed wide-angle macro lens. Since the image sensor is very close to the femoral component, this wide view angle macro lens can guarantee that most of the surface of the femoral component can be “seen” by the image sensor.
The sensing device contains six low profile force sensors manufactured by Honeywell based on silicon-implanted piezo resistors. Each sensor has a measurement range of 0–45 N. A 6-channel 24-bit ΣΔ ADC by ADI is used to quantize the force sensors’ output.
The wireless data transmitter chip is a highly integrated system-on-a-chip (SoC) which is an improved design of that reported in [22]. The ultra-low power SoC was originally designed to transmit the image sensor data. It is mainly composed of a minimum shift keying (MSK) transmitter working at the 400 MHz band and an image data compressor. In this application, the transmitter is configured to work at 416 MHz, and it provides a raw data rate of 3 Mbps. The image data compressor provides a near-lossless data compression ratio of ~3, which can help to boost the transmission frame rate. In this application, the transmitter SoC can be configured to transmit 480 × 480 images with a frame rate of ~3 fps, or 240 × 240 images with a frame rate of 6–8 fps. The SoC contains one charge-pump boost regulator and three programmable low-dropout (LDO) linear regulators, which provide 4 V, 2.5 V, 1.8 V and 1.2 V power supplies to all the other circuits in the SoC and the other function blocks in the sensing device. The 4V power supply is used to drive four white LEDs, which serve as the flash lights for image sensing. A coil antenna with −8.9 dBi peak gain and a voltage standing wave ratio (VSWR) of less than 2.0 at the 416 MHz center frequency with the proper matching between the antenna and the RF power amplifier. The RF transmitter has an energy efficiency of 1.3 nJ/bit not including the power amplifier.
Since the wireless transmitter SoC was originally designed to transmit the image data, it can only receive the data from the image sensor using an 8-bit parallel data port. The dedicated sensor interface chip is used to pack the sensing data from the force sensor array and the image sensor into the format of 8-bit parallel image data, which is readable by the SoC. Actually, the first four data rows of each 480 × 480 or 240 × 240 image is replaced by the force sensor data with careful consideration of data synchronization. From the view point of the transmitter SoC, it just receives and transmits the “image” data from the interface chip. By doing so, each frame of the image will lose the first four rows, which is affordable, considering that each image contains at least 240 rows.
The sensing device is supplied by a 3 V lithium manganese battery with a capacity of 170 mAh. There is the potential risk to use this type of battery since it is not dedicated for the medical applications. Regardless, this battery is only used for the experiments before the clinical trial. In the future, the lithium manganese battery can be replaced by a safer battery, such as the lithium/iodine cell battery for the real medical product. The currently used battery has a peak current of 30 mA that occurs when the image sensor is enabled, with the image sensor drawing 15 mA from a regulated 2.5 V supply and the flash lights consuming ~14 mA current. The transmitter SoC has a peak current of ~6 mA. Note that the image sensor and the transmitter are not enabled simultaneously to avoid too much peak current. The force sensors and the ADC has a peak current of less than 1 mA. In overall, the sensing device consumes a peak current of ~30 mA.
All the circuits in the sensing device are powered on only when necessary, the sensing device has an average current of ~10 mA from the 3 V supply. Roughly, 40% of the total average current is contributed by the transmitter SoC and almost all the remaining 60% is contributed by the image sensor in together with its flash lights. The force sensors and the following ADC are enabled at very low duty cycle ratio, and the contribution to the average current is less than 1%.
As shown in Figure 5a, the force sensors and the image sensor reside on the same printed circuit board (PCB) which has the same outline as the spacer. The PCB is sealed inside a transparent shell which is composed of the upper shell and lower shell. The central region of the upper shell is transparent and polished to ensure the image quality. The entire sensing device is shown in Figure 5b. It has the identical shape and size as the real spacer used in the knee joint prosthesis. Since there are many versions of spacers with various sizes to meet the requirements of different patients, the sensing devices in the real product should also have as many as versions to match the spacers. The shell of the sensing device is made of the medical-level polycarbonate. Since the device is not permanently implanted, the wear issue is not of concern. Note that the sensing device is used when the patient lies flat. It is estimated that the force applied to the shell of the device is less than 20 kilograms. The mechanical architecture of the sensing device is well designed so that it can tolerate the loads up to 50 kilograms. Experiments have been performed to validate the mechanical reliability of the sensing device with a 50 kg load.
The performance of the sensing device is summarized in Table 1.

3.2. Wireless Data Receiver

The data receiver in the proposed wireless visualized measurement system is used to receive the multimodal data from the sensing device. The block diagram of the receiver is shown in Figure 6. The key parts of the data receiver are the RF receiver and the FPGA-based MSK demodulator. The digital demodulator receives the digitized intermediate frequency (IF) signals from a pair of 8-bit 24 Msps ADCs, and performs the MSK demodulation. Note that in the sensing device, the image sensor output and the force sensors’ output are packed. Correspondingly, the image data and the force data are decomposed from the received data in the data receiver. The data receiver is then connected to the workstation through a USB cable for further data processing.
The PCB and package of the data receiver is shown in Figure 7. The data receiver has a size of 124 × 86 × 22 mm3. It contains a li-iron battery of 4000 mAh, and the battery life is about 10 h.

4. Multimodal Sensor Data Processing

The multimodal sensing data is eventually sent to the workstation for further processing to acquire the relative pose of the components in the knee joint prosthesis.
The force sensors’ data can be used to check the ligament balance of the knee prosthesis, and the details of the force data processing can be found in the paper previously published by our group [21]. As shown in Figure 8, the image data processing is composed of three steps, i.e., the image data pre-processing, the instant pose reconstruction from the individual image, and the kinetic pose reconstruction based on multiple consecutive images.

4.1. Image Pre-Processing

In the image pre-processing part, the images are denoised, and then the contrast is enhanced, and the lens distortion is corrected. Note that the contrast enhancement is necessary since all the images are taken under the low illumination scenario inside the knee prosthesis. The distortion introduced by the wide-angle macro-lens must also be corrected before further processing. The steps of the pre-processing are shown in Figure 8.
The Block-Matching and 3D filtering (BM3D) algorithm [23] is used for the denoising. In this algorithm, the similar blocks in the femoral component surface images are used to eliminate the noise, and the image quality is improved in terms of both the peak signal-to-noise ratio (PSNR) and the subjective visual quality.
For the contrast enhancement, a modified Multi-Scale Retinex (MSR) algorithm [24] is used to enhance the edge features of the images. For each input image data matrix S(x,y), the output of the MSR processing is the reflectance matrix R(x,y) given by:
R ( x , y ) = S ( x , y ) / ( F ( x , y ) S ( x , y ) )
in which F(x,y) is a Gaussian filter. F(x,y) is given by:
F ( x , y ) = exp ( ( x 2 + y 2 ) / c 2 )
where c is the scale factor. To simplify the calculation, the logarithmic matrix r(x,y) is used, and r(x,y) is calculated as:
r ( x , y ) = ln R ( x , y ) = ln S ( x , y ) ln ( F ( x , y ) S ( x , y ) )
Following the method in [25], three Gaussian filters F k ( x , y ) = exp ( ( x 2 + y 2 ) / c k 2 ) ,   k = 1 , 2 , 3 are used to generate three outputs, i.e., r k ( x , y ) ,   k = 1 , 2 , 3 and the weighted summation of r k ( x , y ) ,   k = 1 , 2 , 3 is calculated to find the final MSR output.
R ( x , y ) = exp ( k = 1 3 W k r k ( x , y ) )
In this application, based on the experiments on the image data taken in the emulated environment, the MSR processing is only applied to the G channel of the original RGB image data, which yields the best subjective quality. Also based on the experimental data, the following set of parameters are used for the MSR processing, i.e., c1 = 15, c2 = 80, c3 = 250, and W1 = W2 = W3 = 1/3.
The R, G and B channels after the MSR processing are normalized separately, to balance the relative intensities of the three-color channels.
The last step in the pre-processing is to correct the lens distortion. The camera calibration method from OpenCV [26] is used. A test board with the chess board pattern is utilized to find the camera matrix, distortion coefficients, rotation and translation vectors, etc., which is then used for the distortion correction.

4.2. Establishment of the Pose Reconstruction Problem

To facilitate the image data processing, five pairs of control points are marked on the femoral component, by printing some special markers on its surface. Those control points are numbered as 1L/1R, 2L/2R, 3L/3R, 4L/4R and 5L/5R. As shown in the side view of the femoral component given in Figure 9, the five pairs of control points are spaced approximately evenly on the femoral component surface with an angle difference of 30°. Any two pairs of control points form a rectangle. The femoral component is usually made of reflective materials, which will bring difficult to take the. However, the most concerned features of the images are the five pairs of control points. Non-reflective material with high contrast should be chosen to print the control points, so that they can then easily be distinguished, even if the entire image is affected by the surface reflection issue.
Only three types of markers are used, namely, the triangle, round and square markers. 1L/1R markers are triangle, 2L/2R and 3L/3R markers are round, and 4L/4R and 5L/5R markers are square. Since the movement of the femoral component has limited freedom, the control point numbers can be easily recognized by recognizing the shapes of the markers. For example, if the surface image of the femoral component contains two round markers and two square markers, the control point numbers are recognized as 3L/3R and 4L/4R.
The image data processing of this system turns to finding the relative position between the spacer (sensing device) and the femoral component from the surface images of the femoral component “seen” by the spacer (sensing device) with the aid of the control points.
Geometrically, the spacer and the femoral component are represented by two coordinate systems, namely, the spacer coordinate system {OCXCYCZC}, and the femoral coordinate system {OFXFYFZF}, as shown in Figure 10. For the spacer coordinate system, the origin OC is the center of the space bottom plane, the X axis OCXC is defined as the long axis the space bottom plane, and the Y axis OCYC is the axis perpendicular to OCXC on the space bottom plane, and the Z axis OCZC is vertically perpendicular to the space bottom plane. Ideally, when the femur is alignment with the tibia, the femoral coordinate system {OFXFYFZF} is fully parallel to {OCXCYCZC}, and the two origins OF and OC are apart vertically. The spacer coordinate system is used as the camera coordinate system in this implementation. The relative position of the spacer and the femoral component can be described by the rotational angles [ϕ, θ, ψ], and the 3-dimensional translation tCF = [tx, ty, tz]T between the two origins. ϕ is the the roll angle about the OCYC axis. θ is the flexion-extension rotation angle, namely, the pitch angle about the OCXC axis. ψ is the internal–external rotation angle, namely, the yaw angle about the OCZC axis between the two coordinate systems. tx is the mediolateral translation, namely, the horizontal distance between the centers of the femoral component and the spacer along the OCXC axis. For a successful TKR surgery, it is expected that ϕ, ψ and tx are all zero when the tibia moves relative to the femur.
In the image data processing, it is more convenient to use the rotation matrix R instead of the rotational angles [ϕ, θ, ψ]. [ϕ, θ, ψ] can easily calculated from R based on the definition of R as follows:
R C F = [ cos ϕ cos ψ cos ϕ sin ψ sin ϕ cos θ sin ψ + cos ψ sin ϕ sin θ cos ψ cos θ sin ϕ sin ψ sin θ cos ϕ sin θ cos ψ cos θ sin ϕ + sin ψ sin θ cos θ sin ϕ sin ψ + cos ψ sin θ cos ϕ cos θ ]
The relative pose reconstruction problem is converted to the problem to find RCF and tCF between two coordinate systems {OCXCYCZC} and {OFXFYFZF}.
It is not easy for the sensing device to directly recognize the coordinate system {OFXFYFZF}. However, with the wide view angle of the image sensor in the sensing device, any femoral component image taken by the sensing device contains at least 2 pairs of control points, which can define an assistant control point coordinate system {OAXAYAZA}. There are a series of control point coordinate systems {OAXAYAZA} defined by the control points, and the rotation matrix RAF and the translational distance tAF between any {OAXAYAZA} and {OFXFYFZF} are exactly known. On the other hand, RCA and tCA between {OCXCYCZC} and {OAXAYAZA} can be calculated using the image data processing shown next.
Assume that the coordinates of a given point in these three coordinate systems are CC = [xC, yC, zC]T, CA = [xA, yA, zA]T, and CF = [xF, yF, zF]T, respectively. The coordinate transformation gives the following equations:
C F = R C F ( C C t C F )
C F = R A F ( C A t A F )
C A = R C A ( C C t C A )
Substituting (6) and (7) into (8) gives:
C F = R A F ( R C A ( C C t C A ) t A F ) C F = R A F R C A ( C C ( t C A + R C A 1 t A F ) )
Comparing (6) and (9), it follows that:
[ R C F t C F ] = [ R A F R C A R C A 1 t A F + t C A ]
The problem to solve RCF and tCF is finally converted to the coplanar perspective-4-points problem [27] to find [RCA, tCA], with [RAF, tAF] as the known parameters.

4.3. Instant Pose Reconstruction

After the pre-processing, the control points in the femoral component images are recognized. The images are firstly filtered by an adaptive threshold filter to generate the binary images. The contours of the control point markers are then recognized using the method by [28]. The numbers of the control points are recognized by the markers’ shapes. For each individual image, four adjacent control points which form the biggest rectangle area in this image are chosen to establish the control point coordinate system {OAXAYAZA}.
An analytic and non-iterative method is then proposed to solve the coplanar perspective-4-points problem to find [RCA, tCA] as described in the previous subsection. There are many classical methods to solve this perspective problem, including the non-iterative method Epnp [29], the iterative linear solution using unbiased statistics [30,31], some simple methods based on P3P problem [32,33] and many linear analytic solutions [34,35,36,37,38] and so on. Compared to these classical methods, the proposed method has the advantage of low computational complexity.
The geometry of the proposed method is explained in Figure 11. As shown in Figure 11, the four control points C1, C2, C3, C4 are projected to the image plane as the image points m1, m2, m3, m4, and the correspondence between the control points and the image points is known. The point O is the perspective center of the camera. Solving the estimation problem can be equivalent to solving the depth of OC1, OC2, OC3, and OC4. The vector r i is the unit vector with the same direction as O m i (I = 1, 2, 3, 4) and γ 1 , γ 2 , γ 3 are the angles between r 1 and r 2 , r 3 , r 4 , respectively. H1, H2, H3 are the points on line OC2, OC3, and OC4 with C 1 H 1 O C 2 , C 1 H 2 O C 3 and C 1 H 3 O C 4 .
To simplify this problem, assume that OC1 has a length of x and C1H1, C1H2, C1H3 equal to k1x, k2x and k3x respectively. How to find the values of k1, k2, k3 and x is explained as follows. In the triangle C1C2C3:
C 1 C 2 2 = D 1 2 = C 1 H 1 2 + H 1 C 2 2 = x 2 ( sin 2 γ 1 + k 1 2 )
C 1 C 3 2 = D 1 2 + D 2 2 = C 1 H 2 2 + H 2 C 3 2 = x 2 ( sin 2 γ 2 + k 2 2 )
where D1 and D2 are the two side lengths of rectangle C1C2C3C4. According to the cosine theorem:
cos C 2 C 1 C 3 = D 1 D 1 2 + D 2 2 = C 1 C 2 C 1 C 3 | C 1 C 2 | | C 1 C 3 | = C 1 C 2 C 1 C 3 D 1 / D 1 2 + D 2 2 C 1 C 3 2
According to the geometry relationship:
C 1 C 2 = O C 2 O C 1 = x [ ( cos γ 1 + k 1 ) r 2 r 1 ]
C 1 C 3 = O C 3 O C 1 = x [ ( cos γ 2 + k 2 ) r 3 r 1 ]
Substitute (14) and (15) into (13):
D 1 2 D 1 2 + D 2 2 ( sin 2 γ 2 + k 2 2 ) x 2 = x 2 [ ( cos γ 1 + k 1 ) r 2 r 1 ] [ ( cos γ 2 + k 2 ) r 3 r 1 ]
(16) can be expanded to (17):
a 1 k 2 2 + b 1 k 2 + c 1 = k 1 ( d 1 k 2 + e 1 )
Substitute (17) into (11)/(12):
f 1 k 2 4 + g 1 k 2 3 + h 1 k 2 2 + i 1 k 2 + j 1 = 0
Similarly, in the triangle C1C4C3:
a 2 k 2 2 + b 2 k 2 + c 2 = k 3 ( d 2 k 2 + e 2 )
f 2 k 2 4 + g 2 k 2 3 + h 2 k 2 2 + i 2 k 2 + j 2 = 0
While in the triangle C1C2C4:
C 1 C 2 C 1 C 4 x 2 [ ( cos γ 1 + k 1 ) r 2 r 1 ] [ ( cos γ 3 + k 3 ) r 4 r 1 ] = 0
Combining (17), (19) and (21) leads to an equation without k1 or k3:
f 3 k 2 4 + g 3 k 2 3 + h 3 k 2 2 + i 3 k 2 + j 3 = 0
In (17)–(22), the coefficients ai, bi, ci, di, ei, fi, gi, hi, ii, ji (I = 1, 2, 3) are used without explanation. Actually, these coefficients all have the analytical expressions in terms of γ 1 , γ 2 , γ 3 , which are shown in Appendix A. (18), (20) and (22) can be combined and written in the form of matrix operation:
( f 1 g 1 h 1 i 1 j 1 f 2 g 2 h 2 i 2 j 2 f 3 g 3 h 3 i 3 j 3 ) ( k 2 4 k 2 3 k 2 2 k 2 1 ) T = ( 0 0 0 )
The unique solution of k2 can be found from these homogeneous linear equations, based on the right singular vectors of null space of the 3 × 5 matrix [20]. Consequently, k1, k3 and length x can be calculated from (17), (19) and (11), respectively. Note that:
{ O C 1 = x r 1 O C 2 = x ( cos γ 1 + k 1 ) r 2 O C 3 = x ( cos γ 2 + k 2 ) r 3 O C 4 = x ( cos γ 3 + k 3 ) r 4
Consequently, RCA and tCA can be calculated as:
t C A = [ t x t y t z ] T = 1 4 ( O C 1 + O C 2 + O C 3 + O C 4 )
R C A = [ C 1 C 2 C 1 C 2 C 1 C 4 C 1 C 4 C 1 C 2 × C 1 C 4 C 1 C 2 × C 1 C 4 ]
RCF and tCF can then be calculated using (10), and the rotational angles [ϕ, θ, ψ] can be calculated from RCF using (5).
Obviously, the proposed method can solve the problem without any iteration, and the computation complexity can be characterized as O(1). As a result, the instant pose reconstruction can be implemented in real time, while requiring limited computation overhead.

4.4. Kinetic Pose Reconstruction

During the TKA surgeries, the instant relative pose reconstructed by the image data and the ligament balance indicated by the force sensor data should be checked by slowly moving the tibia with respect to the femur. As shown in Figure 12, there are some typical angles between the tibia and the femur, such as 0°, 45°, 90° and 130° [39].
It is also of great significance to check the kinematic trajectory of the femoral component “seen” by the spacer when moving the tibia with respect to the femur. A successful surgery will lead to a trajectory that is central symmetric and smooth. The symmetry of the kinetic trajectory indicates the balance of the prosthesis, and the smoothness indicates the knee joint can move freely without interference between the prosthesis components.
In the instant pose reconstruction, the coordinate of the femoral component coordinate origin in the spacer coordinate system is described as tCF = [tx, ty, tz]T. The change of tCF = [tx, ty, tz]T when moving the tibia with respect to the femur can be used as the kinetic trajectory. However, this trajectory has limited amplitude. To amplify this trajectory, another point P which is the lowest central point of the femoral component is defined and the relative trajectory of P in the spacer coordinate system is described. As shown in Figure 13, the point P has a coordinate of [0, 0, −50]T, then in the spacer coordinate system, the coordinate of point P is given by:
C P , C = R C F 1 C P , F + t C F = R C F 1 [ 0 0 50 ] T + [ t x t y t z ] T
The kinetic pose reconstruction is just to sketch the trajectory of point P with the coordinate CP,C in the camera coordinate system.

5. Experimental Results

5.1. Prototype Sensing System

The prototype system implemented in this work is shown in Figure 14a. The sensing device can be programmed take images with resolution of 240 × 240 or 480 × 480. The frame rate of image transmission can reach 8 fps when transmitting the compressed 240 × 240 images. The frame rate is actually limited by the data rate of the wireless transmitter. Considering that the knee joint moves quite slowly, this frame rate is acceptable. The refreshing rate of the force data acquisition is 30 Hz.
In the experiment, the spacer (sensing device) is placed in a test platform, which can be used to resemble the situation that the femoral component moves around the spacer during the TKA surgery. As shown in Figure 14b, the test platform includes a base which is used to fix the spacer (sensing device) and a movable femoral component. The femoral component can rotate along a track to resemble the flexion-extension (pitch) rotation around the OCXC axis in the spacer coordinate system. The base to hold the spacer can also rotate to resemble the internal–external (yaw) rotation of the femoral component around the OCZC axis. The mediolateral distance between the femoral component and the spacer can also be adjusted to resemble the translational displacement. The test platform consists of the rotation and translation tracks with the scales, and the pitch and yaw angles and the translational displacement of the femoral component can be precisely controlled. The sensing devices take the image of the femoral component and record the contact force between the sensing device and the femoral component during the rotation and translation.

5.2. Experiment Results of Instant Pose Reconstruction

The signal processing steps of the instant relative pose reconstruction is shown in Figure 15. Figure 15a gives one typical original image before the processing. Figure 15b–d give the image after the denoising and contrast enhancement, the distortion correction, and the control point recognition, respective. Figure 15e gives the relative pose reconstruction result, including the pitch, yaw, and the roll angles, and the translational distance.
To evaluate the pose reconstruction accuracy, more than 400 images of the femoral component are taken by the sensing device, with the flexion-extension (pitch) angle varying from 0° to 90° with a step of 10°, the internal–external (yaw) angle varying from −15° to 15° with a step of 5°, and the mediolateral translation varying from 0 to 30 mm with a step of 5 mm. During the instant pose reconstruction experiment, the femoral component is placed on the test platform. The relative flexion-extension (pitch) angle, internal–external (yaw) angle and translational displacement of the femoral component with respect to the spacer are adjusted, and the motion degrees/distances are read from the scales on the test platform. The measurement errors are obtained by comparing the reconstructed pose to the motion values read from the scales. The detailed pose reconstruction error distribution is shown in Figure 16. With the proposed system, the flexion-extension angle estimation error of 0.01° ± 1.06° (mean ± σ), and the internal–external estimation error is 0.03° ± 0.50° (mean ± σ). As a contrast, in [17], the flexion-extension angle estimation error is 0.0° ± 0.9° (mean ± σ), and the internal–external estimation error is 0.2° ± 1.1° (mean ± σ). The presented vision system has a smaller estimation error. In addition, the most important performance for this system is the absolute maximum error. The absolute maximum flexion-extension (pitch) and internal–external (yaw) reconstruction errors are 1.73° and 1.08°, respectively. And the absolute maximum mediolateral translation reconstruction error is only 1.55 mm. The instant relative pose reconstruction accuracy is summarized in Table 2.

5.3. Balance Measurement with Force Sensing

The force sensing data can be used to determine the contact force between the spacer and the femoral component is balanced between the sides. In the experiment, imbalanced forces are applied to the sides of the spacer (sensing device), and the force sensors in the sensing device measure the force difference (imbalance). Figure 17a shows the measured force imbalance versus the actual force imbalance, and Figure 17b shows the force imbalance measurement error versus the actual force imbalance. The maximum force imbalance measurement error is less than ±5 N, which is adequately small to judge the ligament balance of the knee joint prosthesis.

5.4. Kinetic Pose Reconstruction

In this kinetic pose reconstruction experiment, the femoral component is placed in two typical situations, and the relative trajectories of the femoral component “seen” by the spacer are acquired and plotted. Specifically, the femoral component rotates around the OCXC and OCZC axes of the sensing device slowly with a rotation velocity of ~30° per second, and this rotation velocity is close to that in the TKA procedure in which the surgeons check the trial implants by slowly moving the tibia with respect to the femur.
As shown in Figure 18a, the femoral component is in an appropriate position. The measured force data shows good ligament balance, the instant relative pose meets the expectation, and the kinetic trajectory is smoot and resides on the YOZ plane.
As shown in Figure 18b, the femoral component is in the good position, but the spacer does not exactly fit the space between the femoral component and tibia component. In this case, both the force measurement and the instant relative pose at some specific point show good result. But the kinetic trajectory shows that relative movement between the spacer and the femoral component has some fluctuation which is does not meet the expectation. This case shows an example of TKA surgery defection that can be found only using the kinetic pose reconstruction.
The experiment results show that it provides more comprehensive evaluation of the TKA surgeries by combining the result of force sensing, the instant pose reconstruction, and the kinetic post reconstruction, rather than that with only the force sensing in [8].

6. Conclusions

A wireless visualized measurement system has been proposed and implemented to improve the quality of the TKA surgeries. The system consists of a multimodal sensing device, a wireless data receiver and a multimodal data processing workstation. The multimodal sensing device can take the images of the femoral component of the knee prosthesis and measure the contact force distribution between the spacer and the femoral component. The system is capable of the instant and kinetic prosthesis pose reconstruction based on the image sensing. With the image processing algorithms proposed in this work, the proposed system can provide pose reconstruction with higher accuracy. The absolute reconstruction errors of the flexion-extension rotation angle, the internal–external rotation angle as well as the mediolateral distance between the femoral component and the spacer are less than 1.73°, 1.08° and 1.55 mm, respectively. The force imbalance measurement error is less than ±5N. With the kinetic pose estimation algorithm, the presented system can output a 3D motion trajectory which is not available from other systems that acquire the single modal data such as the force data. The system is used during the surgery as a trial component. The influence on the TKA procedure is quite small. The TKA surgeons can easily use it in the standard clinical procedures.
The current implementation has several limitations that can be improved in the future. The image sensing frame rate is limited to 8 fps due to the limited data rate provided by the transmitter. The wireless transmission distance is limited due to the limited antenna gain. In the future, the transmitter and antenna design will be improved so that the system can provide higher image frame rate and longer wireless communication distance. The device will soon be validated in the real clinical environment. There is the possibility that the data processing algorithm performance may degrade under real situations. The algorithms will be then optimized for the real situations.

Author Contributions

Conceptualization, Z.W. and H.J.; methodology, H.J. and S.X.; validation, S.X. and Y.G.; data curation, S.X. and Y.G.; writing—original draft preparation, S.X.; writing—review and editing, H.J.; project administration, Z.W.; funding acquisition, H.J.

Funding

This work was funded, in partial, by National Natural Science Foundation of China under contract number 61661166010, Suzhou-Tsinghua Innovation Leadership Program under contract number 2016SZ0214, National Key R&D Program of China under contract number 2016YFC0105603, and Beijing Engineering Research Center No. BG0149.

Acknowledgments

The authors want to acknowledge the invaluable contribution of Hong Chen and Shaojie Su in this project.

Conflicts of Interest

The authors declare no conflict of interest.

Appendix A

In (17)–(22), the coefficients ai, bi, ci, di, ei, fi, gi, hi, ii, ji (i = 1, 2, 3) are listed below:
{ a 1 = D 1 2 / ( D 1 2 + D 2 2 ) b 1 = cos 2 γ 2 cos γ 1 r 2 r 3 c 1 = a 1 sin 2 γ 2 cos γ 1 cos γ 2 r 2 r 3 + cos 2 γ 2 + cos 2 γ 1 1 d 1 = r 2 r 3 e 1 = cos γ 2 r 2 r 3 cos γ 1
n 1 = a 1 sin 2 γ 2 sin 2 γ 1
{ f 1 = a 1 2 a 1 d 1 2 g 1 = 2 ( a 1 b 1 a 1 d 1 e 1 ) h 1 = b 1 2 + 2 a 1 c 1 a 1 e 1 2 n 1 d 1 2 i 1 = 2 ( b 1 c 1 n 1 d 1 e 1 ) j 1 = c 1 2 n 1 e 1 2
{ a 2 = D 1 2 / ( D 1 2 + D 2 2 ) b 2 = cos 2 γ 2 cos γ 3 r 4 r 3 c 2 = a 2 sin 2 γ 2 cos γ 3 cos γ 2 r 4 r 3 + cos 2 γ 2 + cos 2 γ 3 1 d 2 = r 4 r 3 e 2 = cos γ 2 r 4 r 3 cos γ 3
n 2 = a 2 sin 2 γ 2 sin 2 γ 3
{ f 2 = a 2 2 a 2 d 2 2 g 2 = 2 ( a 2 b 2 a 2 d 2 e 2 ) h 2 = b 2 2 + 2 a 2 c 2 a 2 e 2 2 n 2 d 2 2 i 2 = 2 ( b 2 c 2 n 2 d 2 e 2 ) j 2 = c 2 2 n 2 e 2 2
{ m = r 2 r 4 n 3 = ( cos γ 3 r 2 r 4 cos γ 1 ) p = ( cos γ 1 r 2 r 4 cos γ 3 ) q = ( cos γ 1 cos γ 3 r 2 r 4 cos 2 γ 1 + sin 2 γ 3 )
{ f 3 = a 1 a 2 m g 3 = m ( a 1 b 2 + a 2 b 1 ) + n 3 ( a 1 d 2 + a 2 d 1 ) h 3 = m ( a 1 c 2 + a 2 c 1 + b 1 b 2 ) + n 3 ( a 1 e 2 + b 1 d 2 ) + p ( a 2 e 1 + b 2 d 1 ) + q d 1 d 2 i 3 = m ( b 1 c 2 + b 2 c 1 ) + n 3 ( c 1 d 2 + b 1 e 2 ) + p ( c 2 d 1 + b 2 e 1 ) + q ( e 1 d 2 + d 1 e 2 ) j 3 = m c 1 c 2 + n 3 c 1 e 2 + p e 1 c 2 + q e 1 e 2

References

  1. Kane, R.L.; Saleh, K.J.; Wilt, T.J.; Bershadsky, B.; Cross, W.W., 3rd; MacDonald, R.M.; Rutks, I. Total Knee Replacement: Summary; US Department of Health and Human Services, Public Health Service, Agency for Healthcare Research and Quality: Rockville, MD, USA, 2003.
  2. Bono, J.V.; Scott, R.D. Revision Total Knee Arthroplasty; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
  3. Victor, J.; Dujardin, J.; Vandenneucker, H.; Arnout, N.; Bellemans, J. Patient-specific guides do not improve accuracy in total knee arthroplasty: A prospective randomized controlled trial. Clin Orthop Relat Res. 2014, 472, 263–271. [Google Scholar] [CrossRef] [PubMed]
  4. Pitta, M.; Esposito, C.I.; Li, Z.; Lee, Y.Y.; Wright, T.M.; Padgett, D.E. Failure After Modern Total Knee Arthroplasty: A Prospective Study of 18,065 Knees. J. Arthrop. 2018, 33, 407–414. [Google Scholar] [CrossRef] [PubMed]
  5. Kim, K.T.; Lee, S.; Ko, D.O.; Seo, B.S.; Jung, W.S.; Chang, B.K. Causes of Failure after Total Knee Arthroplasty in Osteoarthritis Patients 55 Years of Age or Younger. Knee Surg Relat Res. 2014, 26, 13–19. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  6. Hafezalkotob, A.; Hafezalkotob, A. Comprehensive MULTIMOORA method with target-based attributes and integrated significant coefficients for materials selection in biomedical applications. Mater. Des. 2015, 87, 949–959. [Google Scholar] [CrossRef]
  7. Heinlein, B.; Graichen, F.; Bender, A.; Rohlmann, A.; Bergmann, G. Design, calibration and pre-clinical testing of an instrumented tibial tray. J. Biomech. 2007, 40, S4–S10. [Google Scholar] [CrossRef] [PubMed]
  8. Heinlein, B.; Kutzner, I.; Graichen, F.; Bender, A.; Rohlmann, A.; Halder, A.M.; Beier, A.; Bergmann, G. Complete data of total knee replacement loading for level walking and stair climbing measured in vivo with a follow-up of 6–10 months. Clin. Biomech. 2009, 24, 315–326. [Google Scholar] [CrossRef]
  9. Kutzner, I.; Heinlein, B.; Graichen, F.; Bender, A.; Rohlmann, A.; Halder, A.; Beier, A.; Bergmann, G. Loading of the knee joint during activities of daily living measured in vivo in five subjects. J. Biomech. 2010, 43, 2164–2173. [Google Scholar] [CrossRef]
  10. Morris, B.A.; D’Lima, D.D.; Slamin, J.; Kovacevic, N.; Arms, S.W.; Townsend, C.P.; Colwell Jr, C.W. e-Knee: Evolution of the electronic knee prosthesis-Telemetry technology development. J. Bone Joint Surg. 2001, 83, 62–66. [Google Scholar] [CrossRef]
  11. D’Lima, D.D.; Patil, S.; Steklov, N.; Slamin, J.E.; Colwell, C.W., Jr. In vivo knee forces after total knee arthroplasty. Clin. Ortho. Relat. Res. 2005, 440, 45–49. [Google Scholar]
  12. D’Lima, D.D.; Steklov, N.; Patil, S.; Colwell, C.W., Jr. The Mark Coventry Award: In vivo knee forces during recreation and exercise after knee arthroplasty. Clin. Ortho. Relat. Res. 2008, 466, 2605–2611. [Google Scholar] [CrossRef]
  13. Kirking, B.; Krevolin, J.; Townsend, C.; Colwell, C.W., Jr.; D’Lima, D.D. A multiaxial force-sensing implantable tibial prosthesis. J. Biomech. 2006, 39, 1744–1751. [Google Scholar] [CrossRef] [PubMed]
  14. Forchelet, D.; Simoncini, M.; Arami, A.; Bertsch, A.; Meurville, E.; Aminian, K.; Ryser, P.; Renaud, P. Enclosed Electronic System for Force Measurements in Knee Implants. Sensors 2014, 14, 15009–15021. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  15. Arami, A.; Dejnabadi, H.; Leclercq, V.; Aminian, K. An Implantable System for Angles Measurement in Prosthetic Knee. In Proceedings of the International Society of Biomechanics (ISB), Brussels, Belgium, 3–7 July 2011. [Google Scholar]
  16. Arami, A.; Simoncini, M.; Atasoy, O.; Ali, S.; Hasenkamp, W.; Bertsch, A.; Meurville, E.; Tanner, S.; Renaud, P.; Dehollain, C.; et al. Instrumented Knee Prosthesis for Force and Kinematics Measurements. IEEE Trans. Autom. Sci. Eng. 2013, 10, 615–624. [Google Scholar] [CrossRef]
  17. Arami, A.; Vallet, A.; Aminian, K. Accurate Measurement of Concurrent Flexion–Extension and Internal–External Rotations in Smart Knee Prostheses. IEEE Trans. Biomed. Eng. 2013, 60, 2504–2510. [Google Scholar] [CrossRef]
  18. Lajnef, N.; Elvin, N.G.; Chakrabartty, S. A Piezo-Powered Floating-Gate Sensor Array for Long-Term Fatigue Monitoring in Biomechanical Implants. IEEE Trans. Biomed. Circuits Syst. 2008, 2, 164–172. [Google Scholar] [CrossRef] [PubMed]
  19. Gustke, K.A.; Golladay, G.J.; Roche, M.W.; Elson, L.C.; Anderson, C.R. A New Method for Defining Balance: Promising Short-Term Outcomes of Sensor-Guided TKA. J. Arthroplast. 2014, 29, 955–960. [Google Scholar] [CrossRef]
  20. Luo, H.; Liu, M.; Chen, H.; Zhang, C.; Wang, Z. A wireless force measurement system for Total Knee Arthroplasty. In Proceedings of the 2012 IEEE International Symposium on Circuits and Systems, Seoul, Korea, 20–23 May 2012. [Google Scholar]
  21. Crottet, D.; Maeder, T.; Fritschy, D.; Bleuler, H.; Nolte, L.P.; Pappas, I.P. Development of a Force Amplitude- and Location-Sensing Device Designed to Improve the Ligament Balancing Procedure in TKA. IEEE Trans. Biomed. Eng. 2005, 52, 1609–1611. [Google Scholar] [CrossRef]
  22. Jiang, H.; Li, F.; Chen, X.; Ning, Y.; Zhang, X.; Zhang, B.; Teng, M.; Wang, Z. A SoC with 3.9 mW 3 Mbps UHF transmitter and 240 μW MCU for capsule endoscope with bidirectional communication. In Proceedings of the 2010 IEEE Asian Solid State Circuits Conference (A-SSCC), Beijing, China, 8–10 November 2010. [Google Scholar]
  23. Dabov, K.; Foi, A.; Katkovnik, V.; Egiazarian, K. Image Denoising by Sparse 3-D Transform-Domain Collaborative Filtering. IEEE Trans. Image Process. 2007, 16, 2080–2095. [Google Scholar] [CrossRef]
  24. Jobson, D.J.; Rahman, Z.; Woodell, G.A. Properties and performance of a center/surround retinex. IEEE Trans. Image Process. 1997, 6, 451–462. [Google Scholar] [CrossRef]
  25. Rahman, Z.; Jobson, D.J.; Woodell, G.A. Multi-scale retinex for color image enhancement. In Proceedings of the 3rd IEEE International Conference on Image Processing, Lausanne, Switzerland, 19 September 1996. [Google Scholar]
  26. Classic Camera Calibration Method of OpenCV. Available online: https://opencv-python-tutroals.readthedocs.io/en/latest/py_tutorials/py_calib3d/py_calibration/py_calibration.html (accessed on 29 June 2019).
  27. Fischler, M.A.; Bolles, R.C. Random Sample Consensus: A Paradigm for Model Fitting with Applications to Image Analysis and Automated Cartography. Graph. Image Process. 1981, 24, 381–395. [Google Scholar] [CrossRef]
  28. Classic Contours Recognition Method of OpenCV. Available online: https://opencv-python-tutroals.readthedocs.io/en/latest/py_tutorials/py_imgproc/py_contours/py_contour_features/py_contour_features.html?highlight=contours (accessed on 29 June 2019).
  29. Lepetit, V.; Moreno-Noguer, F.; Fua, P. Epnp: An accurate o(n) solution to the pnp problem. Int. J. Comput. Vis. 2009, 88, 155–166. [Google Scholar] [CrossRef]
  30. Vigueras, F.; Hernández, A.; Maldonado, I. Iterative Linear Solution of the Perspective n-Point Problem Using Unbiased Statistics. In Proceedings of the 2009 Eighth Mexican International Conference on Artificial Intelligence, Guanajuato, Mexico, 9–13 November 2009. [Google Scholar]
  31. Gao, J.; Zhang, Y. An Improved Iterative Solution to the PnP Problem. In Proceedings of the 2013 International Conference on Virtual Reality and Visualization, Xi’an, China, 14–15 September 2013. [Google Scholar]
  32. Merzban, M.H.; Abdellatif, M.; Abouelsoud, A.A. A simple solution for the non perspective three point pose problem. In Proceedings of the 2014 International Conference on 3D Imaging (IC3D), Liege, Belgium, 9–10 December 2014. [Google Scholar]
  33. Li, S.; Xu, C. A Stable Direct Solution of Perspective-Three-Point Problem. Int. J. Pattern Recogn. Artif. Intell. 2010, 25, 627–642. [Google Scholar] [CrossRef]
  34. Horaud, R.; Conio, B.; Leboulleux, O. An Analytic Solution for the Perspective 4-Point Problem. In Proceedings of the CVPR ’89: IEEE Computer Society Conference on Computer Vision and Pattern Recognition, San Diego, CA, USA, 4–8 June 1989. [Google Scholar]
  35. Quan, L.; Lan, Z. Linear N-point camera pose determination. IEEE Trans. Pattern Anal. Mach. Intell. 1999, 21, 774–780. [Google Scholar] [CrossRef]
  36. Schweighofer, G.; Pinz, A. Robust Pose Estimation from a Planar Target. IEEE Trans. Pattern Anal. Mach. Intell. 2006, 28, 2024–2030. [Google Scholar] [CrossRef] [PubMed]
  37. Zheng, Y.; Kuang, Y.; Sugimoto, S.; Åström, K.; Okutomi, M. Revisiting the PnP Problem: A Fast, General and Optimal Solution. In Proceedings of the 2013 IEEE International Conference on Computer Vision, Darling Harbour, Sydney, 1–8 December 2013. [Google Scholar]
  38. Xu, C.; Zhang, L.; Cheng, L.; Koch, R. Pose Estimation from Line Correspondences: A Complete Analysis and a Series of Solutions. IEEE Trans. Pattern Anal. Mach. Intell. 2017, 39, 1209–1222. [Google Scholar] [CrossRef] [PubMed]
  39. Virtual knee surgery. Available online: https://www.silvergames.com/en/virtual-knee-surgery (accessed on 29 June 2019).
Figure 1. A total knee replacement prosthesis (side view).
Figure 1. A total knee replacement prosthesis (side view).
Sensors 19 02909 g001
Figure 2. Proposed system architecture.
Figure 2. Proposed system architecture.
Sensors 19 02909 g002
Figure 3. Typical situations of the knee joint prosthesis of total knee arthroplasty (TKA) surgery: (a) the joint prosthesis is well installed, (b) inappropriate condition with the spacer tilted, (c) inappropriate condition with malalignment between the spacer and the femoral component.
Figure 3. Typical situations of the knee joint prosthesis of total knee arthroplasty (TKA) surgery: (a) the joint prosthesis is well installed, (b) inappropriate condition with the spacer tilted, (c) inappropriate condition with malalignment between the spacer and the femoral component.
Sensors 19 02909 g003
Figure 4. Functional diagram of the multimodal sensing device.
Figure 4. Functional diagram of the multimodal sensing device.
Sensors 19 02909 g004
Figure 5. Package and assembling of the sensing device. (a) printed circuit board (PCB), (b) the entire sensing device with the transparent shell.
Figure 5. Package and assembling of the sensing device. (a) printed circuit board (PCB), (b) the entire sensing device with the transparent shell.
Sensors 19 02909 g005
Figure 6. Block diagram of the data receiver.
Figure 6. Block diagram of the data receiver.
Sensors 19 02909 g006
Figure 7. PCB and package of the data receiver.
Figure 7. PCB and package of the data receiver.
Sensors 19 02909 g007
Figure 8. Data processing flow in the workstation.
Figure 8. Data processing flow in the workstation.
Sensors 19 02909 g008
Figure 9. Femoral component with five pairs of control points on its surface: (a) side view, (b) front view (partial).
Figure 9. Femoral component with five pairs of control points on its surface: (a) side view, (b) front view (partial).
Sensors 19 02909 g009
Figure 10. The coordinate systems to represent the relative position between the spacer and the femoral component.
Figure 10. The coordinate systems to represent the relative position between the spacer and the femoral component.
Sensors 19 02909 g010
Figure 11. Geometry of the proposed analytic and non-iterative method.
Figure 11. Geometry of the proposed analytic and non-iterative method.
Sensors 19 02909 g011
Figure 12. Typical angles for instant prosthesis pose check: (a) 0°, (b) 90°.
Figure 12. Typical angles for instant prosthesis pose check: (a) 0°, (b) 90°.
Sensors 19 02909 g012
Figure 13. Kinetic pose reconstruction actually reconstructs the trajectory of point P in the femoral component.
Figure 13. Kinetic pose reconstruction actually reconstructs the trajectory of point P in the femoral component.
Sensors 19 02909 g013
Figure 14. Prototype system and the test platform.
Figure 14. Prototype system and the test platform.
Sensors 19 02909 g014
Figure 15. Instant relative pose reconstruction: (a) original image, (b) after denoising and contrast enhancement, (c) after lens distortion correction, (d) control point recognition, (e) instant pose reconstruction.
Figure 15. Instant relative pose reconstruction: (a) original image, (b) after denoising and contrast enhancement, (c) after lens distortion correction, (d) control point recognition, (e) instant pose reconstruction.
Sensors 19 02909 g015
Figure 16. Experimental pose reconstruction errors of flexion-extension (pitch) angle, internal–external (yaw) angle, and mediolateral translation.
Figure 16. Experimental pose reconstruction errors of flexion-extension (pitch) angle, internal–external (yaw) angle, and mediolateral translation.
Sensors 19 02909 g016
Figure 17. Contact force imbalance measurement: (a) measured force difference vs. actual force difference, (b) force imbalance measurement error vs. actual force difference.
Figure 17. Contact force imbalance measurement: (a) measured force difference vs. actual force difference, (b) force imbalance measurement error vs. actual force difference.
Sensors 19 02909 g017
Figure 18. Examples of kinetic pose reconstruction: (a) successful surgery gives smooth kinetic trajectory, (b) kinetic trajectory finds inappropriate surgery which cannot be found using force measurement or instant pose reconstruction.
Figure 18. Examples of kinetic pose reconstruction: (a) successful surgery gives smooth kinetic trajectory, (b) kinetic trajectory finds inappropriate surgery which cannot be found using force measurement or instant pose reconstruction.
Sensors 19 02909 g018
Table 1. Sensing device performance summary.
Table 1. Sensing device performance summary.
SpecificationPerformance
Typical size (varies with the prosthesis model)76 mm × 52 mm × 17 mm
Typical weight (varies with the prosthesis model)25.56 g
Force sensingNumber of sensors6
Measurement range (each sensor)0–45 N
Image sensingResolution240 × 240 or 480 × 480
Maximum frame rate8 fps (240 × 240)
Wireless transmitterCarrier frequency416 MHz
Maximum data rate3 Mbps
Power supply3 V
Average power consumption~10 mA@3V
Battery Lifetime(CR2025 Li/MnO2)>5 h
Table 2. Pose Reconstruction Errors.
Table 2. Pose Reconstruction Errors.
Maximum Error
(Absolute Value)
Average Error
(Absolute Value)
Mean ErrorStandard Derivation σ
flexion-extension (pitch)1.73°0.67°−0.01°1.06°
internal–external (yaw)1.08°0.51°0.03°0.50°
mediolateral translation1.55 mm0.82 mm−0.04 mm0.72 mm

Share and Cite

MDPI and ACS Style

Jiang, H.; Xiang, S.; Guo, Y.; Wang, Z. A Wireless Visualized Sensing System with Prosthesis Pose Reconstruction for Total Knee Arthroplasty. Sensors 2019, 19, 2909. https://doi.org/10.3390/s19132909

AMA Style

Jiang H, Xiang S, Guo Y, Wang Z. A Wireless Visualized Sensing System with Prosthesis Pose Reconstruction for Total Knee Arthroplasty. Sensors. 2019; 19(13):2909. https://doi.org/10.3390/s19132909

Chicago/Turabian Style

Jiang, Hanjun, Shaolin Xiang, Yanshu Guo, and Zhihua Wang. 2019. "A Wireless Visualized Sensing System with Prosthesis Pose Reconstruction for Total Knee Arthroplasty" Sensors 19, no. 13: 2909. https://doi.org/10.3390/s19132909

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop