Next Article in Journal
Comparison of the Non-Invasive Monitoring of Fresh-Cut Lettuce Condition with Imaging Reflectance Hyperspectrometer and Imaging PAM-Fluorimeter
Next Article in Special Issue
Intraoperative Control of Hemoglobin Oxygen Saturation in the Intestinal Wall during Anastomosis Surgery
Previous Article in Journal
Investigation of the Effect of Temperature Stabilization in Radiation–Heat Converters Based on a Strong Absorbing Coating
Previous Article in Special Issue
Characterization of Collagen I Fiber Thickness, Density, and Orientation in the Human Skin In Vivo Using Second-Harmonic Generation Imaging
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Multimodal and Multiview Wound Monitoring with Mobile Devices

by
Evelyn Gutierrez
1,2,*,
Benjamín Castañeda
1,
Sylvie Treuillet
2 and
Ivan Hernandez
3,4
1
Laboratorio de Imágenes Médicas, Pontificia Universidad Católica del Perú, Av. Universitaria Cuadra 18, Lima 15086, Peru
2
Laboratoire PRISME, Université d’Orléans, 8 Rue Léonard de Vinci, 45100 Orléans, France
3
Hospital Nacional Hipólito Unanue, Av. Cesar Vallejo 1390, Lima 15007, Peru
4
Facultad de Medicina, Universidad Ricardo Palma, Av. Benavides 5440, Lima 15039, Peru
*
Author to whom correspondence should be addressed.
Photonics 2021, 8(10), 424; https://doi.org/10.3390/photonics8100424
Submission received: 29 August 2021 / Revised: 28 September 2021 / Accepted: 28 September 2021 / Published: 2 October 2021
(This article belongs to the Special Issue Tissue Optics)

Abstract

:
Along with geometric and color indicators, thermography is another valuable source of information for wound monitoring. The interaction of geometry with thermography can provide predictive indicators of wound evolution; however, existing processes are focused on the use of high-cost devices with a static configuration, which restricts the scanning of large surfaces. In this study, we propose the use of commercial devices, such as mobile devices and portable thermography, to integrate information from different wavelengths onto the surface of a 3D model. A handheld acquisition is proposed in which color images are used to create a 3D model by using Structure from Motion (SfM), and thermography is incorporated into the 3D surface through a pose estimation refinement based on optimizing the temperature correlation between multiple views. Thermal and color 3D models were successfully created for six patients with multiple views from a low-cost commercial device. The results show the successful application of the proposed methodology where thermal mapping on 3D models is not limited in the scanning area and can provide consistent information between multiple thermal camera views. Further work will focus on studying the quantitative metrics obtained by the multi-view 3D models created with the proposed methodology.

1. Introduction

Three-dimensional measurements (i.e., length, width, area, and depth) are useful tools to accurately monitor wounds [1,2]. Measurements obtained with computer vision-based methods are noninvasive and more reliable than those obtained manually [3,4]. In addition, reconstructed 3D virtual color models allow for remote consultation and tracking history [5]. However, a complete wound assessment is necessary for clinicians to choose better treatments, and this requires not only volumetric metrics but additional information.
Temperatures related to the outside of the wound, such as periwound temperature and wound bed temperature, appear to be useful for clinical wound assessment [6,7]. Viewing these zones independently or in relation to other zones would serve to identify cases of stagnant or infection-prone wounds. For example, an increased peripheral wound temperature has been found to be associated with infection [8], and wound temperature correlates to a wound bed score, which combines several wound characteristics and has been validated as a useful predictor of wound closure [9].
In addition, temperature comparisons between different areas of the wound bed and outside the wound also provided valuable information. A temperature difference of 4 °C or more between the wound bed and normal reference skin is associated with infected wounds and may be useful in differentiating them from normally inflamed wounds [10]. On the other hand, the difference in temperatures between the peripheral area of the wound and normal skin temperature can help identify a tendency of non-healing, and a large variation in the wound bed temperature also seems to indicate a tendency of non-healing [11]. A more recent study proposes calculating an area indicator based on the temperature pattern [12]. Wound thermal areas are then used to create area ratios between two consecutive weeks, and these appear to be useful for predicting the healing status at week four.
The 3D model and the thermal surface distribution can provide geometrical (areas and volume) and thermal metrics in addition to a visualization that helps raise patient awareness of their wound status. Most studies show the feasibility of creating 3D surface thermography using expensive devices and cameras in static positions [13,14]; however, thermography from a static viewing position limits the observation and analysis of wounds to a single view and underexploits the 3D component.
The use of a single thermal image provides limited analysis of a wound and is a problem in the case of large wounds, multiple wounds or wounds located in areas that are too curved, such as the contour of the foot. In these cases, a single view provides only partial information about the inside and outside of the wound. In curved areas, the emissivity of the skin, which is estimated at 0.98 from a frontal view, changes depending on the angle from which you look at it, and this can result in temperature estimation errors of up to 4 °C [15,16,17]. Therefore, multiple thermal images are needed to provide a detailed and reliable measurement of the examination region.
Although large wounds or multiple wounds in the same area require more than just a single view for a better analysis, most previous work has used single-view static cameras. Thus, an interesting alternative is the use of portable thermal devices [18,19]. Moghadam presents a new device that allows to put three devices together: a thermal camera, a light projector, and a depth camera [18]. On the other hand, Xu et al. propose a calibration between portable thermal cameras and depth imaging [19]. Both alternatives are quite interesting as they can be used in handheld mode, but require the use of non-commercially available devices which could be expensive or difficult to obtain, especially in hospitals in remote locations.
In this work, we propose a set of comprehensive processes to create a multiview and multimodal, thermal and color, 3D model using low-cost commercial devices to facilitate wide deployment in hospitals and remote care centers. The contributions of the study could be summarized as follows:
  • The process of creating a 3D surface thermography from scratch using only portable devices: a mobile device and a low-cost portable thermal camera.
  • A novel approach to adjust the poses of the multiple views captured with the thermal camera to have an accurate mapping of thermal data on the 3D model surface.
  • The fusion of temperatures from multiple 2D thermograms to create a robust analysis of thermography using compelling 3D surface thermal mapping.

2. Materials and Methods

2.1. Data Acquisition

Images were acquired using a low-cost commercial thermal camera, FlirOne Pro (FLIR Systems, Inc., Wilsonville, OR, USA), and a mobile device, Samsung Galaxy Tab S4, at the hospital, Hipólito Unanue Hospital in Lima (Peru). The FlirOne Pro thermal camera is able to capture a pair of images in one shot: a 1080 × 1440 pixels RGB image, and a 480 × 640 pixels thermal image. The thermal accuracy of the FlirOne Pro, according to technical specifications, is ± 3 C or ± 5 % with 70 mk thermal sensitivity and a spectral range of 8–14 µm. On the other hand, the mobile device provides high-resolution RGB images: 4000 × 3000 pixels.
Prior to acquisition, patients were informed, and their consent to participate was obtained. Patients were asked to remain in a comfortable position, and a reference card was placed near the wound to adjust the scale of the 3D model.
Acquisition is performed handheld and starts with high-resolution color images acquired with the mobile device. Around 40 images are captured from different views following a classical acquisition for photogrammetry, where the device moves using a recommended arc-shaped motion with the camera always oriented towards the center of the wound. The goal of this acquisition is to have a variety of overlapping views to build a detailed 3D model. Thus, in larger wounds, the number of images can be increased in order to cover the entire wound with multiple views.
Next, with the thermal camera device also in handheld mode, around 8 images are obtained from different views using an arc-shaped motion similar to the previous one. In addition, to have a reference view of the temperature distribution in the wound and surrounding skin, a frontal thermal view of the wound was also captured approximately 10 cm further away than the other thermal images. This farthest frontal image was taken at the time chosen by the operator, i.e., it is not necessarily the last one taken. Figure 1 shows an example of the acquisition performed with the mobile devices in handheld mode and the camera poses used.
The goal of the various thermal images is to capture the thermography of the wound and the surrounding area from various perspectives in order to combine them and map the temperatures to the surface of the 3D model. Not as many images are needed as those obtained in color; however, a larger number of images from different views could be useful to cover a more extensive surface of the 3D model with thermography. Additionally, as this is a manual acquisition, note that the operator may take a few more color or thermal images than requested if he inadvertently presses the camera shutter button during the acquisition process. This is not a problem as more images of the wound and surroundings will help in the reconstruction of the 3D model and creating the 3D surface thermography. The entire image acquisition process usually takes about 2 min and depends on the operator’s ability to take the pictures.

2.2. Thermal Mapping on 3D Model

From a 3D point cloud and a single thermal image with a known camera pose, temperatures can be assigned to the surface of the 3D model. For this, the 3D point cloud is projected onto the 2D thermal image plane, and the temperatures of each pixel are mapped to the projected points.
The 3D model is created using the Structure from Motion (SfM) technique as it only requires a set of color images to create the point cloud, mesh, and texture of an object in 3D. This technique is particularly useful since it allows the use of any commercially available camera to take multiple images from different perspectives and provide useful geometrical metrics for wounds [20]. In this work, an incremental SfM pipeline is used that allows using images from cameras with different internal calibrations: the color images obtained with the mobile device and the color images obtained with the FlirOne device [21]. The whole process is performed using Meshroom software [22].
The registration of the 2D temperature to the 3D point cloud is accurate when the color and thermal cameras are in a fixed relative position, and the captures are performed at the same time. However, when working with a handheld mode acquisition, despite having the cameras in a fixed relative position on the device, the captures are performed asynchronously with a small delay between them; therefore, problems in accurately matching 2D to 3D surface thermography arise. To address this problem, we propose to use a baseline registration between the FlirOne color and thermal pair of images based on multimodal calibration, then refine the infrared (IR) camera pose by optimizing the correlation of temperatures mapped from different thermal views.

2.3. Baseline Registration

The pair of sensors of the FlirOne thermal device, color and infrared image sensors, are separated by a small distance inside the thermal camera device. When obtaining the camera pose from 3D modeling, the poses of the FlirOne’s color image sensor are acquired. Thus, to estimate the baseline pose of the corresponding thermal image sensor, a geometrical transformation has to be performed: a rotation and translation transformation.
The rotation ( R 0 ) and translation ( t 0 ) required for the geometrical transformation are obtained through a multimodal calibration using a thermal checkerboard created in our previous work [23]. The transformation is then applied to the extrinsic camera pose parameters of the i-th color image (RGB image) estimated by SfM. Let R i R G B and t i R G B be the extrinsics parameters of the i-th color image. To obtain a baseline estimate of the i-th thermal camera pose, the following transformation will be used:
R i I R B a s e = R 0 R i R G B t i I R B a s e = R 0 t i R G B + t 0

2.4. Multiview Thermography

The creation of a multi-view 3D surface thermography consists of three steps: First, a selection of a reference thermal mapping. Then, an adjustment of the camera poses according to the reference mapping; finally, a combination of the thermal mapping information from different views. The result is a multi-view thermal mapping of the 3D surface, which has a higher resolution and can cover a larger area compared to single-view thermal mapping. The workflow is presented in Figure 2.

2.4.1. Reference Surface Thermography

The reference surface thermography or reference view in this study is the surface thermography associated to a reference thermal frame: the thermal image taken from the farthest frontal view of the wound.
The reference thermal frame is automatically selected from all thermal frames by calculating the angles and distances between each camera and the wound. To do this, we use the following process: For each thermal image, a wound segmentation is performed on its corresponding RGB image; then the 2D segmentation is passed onto to the 3D model surface; and finally, the distance and angle between the thermal camera and the wound region are calculated. The distance and angles calculated in this step are used to automatically select the furthest frontal frame, i.e., the reference thermal frame. The automated 2D wound segmentation process was performed based on a deep learning model developed in a previous study by the same research group [24].
The reference surface thermography provides an overview of the temperature pattern of the wound and its surroundings. Though this surface thermography does not contain detailed information, it serves to fine-tune the camera poses of the remaining views, which will provide finer temperature details in the final 3D surface thermography.

2.4.2. Thermal Camera Pose Adjustments

The relative baseline transformation between the cameras, denoted as R 0 (rotation) and T 0 (translation), give an estimate of the position of the thermal sensor that must be adjusted. For this adjustment, the reference view, i.e., the reference 3D surface thermography, is proposed to be used as the basis. The thermal camera poses are adjusted to ensure similarity between the temperatures from the projection of the reference view and each thermal image.
The adjustment of the camera poses is performed for each of the individual frames obtained with the thermal camera. It starts by projecting the reference view to a thermal image plane using the baseline thermal camera pose. Then, an optimization is performed by adjusting the thermal camera pose in order to maximize the similarity in temperatures between the synthetic thermal image (projected reference surface thermography) and the real thermal image. Since both synthetic and real thermal images are in the same modality, the similarity indicator chosen to perform this optimization is the intensity-based 2D correlation coefficient [25,26]. Moreover, the 2D correlation coefficient provides an indicator that is easy to interpret: the closer it is to one, the more similar the images are.
R I R A d j , t I R A d j = argmax R , t C o r r ( I ( R , t ) , I )
where I ( R , t ) is a one-dimensional vector of temperature values from the synthetic thermal image created with the camera pose R and t. I is a one-dimensional vector of temperature values from the real thermal image.

2.4.3. Combining Multi-View Thermal Data

The thermal mapping to the 3D model surface is performed for each of the thermal images using the refined camera poses obtained after the camera pose adjustment. The 3D point cloud projected onto the 2D thermal image plane is used to map temperatures from pixels to points. As a result, multiple thermal data measurements are obtained for each of the points on the surface of the 3D model.
Multiple data measurements for each point are combined with a weighted average to obtain a temperature summary for each point on the surface. The weighted average of the temperatures from multiple views is performed with an algorithm similar to the one proposed by Vidas et al. [27]. In our case, the weighted average takes into account the distance and viewing angle between the camera and each point. The distance between the camera and each point is calculated from the camera pose. The angle between the camera and each point is calculated between the principal axis of the camera and the normal of each point—the latter estimated via principal component analysis for the covariance matrix of the nearest neighbor points. Distance and angle calculations are computed with the open3D library in Python [28].
Experimentally and theoretically, it has been proven that the viewing angle of a surface influences the emissivity of the surface, and consequently, errors in temperature measurements can be recorded [15,16]. When the viewing angle is less than about 40°, no significant changes in skin emissivity are reported, and therefore, no significant errors in temperature measurements are observed. However, much wider angles are affected by a measurement error. Between 40° and 60° in the viewing angle, about 2 °C or less of measurement error can occur; and as the angle moves away from 60°, larger errors in temperature measurements occur.
Similarly, errors in temperature measurements appear as the distance between the skin and the camera increases [17]. Environmental conditions, reflection from other surfaces, and ambient radiation are factors that contribute to error when imaging at long distances. Therefore, thermographic images obtained at short distances, below 40 cm, are considered more reliable than those farther away.
Thus, in the process of temperature fusion from different views that we propose, higher priority is given to thermal information obtained at a distance of up to 40 cm and a viewing angle of less than 60° between the camera and the skin. The weights are inversely associated to distance and angle via inverse logistic function to set high weights on distances before 40 cm and angles below 60°. The weighting functions used to prioritize information obtained from different angles and distances are shown in Figure 3. Weighted averaging to combine the information of k thermal images from different perspectives is performed at each point in the 3D point cloud as follows:
T f i n a l = i = 1 k w i T i where w i = w i d i s t w i a n g , and w i d i s t = 1 1 + e x 40 4 , w i a n g = 1 1 + e x 60 4
where T f i n a l is the final temperature for a point in the 3D point cloud, and w i is the final weight of the thermal information at a point, obtained with the i-th of the k thermal images. w i d i s t and w i a n g are the weights of the information obtained from the i-th thermal image according to the camera-point distance and angle.

2.5. 3D Wound and Periwound Area

The wound and periwound area definition in the 3D model is necessary to calculate the summary of temperature measurements inside and outside the wound (periwound). In this section, we explain how the wound and periwound area are defined in the 3D model.
The wound segmentation in the 3D model was created from segmentations of the 2D color images. The 2D segmentation was performed using a deep learning model previously developed by the same research group [24]. All 2D segmentations were reprojected to the 3D point cloud in order to create the wound segmentation in 3D. Discrepancies between multiple segmentations were resolved according to the majority vote of the multiple views at each point in the cloud.
The periwound was then defined as the area outside the wound with a distance of up to 4 cm from the wound edges. For the calculation of the distances, the geodesic distance was used to account for the curvature of the body. Figure 4 illustrates the 2D wound segmentation, the 3D wound segmentation, and periwound area definition.

3. Results

The proposed methodology was successfully applied to wound images from six different patients. Figure 5 illustrates the multiview thermal models in two cases of wounds: one located in a curved area and the other with multiple wounds in the same area. By using a multiview thermography with the proposed methodology, a more extensive area of the skin can be covered with thermal information, making it possible to obtain thermal metrics of the wound bed and periwound.
During the pose refinement algorithm, the 2D correlation usually improves from 0.5 or less with the reference poses to 0.75 or more with the proposed pose refinement. Pose refinement improvements are observed by projecting the reference 3D surface thermogram onto each 2D view using camera pose estimation and comparing them with the original thermal images, as in Figure 6. In this figure, we can see examples of synthetic thermal images created from the projection of the reference 3D surface thermography using the baseline and the adjusted camera pose, respectively. The synthetic image created with the adjusted pose is clearly closer to the original thermal image.
Another way to observe these improvements is to look at the 3D model from the thermal camera perspective and overlay the thermal image to see if there is a correspondence between them. Figure 7 illustrates examples of the thermal images superimposed on the 3D model view according to the estimated camera pose with the baseline and adjusted camera pose. A better correspondence between the thermal image and the 3D color model is observed with the adjusted poses than with the reference poses. However, small offsets could remain. For example, the second case from left to right in Figure 7 shows an improvement in the vertical alignment but no improvement in the horizontal axis.
Furthermore, to quantitatively assess camera pose improvement, we used an indicator of consistency between temperatures mapped from multiple views: the Intraclass Correlation Coefficient (ICC), a widely used consistency index [29]. Unlike the correlation coefficient used within our process, which evaluates the similarity between the temperatures of each image with the reference thermal view at the pixel level, the ICC gives us a summary measure of the overall consistency between the multiple temperatures measurements assigned to each point in the 3D point cloud.
Table 1 shows the ICC index to address consistency between temperatures from different thermal views using the baseline and adjusted camera poses. The ICC improves overall from moderate consistency (0.67 on average) with the baseline poses to good consistency (0.87 on average) with the adjusted poses. We also observed larger ICC improvements for those patients who have a moderate consistency among thermal data (baseline ICC of 0.65 or lower): patients P1, P2, and P4. Patients with a baseline ICC of 0.85 or higher show only slight or no improvement in consistency. Patient P5 is the only case where the consistency does not improve. This means that adjusting the camera pose did not improve the consistency between the different thermal views. However, note that the initial ICC consistency was already high with the baseline pose (ICC = 0.89). In such cases, it could be decided not to use the adjusted poses and to continue the combination of multiple thermal views using the baseline poses.
Finally, the 3D models and thermal surface distribution created with the proposed methodology are used to obtain metrics of the inside and outside of the wound. Table 2 shows the results of the visual inspection of the wounds, an evaluation that is usually performed for the follow-up of these wounds in the hospital; on the other hand, the quantitative metrics obtained from the 3D models and thermography created with our methodology are shown in the Table 3.
The results show a diversity of patterns in patient measurements, in particular, in temperature differences between the wound bed and the periwound zone, which may help to evidence proper healing or problems around the wound. For example, patient P4 has a lower periwound temperature, which may suggest problems with wound progress. This result could also be related to suspected bone contact, as evidenced by visual inspection in Table 2.
In addition, variability is also observed in the standard deviation of temperature at the periphery. Patients with clinical observations are those with the greatest standard deviation in temperature around the wound of 1 °C or more. These results may be related to problems in the circulation around the wound. The results encourage us to further analyze more cases and evaluate all these measurements.

4. Discussion

In this article, we present the process of creating 3D thermal models that help evaluate large wounds or multiple wounds in the same area. For the first time, a methodology is presented for integrating information from the visible and infrared spectra into 3D wound models using low-cost, portable commercial devices.
In addition to the importance of visible and geometric indicators for objective wound monitoring (such as areas, volume and different tissue segmentation), thermography could also provide other types of indicators in the non-visible spectrum that could help distinguish normal inflammation from wound infections or predict wound evolution [6,7,8,9,10,12]. Thus, our work expands the range of options for creating multimodal 3D thermal and color models, which currently rely primarily on the use of high-cost, fixed-position thermal cameras. The advantage of our approach is that it does not restrict the extent of the wound to be scanned.
Additionally, our proposed methodology helps to combine temperatures from different thermograms and achieves consistency between different thermal views. This allows us to have consistent 3D models with greater coverage of the thermography than those currently available.
Thermal and geometrical measurements have been obtained from the 3D models constructed with our methodology. From the observed cases, the temperature difference between the wound and periwound, and the standard deviation of temperatures in the periwound could be potential metrics for wound evolution and require further investigation. Moreover, further research is needed on the use of thermal metrics obtained from low-cost portable devices, which have less thermal accuracy and reduced thermal image resolution than expensive high-end cameras. Previous studies propose the use of different metrics, and most of them use high-end thermal cameras that have higher thermal accuracy than low-cost commercial devices [11,13,14]. Thus, we are looking forward to conducting a longitudinal study to analyze the metrics in the literature, and identify those that have predictive power in wound evolution and can be used as follow-up metrics in the specific case of using low-cost hand-held devices.
Finally, the presented process allows an excellent resource for the teledermatology in the future and, in particular, for the prevention, follow-up, and monitoring of chronic wounds in remote areas, offering the possibility of more opportune examinations and diagnoses with less transport costs for the patient, and decreasing the expenses of the hospital at the same time. Our work also provides access to multimodal 3D models using low-cost handheld devices that employ an automated process through free, easy-to-use software. Basic knowledge of photo acquisition and basic computer skills would be sufficient to use the proposed process. Therefore, its application in hospitals and health centers would not pose any problems and would only require basic training in software handling and some training in the acquisition of photographs in the handheld mode, which is increasingly becoming familiar due to the widespread adoption of smartphones and, recently, the use of portable thermal cameras.

5. Conclusions

We present a methodology for creating 3D color and thermal models from multiple thermographic views obtained with low-cost handheld devices. This methodology is applied in six wounds, and the results show that consistency between the different thermographic views is achieved. Additionally, this methodology helps to increase the spatial and thermal resolution of the surface in the 3D model.
Different metrics combining 3D color information and thermography were obtained based on the models created with the proposed methodology. These metrics could be potentially useful for clinical evaluation; however, they require further study. We look forward to performing larger studies with more participants to help define the relationship between the different metrics and the evolution of the wound.

Author Contributions

Conceptualization, E.G., S.T., and B.C.; methodology, E.G.; resources I.H.; software, E.G.; validation, E.G. and I.H.; formal analysis, E.G.; investigation, E.G. and I.H.; data curation, E.G.; writing—original draft preparation, E.G.; writing—review and editing, E.G., S.T., and B.C.; supervision, S.T. and B.C.; project administration, S.T. and B.C.; funding acquisition, S.T. and B.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research has received funding from the European Union’s Horizon 2020 research and innovation program under the Marie Sklodowska-Curie grant agreement N° 777661 (STANDUP project).

Institutional Review Board Statement

The study was conducted according to the guidelines of the Declaration of Helsinki and approved by the Ethics Committee of Hipolito Unanue National Hospital, Lima, Peru (protocol code: 22494; date of approval: 4 December 2019).

Informed Consent Statement

Informed consent was obtained from all subjects involved in the study.

Data Availability Statement

The data presented in this study are available on request from the corresponding author.

Acknowledgments

The authors thank the Laboratory of Medical Imaging of Lima, Peru, for their support with data acquisition.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Zvietcovich, F.; Castaneda, B.; Valencia, B.; Llanos-Cuentas, A. A 3D assessment tool for accurate volume measurement for monitoring the evolution of cutaneous Leishmaniasis wounds. In Proceedings of the 2012 Annual International Conference of the IEEE Engineering in Medicine and Biology Society, San Diego, CA, USA, 28 August–1 September 2012; IEEE: San Diego, CA, USA, 2012; pp. 2025–2028. [Google Scholar] [CrossRef]
  2. Casas, L.; Castaneda, B.; Treuillet, S. Imaging technologies applied to chronic wounds: A survey. In Proceedings of the 4th International Symposium on Applied Sciences in Biomedical and Communication Technologies-ISABEL ’11, Barcelona, Spain, 26–29 October 2011; ACM Press: Barcelona, Spain, 2011; pp. 1–5. [Google Scholar] [CrossRef]
  3. Jørgensen, L.B.; Halekoh, U.; Jemec, G.B.; Sørensen, J.A.; Yderstræde, K.B. Monitoring Wound Healing of Diabetic Foot Ulcers Using Two-Dimensional and Three-Dimensional Wound Measurement Techniques: A Prospective Cohort Study. Adv. Wound Care 2020, 9, 553–563. [Google Scholar] [CrossRef]
  4. Bowling, F.L.; King, L.; Fadavi, H.; Paterson, J.A.; Preece, K.; Daniel, R.W.; Matthews, D.J.; Boulton, A.J.M. An assessment of the accuracy and usability of a novel optical wound measurement system. Diabet. Med. 2009, 26, 93–96. [Google Scholar] [CrossRef] [PubMed]
  5. Lucas, Y.; Niri, R.; Treuillet, S.; Douzi, H.; Castaneda, B. Wound Size Imaging: Ready for Smart Assessment and Monitoring. Adv. Wound Care 2021, 10, 641–661. [Google Scholar] [CrossRef]
  6. Serbu, D.G. Infrared Imaging of the Diabetic Foot. InfraMation Proc 2009, 86, 5–20. [Google Scholar]
  7. Alametsä, J.; Oikarainen, M.; Perttunen, J.; Viik, J.; Vaalasti, A. Thermal imaging in skin trauma evaluation: Observations by CAT S60 mobile phone. Finn. J. Ehealth eWelfare 2018, 10, 192–199. [Google Scholar] [CrossRef]
  8. Fierheller, M.; Sibbald, R.G. A Clinical Investigation into the Relationship between Increased Periwound Skin Temperature and Local Wound Infection in Patients with Chronic Leg Ulcers. Adv. Ski. Wound Care 2010, 23, 369–379. [Google Scholar] [CrossRef] [PubMed]
  9. Dini, V.; Salvo, P.; Janowska, A.; Di Francesco, F.; Barbini, A.; Romanelli, M. Correlation Between Wound Temperature Obtained With an Infrared Camera and Clinical Wound Bed Score in Venous Leg Ulcers. Wounds Compend. Clin. Res. Pract. 2015, 27, 274–278. [Google Scholar]
  10. Chanmugam, A.; Langemo, D.; Thomason, K.; Haan, J.; Altenburger, E.A.; Tippett, A.; Henderson, L.; Zortman, T.A. Relative Temperature Maximum in Wound Infection and Inflammation as Compared with a Control Subject Using Long-Wave Infrared Thermography. Adv. Ski. Wound Care 2017, 30, 406–414. [Google Scholar] [CrossRef] [PubMed]
  11. Chang, M.C.; Yu, T.; Luo, J.; Duan, K.; Tu, P.; Zhao, Y.; Nagraj, N.; Rajiv, V.; Priebe, M.; Wood, E.A.; et al. Multimodal Sensor System for Pressure Ulcer Wound Assessment and Care. IEEE Trans. Ind. Inform. 2018, 14, 1186–1196. [Google Scholar] [CrossRef]
  12. Aliahmad, B.; Tint, A.N.; Poosapadi Arjunan, S.; Rani, P.; Kumar, D.K.; Miller, J.; Zajac, J.D.; Wang, G.; Ekinci, E.I. Is Thermal Imaging a Useful Predictor of the Healing Status of Diabetes-Related Foot Ulcers? A Pilot Study. J. Diabetes Sci. Technol. 2019, 13, 561–567. [Google Scholar] [CrossRef]
  13. Barone, S.; Paoli, A.; Razionale, A.V. Assessment of Chronic Wounds by Three-Dimensional Optical Imaging Based on Integrating Geometrical, Chromatic, and Thermal Data. Proc. Inst. Mech. Eng. Part H J. Eng. Med. 2011, 225, 181–193. [Google Scholar] [CrossRef]
  14. van Doremalen, R.F.M.; van Netten, J.J.; van Baal, J.G.; Vollenbroek-Hutten, M.M.R.; van der Heijden, F. Infrared 3D Thermography for Inflammation Detection in Diabetic Foot Disease: A Proof of Concept. J. Diabetes Sci. Technol. 2020, 14, 46–54. [Google Scholar] [CrossRef]
  15. Watmough, D.J.; Fowler, P.W.; Oliver, R. The thermal scanning of a curved isothermal surface: Implications for clinical thermography. Phys. Med. Biol. 1970, 15, 1–8. [Google Scholar] [CrossRef]
  16. Keenan, E.; Gethin, G.; Flynn, L.; Watterson, D.; O’Connor, G.M. Enhanced thermal imaging of wound tissue for better clinical decision making. Physiol. Meas. 2017, 38, 1104–1115. [Google Scholar] [CrossRef]
  17. Gutierrez, E.; Castañeda, B.; Treuillet, S. Correction of Temperature Estimated from a Low-Cost Handheld Infrared Camera for Clinical Monitoring. In Advanced Concepts for Intelligent Vision Systems; Blanc-Talon, J., Delmas, P., Philips, W., Popescu, D., Scheunders, P., Eds.; Series Title: Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2020; Volume 12002, pp. 108–116. [Google Scholar] [CrossRef]
  18. Moghadam, P. 3D Medical Thermography Device; International Society for Optics and Photonics: Bellingham, WA, USA, 2015; p. 94851J. [Google Scholar] [CrossRef]
  19. Xu, B.; Ye, Z.; Wang, F.; Yang, J.; Cao, Y.; Tisse, C.L.; Li, X.; Cao, Y. On-the-fly extrinsic calibration of multimodal sensing system for fast 3D thermographic scanning. Appl. Opt. 2019, 58, 3238. [Google Scholar] [CrossRef]
  20. Zenteno, O.; González, E.; Treuillet, S.; Valencia, B.M.; Castaneda, B.; Llanos-Cuentas, A.; Lucas, Y. Volumetric monitoring of cutaneous leishmaniasis ulcers: Can camera be as accurate as laser scanner? Comput. Methods Biomech. Biomed. Eng. Imaging Vis. 2019, 7, 667–675. [Google Scholar] [CrossRef]
  21. Shah, R.; Deshpande, A.; Narayanan, P. Multistage SFM: Revisiting Incremental Structure from Motion. In Proceedings of the 2014 2nd International Conference on 3D Vision, Tokyo, Japan, 8–11 December 2014; Volume 1, pp. 417–424. [Google Scholar] [CrossRef] [Green Version]
  22. AliceVision. Meshroom: A 3D Reconstruction Software. 2018. Available online: https://alicevision.org (accessed on 29 August 2021).
  23. Gutierrez, E.; Castañeda, B.; Treuillet, S.; Lucas, Y. Combined thermal and color 3D model for wound evaluation from handheld devices. In Proceedings of the Medical Imaging 2021: Imaging Informatics for Healthcare, Research, and Applications, San Diego, CA, USA, 15 February 2021; SPIE: Bellingham, WA, USA, 2021; p. 7. [Google Scholar] [CrossRef]
  24. Rania, N.; Douzi, H.; Yves, L.; Sylvie, T. Semantic Segmentation of Diabetic Foot Ulcer Images: Dealing with Small Dataset in DL Approaches. In Image and Signal Processing; El Moataz, A., Mammass, D., Mansouri, A., Nouboud, F., Eds.; Series Title: Lecture Notes in Computer Science; Springer: Cham, Switzerland, 2020; Volume 12119, pp. 162–169. [Google Scholar] [CrossRef]
  25. Brown, L.G. A survey of image registration techniques. ACM Comput. Surv. 1992, 24, 325–376. [Google Scholar] [CrossRef]
  26. Valero, M.M.; Verstockt, S.; Mata, C.; Jimenez, D.; Queen, L.; Rios, O.; Pastor, E.; Planas, E. Image Similarity Metrics Suitable for Infrared Video Stabilization during Active Wildfire Monitoring: A Comparative Analysis. Remote Sens. 2020, 12, 540. [Google Scholar] [CrossRef] [Green Version]
  27. Vidas, S.; Moghadam, P.; Bosse, M. 3D thermal mapping of building interiors using an RGB-D and thermal camera. In Proceedings of the 2013 IEEE International Conference on Robotics and Automation, Karlsruhe, Germany, 6–10 May 2013; IEEE: Karlsruhe, Germany, 2013; pp. 2311–2318. [Google Scholar] [CrossRef] [Green Version]
  28. Zhou, Q.Y.; Park, J.; Koltun, V. Open3D: A Modern Library for 3D Data Processing. arXiv 2018, arXiv:1801.09847. [Google Scholar]
  29. Koo, T.K.; Li, M.Y. A Guideline of Selecting and Reporting Intraclass Correlation Coefficients for Reliability Research. J. Chiropr. Med. 2016, 15, 155–163. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Acquisition set up: (a) the FlirOne thermal camera is directly connected to the top of the mobile device. (b) An example of the mobile device and FlirOne camera poses for image acquisition.
Figure 1. Acquisition set up: (a) the FlirOne thermal camera is directly connected to the top of the mobile device. (b) An example of the mobile device and FlirOne camera poses for image acquisition.
Photonics 08 00424 g001
Figure 2. The proposed pipeline for creating a multi-view 3D surface thermography starting from single-view 3D surface thermographs.
Figure 2. The proposed pipeline for creating a multi-view 3D surface thermography starting from single-view 3D surface thermographs.
Photonics 08 00424 g002
Figure 3. Weighting function based on distance and camera angles to combine information from multiple thermal images. Weights are inversely associated to distance and angle via inverse logistic function to set high weights to measurements obtained from distances smaller than 40 cm and angles below 60°.
Figure 3. Weighting function based on distance and camera angles to combine information from multiple thermal images. Weights are inversely associated to distance and angle via inverse logistic function to set high weights to measurements obtained from distances smaller than 40 cm and angles below 60°.
Photonics 08 00424 g003
Figure 4. Illustration of the wound segmentation in 2D, as well as the definition of the wound and periwound area in the 3D model.
Figure 4. Illustration of the wound segmentation in 2D, as well as the definition of the wound and periwound area in the 3D model.
Photonics 08 00424 g004
Figure 5. Illustration of the proposed process with two real cases: (left) color 3D models, (center) single-view thermal mapping, and (right) multi-view 3D surface thermography. Extensive coverage of thermal information is shown in the multi-view 3D thermal model compared to the single view.
Figure 5. Illustration of the proposed process with two real cases: (left) color 3D models, (center) single-view thermal mapping, and (right) multi-view 3D surface thermography. Extensive coverage of thermal information is shown in the multi-view 3D thermal model compared to the single view.
Photonics 08 00424 g005
Figure 6. Illustration of synthetic thermal images created by projecting the reference view model using the baseline (left), the adjusted IR camera pose (center), and the original thermal image (right). The dashed horizontal lines show that the synthetic images are in better alignment with the actual thermal image when the camera pose is adjusted.
Figure 6. Illustration of synthetic thermal images created by projecting the reference view model using the baseline (left), the adjusted IR camera pose (center), and the original thermal image (right). The dashed horizontal lines show that the synthetic images are in better alignment with the actual thermal image when the camera pose is adjusted.
Photonics 08 00424 g006
Figure 7. Illustration of the 2D thermal images overlaid onto the 3D model according from camera view: (top) given by the baseline camera pose and (bottom) after the proposed adjustment in camera pose. A better alignment of the skin edges is observed when using the adjusted thermal camera poses.
Figure 7. Illustration of the 2D thermal images overlaid onto the 3D model according from camera view: (top) given by the baseline camera pose and (bottom) after the proposed adjustment in camera pose. A better alignment of the skin edges is observed when using the adjusted thermal camera poses.
Photonics 08 00424 g007
Table 1. Results on the intraclass correlation coefficient (ICC) when using baseline and adjusted camera poses, and improvement in the ICC using adjusted poses. The table shows that the consistency between thermal information from multiple thermal views generally improves using adjusted camera poses, and large improvements are found especially when the baseline ICC consistency is 0.65 or lower.
Table 1. Results on the intraclass correlation coefficient (ICC) when using baseline and adjusted camera poses, and improvement in the ICC using adjusted poses. The table shows that the consistency between thermal information from multiple thermal views generally improves using adjusted camera poses, and large improvements are found especially when the baseline ICC consistency is 0.65 or lower.
PatientICCICC Improvement
BaselineAdjusted(Adj-Base)
P10.650.970.32
P20.270.910.64
P30.870.940.07
P40.420.660.24
P50.890.78−0.11
P60.930.940.01
Table 2. Results of the clinical visual assessment of wounds in the lower extremities of the 6 patients for which we created the 3D color and thermal model with the proposed methodology.
Table 2. Results of the clinical visual assessment of wounds in the lower extremities of the 6 patients for which we created the 3D color and thermal model with the proposed methodology.
PatientVisual AspectObservations
WoundPeriwound
P1Clean, little fibrinoid,
large granular tissue.
Moist with
maceration.
Probable adjacent
epithelial necrosis.
P2CleanNormal
appereance.
P3Clean,
adherent fibrinoid.
Slight
inflammation.
P4Dry fibrinoid.DrySuspected bone
contact.
P5Clean,
granular tissue.
Normal
appereance.
Wound after
amputation.
P6CleanNormal
appereance.
Table 3. Geometrical and thermal indicators obtained with the multimodal and multiview 3D surface thermography. Different patterns are observed especially in the temperature difference between the wound bed (WB) and the periwound (PW) and in the standard deviation of the periwound temperature.
Table 3. Geometrical and thermal indicators obtained with the multimodal and multiview 3D surface thermography. Different patterns are observed especially in the temperature difference between the wound bed (WB) and the periwound (PW) and in the standard deviation of the periwound temperature.
PatientWound AreaMedian TemperatureStd Dev Temp.
WoundPeri-Diff.WoundPeri-
(cm2)Bed (°C)Wound (°C)(PW-WB).Bed (°C)Wound (°C)
P19.2931.0331.950.920.591.22
P24.4933.3134.611.300.450.90
P30.8032.9834.101.120.420.43
P41.4928.2927.65−0.640.461.00
P525.4328.9229.150.230.711.35
P62.4736.0636.890.830.430.44
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Share and Cite

MDPI and ACS Style

Gutierrez, E.; Castañeda, B.; Treuillet, S.; Hernandez, I. Multimodal and Multiview Wound Monitoring with Mobile Devices. Photonics 2021, 8, 424. https://doi.org/10.3390/photonics8100424

AMA Style

Gutierrez E, Castañeda B, Treuillet S, Hernandez I. Multimodal and Multiview Wound Monitoring with Mobile Devices. Photonics. 2021; 8(10):424. https://doi.org/10.3390/photonics8100424

Chicago/Turabian Style

Gutierrez, Evelyn, Benjamín Castañeda, Sylvie Treuillet, and Ivan Hernandez. 2021. "Multimodal and Multiview Wound Monitoring with Mobile Devices" Photonics 8, no. 10: 424. https://doi.org/10.3390/photonics8100424

APA Style

Gutierrez, E., Castañeda, B., Treuillet, S., & Hernandez, I. (2021). Multimodal and Multiview Wound Monitoring with Mobile Devices. Photonics, 8(10), 424. https://doi.org/10.3390/photonics8100424

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop