Next Article in Journal
CNN-Based Identification of Parkinson’s Disease from Continuous Speech in Noisy Environments
Next Article in Special Issue
Artificial Intelligence Applications for Osteoporosis Classification Using Computed Tomography
Previous Article in Journal
Automatic COVID-19 and Common-Acquired Pneumonia Diagnosis Using Chest CT Scans
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Prediction of Contaminated Areas Using Ultraviolet Fluorescence Markers for Medical Simulation: A Mobile Phone Application Approach

1
Department of Emergency Medicine, National Cheng Kung University Hospital, College of Medicine, National Cheng Kung University, Tainan 70101, Taiwan
2
Department of Mold and Die Engineering, National Kaohsiung University of Science and Technology, Kaohsiung 80782, Taiwan
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
Bioengineering 2023, 10(5), 530; https://doi.org/10.3390/bioengineering10050530
Submission received: 9 March 2023 / Revised: 14 April 2023 / Accepted: 23 April 2023 / Published: 26 April 2023
(This article belongs to the Special Issue Computer Vision and Machine Learning in Medical Applications)

Abstract

:
The use of ultraviolet fluorescence markers in medical simulations has become popular in recent years, especially during the COVID-19 pandemic. Healthcare workers use ultraviolet fluorescence markers to replace pathogens or secretions, and then calculate the regions of contamination. Health providers can use bioimage processing software to calculate the area and quantity of fluorescent dyes. However, traditional image processing software has its limitations and lacks real-time capabilities, making it more suitable for laboratory use than for clinical settings. In this study, mobile phones were used to measure areas contaminated during medical treatment. During the research process, a mobile phone camera was used to photograph the contaminated regions at an orthogonal angle. The fluorescence marker-contaminated area and photographed image area were proportionally related. The areas of contaminated regions can be calculated using this relationship. We used Android Studio software to write a mobile application to convert photos and recreate the true contaminated area. In this application, color photographs are converted into grayscale, and then into black and white binary photographs using binarization. After this process, the fluorescence-contaminated area is calculated easily. The results of our study showed that within a limited distance (50–100 cm) and with controlled ambient light, the error in the calculated contamination area was 6%. This study provides a low-cost, easy, and ready-to-use tool for healthcare workers to estimate the area of fluorescent dye regions during medical simulations. This tool can promote medical education and training on infectious disease preparation.

Graphical Abstract

1. Introduction

Ultraviolet fluorescence markers have been used in the medical field for many years, including training simulations, infection control [1,2], dermal contaminations in occupational hygiene [3], and fluorescence staining in microbial cells [4]. During the COVID-19 pandemic, increasing numbers of healthcare workers used ultraviolet fluorescence markers to replace pathogens or secretions and detect regions of contamination [5]. The more contamination regions that are present, the more severe the environmental contamination. There are several bioimage processing software programs that can be used to calculate the area and quantity of fluorescent dyes, including ImageJ, FIJI [6], CellProfiler [7,8], and Icy [9,10]. Currently, ImageJ is the most commonly used software for measuring fluorescence marker areas in the medical field [11,12]. ImageJ is a publicly available image processing software based on Java that was developed by the National Institutes of Health [13,14]. This software is frequently used to analyze medical problems, such as contaminated areas [15], sperm density [16], fluorescent cell stains [4], and corneal neovascularization [17]. However, the software is used by first taking a photo with a mobile phone or camera, and then analyzing the photo using ImageJ software on a computer. Based on available information, it is currently difficult to use ImageJ software on smartphones [18,19]. Therefore, we hope to establish a direct method to measure areas contaminated with dye markers in situ. In this study, we aim to create an approach that utilizes direct estimation of a contaminated area by taking photos with a smartphone.
There are two approaches to obtaining the size of a contamination region using a camera. The first uses a camera with fixed focal-length lenses to orthogonally capture a photograph at a fixed distance. However, this approach requires an understanding of the conversion rules (for translation and rotation) between the coordinate system of the camera and the actual coordinate system, which is often referred to as a homography matrix. Homography has been applied to image correction, image stitching, camera pose estimation, and vision construction in the field of computer visualization [20,21,22,23]. Although homography is a mature technology, it is relatively complicated [24,25]. Instead of using homography, this study proposes a method to compare an orthogonal photo of a contaminated area with a photo of an area of known size to calculate the size of the contaminated region using their area ratio.

2. Materials and Methods

2.1. Mathematical Theory of Contamination Area in Photographs

Color photographs are generally used when calculating a contaminated area. If a black-and-white (grayscale) photo is used to determine the size of a region, it is necessary to perform additional binarization because the computer is not capable of calculating the exact area using the given photo. Binarization converts a grayscale image into a binary image by setting the grayscale value of a certain pixel as the maximum grayscale value when the original grayscale value exceeds a certain threshold. The grayscale value of a certain pixel is set as the minimum grayscale value when the original grayscale value is below this threshold. Binarization typically converts an image such that only black and white are present after conversion. Therefore, it is necessary to control the threshold grayscale value. Depending on the method used to select the value of this threshold, binarization algorithms can be categorized into those that use a fixed threshold and those that use a self-adaptive threshold. Commonly used binarization algorithms include the bimodal, P parameter, iteration, and Otsu methods [26,27]. The following describes the research approach used in this study. Figure 1 shows the setup used to photograph the contaminated region.
The coordinates (Xc, Yc, Zc) represent the center of the 3D image, and the axes (X, Y) represent the 2D plane of the image. Thus, the coordinates (Xw, Yw, Zw) represent the 3D coordinates of the plane of the object to be measured, and the goal is to measure the area of a contaminated region on this plane. To calculate the area of this contaminated region, an additional area to be measured is added to the same plane to estimate the calculation result. When the camera lens is orthogonal to the area to be measured, the size of the image and area to be measured in the image plane follow the following relationship:
Image   area Area   to   be   measured = f f + d
where f is the focal length of the image and d is the planar distance from the lens to the area to be measured, and is determined by a laser when taking the photo. Because d >> f, the equation above can be rewritten as follows:
Image   area Area   to   be   measured f d
and rearranged as follows:
Area   to   be   measured =   d f × Image   area
Subsequently, binarization is performed using a computer to calculate the size of the image area in the image plane. This is not the exact area but rather the percentage covered in the image plane. The above equation shows that the image area is proportional to the area to be measured, and the unknown 1/f can be replaced by a constant λ to produce the following equation:
Area   to   be   measured =   d × λ ×   Image   area   percentage
If the area to be measured and the distance to the photographed contaminated region d are known, the value of λ can be calculated using the above equation. For the same camera, the value of λ will slightly change with the distance d of the photograph. Hence, if d is known when capturing the photograph, then once the image area percentage is calculated, the area of the contaminated region can be calculated.

2.2. Image Processing Theory and Method

There are three important steps in image processing: grayscale processing, image binarization, and calculation of the binarized area.

2.2.1. Grayscale Processing

Software packages such as MATLAB, Python, and Java are generally used for image processing [22]. Based on the form of the matrices, the aforementioned software packages can perform grayscale image processing according to their program instructions to obtain the desired images.
Grayscale images show different shades of black (with varying brightness levels), which can be represented by pixel values between zero and 255, where zero represents fully black and 255 represents fully white. That is, the closer the pixel value is to zero, the blacker the pixel, and vice versa. Figure 2a shows the contaminated region to be measured, and Figure 2b shows the grayscale results after processing using the software. The grayscale result in Figure 2a is a continuous spectrum, which makes it impossible to determine the area of the contaminated region directly. Further binarization of the image is necessary to calculate the area of the contaminated region. Binarization further converts a grayscale image into a binary image with only black (0) and white (255) pixels, allowing the black-to-white area ratio to be calculated based on the pixel coordinates.

2.2.2. Image Binarization

The maximum interclass variance method, proposed by the Japanese scholar Otsu in 1979, is a self-adaptive approach to determine the threshold value. This is also known as the Otsu method [27]. This method divides the image into two parts—background and target—according to its grayscale characteristics. A larger interclass variance between the background and the target indicates a greater difference between the two parts of the image. Therefore, if part of the target is misclassified as the background, or vice versa, the interclass variance decreases. Thus, maximizing the interclass variance can minimize the likelihood of misclassification.
The Otsu method is mainly based on the following principle: because all pixels in an image form a set of pixels at individual coordinates (x, y), they can be classified into foreground (i.e., target) and background using a threshold, K. The ratio of the number of pixels belonging to the foreground to that of all pixels is denoted by ω 0 , with an average grayscale of μ 0 ; the ratio of the number of pixels that belong to the background to that of all pixels is denoted by ω 1 , with an average grayscale of μ 1 . The average grayscale of the entire original image is denoted by μ , and the interclass variance is g, such that g = ω 0 ω 1 μ 0 μ 1 2 . The value of K is changed iteratively to determine the maximum g, at which the value of K is the desired threshold.
In this study, the Otsu method was used to obtain binarized images. The Otsu method has the advantage of allowing quick and effective determination of the optimum threshold value and results in the maximum interclass variance. However, if the grayscale range of the target to be measured is too large, a portion of the target will be missing after processing. In this study, the contaminated region to be measured and the background color in the image were monotonic, which did not affect the binarization calculation results.

2.2.3. Calculation of Binarized Area

After binarizing an image, only black and white pixels are shown. At this time, it is straightforward to calculate the ratio between the two based on the pixel coordinates using software packages such as Python, Java, or MATLAB, and then calculate the area.
Our mobile application program used Android Studio software (android-studio-2021.2.1.14-windows, Singapore), and the writing program initially imported the OpenCV library into the project area using the same software. An Android-based mobile phone (ASUS ZenFone3 Zoom ZE553KL Z01HDA, 2017; ASUSTeK Computer Inc., Taipei, Taiwan) was used for processing. The photography distance was obtained using a built-in phone application (Laser Ruler, edition 1.0.67.0-170922, Singapore), which can measure distances between 0 and 150 cm.

2.3. Estimation of Pollution Area

2.3.1. Preliminary Analysis Results

Next, we use a series of illustrations (Figure 3) to demonstrate using the known area of a 10-dollar nickel coin made to calculate λ and use this to calculate the fluorescent dye-contaminated region.
Figure 3a shows the contaminated regions to be measured. Figure 3b shows the grayscale image obtained after software processing, and Figure 3c shows the binarized result, which is used to calculate white/total; from this image, the white/total was determined to be 0.008918. Figure 3d shows the binarized results for the image of a 10-dollar coin made of nickel (with a known area) captured at the same distance, which facilitates calculating white/total, determined to be 0.002605. Because the radius of the coin is 11 mm, its area is 11 2 × π 380   mm 2 , and the value of d is 33.4 cm. Thus, λ can be calculated as follows:
λ = 380 33.4 × 0.002605 4376
The area of the contaminated region in Figure 3 can then be calculated using λ , as shown below:
Area   of   contaminated   region = 4376   ×   33.4   ×   0.008918 = 1303.44   mm 2

2.3.2. Contaminated Area Based on Least Squares Regression with Image Linearity

The characteristic constant λ varies at different capture distances. To provide greater flexibility in the distance captured in the photographs, we utilized the least squares regression interpolation method to obtain optimal results. An on-site photography simulation is provided as an example.
Figure 4 shows the geometry of the setup used to photograph a contaminated region. The area measured in a photograph can be calculated using Equation (4). Therefore, as long as the image area ratio is known, the area of the contaminated region can be calculated using distance d during photography and camera area parameter λ. In addition, Equation (4) shows that the area to be measured is dependent on the distance. As previously mentioned, λ is a constant that varies slightly with d, in order to accurately calculate the area of pollution after taking a photo at any distance. The linearity was used to calculate the area of the contaminated region based on interpolation using least-squares regression. The area of the contaminated region ranged from 100 to 1600   mm 2 , whereas the measurement distance ranged from 50 to 100 cm. Figure 4 shows a digital photograph of a piece of red paper (4 cm × 4 cm = 1600   mm 2 ) on a gray wall, which was assumed to be the contaminated region to be measured and captured at a distance of 50 cm. 21 digital photographs of contaminated regions with sizes of 100, 200, 400, 450, 800, 900, and 1600   mm 2 were captured at distances of 50, 75, and 100 cm in the same manner to simulate on-site photography distances.
Photographs of contaminated regions of different sizes were taken at different distances, and Table 1 lists the areas (in pixels) of the contaminated regions calculated by a computer using their binarized images.

2.3.3. Pixel Linear Interpolation Based on Least Squares Regression

A first-order linear function is given by
f x = a 0 + a 1 x
a 0 and a 1 are coefficient determination.
The data in Table 1 were calculated using the least-squares regression to obtain the function f(x) and coefficient of determination r 2 as follows:
For 50 cm:
f x = 18.7376 + 0.0285 x ,
where r2 = 0.9951, indicating 99.51% agreement.
For 75 cm:
f x = 18.6109 + 0.0725 x ,
where r2 = 0.9981, indicating 99.81% agreement.
For 100 cm:
f x = 18.1121 + 0.136 x ,
where r2 = 0.9968, indicating 99.53% agreement.
The results are presented in Figure 5. For a set area and distance, the contaminated area exhibits a high degree of agreement, indicating a linear relationship. However, the exact area of the contamination region should initially be calculated based on distance d. Subsequently, the above function can be used, followed by interpolation. If d = 50.5 cm and d uses a value between 50 and 75 cm, the area can be calculated using interpolation as follows:
Area   ( pixels )   of   contamination   region = 50.5 50 75 50 f x 50 f x 75 .
Equation (11) can be used to calculate the area (pixels) of the contaminated region. Its area ratio to that of the entire image can then be calculated using the total number of pixels in the image. Regarding the methodology of the entire study, we have compiled a flowchart in Figure 6.

3. Results

3.1. Binarization of Photo to Calculate Target Area

In Figure 7, we use two red rectangular shapes affixed to a white wall to provide an example of utilizing our application to perform binarization on a photograph captured by a smartphone placed 50 cm from the wall surface. We further illustrate how the application calculates the total area of the two rectangles.
In this case study, the red rectangles in Figure 7a were set to 40 × 20 + 20 × 10 mm = 1000 mm 2 and the distance was set to 50 cm. The steps were as follows:
  • The LOAD button was pressed to input the sample image and the GRAY button was pressed to convert the image into grayscale, as shown in Figure 7b.
  • The OTSU button was pressed to obtain the initial threshold value based on the OTSU method. The threshold given was 139, which is evidently too high for effective processing (see Figure 7c).
  • A threshold lower than 139 was input, such as 110, and the THRESHOLD button was pressed to obtain the shade in the lower half.
  • The threshold value was adjusted further until the shade in the lower half of the resulting image disappeared. Eventually, a threshold value of 80 was reached. The THRESHOLD button was pressed to obtain Figure 7d. The target value obtained had an area of 10,022.14 mm2.

3.2. Error Analysis

We also analyzed the errors that arose in the study. The errors mainly related to two factors: photography distance and environmental lumina. Considering practicability, this study used a photography distance of 50 cm in an indoor light environment. However, when using a handheld mobile phone, the photography distance can be imprecise and light conditions can vary. This section discusses the magnitude of the errors caused by instabilities in the photography distance or environmental lumina.
Figure 7a shows a photograph of red rectangles captured at a standard distance of 50.0 cm using a handheld mobile phone. Photographs were captured at night under indoor lighting conditions. The red squares had a total area of 4 × 2 + 2 × 1 cm = 1000 mm2 and were attached to a white wall to simulate a contaminated region. Because the white balance setting does not apply to indoor photography, the photograph shows a severe yellow cast. This scenario was deliberately chosen because photographs taken under poor lighting conditions lead to larger analysis errors.
Figure 8 shows that the actual distance was d = 50.5 cm. The binarization threshold was set to 80. After converting the image into grayscale, the area was calculated to be 1055.03 mm 2 , indicating that the numerical error was approximately 5%.
Table 2 lists the pixel values obtained using different areas and distances. The photography distance was detected by laser-based distance-measuring software on the mobile phone. Using 50.5 cm as the median value, which indicates that the photo was taken at a distance of 50.5 cm, the software operated at a distance between 49.5 and 51.5 cm, showing that the error was approximately 6% for a displacement of ±1 cm. Using a maximum possible displacement of ±0.5 cm during photography, the error should be within 3%.
When the lighting conditions were poor, an error of approximately 6% occurred in the measurement of the contaminated area owing to poor imaging; this error can be reduced if the lighting conditions are improved. Furthermore, capturing a photo and measuring the distance using a handheld mobile phone can lead to an error of approximately 3%, which can be eliminated by fixing the mobile phone in place using a tripod.

3.3. Application in a Medical Simulation

We also applied the application to estimate contamination during a simulated intubation scenario. A team consisting of one emergency physician and two nurses was used in this simulation. Before the simulation began, fluorescence markers (Glo Germ, Moab, UT, USA) were applied to the mouth, tongue, trachea, chin, and lips of the manikin used for intubation. Subsequently, an ultraviolet tracer was used to scan the environment in detail to determine the level of environmental contamination. The simulation was conducted in a simulated emergency room. The results are shown in Figure 9 and Table 3.

4. Discussion

Fluorescence-based simulations are commonly used by healthcare workers [28], who use fluorescence markers as quality indicators in their environment cleaning protocols [29]. Procedures for cleaning high-touch surfaces serve as an important step in controlling the transmission of multidrug-resistant pathogens in hospital environments. Two common methods are used to evaluate hospital cleaning protocols: fluorescence markers and environmental pathogen cultures. Fluorescence markers are considered a simple and cost-effective method for assessing environment cleaning practices compared with the environmental pathogen culture method [30]. However, it is difficult to quantify the fluorescent area unless the contaminated regions are calculated. In this study, we provide a straightforward method for quantifying contamination. Nevertheless, this method has some limitations.
Image errors that may occur during photography on a mobile phone are primarily due to three factors. First, holding a mobile phone in one’s hand rather than keeping it in a fixed position when taking a photo can result in displacement errors. Second, both the illumination of the ambient light source and color of the contaminated region to be measured affect the subsequent grayscale conversion and binarization of the image. Third, image errors are caused by reflection from the metal surface of medical equipment. In our case, the image editing software can cover the dark background color of metal images to eliminate this error. These three factors led to errors in the analysis. In particular, although the distance is measured using laser-based software installed on the phone, which is held with a hand when capturing the photo, the two operations do not occur simultaneously. This leads to a small difference in the location of the mobile phone and creates further analysis errors.
The currently known bioimage software includes ImageJ/FIJI (Bethesda, MA, USA), ICY (Shanghai, China), and CellProfiler (Cambridge, MA, USA). ImageJ/FIJI is a user-friendly, open-source software with a large community and pre-installed plugins for biological image analysis [13,14]. Icy is a 3D imaging and visualization-focused software with a large plugin library and support for automation [9]. CellProfiler is a high-throughput analysis software designed for cell-based assays with a good community and automation capabilities [8]. Nevertheless, these software applications exhibit inadequacies, such as the requirement for substantial computational resources, potential errors in image analysis, challenges in standardizing analysis methods across multiple laboratories, and restrictions in functionality or compatibility with specific types of images or data formats.
Our application and these current bioimage software packages differ in that current software requires exporting the images and conducting analysis on a computer, which lacks real-time on-site use and is more suitable for laboratory research. However, our application is convenient for high-fidelity medical training, in which quantitative feedback on contaminated areas must be provided to trainees in a timely manner. Compared to current bioimage software, this application has limitations, as it cannot detect the intensity of fluorescent dye regions. Nevertheless, an application installed on a smartphone is convenient for clinical workers or even in prehospital settings.
In the future, our application is expected to offer real-time feedback suitable for medical education or EMS training, particularly in the realm of simulation medicine. Additionally, it has the potential for use in pre-hospital settings, including EMS and disaster response scenarios. The app’s quantitative testing capabilities, which provide instant feedback on a mobile device, hold promise for assisting frontline personnel in effectively addressing problems arising from industrial pollution, toxicological disasters, and nuclear disasters.

5. Conclusions

In healthcare settings, fluorescence markers are widely used by healthcare workers as a simple and cost-effective method to assess the quality of environment cleaning protocols. This study proposes a straightforward method for quantifying contamination using smartphones. However, image errors during photography with a mobile phone caused by factors such as displacement errors, illumination of ambient light sources, and reflection of metal surfaces can lead to analysis errors. Although the distance is measured using laser-based software installed on the phone, this does not occur simultaneously to capturing the photo, which can further contribute to analysis errors.
The proposed method offers a convenient and timely solution for high-fidelity medical training, in which feedback on the contaminated area needs to be provided to trainees. Nevertheless, the limitations of the proposed method include its inability to detect the intensity of fluorescent regions. Despite these limitations, the use of an application installed on a smartphone provides significant convenience for clinical workers.
In conclusion, we provide a simple and cost-effective method using smartphones to quantify contamination in healthcare settings, which can offer convenient and real-time feedback for frontline medical training.

Author Contributions

P.-W.C. conceived the study and developed the study protocols. P.-W.C. and W.-Y.C. provided technical and administrative support. P.-W.C., C.-T.H. and S.-P.H. contributed to the acquisition of data. P.-W.C., C.-T.H. and W.-Y.C. performed data analysis. P.-W.C., W.-Y.C. and C.-H.L. interpreted the study results. P.-W.C., W.-Y.C. and C.-H.L. provided critical comments. P.-W.C., C.-T.H. and C.-H.L. drafted the manuscript. All authors contributed substantially to its revision. P.-W.C. and C.-T.H. contributed equally as first authors. W.-Y.C. and P.-W.C. contributed equally as corresponding authors and take responsibility for the paper as a whole. All authors have read and agreed to the published version of the manuscript.

Funding

This study was supported and funded by the Taiwan Ministry of Science and Technology (MOST 111-2321-B-006-009-) and National Cheng Kung University Hospital, Tainan, Taiwan (NCKUH-11103042 and NCKUH-11203011). The funding sponsor was not involved in the study design; collection, analysis, and interpretation of data; writing of the report; or decision to submit the article for publication.

Institutional Review Board Statement

Not applicable. This study did not involve any human participants, materials, or data.

Informed Consent Statement

Not applicable. This study did not involve any human participants, materials, or data.

Data Availability Statement

Data available on request due to restrictions e.g., privacy or ethical: The data presented in this study are available on request from the corresponding author. The data are not publicly available due to the confidentiality agreements with participants and institutional policies.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Porteous, G.H.; Bean, H.A.; Woodward, C.M.; Beecher, R.P.; Bernstein, J.R.; Wilkerson, S.; Porteous, I.; Hsiung, R.L. A simulation study to evaluate improvements in anesthesia work environment contamination after implementation of an infection prevention bundle. Anesth. Analg. 2018, 127, 662–670. [Google Scholar] [CrossRef] [PubMed]
  2. Andonian, J.; Kazi, S.; Therkorn, J.; Benishek, L.; Billman, C.; Schiffhauer, M.; Nowakowski, E.; Osei, P.; Gurses, A.P.; Hsu, Y.J.; et al. Effect of an intervention package and teamwork training to prevent healthcare personnel self-contamination during personal protective equipment doffing. Clin. Infect. Dis. 2019, 69 (Suppl. S3), S248–S255. [Google Scholar] [CrossRef]
  3. Roff, M.W. Accuracy and reproducibility of calibrations on the skin using the fives fluorescence monitor. Ann. Occup. Hyg. 1997, 41, 313–324. [Google Scholar] [CrossRef] [PubMed]
  4. Veal, D.A.; Deere, D.; Ferrari, B.; Piper, J.; Attfield, P.V. Attfield a Fluorescence staining and flow cytometry for monitoring microbial cells. J. Immunol. Methods 2000, 243, 191–210. [Google Scholar] [CrossRef] [PubMed]
  5. Canelli, R.; Connor, C.W.; Gonzalez, M.; Nozari, A.; Ortega, R. Barrier enclosure during endotracheal intubation. N. Engl. J. Med. 2020, 382, 1957–1958. [Google Scholar] [CrossRef]
  6. Thomas, L.S.V.; Schaefer, F.; Gehrig, J. Fiji plugins for qualitative image annotations: Routine analysis and application to image classification. F1000Research 2020, 9, 1248. [Google Scholar] [CrossRef]
  7. Bray, M.A.; Carpenter, A.E. CellProfiler Tracer: Exploring and validating high-throughput, time-lapse microscopy image data. BMC Bioinform. 2015, 16, 368. [Google Scholar] [CrossRef]
  8. Meijering, E.; Dzyubachyk, O.; Smal, I. Methods for cell and particle tracking. Methods Enzymol. 2012, 504, 183–200. [Google Scholar] [CrossRef]
  9. de Chaumont, F.; Dallongeville, S.; Chenouard, N.; Hervé, N.; Pop, S.; Provoost, T.; Meas-Yedid, V.; Pankajakshan, P.; Lecomte, T.; Le Montagner, Y.; et al. Icy: An open bioimage informatics platform for extended reproducible research. Nat. Methods 2012, 9, 690–696. [Google Scholar] [CrossRef]
  10. Available online: https://icy.bioimageanalysis.org/ (accessed on 21 October 2022).
  11. Hartig, S.M. Basic image analysis and manipulation in ImageJ. Curr. Protoc. Mol. Biol. 2013, 102, 14.15.1–14.15.12. [Google Scholar] [CrossRef]
  12. Schroeder, A.B.; Dobson, E.T.A.; Rueden, C.T.; Tomancak, P.; Jug, F.; Eliceiri, K.W. The ImageJ ecosystem: Open-source software for image visualization, processing, and analysis. Protein Sci. 2021, 30, 234–249. [Google Scholar] [CrossRef]
  13. Schneider, C.A.; Rasband, W.S.; Eliceiri, K.W. NIH Image to ImageJ: 25 years of Image Analysis. Nat. Methods 2012, 9, 671–675. [Google Scholar] [CrossRef]
  14. Rueden, C.T.; Schindelin, J.; Hiner, M.C.; Dezonia, B.E.; Walter, A.E.; Arena, E.T.; Eliceiri, K.W. ImageJ2: ImageJ for the next generation of scientific image data. BMC Bioinform. 2017, 18, 529. [Google Scholar] [CrossRef]
  15. Weng, C.H.; Chiu, P.W.; Kao, C.L.; Lin, Y.Y.; Lin, C.H. Combating COVID-19 during airway management: Validation of a protection tent for containing aerosols and droplets. Appl. Sci. 2021, 11, 7245. [Google Scholar] [CrossRef]
  16. Polo Freitag, G.; Freitag de Lima, L.G.; Ernandes Kozicki, L.; Simioni Felicio, L.C.; Romualdo Weiss, R. Use of a smartphone camera attached to a light microscope to determine equine sperm concentration in ImageJ Software. Arch. Vet. Sci. 2020, 25, 33–45. [Google Scholar]
  17. Rabiolo, A.; Bignami, F.; Rama, P.; Ferrari, G. VesselJ: A new tool for semiautomatic measurement of corneal neovascularization. Investig. Ophthalmol. Vis. Sci. 2015, 56, 8199–8206. [Google Scholar] [CrossRef]
  18. Available online: https://imagej.net/imagej-wiki-static/Android (accessed on 21 October 2022).
  19. Cardona, A.; Tomancak, P. Current challenges in open-source bioimage informatics. Nat. Methods 2012, 9, 661–665. [Google Scholar] [CrossRef]
  20. Tsai, R.Y. A versatile camera calibration technique for high-accuracy 3D machine vision metrology using off-the-shelf TV cameras and lenses. IEEE J. Robot. Automat. 1987, 3, 323–344. [Google Scholar] [CrossRef]
  21. Zhang, Z. A flexible new technique for camera calibration. IEEE Trans. Pattern Anal. Mach. Intell. 2000, 22, 1330–1334. [Google Scholar] [CrossRef]
  22. Cui, C.; Ngan, K.N. Plane-based external camera calibration with accuracy measured by relative deflection angle. Signal Process. Image Commun. 2010, 25, 224–234. [Google Scholar] [CrossRef]
  23. Kwak, K.; Huber, D.F.; Badino, H.; Kanade, T. Extrinsic calibration of a single line scanning lidar and a camera. In Proceedings of the 2011 IEEE/RSJ International Conference on Intelligent Robots and Systems, San Francisco, CA, USA, 25–30 September 2011; IEEE Publications: San Francisco, CA, USA, 2011; pp. 3283–3289. [Google Scholar]
  24. Buades, A.; Coll, B.; Morel, J.M. A review of image denoising algorithms, with a new one. Multiscale Model. Simul. 2005, 4, 490–530. [Google Scholar] [CrossRef]
  25. Gonzalez, R.C.; Woods, R.E. Digital Image Processing, 4th ed.; Pearson: London, UK, 2018. [Google Scholar]
  26. Barney Smith, E.H.; Likforman-Sulem, L.; Darbon, J. Effect of pre-processing on binarization. SPIE Proc. 2010, 7534, 154–161. [Google Scholar] [CrossRef]
  27. Otsu, N. A threshold selection method from gray-level histograms. IEEE Trans. Syst. Man Cybern. 1979, 9, 62–66. [Google Scholar] [CrossRef]
  28. Hall, S.; Poller, B.; Bailey, C.; Gregory, S.; Clark, R.; Roberts, P.; Tunbridge, A.; Poran, V.; Evans, C.; Crook, B. Use of ultraviolet-fluorescence-based simulation in evaluation of personal protective equipment worn for first assessment and care of a patient with suspected high-consequence infectious disease. J. Hosp. Infect. 2018, 99, 218–228. [Google Scholar] [CrossRef]
  29. Blue, J.; O’Neill, C.; Speziale, P.; Revill, J.; Ramage, L.; Ballantyne, L. Use of a fluorescent chemical as a quality indicator for a hospital cleaning program. Can. J. Infect. Control 2008, 23, 216–219. [Google Scholar]
  30. Dewangan, A.; Gaikwad, U. Comparative evaluation of a novel fluorescent marker and environmental surface cultures to assess the efficacy of environmental cleaning practices at a tertiary care hospital. J. Hosp. Infect. 2020, 104, 261–268. [Google Scholar] [CrossRef]
Figure 1. Geometry of contaminated region during photography.
Figure 1. Geometry of contaminated region during photography.
Bioengineering 10 00530 g001
Figure 2. Images of the contaminated area to be measured: (a) image before processing, (b) image after software-based grayscale processing.
Figure 2. Images of the contaminated area to be measured: (a) image before processing, (b) image after software-based grayscale processing.
Bioengineering 10 00530 g002
Figure 3. Using the known area of a coin to calculate λ and the fluorescent dye-contamination region: (a) Contaminated region to be measured, (b) software-processed grayscale image, (c) binarization results based on the Otsu method with a threshold of 87.0, (d) binarized result for a coin with a threshold of 142.0.
Figure 3. Using the known area of a coin to calculate λ and the fluorescent dye-contamination region: (a) Contaminated region to be measured, (b) software-processed grayscale image, (c) binarization results based on the Otsu method with a threshold of 87.0, (d) binarized result for a coin with a threshold of 142.0.
Bioengineering 10 00530 g003
Figure 4. A simulated 4 cm × 4 cm contaminated region.
Figure 4. A simulated 4 cm × 4 cm contaminated region.
Bioengineering 10 00530 g004
Figure 5. Least squares regression to obtain function f(x) and coefficient of determination r2 at: (a) 50 cm, (b) 75 cm, (c) 100 cm.
Figure 5. Least squares regression to obtain function f(x) and coefficient of determination r2 at: (a) 50 cm, (b) 75 cm, (c) 100 cm.
Bioengineering 10 00530 g005
Figure 6. Flow diagram for the methodology.
Figure 6. Flow diagram for the methodology.
Bioengineering 10 00530 g006
Figure 7. Numerical settings for binarization obtained for the example (two red rectangles): (a) area of 1000 mm2, (b) grayscale image, (c) image with high threshold of 139.0, (d) image with final threshold of 80.
Figure 7. Numerical settings for binarization obtained for the example (two red rectangles): (a) area of 1000 mm2, (b) grayscale image, (c) image with high threshold of 139.0, (d) image with final threshold of 80.
Bioengineering 10 00530 g007
Figure 8. Results obtained by application at a distance of 50.5 cm.
Figure 8. Results obtained by application at a distance of 50.5 cm.
Bioengineering 10 00530 g008
Figure 9. Areas of fluorescent dye and calculation results: (a) contamination regions on gloves, (b) results of regions on gloves, (c) contamination regions on the face of the manikin, (d) results of regions on face.
Figure 9. Areas of fluorescent dye and calculation results: (a) contamination regions on gloves, (b) results of regions on gloves, (c) contamination regions on the face of the manikin, (d) results of regions on face.
Bioengineering 10 00530 g009
Table 1. Pixel values based on different areas and distances.
Table 1. Pixel values based on different areas and distances.
Area (mm2)50 cm75 cm100 cm
502127736356
10038461334793
20010,54726721979
40014,26352202944
45015,48055793807
80028,01910,3095721
90031,20411,9786499
160057,43722,18512,013
Table 2. Pixel values (areas) based on photography distance and area measurement (with 50.0 cm as the standard distance and 1000 mm2 as the area).
Table 2. Pixel values (areas) based on photography distance and area measurement (with 50.0 cm as the standard distance and 1000 mm2 as the area).
Distance (cm)48.549.550.551.552.5
Area (mm2)923.48989.251055.031120.801186.57
Table 3. Results from intubation simulation.
Table 3. Results from intubation simulation.
SitesFaceHandsChest Wall
Area (mm2)3351.35040.689.7
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Chiu, P.-W.; Hsu, C.-T.; Huang, S.-P.; Chiou, W.-Y.; Lin, C.-H. Prediction of Contaminated Areas Using Ultraviolet Fluorescence Markers for Medical Simulation: A Mobile Phone Application Approach. Bioengineering 2023, 10, 530. https://doi.org/10.3390/bioengineering10050530

AMA Style

Chiu P-W, Hsu C-T, Huang S-P, Chiou W-Y, Lin C-H. Prediction of Contaminated Areas Using Ultraviolet Fluorescence Markers for Medical Simulation: A Mobile Phone Application Approach. Bioengineering. 2023; 10(5):530. https://doi.org/10.3390/bioengineering10050530

Chicago/Turabian Style

Chiu, Po-Wei, Chien-Te Hsu, Shao-Peng Huang, Wu-Yao Chiou, and Chih-Hao Lin. 2023. "Prediction of Contaminated Areas Using Ultraviolet Fluorescence Markers for Medical Simulation: A Mobile Phone Application Approach" Bioengineering 10, no. 5: 530. https://doi.org/10.3390/bioengineering10050530

APA Style

Chiu, P. -W., Hsu, C. -T., Huang, S. -P., Chiou, W. -Y., & Lin, C. -H. (2023). Prediction of Contaminated Areas Using Ultraviolet Fluorescence Markers for Medical Simulation: A Mobile Phone Application Approach. Bioengineering, 10(5), 530. https://doi.org/10.3390/bioengineering10050530

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop