Next Article in Journal
Early Breast Cancer Detection Based on Deep Learning: An Ensemble Approach Applied to Mammograms
Previous Article in Journal
Implementation of Automatic Segmentation Framework as Preprocessing Step for Radiomics Analysis of Lung Anatomical Districts
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Review

Artificial Intelligence in Wound Care: A Narrative Review of the Currently Available Mobile Apps for Automatic Ulcer Segmentation

by
Davide Griffa
1,2,†,
Alessio Natale
1,3,†,
Yuri Merli
1,3,
Michela Starace
1,3,
Nico Curti
4,*,
Martina Mussi
1,3,
Gastone Castellani
1,
Davide Melandri
1,2,
Bianca Maria Piraccini
1,3 and
Corrado Zengarini
1,3,*,†
1
Department of Medical and Surgical Sciences, University of Bologna, 40138 Bologna, Italy
2
SOC Centro Grandi Ustionati/Dermatologia Cesena (Forlì), AUSL Romagna, 47521 Cesena, Italy
3
Dermatology Unit, IRCCS Azienda Ospedaliero-Universitaria di Bologna, 40138 Bologna, Italy
4
Department of Physics and Astronomy, University of Bologna, 40138 Bologna, Italy
*
Authors to whom correspondence should be addressed.
These authors contributed equally to this work.
BioMedInformatics 2024, 4(4), 2321-2337; https://doi.org/10.3390/biomedinformatics4040126
Submission received: 6 October 2024 / Revised: 11 November 2024 / Accepted: 3 December 2024 / Published: 11 December 2024
(This article belongs to the Section Imaging Informatics)

Abstract

:
Introduction: Chronic ulcers significantly burden healthcare systems, requiring precise measurement and assessment for effective treatment. Traditional methods, such as manual segmentation, are time-consuming and error-prone. This review evaluates the potential of artificial intelligence AI-powered mobile apps for automated ulcer segmentation and their application in clinical settings. Methods: A comprehensive literature search was conducted across PubMed, CINAHL, Cochrane, and Google Scholar databases. The review focused on mobile apps that use fully automatic AI algorithms for wound segmentation. Apps requiring additional hardware or needing more technical documentation were excluded. Vital technological features, clinical validation, and usability were analysed. Results: Ten mobile apps were identified, showing varying levels of segmentation accuracy and clinical validation. However, many apps did not publish sufficient information on the segmentation methods or algorithms used, and most lacked details on the databases employed for training their AI models. Additionally, several apps were unavailable in public repositories, limiting their accessibility and independent evaluation. These factors challenge their integration into clinical practice despite promising preliminary results. Discussion: AI-powered mobile apps offer significant potential for improving wound care by enhancing diagnostic accuracy and reducing the burden on healthcare professionals. Nonetheless, the lack of transparency regarding segmentation techniques, unpublished databases, and the limited availability of many apps in public repositories remain substantial barriers to widespread clinical adoption. Conclusions: AI-driven mobile apps for ulcer segmentation could revolutionise chronic wound management. However, overcoming limitations related to transparency, data availability, and accessibility is essential for their successful integration into healthcare systems.

Graphical Abstract

1. Introduction

A chronic wound is one that fails to progress through the normal timely repair process or does not restore anatomical and functional integrity within three months [1,2]. Chronic ulcers are well recognised as a significant burden in healthcare, significantly affecting patient quality of life and imposing economic strain on health systems [3].
The management of a chronic wound is complex and typically involves collecting and analysing essential information to determine its aetiology, understand its prognosis, and decide on the appropriate treatment. Key characteristics to evaluate in a lesion include its location, size, depth, the presence of drainage, its borders, the tissue type, and the patient’s symptoms [2]. Monitoring these characteristics over time is also crucial for assessing the healing process and selecting the most appropriate interventions [4,5].
This assessment is typically performed manually by healthcare professionals through segmentation and measurement, a process that is often time-consuming, prone to human error, and may increase the risk of superinfection [6].
Furthermore, the prevalence of chronic wounds, currently estimated at 2.21 cases per 1000 individuals, is expected to rise due to the increasing life expectancy and the growing global incidence of diabetes and obesity [7,8]. Therefore, it is now more important than ever to consider effective strategies that could facilitate accurate wound assessment, enable the selection of optimal interventions, and improve patient outcomes [9].
For several years, the widespread use of mobile phones has enabled the capture and storage of ulcerative lesion images directly on these devices, thanks to their integrated cameras. Numerous studies have demonstrated that photographing a wound facilitates a more comprehensive assessment and reliable monitoring [10,11]. In this context, advancements in biomedical informatics have enabled the development of tools, such as artificial intelligence (AI)-based mobile applications, capable not only of capturing and storing images of ulcers but also of analysing their main characteristics and providing valuable information to support their management [12].
Biomedical informatics is an interdisciplinary field focused on the effective use of biomedical data, information, and knowledge to improve human health. By leveraging advanced computational techniques, such as AI and machine learning, biomedical informatics facilitates the processing and analysis of complex medical data [13]. AI is the capability of machines to perform tasks that typically require human intelligence, such as image recognition, language understanding, and decision-making [14]. Machine learning is a particular branch of AI that enables systems to learn and improve automatically from experience without explicit programming [14].
AI-based mobile apps are increasingly developed to support physicians and nurses, positioning AI as a promising tool in healthcare, particularly in wound care [15]. These mobile apps appear to have considerable potential and could be valuable tools for professionals working in the field [16]. In fact, the increasing use of AI technologies and handheld devices such as smartphones has led to the timely development of remote diagnostic and prognostic systems for wound care [17]. For example, smartphone cameras have enabled the capturing of images and the development of analysis algorithms to assess wound area, segment, and extract colour correlated with wound tissue [18].
Mobile apps can have one-to-many built-in features and range from simple ones that detect differences in colour to assess ulcer composition or allow photographic follow-up and monitoring to advanced ones that use convolutional neural networks (CNNs), a type of neural network specifically designed for visual data analysis, to provide ulcer analysis and generate accurate segmentations [19]. In fact, deep learning, an advanced form of machine learning based on deep neural networks like CNNs, is currently the most used technique for wound analysis and is one of the most active research areas.
Such tools can guide clinical decision-making, reduce the burden on healthcare professionals, and improve patient outcomes [20]. Additionally, advances in hardware and imaging equipment capable of capturing, classifying, and segmenting images hold promise for improving diagnostic accuracy in healthcare [19]. However, the high heterogeneity of lesion types, image background composition, and acquisition conditions present significant challenges for effective image segmentation [21,22].
This literature review aims to provide an updated overview of mobile applications currently available for wound assessment and the automatic segmentation of skin ulcers. Furthermore, the applied technological frameworks and their clinical validation will be scrutinised to investigate their application in real-world clinical settings. Finally, we will discuss the challenges and limitations of these technologies, along with the future directions for research and development in this rapidly evolving field.

2. Material and Methods

2.1. Literature Search Strategy

A literature review was conducted to identify AI-based mobile apps capable of automatically segmenting and assessing wounds. The following is an example flowchart of the automated wound assessment process in use by the evaluated apps using AI-based mobile apps (Figure 1):
The search was performed using major electronic databases, including PubMed, CINAHL, Cochrane, Google Scholar, and IEEE Xplore, with a publication date range set from 2015 to 2024. The search string, adapted to different databases, used the following expression with MeSH terms and Boolean operators: ((“Wounds and Injuries” [Mesh]) OR (“Skin Ulcer” [Mesh])) AND ((((“Artificial Intelligence” [Mesh]) OR (“Machine Learning” [Mesh])) OR (segmentation)) AND (“Mobile Applications” [Mesh])). The initial search was conducted in May 2024 and updated in September 2024. Additionally, a search was performed in major app repositories, such as the Apple App Store and Google Play Store, to identify commercially relevant applications for wound management and assessment, using generic keywords such as “wound” and “ulcer”. Inclusion criteria for the apps required mobile device compatibility and the use of AI algorithms for fully automated wound segmentation (without the need for manual or semi-automatic input). Apps that required integration with external devices or lacked sufficient technical documentation were excluded.

2.2. Data Extraction

For each application, data on the segmentation technique (e.g., edge detection, pixel counting, deep learning), algorithm type, and dataset used for training were extracted. If available, information on the validation method, such as comparing algorithmic segmentation with clinician assessments, was also collected. The presence of public datasets, the algorithm’s transparency (e.g., black box or explainable AI), and the availability of published results were documented.

2.3. App Selection Criteria and Scoring Classification

  • Regulatory approval (up to 2 points): Applications were awarded up to two points if they had been approved or registered with recognised regulatory agencies such as the FDA or relevant European agencies. Regulatory approval reflects adherence to safety and efficacy standards, critical for clinical application. While important, this criterion was given a relatively low score to ensure that it did not unduly influence the overall ranking compared to performance metrics.
  • Platform availability (up to 2 points): Applications received up to two points for availability on major mobile platforms (iOS and Android). Platform accessibility is important for widespread adoption by healthcare providers and patients, but it was assigned a lower weight to reflect its supportive role in usability rather than core clinical effectiveness.
  • Peer-reviewed studies or validation data (up to 2 points): Up to two points were awarded if the application had been validated in peer-reviewed studies, as this provides a measure of scientific credibility. This criterion was scored modestly to acknowledge validation without allowing it to disproportionately affect the overall ranking.
  • Disclosure of methods/algorithms and public dataset (up to 2 points): Points were given for transparency in algorithmic methods and the use of public datasets. Applications that disclosed their segmentation algorithms and datasets scored higher, as transparency facilitates scientific scrutiny and reproducibility. This criterion was also limited in score to keep it from outweighing core performance metrics.
  • Inter-rater reliability and performance metrics (up to 10 points): This parameter was the most heavily weighted, with applications awarded up to ten points based on statistical measures of performance, such as inter-rater reliability, the Dice similarity coefficient, pixel-based accuracy, and AUC scores. High inter-rater reliability indicates consistent and reliable results, making it a crucial factor in clinical contexts. Given its importance in reflecting true algorithmic efficiency, this parameter had the highest potential score to ensure it influenced the overall ranking meaningfully.
By structuring our scoring system this way, we emphasised algorithmic efficiency as the primary metric for app evaluation, acknowledging some arbitrariness in the weighting of secondary parameters. Such values—regulatory approval, platform availability, peer-reviewed validation, and method transparency—were considered mostly in a spectrum going to insufficient to sufficient, sometimes including completely fulfilled consideration. This reduced absolute scoring ensured that these factors, while important, could not alter the overall app rankings significantly if the app’s primary performance metric was lacking. This design allowed us to maintain algorithmic efficiency as the focal point of our assessment while still acknowledging the role of other essential aspects in clinical usability.

3. Results

Ten apps met the inclusion criteria, each with declared performance levels, inconsistent public dataset availability, and clinical validation. The apps’ evaluation also found varying accuracy and reliability in AI-powered ulcer segmentation.
Table 1 provides a brief description of each app, including its main features and characteristics. Figure 2 shows a graphic summary of their characteristics.
  • Healthy.io’s Minuteful for Wound (2019, Israel)
Minuteful for Wound (also named wound at home app) by Healthy.io, launched around 2019, is an advanced digital tool designed to improve wound care management using AI-powered computer vision and 3D wound imaging [23]. Based in Tel Aviv, Israel, with offices in Boston, USA, and London, UK, the app allows healthcare providers to assess wounds accurately using smartphone cameras and calibration stickers. It standardises wound measurements across various lighting conditions and offers a comprehensive system for tracking wound progression. The app is FDA-registered and CE-marked, ensuring compliance with global medical device standards. However, no detailed data have been published on the specific segmentation techniques employed by the app’s camera or the image database used in developing the AI algorithms [23]. The data published on the segmentation techniques for Healthy.io’s Minuteful for Wound refer to its use of AI-powered computer vision and 3D wound imaging integrated with smartphone cameras [8,24]. The system employs colour calibration stickers to ensure consistent measurements across lighting conditions. However, these details apply only to the external calibration used during wound assessment and monitoring through the app, not specifically for internal methods like segmentation via camera alone.
Currently, there are no detailed published data regarding the segmentation techniques explicitly employed by the app’s camera without additional devices nor any information on the image database used to develop the segmentation models. The only data published are clinical, regarding its efficacy in providing telemedicine healthcare support [25,26].
  • Wound Vision Scout Mobile App (USA, 2019)
The WoundVision Scout Mobile app [27], released in 2019, is a comprehensive wound imaging and documentation solution developed by WoundVision LLC (Indianapolis, IN, USA). The app integrates an eco-system, principally the use of long-wave infrared thermography (LWIT), using an external device that is FDA 510(k)-cleared (K131596) and HIPAA-compliant, designed to streamline wound assessment and integrate seamlessly into electronic medical records (EMRs). However, our exclusive interest is its camera-based wound documentation and assessment capabilities. The hardware instruments, such as the Scout multi-modal imaging device, are beyond the scope of this review. The mobile app analysed provides visual photography using a mobile phone app to record, assess, and document wounds, particularly hospital-acquired pressure injuries (HAPIs).
The published data on segmentation techniques [28,29] only apply to the external thermal camera device (like the Scout device used in the studies). The app was claimed to be downloadable and available in stores (like Google Play) but is hardly retrievable in repositories, and there are no published data supporting the segmentation methods employed by the smartphone camera app. Furthermore, there is no information available on the number or source of images used in the app’s development or on any associated image database.
  • APD Skin Monitoring App (Singapore, 2019)
APD Lab, based in Singapore, developed the APD Skin Monitoring App in collaboration with APD SKEG Pte Ltd. It was initially part of a project aimed at improving wound care by using smartphone technology to help patients and healthcare providers monitor wound healing. The app is available on both the Google Play and Apple App Store and was first released in 2019 [30].
The APD Skin Monitoring App performs wound segmentation using the following two main approaches:
  • GrabCut algorithm: This method uses an interactive segmentation based on graph cuts, requiring the user to draw a rectangle around the wound. While accurate, it is slow and demands manual input, making it less efficient [31].
  • Colour thresholding: The second approach leverages colour detection based on typical wound hues (e.g., shades of red). It quickly separates wound pixels from the background and uses contour detection for area calculation. This method is faster and more accurate [32].
The APD Skin Monitoring App lacks published data regarding its clinical efficiency, database, or formal results comparison. In the development process, the app primarily relied on wound images from a single volunteer for testing, which were used to demonstrate the functionality of its image processing algorithms. While these internal tests showed promising results regarding wound detection accuracy and processing speed, no extensive clinical trials or formal studies have been published to validate its effectiveness in broader patient populations [30].
  • NDKare App (Singapore, 2019)
The NDKare app was developed by Nucleus Dynamics Pte. Ltd., Singapore, and is designed for the secure and fast sharing of wound images between patients, clinicians, and healthcare organisations. The app’s first data were shared in 2018 [33], and it allowed users to upload realistic wound photos directly from their smartphone, making them accessible from any desktop or mobile device for further review. It also has FDA registration, CE marking, and HSA approval, ensuring that it meets necessary regulatory standards for medical devices.
The app provides a practical tool for wound assessment, making it easier to track wound healing and share data with healthcare providers efficiently. As of its latest update in June 2019, it focuses on improving the user experience and secure handling of sensitive medical data. NDKare is available for download in different app repositories.
In the only known study, the use of a proprietary database of approximately 60 patients with 203 measurements was described for its programming [33]. The results were as follows:
-
For 2D wound segmentation, the NDKare app uses an image processing technique that automatically distinguishes the wound area from surrounding tissue based on pixel analysis. The app identifies the ulcer boundaries and allows users to adjust the outline manually, if needed. This segmentation calculates 2D metrics such as length, width, perimeter, and surface area, offering precise measurements of wounds captured by the smartphone’s camera.
-
For 3D wound segmentation, the technique is based on “structure from motion”. This technique creates a 3D model by analysing a video of the wound, capturing images from different angles, identifying key points, and reconstructing the wound in 3D using triangulation. The app then generates a “dense 3D point cloud” and a smooth surface reconstruction for depth and volume measurement.
To date, the site is offline, and the application cannot be downloaded.
  • Clinic Gram (Barcelona, 2019)
Clinicgram [34] was developed by Skilled Skin SL, a company based in Barcelona, Spain. The app was first released in 2019 as a solution for managing and diagnosing chronic wounds and other clinical conditions using smartphone technology. It is a Class I medical device certified under European standards.
The producer declared that it uses AI-based algorithms to automatically analyse clinical images and segment various wound parameters. These include detecting the wound’s perimeter, area, and tissue types and helping clinicians assess wound progression.
However, neither the database size nor the segmentation technique are disclosed nor are there data on its efficiency in wound recognition and segmentation.
Furthermore, no articles or scientific works have been published related to this application, which is reported to be in the course of “clinical trials” and described as a CE class IIb-certified device.
  • Swift Skin and Wound App (Canada, 2017)
The Swift Skin and Wound App was developed by Swift Medical, Toronto, Canada [35]. The medical device enables the capture, measurement, and documentation of wound progression and was presented in 2017 [36]. The app can work on most iOS and Android devices and complies with the HIPPA (Health Insurance Portability and Accountability Act 1996, USA) and PHIPPA (Personal Health Information Protection Act 2004, Canada).
The Swift Skin and Wound App uses an artificial intelligence algorithm to measure wound circumference, type, and progress accurately.
An update of the models used by the digital tool was presented in 2022 to appear in the European Conference on Computer Vision (ECCV), 2014 [37], as follows:
-
YoloV3 [38]: a separate object detection model.
-
AutoTrace [37]: the segmentation model for lesion localisation is a deep convolutional coding–decoding neural network with attention gates in jump connections.
-
AutoTissue [39,40]: the segmentation model for wound tissue is a convolutional encoder–decoder neural network using an EfficientNetB0 architecture as an encoder.
Tissue segmentation was considered good/discreet in terms of tissue identification and quality.
These models were trained using 465,187 and 17,000 image–label pairs, respectively, the largest and most diverse chronic wound image dataset in terms of wound types and skin tones.
The measurement is carried out through the application of a small HealX™ reference adhesive marker, applied next to the wound during image acquisition, acting as a scale and colour reference.
The published data regarding the accuracy are quite good; the app’s ICC for measuring height and width showed a mean ICC of 0.86 [0.75–0.94 range] and has been shown to be 57% faster than the traditional ruler method and charting on paper [41].
Specific published data regarding the segmentation and area calculation in a testing set showed an ICC of 1 (0.99–1), even if the wounds on which the measurements were performed were of a low number and in a specific environment [36].
Moreover, the app allows for interfacing with other devices (such as FLIR One™) to assess wound temperatures.
The Swift Skin and Wound solution includes a dashboard to view a full range of statistics and trends in the patient population; it allows for the export of images, measurements, assessments, and treatments via PDF documents. All patient data are protected and encrypted. Finally, patients and caregivers for home monitoring can also use the tool.
  • Cares4wounds (Singapore, 2019)
Cares4wounds (C4W) is a wound care imaging, assessment, and management system developed by Tetsuyu Healthcare Holdings Pte Ltd, Singapore [42]. The purpose of this tool is to provide a standardised approach to monitoring wound healing progress by using a non-contact approach.
The system consists of a software application that is utilised with commercial off-the-shelf (COTS) Apple mobile computing platforms (an iPad with a 3D structure sensor and an iPhone with a dual camera) and a web-based software application that is optimised for mobile platforms. The mobile application is supported on the iPhone 11 or an equivalent device (no attachment sensor or reference sticker required) or alternatively on any iPad with an attached structured sensor. The application is accessible from the Apple App Store. C4W is available in two versions; while the C4W-A is designed only for the iPad mini fifth generation with a structure sensor, the C4W-B is also supported on the iPhone 11 or later. The C4W-A version uses a point-and-measure system to measure the length, width, and depth of wounds, while the tissue classification is manual. In contrast, the CW4-B version has automatic tissue classification (based on epithelisation, granulation, necrotic, and slough) and measurement. Both versions allow for recording clinical documentation and conducting teleconsultations. The patient’s data can be accessed remotely through a secured web portal to improve overall patient management. Additionally, the CAW-B has received the CE mark and is listed in the HSA Class B Medical Device Register for Singapore.
Chan et al. conducted a study to validate the C4W system against traditional wound assessment measurements in 28 patients diagnosed with diabetic foot ulcers. The study evidences the optimal inter-rater reliability of C4W measurements for length (range 0.825–0.934), width (range 0.825–0.930), and area (range 0.872–0.932) against traditional wound assessment by a trained specialist wound nurse [43].
Kitamura et al. published a case study of a 90-year-old woman presenting with a pressure injury during the COVID-19 pandemic [44]. The visiting nurse utilised the C4W app to conduct teleconsultations with a wound care specialist. By assessing the wound through the C4W app, they asserted how they have been able to provide valuable instructions for patient assistance, leading to better wound care.
Data on the segmentation technique, database size, and efficiency in wound recognition are not available; although the data published by Chan et al. [43] showed a good inter-rater reliability of C4W measurements of wounds, these was limited to length, width, and area (not available for depth and undermining) and to diabetic foot ulcers.
  • Tissue Analytics (USA, 2014)
The Tissue Analytics (TA) app is a cloud-based application that uses machine learning and computer vision to autonomously segment, classify, and measure wounds and assist in wound treatment [45]. It was developed in 2014 by Tissue Analytics, Inc., Baltimore, MD, USA, and then the start-up was acquired by Net Health Company in 2020. It was developed by capturing an image of the wound, enabling the TA app to analyse its dimensions, perimeters, surface area, and tissue composition.
The app is available for iOs and Android devices with an integrated camera. There is no need for additional optics or hardware. The TA app includes a clinician interface that allows for taking photos and documenting wound assessment and management and an interface for patients to monitor wounds that is connected to the clinician interface for oversight.
The Net Health company has released data that show a measurement error of less than 5% when using the TA app compared to conventional methods.
The TA app was granted Breakthrough Device Status by the FDA in 2022.
Data on the segmentation technique or database size are not available.
The study conducted by Barakat-Johnson et al. aimed to evaluate the usability and effectiveness of the TA app to improve wound assessment and management [46]. Using the TA app for wound care in 124 patients during the COVID-19 pandemic, a significant improvement in wound documentation and remote patient monitoring was demonstrated. The wound management schedule showed a 6.1% improvement when compared to standard-care patients, with a 32.8% improvement in pain, 91.7% improvement in size, 55.2% improvement in exudate, and 38.7% improvement in odour. According to the authors, the app has the ability to capture wound images, integrate EMR, aggregate wound data, integrate telehealth, and provide clinical decision–support capability.
Another study by Fong and colleagues aimed to verify the clinical inter-rater reliability of the TA app in measuring venous leg ulcers by comparing it to traditional wound assessment measurements conducted by a trained wound nurse [47]. The inter-rater reliability between traditional standard measurements and TA measurements ranged from 0.799 to 0.919. The inter-rater reliability for using TA on iOS or Android systems ranged from 0.987 to 0.989.
  • ImitoWound (Switzerland 2020)
ImitoWound is a digital wound management app provided by Imito, based in Zurich, Switzerland [48]. The app is compatible with both iOS and Android devices, and once a picture is taken, the app computes the wound size and reviews wound images on a timeline to assess the progress of wound healing, relying on calibration markers of standard size available from a PDF file. It is not regulated by the European Union MDR nor the FDA, since they state the application is intended for medical and medico-administrative documentation (storage and archival), communication, simple searches, and data representation/embellishment.
The database size and the exact segmentation technique are not revealed. But there are published data regarding the ICC reliability; Bayoumi et al. reported an ICC of 0.99 (95% C.I 0.998–0.999) in measuring the lower limb chronic wounds of 61 patients, while Guarro et al. reported an ICC of 0.95 in 32 wound case series [49]. Moreover, Biagioni et al. published a study where they compared 85 wounds on toes, feet, and legs caused by vascular disease measured with Imito and ImageJ [50]. The ICCs comparing the results were 0.995 for iOS (95% CI, 0.991–0.997) and 0.970 for Android (95% CI, 0.946–0.984). Further studies have been published on the app ecosystem provided by the same company [51,52]. The developers state that in the future, the app could be utilised to evaluate the tissue characteristics of wounds and aid in making treatment decisions.
  • WoundsWiseIQ (USA, 2015)
WoundWiseIQ, released by Med-Compliance IQ, Inc. in 2015, is a mobile application for capturing wound images that is compatible with iOS devices [53]. The technology enables users to automatically calculate wound analysis colour differences, but the algorithm is declared to be patented and undisclosed. It is FDA-registered and HIPAA-compliant.
It consists of two mobile applications. The first one is used by clinicians to capture wounds and access image analysis; the second one is made available for patients to take wound images on their own. The mobile application sends captured images to a HIPAA-compliant cloud server, where they are processed.
The 3D wound image analysis allows users to automatically calculate wound depth and volume. The patented algorithm performs image analysis of the wound perimeter, square area, planimetric area, and various colours of the wound.
The WoundWiseIQ app currently lacks connectivity to other applications such as Wound EMR’s or Enterprise EMR’s. Currently, advanced analytics for decision support and predictive analysis are in progress.
Data on the segmentation technique, database size, and efficiency in wound recognition are not available. There are no studies that validate the use of WoundWiseIQ in clinical practice or its inter-reliability in comparison to standard wound assessment.
Table 1. This table summarises the key characteristics, segmentation techniques, and clinical validations of ten mobile applications developed for wound segmentation using artificial intelligence (AI). The table evaluates each app based on several criteria, including availability on mobile platforms, regulatory approval, published studies, segmentation methods, and reliability metrics. The apps are ranked according to their segmentation accuracy and the level of clinical validation available, providing an assessment of their potential utility in clinical practice.
Table 1. This table summarises the key characteristics, segmentation techniques, and clinical validations of ten mobile applications developed for wound segmentation using artificial intelligence (AI). The table evaluates each app based on several criteria, including availability on mobile platforms, regulatory approval, published studies, segmentation methods, and reliability metrics. The apps are ranked according to their segmentation accuracy and the level of clinical validation available, providing an assessment of their potential utility in clinical practice.
App NameStateCompany, Industry, OtherAvailable on App Stores and/or Public RepositoriesStudiesHealthcare Agency EvaluationPublic DatasetSegmentation TechniqueReliabilityOur Classification
Wound at home Healthy.io Minuteful for WoundIsraelHealthy.io, a private company2/2 (App store, Android store)1/22/2, YesNoN/AN/A5/18
Wound Vision Scout App MobileUSAWoundVision LLC, a private companyN/a2/2NoNoN/AN/A2/18
APD Skin Monitoring AppSingaporeAPD Lab, Private Company2/2 (App store, Android store)1/2, scarceNoNo1/1 Grabcut [31], RBG thresholds [32]N/A4/18
NdKare appSingaporeNucleus Dynamics Pte. Ltd., Private Company2/2 (App store, Android store, other repositories)2/22/2, YesNo1/1 for 2D reconstruction, pixel analysis [54].10/10 [55]17/18
ClinicramSpainSkilled Skin SL, Private CompanyNoNo2/2, YesNoN/AN/A2/18
Swift Skin and WoundCanadaSwift Medical Inc.No2/22/2, YesNo1/1, AutoTissue: tissue segmentation model; AutoTrace: wound segmentation model10/10 [41]15/18
Care4woundsSingaporeTetsuyu Healthcare Holdings Pte Ltd.2/2 (App store, Android store)2/22/2, YesNoN/A9/10 [43]15/18
Tissue AnalyticsUSANet Health Company2/2 (App store, Android store)2/2NoNoN/A10/10 [47]14/18
ImitoWoundSwitzerlandImito AG2/2 (App store, Android store)2/2NoNoN/A10/10 [56]14/18
WoundWiseIQUSAMed-Compliance IQ, Inc.NoNo2/2NoN/AN/A2/18

4. Discussion

In recent years, the use of smartphones and mobile apps in clinical practice has increased considerably, not only due to their capability to capture high-resolution photographs but also for the ability to store medical information, monitor chronic conditions, and enhance care management [57]. In particular, mobile apps dedicated to skin injury detection and assessment are attracting growing interest for their ability to offer diagnostic and therapeutic support, thereby improving the quality of care for patients with acute and chronic skin ulcers [16].
The use of these tools has been made possible through the development of advanced artificial intelligence (AI) algorithms, deep learning, and convolutional neural networks (CNNs). Artificial intelligence is defined as the ability of a machine to perform tasks that normally require human intellect, such as visual perception, speech recognition, or decision-making [58]. Belonging to this field is deep learning, a sub-discipline of AI that relies on artificial neural networks with different levels of depth, enabling the model to automatically learn complex features from rough data. Convolutional neural networks (CNNs), a type of deep learning model particularly well suited for image processing, can identify the most salient features through multiple layers of processing, mimicking the structure and function of the human brain [58].
Artificial intelligence methodologies show significant positive impacts and promising prospects in the field of wound care and management [59]. In fact, the application of AI and deep learning has facilitated the development of advanced tools for the automatic segmentation and detection of clinical images. Segmentation involves dividing an image into relevant parts, such as the contours of a lesion or different tissue types, while detection focuses on identifying key features, such as edges or colour changes. With regard to wounds, this process enables the algorithms to recognise and analyse the various types of tissue present, such as epithelial tissue, granulation tissue, slough, and necrotic tissue [15]. Therefore, it automatically analyses the status of the lesion and provides an accurate assessment, eliminating the need for a subjective manual analysis that may be influenced by the clinician’s experience and environmental conditions.
The COVID-19 pandemic has significantly accelerated the development of apps focused on skin lesion management, as they facilitate remote assessment [60]. Some of these apps enable real-time storage, management, and sharing of clinical information while also employing algorithms that can automate and standardise skin lesion assessments. They also promote greater diagnostic consistency among different healthcare providers, which is crucial, especially in settings where staff training and experience may differ.
Through this literature review, we aim to present the main automated assessment tools available on the market to healthcare professionals. Significant advantages are recognised from the use of these applications, such as the standardisation of assessment and improved diagnostic accuracy, which helps mitigate the subjective variability among operators. In addition, they help reduce the time of ulcer diagnosis and measurement compared to traditional methods [41]. Moreover, with the added benefit of remote assessment, they eliminate direct contact with the patient, reducing the risk of infection and avoiding pain, unlike traditional measurement techniques that involve using a paper ruler or graph paper [61,62].
The standardisation of diagnostic processes not only reduces the risk of human error but also enhances the tracking of injury progression over time. In addition, apps can serve as valuable support tools for less experienced staff by offering preliminary diagnoses and therapeutic suggestions, thereby improving the overall quality of care.
However, there are several limitations associated with the use of these tools. The review revealed that it was not possible to assess the algorithm’s level of learning due to the lack of sufficiently large and diverse public databases for effective training. In addition, the existing scientific literature validating many of these apps is still limited, with published studies often featuring a small number of images, which limits the external validity of the results and the direct comparisons between them as segmentation and measurement methods vary widely among applications. However, since this work underscores the need for the establishment of uniform evaluation standards as initial food for thought, we preferred to show the results in percentage terms to stimulate an indirect comparison when possible and to be as informative as possible on the current panorama.
Finally, most app developers do not provide adequate technical details regarding the deep learning algorithms employed or the data used for training these models, making it challenging to evaluate the actual effectiveness of the apps in diverse clinical settings. In addition, assessing the quality of mobile applications proved difficult due to their unavailability in major digital stores. In this context, the literature suggests using the mobile app rating scale (MARS) as an evaluation tool, which comprises five main criteria, namely engagement, functionality, aesthetics, information, and subjective quality [63].
Research in this field is continuously advancing, aiming to integrate automated skin injury assessments into routine clinical practice. In this context, significant innovations include the development of a fully automated model capable of performing wound area identification and segmentation, achieving a fully automatic clinical severity assessment from smartphone-acquired images. Such approaches utilise active semi-supervised learning to train a convolutional neural network model [64]. Published models, subsequently trained and compared to PWAT scores, appeared to obtain a Spearman correlation coefficient of 0.85, indicating strong predictive accuracy [65].
Despite the many advantages of using these apps, concerns about data privacy, security, and integration with existing healthcare systems must be addressed [66]. As these apps collect sensitive clinical information and personal patient images, it is essential to ensure that data are protected in accordance with current regulations, such as the General Data Protection Regulation (GDPR) in Europe, Regulation (EU) 2016/679 [67].
Various challenges remain to be addressed with automated mobile applications, but future prospects look promising. Indeed, it is expected that algorithms will be soon capable of automatically identifying the wound aetiology, offering accurate classification and recommending the most appropriate treatment and type of dressing. In this context, a learning model has recently been introduced that classifies wounds into the following categories: deep, infected, arterial, venous, and pressure wounds [68,69,70].
Finally, deep learning is a key area of active research in wound image analysis, particularly in image classification, detection, and segmentation, which can significantly enhance diagnostic efficiency for healthcare providers [71]. The advancement of this technology could be especially valuable in addressing the needs of regions where expertise and access to healthcare services are limited.
Nevertheless, it remains essential to develop robust algorithms capable of analysing images captured in suboptimal conditions, such as poor lighting and inaccurate positioning [72].
As mobile applications for wound care continue to evolve in clinical practice, they have the potential to significantly enhance and improve patient care. Therefore, it is crucial to implement and promote the development of digital skills throughout the training and professional development of healthcare practitioners [73].

5. Conclusions

Mobile apps for skin wound management, powered by artificial intelligence and deep learning, are revolutionising the field of wound care and serve as valuable advanced tools for wound diagnosis and treatment to date. Despite the many advantages in standardisation, accuracy, and speed, some challenges persist, including a lack of adequate public databases and concerns regarding data security. But above all, one of the major limitations is the inconsistent clinical validation, often constrained by small datasets and narrow patient demographics. To increase the reliability and generalizability of these applications in varied clinical settings, future validation efforts should involve larger, multi-centre studies with diverse patient cohorts. This approach would better capture the heterogeneity of wound types and ulcer presentations encountered in real-world practice, strengthening the clinical robustness of these algorithms.
Additionally, the absence of standardised benchmarks and evaluation metrics significantly hinders the comparability of different applications. Each application often uses proprietary segmentation methods and unique measurement techniques, which limits consistent performance evaluation across platforms. To address this, the establishment of standardised datasets and uniform metrics is crucial. These improvements would not only enhance comparability but also provide a scientifically rigorous foundation for future advancements in wound care technology.
However, as algorithms advance and technologies become integrated into daily clinical practice, these apps have the potential to significantly enhance the quality of wound management and care, offering even more efficient support for wound care professionals.

Author Contributions

D.G.: conceptualization, methodology, software. A.N.: data curation, writing—original draft preparation. Y.M.: visualisation, data curation. M.S.: methodology, validation, writing—reviewing and editing. N.C.: software, formal analysis, visualisation. M.M.: formal analysis, data curation. G.C.: project administration, supervision. D.M.: supervision, resources. B.M.P.: supervision, resources. C.Z.: validation, writing—reviewing and editing. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Institutional Review Board Statement

Not applicable.

Data Availability Statement

Data are available upon request from the authors.

Conflicts of Interest

The authors declare no conflicts of interest.

References

  1. Mustoe, T.A.; O’Shaughnessy, K.; Kloeters, O. Chronic wound pathogenesis and current treatment strategies: A unifying hypothesis. Plast. Reconstr. Surg. 2006, 117 (Suppl. S7), 35S–41S. [Google Scholar] [CrossRef] [PubMed]
  2. Bowers, S.; Franco, E. Chronic Wounds: Evaluation and Management. Am. Fam. Physician 2020, 101, 159–166. [Google Scholar] [PubMed]
  3. Olsson, M.; Järbrink, K.; Divakar, U.; Bajpai, R.; Upton, Z.; Schmidtchen, A.; Car, J. The humanistic and economic burden of chronic wounds: A systematic review. Wound Repair Regen. Off. Publ. Wound Health Soc. Eur. Tissue Repair Soc. 2019, 27, 114–125. [Google Scholar] [CrossRef] [PubMed]
  4. Yee, A.; Harmon, J.; Yi, S. Quantitative Monitoring Wound Healing Status Through Three-Dimensional Imaging on Mobile Platforms. J. Am. Coll. Clin. Wound Spec. 2016, 8, 21–27. [Google Scholar] [CrossRef]
  5. Samaniego-Ruiz, M.-J.; Llatas, F.P.; Jiménez, O.S. Assessment of chronic wounds in adults: An integrative review. Rev. Esc. Enferm. USP 2018, 52, e03315. [Google Scholar] [CrossRef]
  6. Foltynski, P.; Ciechanowska, A.; Ladyzynski, P. Wound surface area measurement methods. Biocybern. Biomed. Eng. 2021, 41, 1454–1465. [Google Scholar] [CrossRef]
  7. Martinengo, L.; Olsson, M.; Bajpai, R.; Soljak, M.; Upton, Z.; Schmidtchen, A.; Car, J.; Järbrink, K. Prevalence of chronic wounds in the general population: Systematic review and meta-analysis of observational studies. Ann. Epidemiol. 2019, 29, 8–15. [Google Scholar] [CrossRef]
  8. Sen, C.K. Human Wounds and Its Burden: An Updated Compendium of Estimates. Adv. Wound Care 2019, 8, 39–48. [Google Scholar] [CrossRef]
  9. Smet, S.; Verhaeghe, S.; Beeckman, D.; Fourie, A.; Beele, H. The process of clinical decision-making in chronic wound care: A scenario-based think-aloud study. J. Tissue Viability 2024, 33, 231–238. [Google Scholar] [CrossRef]
  10. Khoo, R.; Jansen, S. The Evolving Field of Wound Measurement Techniques: A Literature Review. Wounds Compend. Clin. Res. Pract. 2016, 28, 175–181. [Google Scholar]
  11. Wang, C.; Yan, X.; Smith, M.; Kochhar, K.; Rubin, M.; Warren, S.M.; Wrobel, J.; Lee, H. A unified framework for automatic wound segmentation and analysis with deep convolutional neural networks. In Proceedings of the 2015 37th Annual International Conference of the IEEE Engineering in Medicine and Biology Society (EMBC), Milan, Italy, 5 November 2015; pp. 2415–2418. [Google Scholar] [CrossRef]
  12. Koepp, J.; Baron, M.V.; Martins, P.R.H.; Brandenburg, C.; Kira, A.T.F.; Trindade, V.D.; Dominguez, L.M.L.; Carneiro, M.; Frozza, R.; Possuelo, L.G.; et al. The Quality of Mobile Apps Used for the Identification of Pressure Ulcers in Adults: Systematic Survey and Review of Apps in App Stores. JMIR MHealth UHealth 2020, 8, e14266. [Google Scholar] [CrossRef] [PubMed]
  13. Sutton, R.T.; Pincock, D.; Baumgart, D.C.; Sadowski, D.C.; Fedorak, R.N.; Kroeker, K.I. An overview of clinical decision support systems: Benefits, risks, and strategies for success. NPJ Digit. Med. 2020, 3, 17. [Google Scholar] [CrossRef]
  14. Iglesias, L.L.; Bellón, P.S.; Del Barrio, A.P.; Fernández-Miranda, P.M.; González, D.R.; Vega, J.A.; Mandly, A.A.G.; Blanco, J.A.P. A primer on deep learning and convolutional neural networks for clinicians. Insights Imaging 2021, 12, 117. [Google Scholar] [CrossRef]
  15. Le, D.T.P.; Pham, T.D. Unveiling the role of artificial intelligence for wound assessment and wound healing prediction. Explor. Med. 2023, 4, 589–611. [Google Scholar] [CrossRef]
  16. Shamloul, N.; Ghias, M.H.; Khachemoune, A. The Utility of Smartphone Applications and Technology in Wound Healing. Int. J. Low. Extrem. Wounds 2019, 18, 228–235. [Google Scholar] [CrossRef]
  17. Anisuzzaman, D.M.; Wang, C.; Rostami, B.; Gopalakrishnan, S.; Niezgoda, J.; Yu, Z. Image-Based Artificial Intelligence in Wound Assessment: A Systematic Review. Adv. Wound Care 2022, 11, 687–709. [Google Scholar] [CrossRef]
  18. Poon, T.W.K.; Friesen, M.R. Algorithms for Size and Color Detection of Smartphone Images of Chronic Wounds for Healthcare Applications. IEEE Access 2015, 3, 1799–1808. [Google Scholar] [CrossRef]
  19. Zhang, R.; Tian, D.; Xu, D.; Qian, W.; Yao, Y. A Survey of Wound Image Analysis Using Deep Learning: Classification, Detection, and Segmentation. IEEE Access 2022, 10, 79502–79515. [Google Scholar] [CrossRef]
  20. Kermany, D.S.; Goldbaum, M.; Cai, W.; Valentim, C.C.S.; Liang, H.; Baxter, S.L.; McKeown, A.; Yang, G.; Wu, X.; Yan, F.; et al. Identifying Medical Diagnoses and Treatable Diseases by Image-Based Deep Learning. Cell 2018, 172, 1122–1131.e9. [Google Scholar] [CrossRef]
  21. Scebba, G.; Zhang, J.; Catanzaro, S.; Mihai, C.; Distler, O.; Berli, M.; Karlen, W. Detect-and-segment: A deep learning approach to automate wound image segmentation. Inform. Med. Unlocked 2022, 29, 100884. [Google Scholar] [CrossRef]
  22. Sorour, S.E.; El-Mageed, A.A.A.; Albarrak, K.M.; Alnaim, A.K.; Wafa, A.A.; El-Shafeiy, E. Classification of Alzheimer’s disease using MRI data based on Deep Learning Techniques. J. King Saud Univ.—Comput. Inf. Sci. 2024, 36, 101940. [Google Scholar] [CrossRef]
  23. Healthy.io|Digital Wound Management. Available online: https://healthy.io/services/wound/ (accessed on 30 September 2024).
  24. Nussbaum, S.R.; Carter, M.J.; Fife, C.E.; DaVanzo, J.; Haught, R.; Nusgart, M.; Cartwright, D. An Economic Evaluation of the Impact, Cost, and Medicare Policy Implications of Chronic Nonhealing Wounds. Value Health J. Int. Soc. Pharmacoeconomics Outcomes Res. 2018, 21, 27–32. [Google Scholar] [CrossRef] [PubMed]
  25. Keegan, A.C.; Bose, S.; McDermott, K.M.; Starks White, M.P.; Stonko, D.P.; Jeddah, D.; Lev-Ari, E.; Rutkowski, J.; Sherman, R.; Abularrage, C.J.; et al. Implementation of a patient-centered remote wound monitoring system for management of diabetic foot ulcers. Front. Endocrinol. 2023, 14, 1157518, Erratum in Front. Endocrinol. 2023, 14, 1235970. [Google Scholar] [CrossRef]
  26. Kivity, S.; Rajuan, E.; Arbeli, S.; Alcalay, T.; Shiri, L.; Orvieto, N.; Alon, Y.; Saban, M. Optimising wound monitoring: Can digital tools improve healing outcomes and clinic efficiency. J. Clin. Nurs. 2024, 33, 4014–4023. [Google Scholar] [CrossRef] [PubMed]
  27. Wound Imaging Solutions—WoundVision. Available online: https://woundvision.com/ (accessed on 30 September 2024).
  28. Langemo, D.; Spahn, J.; Snodgrass, L. Accuracy and Reproducibility of the Wound Shape Measuring and Monitoring System. Adv. Skin Wound Care 2015, 28, 317–323. [Google Scholar] [CrossRef]
  29. Langemo, D.; Spahn, J.; Spahn, T.; Pinnamaneni, V.C. Comparison of standardized clinical evaluation of wounds using ruler length by width and Scout length by width measure and Scout perimeter trace. Adv. Skin Wound Care 2015, 28, 116–121. [Google Scholar] [CrossRef]
  30. Wu, W.; Yong, K.Y.W.; Federico, M.A.J.; Gan, S.K.-E. The APD Skin Monitoring App for Wound Monitoring: Image Processing, Area Plot, and Colour Histogram. Spamd [26Internet]. 2019. Available online: https://scienceopen.com/hosted-document?doi=10.30943/2019/28052019 (accessed on 30 September 2024).
  31. Tang, M.; Gorelick, L.; Veksler, O.; Boykov, Y. GrabCut in One Cut; CVF Open Access: Salt Lake City, UT, USA, 2013; pp. 1769–1776. Available online: https://openaccess.thecvf.com/content_iccv_2013/html/Tang_GrabCut_in_One_2013_ICCV_paper.html (accessed on 30 September 2024).
  32. Gupta, S.; Girshick, R.; Arbeláez, P.; Malik, J. Learning Rich Features from RGB-D Images for Object Detection and Segmentation. In Proceedings of the Computer Vision—ECCV 2014, Zurich, Switzerland, 6–12 September 2014; Fleet, D., Pajdla, T., Schiele, B., Tuytelaars, T., Eds.; Springer International Publishing: Cham, Switzerland, 2014; pp. 345–360. [Google Scholar]
  33. Nair, H.K.R. Increasing productivity with smartphone digital imagery wound measurements and analysis. J. Wound Care 2018, 27, S12–S19. [Google Scholar] [CrossRef]
  34. Clinicgram—The Revolutionary App that Allows You to Diagnose Diseases with Your Smartphone. Available online: https://www.clinicgram.com/ (accessed on 30 September 2024).
  35. Swift Skin and Wound Mobile App and Dashboards. Swift. Available online: https://swiftmedical.com/solution/ (accessed on 30 September 2024).
  36. Wang, S.C.; Anderson, J.A.E.; Evans, R.; Woo, K.; Beland, B.; Sasseville, D.; Moreau, L. Point-of-care wound visioning technology: Reproducibility and accuracy of a wound measurement app. PLoS ONE 2017, 12, e0183139. [Google Scholar] [CrossRef]
  37. Ramachandram, D.; Ramirez-GarciaLuna, J.L.; Fraser, R.D.J.; Martínez-Jiménez, M.A.; Arriaga-Caballero, J.E.; Allport, J. Fully Automated Wound Tissue Segmentation Using Deep Learning on Mobile Devices: Cohort Study. JMIR MHealth UHealth 2022, 10, e36977. [Google Scholar] [CrossRef]
  38. Redmon, J.; Farhadi, A. YOLOv3: An Incremental Improvement. arXiv 2018, arXiv:1804.02767. [Google Scholar] [CrossRef]
  39. Liu, Z.; Agu, E.; Pedersen, P.; Lindsay, C.; Tulu, B.; Strong, D. Chronic Wound Image Augmentation and Assessment Using Semi-Supervised Progressive Multi-Granularity EfficientNet. IEEE Open J. Eng. Med. Biol. 2023, 5, 1–17. [Google Scholar] [CrossRef] [PubMed]
  40. EfficientNetB0—Furiosa Models. Available online: https://furiosa-ai.github.io/furiosa-models/latest/models/efficientnet_b0/ (accessed on 5 October 2024).
  41. Au, Y.; Beland, B.; Anderson, J.A.E.; Sasseville, D.; Wang, S.C. Time-Saving Comparison of Wound Measurement Between the Ruler Method and the Swift Skin and Wound App. J. Cutan. Med. Surg. 2019, 23, 226–228. [Google Scholar] [CrossRef] [PubMed]
  42. CARES4WOUNDS Wound Management System|Tetusyu Healthcare. Tetsuyu Healthcare. Available online: https://tetsuyuhealthcare.com/solutions/wound-management-system/ (accessed on 5 October 2024).
  43. Chan, K.S.; Chan, Y.M.; Tan, A.H.M.; Liang, S.; Cho, Y.T.; Hong, Q.; Yong, E.; Chong, L.R.C.; Zhang, L.; Tan, G.W.L.; et al. Clinical validation of an artificial intelligence-enabled wound imaging mobile application in diabetic foot ulcers. Int. Wound J. 2022, 19, 114–124. [Google Scholar] [CrossRef] [PubMed]
  44. Kitamura, A.; Nakagami, G.; Okabe, M.; Muto, S.; Abe, T.; Doorenbos, A.; Sanada, H. An application for real-time, remote consultations for wound care at home with wound, ostomy and continence nurses: A case study. Wound Pract. Res. 2022, 30, 158–162. [Google Scholar] [CrossRef]
  45. Home. Tissue Analytics. 2020. Available online: https://www.tissue-analytics.com/ (accessed on 5 October 2024).
  46. Barakat-Johnson, M.; Jones, A.; Burger, M.; Leong, T.; Frotjold, A.; Randall, S.; Kim, B.; Fethney, J.; Coyer, F. Reshaping wound care: Evaluation of an artificial intelligence app to improve wound assessment and management amid the COVID-19 pandemic. Int. Wound J. 2022, 19, 1561–1577. [Google Scholar] [CrossRef]
  47. Fong, K.Y.; Lai, T.P.; Chan, K.S.; See, I.J.L.; Goh, C.C.; Muthuveerappa, S.; Tan, A.H.; Liang, S.; Lo, Z.J. Clinical validation of a smartphone application for automated wound measurement in patients with venous leg ulcers. Int. Wound J. 2023, 20, 751–760. [Google Scholar] [CrossRef]
  48. Wound Assessment Tool—imitoWound App. imito AG. Available online: https://imito.io/en/imitowound (accessed on 5 October 2024).
  49. Guarro, G.; Cozzani, F.; Rossini, M.; Bonati, E.; Del Rio, P. Wounds morphologic assessment: Application and reproducibility of a virtual measuring system, pilot study. Acta Biomed. Atenei Parm. 2021, 92, e2021227. [Google Scholar] [CrossRef]
  50. Schroeder, A.B.; Dobson, E.T.A.; Rueden, C.T.; Tomancak, P.; Jug, F.; Eliceiri, K.W. The ImageJ ecosystem: Open-source software for image visualization, processing, and analysis. Protein Sci. Publ. Protein Soc. 2021, 30, 234–249. [Google Scholar] [CrossRef]
  51. Sia, D.K.; Mensah, K.B.; Opoku-Agyemang, T.; Folitse, R.D.; Darko, D.O. Mechanisms of ivermectin-induced wound healing. BMC Vet. Res. 2020, 16, 397. [Google Scholar] [CrossRef]
  52. Khac, A.D.; Jourdan, C.; Fazilleau, S.; Palayer, C.; Laffont, I.; Dupeyron, A.; Verdun, S.; Gelis, A. mHealth App for Pressure Ulcer Wound Assessment in Patients With Spinal Cord Injury: Clinical Validation Study. JMIR MHealth UHealth 2021, 9, e26443. [Google Scholar] [CrossRef]
  53. WoundWiseIQ—Image Analytics. Improved Outcomes. Available online: https://woundwiseiq.com/ (accessed on 5 October 2024).
  54. Phung, S.L.; Bouzerdoum, A.; Chai, D. Skin segmentation using color pixel classification: Analysis and comparison. IEEE Trans. Pattern Anal. Mach. Intell. 2005, 27, 148–154. [Google Scholar] [CrossRef] [PubMed]
  55. Kuang, B.; Pena, G.; Szpak, Z.; Edwards, S.; Battersby, R.; Cowled, P.; Dawson, J.; Fitridge, R. Assessment of a smartphone-based application for diabetic foot ulcer measurement. Wound Repair Regen. 2021, 29, 460–465. [Google Scholar] [CrossRef] [PubMed]
  56. Younis, P.H.M.; El Sebaie, A.E.M.; Waked, I.S.; Bayoumi, M.B.I. Validity and Reliability of a Smartphone Application in Measuring Surface Area of Lower Limb Chronic Wounds. Egypt. J. Hosp. Med. 2022, 89, 6612–6616. [Google Scholar] [CrossRef]
  57. El-Rashidy, N.; El-Sappagh, S.; Islam, S.M.R.; El-Bakry, H.M.; Abdelrazek, S. Mobile Health in Remote Patient Monitoring for Chronic Diseases: Principles, Trends, and Challenges. Diagnostics 2021, 11, 607. [Google Scholar] [CrossRef] [PubMed]
  58. Ohura, N.; Mitsuno, R.; Sakisaka, M.; Terabe, Y.; Morishige, Y.; Uchiyama, A.; Okoshi, T.; Shinji, I.; Takushima, A. Convolutional neural networks for wound detection: The role of artificial intelligence in wound care. J. Wound Care 2019, 28, S13–S24. [Google Scholar] [CrossRef]
  59. Dabas, M.; Schwartz, D.; Beeckman, D.; Gefen, A. Application of Artificial Intelligence Methodologies to Chronic Wound Care and Management: A Scoping Review. Adv. Wound Care 2023, 12, 205–240. [Google Scholar] [CrossRef]
  60. Kim, P.J.; Homsi, H.A.; Sachdeva, M.; Mufti, A.; Sibbald, R.G. Chronic Wound Telemedicine Models Before and During the COVID-19 Pandemic: A Scoping Review. Adv. Skin Wound Care 2022, 35, 87–94. [Google Scholar] [CrossRef]
  61. Foltynski, P.; Ladyzynski, P. Internet service for wound area measurement using digital planimetry with adaptive calibration and image segmentation with deep convolutional neural networks. Biocybern. Biomed. Eng. 2023, 43, 17–29. [Google Scholar] [CrossRef]
  62. Lucas, Y.; Niri, R.; Treuillet, S.; Douzi, H.; Castaneda, B. Wound Size Imaging: Ready for Smart Assessment and Monitoring. Adv. Wound Care 2021, 10, 641–661. [Google Scholar] [CrossRef]
  63. Stoyanov, S.R.; Hides, L.; Kavanagh, D.J.; Zelenko, O.; Tjondronegoro, D.; Mani, M. Mobile app rating scale: A new tool for assessing the quality of health mobile apps. JMIR MHealth UHealth 2015, 3, e27. [Google Scholar] [CrossRef]
  64. Curti, N.; Merli, Y.; Zengarini, C.; Giampieri, E.; Merlotti, A.; Dall’Olio, D.; Marcelli, E.; Bianchi, T.; Castellani, G. Effectiveness of Semi-Supervised Active Learning in Automated Wound Image Segmentation. Int. J. Mol. Sci. 2023, 24, 706. [Google Scholar] [CrossRef] [PubMed]
  65. Curti, N.; Merli, Y.; Zengarini, C.; Starace, M.; Rapparini, L.; Marcelli, E.; Carlini, G.; Buschi, D.; Castellani, G.C.; Piraccini, B.M.; et al. Automated Prediction of Photographic Wound Assessment Tool in Chronic Wound Images. J. Med. Syst. 2024, 48, 14. [Google Scholar] [CrossRef] [PubMed]
  66. Deniz-Garcia, A.; Fabelo, H.; Rodriguez-Almeida, A.J.; Zamora-Zamorano, G.; Castro-Fernandez, M.; Ruano, M.d.P.A.; Solvoll, T.; Granja, C.; Schopf, T.R.; Callico, G.M.; et al. Quality, Usability, and Effectiveness of mHealth Apps and the Role of Artificial Intelligence: Current Scenario and Challenges. J. Med. Internet Res. 2023, 25, e44030. [Google Scholar] [CrossRef] [PubMed]
  67. General Data Protection Regulation (GDPR)|EUR-Lex. Available online: https://eur-lex.europa.eu/EN/legal-content/summary/general-data-protection-regulation-gdpr.html (accessed on 5 October 2024).
  68. Huang, P.-H.; Pan, Y.-H.; Luo, Y.-S.; Chen, Y.-F.; Lo, Y.-C.; Chen, T.P.-C.; Perng, C.-K. Development of a deep learning-based tool to assist wound classification. J. Plast. Reconstr. Aesthetic Surg. JPRAS 2023, 79, 89–97. [Google Scholar] [CrossRef] [PubMed]
  69. Patel, Y.; Shah, T.; Dhar, M.K.; Zhang, T.; Niezgoda, J.; Gopalakrishnan, S.; Yu, Z. Integrated image and location analysis for wound classification: A deep learning approach. Sci. Rep. 2024, 14, 7043. [Google Scholar] [CrossRef]
  70. Malihi, L.; Hüsers, J.; Richter, M.L.; Moelleken, M.; Przysucha, M.; Busch, D.; Heggemann, J.; Hafer, G.; Wiemeyer, S.; Heidemann, G.; et al. Automatic Wound Type Classification with Convolutional Neural Networks. Stud. Health Technol. Inform. 2022, 295, 281–284. [Google Scholar] [CrossRef]
  71. Zhang, P. Image Enhancement Method Based on Deep Learning. Math. Probl. Eng. 2022, 2022, 6797367. [Google Scholar] [CrossRef]
  72. Chairat, S.; Chaichulee, S.; Dissaneewate, T.; Wangkulangkul, P.; Kongpanichakul, L. AI-Assisted Assessment of Wound Tissue with Automatic Color and Measurement Calibration on Images Taken with a Smartphone. Healthcare 2023, 11, 273. [Google Scholar] [CrossRef]
  73. Gagnon, J.; Probst, S.; Chartrand, J.; Lalonde, M. Self-supporting wound care mobile applications for nurses: A scoping review protocol. J. Tissue Viability 2023, 32, 79–84. [Google Scholar] [CrossRef]
Figure 1. Example of automated wound assessment process using AI-powered mobile app.
Figure 1. Example of automated wound assessment process using AI-powered mobile app.
Biomedinformatics 04 00126 g001
Figure 2. The radar charts provide a visual comparison of various wound monitoring apps based on six key criteria, namely platform availability, regulatory approval, inter-reliability, peer-reviewed studies, the disclosure of methods/algorithms, and the use of public datasets. Each chart represents a single app, highlighting strengths and weaknesses across these categories.
Figure 2. The radar charts provide a visual comparison of various wound monitoring apps based on six key criteria, namely platform availability, regulatory approval, inter-reliability, peer-reviewed studies, the disclosure of methods/algorithms, and the use of public datasets. Each chart represents a single app, highlighting strengths and weaknesses across these categories.
Biomedinformatics 04 00126 g002
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Griffa, D.; Natale, A.; Merli, Y.; Starace, M.; Curti, N.; Mussi, M.; Castellani, G.; Melandri, D.; Piraccini, B.M.; Zengarini, C. Artificial Intelligence in Wound Care: A Narrative Review of the Currently Available Mobile Apps for Automatic Ulcer Segmentation. BioMedInformatics 2024, 4, 2321-2337. https://doi.org/10.3390/biomedinformatics4040126

AMA Style

Griffa D, Natale A, Merli Y, Starace M, Curti N, Mussi M, Castellani G, Melandri D, Piraccini BM, Zengarini C. Artificial Intelligence in Wound Care: A Narrative Review of the Currently Available Mobile Apps for Automatic Ulcer Segmentation. BioMedInformatics. 2024; 4(4):2321-2337. https://doi.org/10.3390/biomedinformatics4040126

Chicago/Turabian Style

Griffa, Davide, Alessio Natale, Yuri Merli, Michela Starace, Nico Curti, Martina Mussi, Gastone Castellani, Davide Melandri, Bianca Maria Piraccini, and Corrado Zengarini. 2024. "Artificial Intelligence in Wound Care: A Narrative Review of the Currently Available Mobile Apps for Automatic Ulcer Segmentation" BioMedInformatics 4, no. 4: 2321-2337. https://doi.org/10.3390/biomedinformatics4040126

APA Style

Griffa, D., Natale, A., Merli, Y., Starace, M., Curti, N., Mussi, M., Castellani, G., Melandri, D., Piraccini, B. M., & Zengarini, C. (2024). Artificial Intelligence in Wound Care: A Narrative Review of the Currently Available Mobile Apps for Automatic Ulcer Segmentation. BioMedInformatics, 4(4), 2321-2337. https://doi.org/10.3390/biomedinformatics4040126

Article Metrics

Back to TopTop