Self-Detection of Early Breast Cancer Application with Infrared Camera and Deep Learning

: Breast cancer is the most common cause of death in women around the world. A new tool has been adopted based on thermal imaging, deep convolutional networks, health applications on smartphones, and cloud computing for early detection of breast cancer. The development of the smart app included the use of Mastology Research with the Infrared Image DMR-IR database and the training of the modiﬁed version of deep convolutional neural network model inception V4 (MV4). In addition to designing the application in a graphical user interface and linking it with the AirDroid application to send thermal images from the smartphone to the cloud and to retrieve the suggestive diagnostic result from the cloud server to the smartphone. Moreover, to verify the proper operation of the app, a set of thermal images was sent from the smartphone to the cloud server from different distances and image acquisition procedures to verify the quality of the images. Four effects on the thermal image were applied: Blur, Shaken, Tilted, and Flipping were added to the images to verify the detection accuracy. After conducting repeated experiments, the classiﬁcation results of early detection of breast cancer, generated from the MV4, illustrated high accuracy performance. The response time achieved after the successful transfer of diagnostic results from the smartphone to the cloud and back to the smartphone via the AirDroid application is six seconds. The results show that the quality of thermal images did not affect by different distances and methods except in one method when compressing thermal images by 5%, 15%, and 26%. The results indicate 1% as maximum detection accuracy when compressing thermal images by 5%, 15%, and 26%. In addition, the results indicate detection accuracy increased in Blurry images and Shaken images by 0.0002%, while diagnostic accuracy decreased to nearly 11% in Tilted images. Early detection of breast cancer using a thermal camera, deep convolutional neural network, cloud computing, and health applications of smartphones are valuable and reliable complementary tools for radiologists to reduce mortality rates.


Introduction
The new smartphone technology has become a strong competitor to computers. It is considered as one of the portable computers characterized by the characteristics of communication and service applications for the user. Recently, specialized applications have appeared in various sections of healthcare, making it easier for the user to access and benefit from it. It also creates self-responsibility and makes it easier to access healthcare in remote areas. This increasing growth in mobile applications in healthcare led to a huge number of applications. These results indicate that the individual's self-responsibility to obtain the largest amount of healthcare became possible. In 2018, there were about 600 applications for breast cancer awareness, screening, diagnosis, treatment, and managing disease [1].
Mobile phone applications in the modern era carry many advantages that have become part of daily life. Due to the multiplicity of these applications, it has entered the medical The researcher [8] indicates that the MOCHA application can follow up the patients with breast cancer in terms of nutrition, sports activities, and routine follow-up with health care professionals. In addition to the study, the patients were followed up over six months. The MOCHA system consists of three components: (1) the smartphone tool, (2) the application server, and (3) the provider's client. The smartphone tool and provider's client are the two applications that run on the Android and iOS operating systems. The smartphone tool as a tracking and communication tool is installed on the participants' cell phones. The provider's client is installed on the cell phones of the caregivers to monitor participants' attitudes and communication in real-time. The app is also used to guide the behavior of patients with breast cancer to prevent disease. In addition, the work focuses on long-term behavior modification to reduce common comorbidities among breast cancer survivors and improve the quality of cancer care for this growing population.
The study presented by [9] indicates the feasibility of the smartphone application and social media interference in the health outcomes of breast cancer survivors. The My Guide app also included a set of icons that contribute to self-awareness of breast cancer, including social communication, communication with the health awareness team, medical advice, educational audios, and prescriptions for patients with breast cancer. In addition, 25 women participated in this four-week study for a remote training protocol (for example, enhancing the use of the application and facilitating the resolution of problems related to barriers that the participant identified using the My Guide app).
The researcher [10] used an Android app (WalkON®), daily walking steps, and weekly distress scores were collected using app-based Distress Thermometer (DT) questionnaires from participants of approximately 12 weeks. The study also aimed to investigate the impact of a community-based mobile phone application on enhancing trauma and reducing distress among breast cancer survivors. The number of participants in this study is 64 (20-60 years old). Furthermore, participants were instructed to run and update the app at least once a week to have their daily walking data sent to a central database system. The mobile community showed a significant increase in weekly steps and a decrease in a distress thermometer.
The researcher [11] indicated the number of mobile phone applications for surviving breast cancer and self-management available on Android and Apple operating systems. The study included a total of 294 applications which were sorted according to the following criteria: Availability in English language, free of charge to the user; and have a digital rating available to the user on the corresponding mobile app store. A content analysis was performed for the nine applications to meeting the inclusion criteria to assess the inclusion of the following mobile health self-management features derived from the chronic care model: symptom tracking; survival education; sharing information with family and/or caregivers; scheduling follow-up visits; personal alerts and reminders; and social networks. Surviving education was found to be the most popular self-management feature among the apps reviewed, followed by social networking. The results of this study highlighted the paucity of mobile health resources available to breast cancer survivors.
The researcher [12] explains the introduction of a smartphone application as a health care tool for patients with breast cancer. It provides patients with individually tailored information and a support group of peers and healthcare professionals. Online breast cancer support aims to enhance women's self-efficacy, social support, and symptom management. The study included six months for 108 women. In addition, the application contains an educational forum, a discussion forum, an Ask Experts forum, and a forum for personal stories. The study contributed to the role of self-efficacy and social support in reducing symptom distress and the credibility of using a theoretical framework for developing a BCS intervention.
In this research [13], a comparison was made between a high-quality thermal camera 640 * 480 pixels and a small thermal camera of 160 * 120 pixels. The thermal images were greyed out and the four features were extracted using a gray level co-occurrence matrix (GLCM). Furthermore, the images were classified using the k-Nearest Neighbor classifier. The results indicated that both classification accuracy exceeded 98%. The method achieves 99.21% accuracy with KNN and has superior performance to traditional methods. Meanwhile, with the rapid development of smartphones that incorporate advanced digital camera technology, they will be more accessible for early screening of breast cancer, reduces heavy costs, strict technical limitations, and scarce medical resources as an auxiliary screening tool.
The researcher in [14] used a mobile phone equipped with a thermal camera. Image captured from a thermal imager (Cat®S60: Equipped with FLIR ™ Lepton), transmitted to FPGA via a Bluetooth ultra-low-power link, stored in an SD card, and gray matrix co-presence (GLCM) features and run-length matrix (RLM) is computed and fed into a Machine Learning (ML) classifier for early detection. The results indicated that the breast cancer screening processor targets a portable home environment and achieves sensitivity and specificity of 79.06% and 88.57%, respectively.
Previous studies in early breast cancer diagnoses have not ventured into the use of the inception family of deep learning algorithms. Our proposed system introduces using inception V3, inception V4, and modified version inception V4 [15]. Further, previous studies indicated a lack of home diagnostic tools for early breast cancer detection. It is noted that study [14] has used a mobile phone app to collect images using a thermal camera and transmit images to a machine learning code running on an FPGA card nearby using a Bluetooth connection. However, such solutions may introduce low accuracy levels. In this paper the proposed system introduced uses inception V3, inception V4, and a modified inception MV4 with very high accuracy and efficiency allowing for early detection of breast cancer at an early stage and helps conduct regular and continuous examinations follow-ups without violating the privacy of the patients or introducing any side effects.

Materials and Methods
The breast cancer screening scheme is presented using a mobile app connected to the thermal camera. We present the outline at the beginning and then describe each process in detail. Figure 1 illustrates our approach to breast cancer screening using a mobile app. We can summarize the main processes as follows: First, we train the Deep Convolutional Neural Network as a model inception MV4. Second, we create a Graphical User Interface Development Environment (GUIDE) in MATLAB using visual elements such as icons, buttons, scroll bars, windows, and boxes to simplify the interaction between the human and the computer. Third, we use cloud computing for the intense computational requirements and large data processing found in deep convolutional neural networks. Fourth, we use the mobile application to send thermal images for cloud computing, receive diagnostic results, and display on users' smartphone screen. Moreover, a set of thermal images were sent from the smartphone for cloud computing on different distances and methods to verify the quality of the images as shown in Figure 2. Four effects on the thermal image (Blur, Shaking, Tilting, and Flipping) were added to verify detection accuracy.

Deep Learning in Matlab
Deep learning is a branch of machine learning that uses deep convolutional neural networks to extract features directly from a database. Therefore, it achieves advanced classification accuracy that exceeds human performance. We used a deep convolutional neural network consisting of 192 layers. Deep convolutional neural network training requires a set of databases that includes thermal images of healthy and breast cancer. Breast thermal images were downloaded from the dynamic thermogram DMR-IR dataset, and Deep Convolutional Neural Network model Inception MV4 was loaded, the learning rate setting was adjusted and the optimization method was chosen. We divided the database

Deep learning in Matlab
Deep learning is a branch of machine learning that uses deep convolutional neural networks to extract features directly from a database. Therefore, it achieves advanced classification accuracy that exceeds human performance. We used a deep convolutional neural network consisting of 192 layers. Deep convolutional neural network training requires a set of databases that includes thermal images of healthy and breast cancer. Breast thermal images were downloaded from the dynamic thermogram DMR-IR dataset, and Deep Convolutional Neural Network model Inception MV4 was loaded, the learning rate setting was adjusted and the optimization method was chosen. We divided the database into

Deep learning in Matlab
Deep learning is a branch of machine learning that uses deep convolutional neural networks to extract features directly from a database. Therefore, it achieves advanced classification accuracy that exceeds human performance. We used a deep convolutional neural network consisting of 192 layers. Deep convolutional neural network training requires a set of databases that includes thermal images of healthy and breast cancer. Breast thermal images were downloaded from the dynamic thermogram DMR-IR dataset, and Deep Convolutional Neural Network model Inception MV4 was loaded, the learning rate setting was adjusted and the optimization method was chosen. We divided the database into Our app employs our developed deep learning algorithms presented in [15], namely Inception V3, Inception V4, and modified Inception MV4. The deep convolutional neural network modified Inception MV4 was developed for higher detection accuracy and faster arithmetic operations compared to Inception V3 and Inception V4. The major change in MV4 is the number of layers in Inception B is less than those in Inception V4. Table 1 shows a comparison between deep convolutional neural networks Inception v3, Inception v4, and Inception Mv4. Finally, all sizes of filters were 3 * 3 and 1 * 1 with average pooling and max pooling [15].

Graphical User Interface Development Environment (GUIDE)
The advantage of the graphical user interface is the availability to the user who does not know MATLAB. The graphic user interface is created using a set of icons such as buttons, scroll bars, windows, and boxes to simplify it for the user. A graphic user interface was created in Figure 4, which contains a dedicated place for displaying the thermal image and showing the diagnostic result. In addition, auxiliary icons for diagnosis have been created, such as the patient's name, age, gender, and room temperature. Moreover, we added some questions related to the patient's condition before the examination and there is an icon to reset all icons in GUIDE. Additionally, we provided two messages for the patient; if there is a suspicion of cancer, it will display ″It is advisable to pay a visit to a specialist clinic'' and for other conditions, it will display ''You are Safe'' as shown in Figure 4. In addition, the user interface has been provided with two files; the first was for entering the diagnostic thermal image, and the other was for storing the diagnostic result.

Cloud Computing
Cloud computing can process large volumes of data, with low cost, high performance, and unlimited storage. Therefore, cloud computing is greatly increased [16]. In

Graphical User Interface Development Environment (GUIDE)
The advantage of the graphical user interface is the availability to the user who does not know MATLAB. The graphic user interface is created using a set of icons such as buttons, scroll bars, windows, and boxes to simplify it for the user. A graphic user interface was created in Figure 4, which contains a dedicated place for displaying the thermal image and showing the diagnostic result. In addition, auxiliary icons for diagnosis have been created, such as the patient's name, age, gender, and room temperature. Moreover, we added some questions related to the patient's condition before the examination and there is an icon to reset all icons in GUIDE. Additionally, we provided two messages for the patient; if there is a suspicion of cancer, it will display "It is advisable to pay a visit to a specialist clinic" and for other conditions, it will display "You are Safe" as shown in Figure 4. In addition, the user interface has been provided with two files; the first was for entering the diagnostic thermal image, and the other was for storing the diagnostic result.

Graphical User Interface Development Environment (GUIDE)
The advantage of the graphical user interface is the availability to the user who does not know MATLAB. The graphic user interface is created using a set of icons such as buttons, scroll bars, windows, and boxes to simplify it for the user. A graphic user interface was created in Figure 4, which contains a dedicated place for displaying the thermal image and showing the diagnostic result. In addition, auxiliary icons for diagnosis have been created, such as the patient's name, age, gender, and room temperature. Moreover, we added some questions related to the patient's condition before the examination and there is an icon to reset all icons in GUIDE. Additionally, we provided two messages for the patient; if there is a suspicion of cancer, it will display ″It is advisable to pay a visit to a specialist clinic'' and for other conditions, it will display ''You are Safe'' as shown in Figure 4. In addition, the user interface has been provided with two files; the first was for entering the diagnostic thermal image, and the other was for storing the diagnostic result.

Cloud Computing
Cloud computing can process large volumes of data, with low cost, high performance, and unlimited storage. Therefore, cloud computing is greatly increased [16]. In

Cloud Computing
Cloud computing can process large volumes of data, with low cost, high performance, and unlimited storage. Therefore, cloud computing is greatly increased [16]. In addition, it is possible to add a set of GPUs with high specifications in cloud computing. The PC was used as a cloud computing platform, where thermal images were received from smartphones and processed on the PC and the results were sent to the smartphone via the application as shown in Figure 5.

Smartphone Health Application
Mobile health is an application that provides healthcare via mobile devices. It has been popular in recent times due to people's interest in public health. Some health apps rely on periodic monitoring and are usually connected to sensors to collect data such as heart rate and track the exact geographical location [17]. Therefore, these applications provide new solutions for digital health services. Previous studies have indicated solutions to breast cancer recurrence and prevention, but no primary diagnostic aid for breast cancer has been mentioned. By adding a thermal camera with smartphone applications, we have added a feature for early detection of breast cancer as showen in Figure 6. The proposal is to create an application that transfers data from the smartphone to the cloud computing platform and sends the results from the cloud computing platform to the mobile phone, as shown in Figure 7.
addition, it is possible to add a set of GPUs with high specifications in cloud compu The PC was used as a cloud computing platform, where thermal images were rece from smartphones and processed on the PC and the results were sent to the smartp via the application as shown in Figure 5.
Electronics 2021, 10, x FOR PEER REVIEW 9 Figure 5. Diagnostic results on the application interface.

Smartphone Health Application
Mobile health is an application that provides healthcare via mobile devices. It been popular in recent times due to people's interest in public health. Some health a rely on periodic monitoring and are usually connected to sensors to collect data suc heart rate and track the exact geographical location [17]. Therefore, these applications p       Figure 1 shows the steps for implementing early detection of breast cancer using a smartphone application that is equipped with a thermal camera. We used a PC (Core i7, RAM 36 GB, a GTX 1660 GPU with 6 GB RAM), Matlab version 2020a, DMR-IR database, Huawei Smartphone, FLIR Pro-One thermal camera, and AirDroid app. In addition, a deep convolutional neural network is designed as inception MV4. To train the deep convolutional neural network model inception MV4, we set the learning rate to 1 × 10 −4 and used the Stochastic Gradient Descent with Momentum (SGDM) optimization method. After training the Deep Convolutional Neural Network Model inception MV4, and verifying a set of tests, we transferred it to MATLAB's GUIDE. The user interface is designed in two parts. The first icon contains the thermal image and the other contains the diagnostic result as shown in Figures 8 and 9. Moreover, the user interface is programmed to automatically read thermal images from a file (input images) created on the desktop and send the diagnostic results to a file (diagnostic output) on the desktop. Figure 1 shows the steps for implementing early detection of breast cancer using a smartphone application that is equipped with a thermal camera. We used a PC (Core i7, RAM 36 GB, a GTX 1660 GPU with 6 GB RAM), Matlab version 2020a, DMR-IR database, Huawei Smartphone, FLIR Pro-One thermal camera, and AirDroid app. In addition, a deep convolutional neural network is designed as inception MV4. To train the deep convolutional neural network model inception MV4, we set the learning rate to 1 × 10 and used the Stochastic Gradient Descent with Momentum (SGDM) optimization method. After training the Deep Convolutional Neural Network Model inception MV4, and verifying a set of tests, we transferred it to MATLAB's GUIDE. The user interface is designed in two parts. The first icon contains the thermal image and the other contains the diagnostic result as shown in Figures 8 and 9. Moreover, the user interface is programmed to automatically read thermal images from a file (input images) created on the desktop and send the diagnostic results to a file (diagnostic output) on the desktop.

Experiment set up
The AirDroid application was installed on the smartphone and on the desktop to insert thermal images from the smartphone into the cloud computing and send the diagnostic results from the cloud server to the smartphone. In conclusion, a group of thermal images captured by the smartphone was tested by a thermal camera (Model FLIR One Pro) in a Shiraz Hospital as shown in Figures 8 and 9. A comparison of the performance of the three deep learning algorithms implemented on the app is shown in Table 1 below.  The AirDroid application was installed on the smartphone and on the desktop to insert thermal images from the smartphone into the cloud computing and send the diagnostic results from the cloud server to the smartphone. In conclusion, a group of thermal images captured by the smartphone was tested by a thermal camera (Model FLIR One Pro) in a Shiraz Hospital as shown in Figures 8 and 9. A comparison of the performance of the three deep learning algorithms implemented on the app is shown in Table 1 below.
On the other hand, the experiments were made as follows: where two sources of thermal images were used, the first source was DMR-IR Database, and the second source was the FLIR One Pro connected to a smartphone (thermography from Shiraz Cancer Hospital). The first experiment used six thermal images from Database, including three healthy images and three breast cancer images. The second experiment used five thermal images taken by the Flair Pro-One thermal camera, all of them were suffering from breast cancer. To verify factors affecting the quality of thermal images, they were sent from the smartphone to the cloud in different ways. The first stage: thermal images were sent from the Database and from FLIR One Pro thermal camera to the cloud via Wi-Fi on different scenarios (1 m, 5 m, and 7 m, all without barriers). The second stage: thermal images were sent from the smartphone to the cloud server via Wi-Fi, with barriers between the smartphone and Wi-Fi (one wall, two walls, a roof, a roof with one wall, and a roof with two walls). The third stage: thermal images were sent from the smartphone to the cloud server by cable. The fourth stage: thermal images were sent from the smartphone to the cloud server via the 4G network. The fifth stage: thermal images were compressed by different percentage values such as 5%, 15%, and 26% and sent from the smartphone to the cloud computing. On the other hand, the experiments were made as follows: where two sources of thermal images were used, the first source was DMR-IR Database, and the second source was the FLIR One Pro connected to a smartphone (thermography from Shiraz Cancer Hospital). The first experiment used six thermal images from Database, including three healthy images and three breast cancer images. The second experiment used five thermal images taken by the Flair Pro-One thermal camera, all of them were suffering from breast cancer. To verify factors affecting the quality of thermal images, they were sent from the smartphone to the cloud in different ways. The first stage: thermal images were sent from the Database and from FLIR One Pro thermal camera to the cloud via Wi-Fi on different scenarios (1 m, 5 m, and 7 m, all without barriers). The second stage: thermal images were sent from the smartphone to the cloud server via Wi-Fi, with barriers between the smartphone and Wi-Fi (one wall, two walls, a roof, a roof with one wall, and a roof with two walls). The third stage: thermal images were sent from the smartphone to the cloud server by cable. The fourth stage: thermal images were sent from the smartphone to the cloud server via the 4G network. The fifth stage: thermal images were compressed by different percentage values such as 5%, 15%, and 26% and sent from the smartphone to the cloud computing.
In addition, eight metrics were used to measure the quality of the thermal images Image blur occurs when the camera moves during the exposure. As for the thermal camera, when blurring occurs around temperature grades, it may lead to incorrect readings in pixels and too unreliable results in thermal images [18]. The researchers in [19] also studied the extent to which the blurring affects the accuracy of breast cancer detection. Furthermore, thermal image tilt is a common processing routine for training a deep convolutional neural network. The tilt depends on the angle and the point of the tilt [20]. The goal of the blur, flip, and tilt is to increase network training and achieve the best diagnostic outcome [21]. In addition, the increasing acquisition of thermal images by inexperienced users leads to a large number of distortions, including shaking images caused by camera shake [22]. However, four factors have been added to influence the accuracy of thermal imaging diagnostics (Blurry images in Figure 10, Flipped images in Figure 11, tilted images in Figure 12, and Shaken images in Figure 13). Finally, the accuracy of the diagnosis was verified in every aspect of the evaluation.
Furthermore, thermal image tilt is a common processing routine for training a deep convolutional neural network. The tilt depends on the angle and the point of the tilt [20]. The goal of the blur, flip, and tilt is to increase network training and achieve the best diagnostic outcome [21]. In addition, the increasing acquisition of thermal images by inexperienced users leads to a large number of distortions, including shaking images caused by camera shake [22]. However, four factors have been added to influence the accuracy of thermal imaging diagnostics (Blurry images in Figure 10, Flipped images in Figure 11, tilted images in Figure 12, and Shaken images in Figure 13). Finally, the accuracy of the diagnosis was verified in every aspect of the evaluation.    Furthermore, thermal image tilt is a common processing routine for training a deep convolutional neural network. The tilt depends on the angle and the point of the tilt [20]. The goal of the blur, flip, and tilt is to increase network training and achieve the best diagnostic outcome [21]. In addition, the increasing acquisition of thermal images by inexperienced users leads to a large number of distortions, including shaking images caused by camera shake [22]. However, four factors have been added to influence the accuracy of thermal imaging diagnostics (Blurry images in Figure 10, Flipped images in Figure 11, tilted images in Figure 12, and Shaken images in Figure 13). Finally, the accuracy of the diagnosis was verified in every aspect of the evaluation.    Furthermore, thermal image tilt is a common processing routine for training a deep convolutional neural network. The tilt depends on the angle and the point of the tilt [20]. The goal of the blur, flip, and tilt is to increase network training and achieve the best diagnostic outcome [21]. In addition, the increasing acquisition of thermal images by inexperienced users leads to a large number of distortions, including shaking images caused by camera shake [22]. However, four factors have been added to influence the accuracy of thermal imaging diagnostics (Blurry images in Figure 10, Flipped images in Figure 11, tilted images in Figure 12, and Shaken images in Figure 13). Finally, the accuracy of the diagnosis was verified in every aspect of the evaluation.

Results and Discussion
We have conducted many experiments by applying the proposed method and the results of the evaluation showed the success of detecting breast cancer at an early stage. The deep convolutional neural network model inception MV4 showed excellent performance in the training phase, so the diagnostic accuracy reached 100%. Moreover, the diagnostic period is only 6 s from sending the image through the application and until receiving the diagnostic result in the interface on the smartphone. The results showed that the size of the application and user interface fees amounted to 1.5 GB due to their large database. In addition, thermal image processing using deep convolutional neural networks requires a high-speed GPU that is not available in smartphones, so cloud computing has the ability to quickly diagnose and send results to the smartphone application.
The results showed that the thermal images (from the database and the FLIR One Pro thermal camera) sent from the smartphone to the cloud computing server via Wi-Fi, either direct or with barriers, did not change their quality as shown in Tables 2 and 3. However, when compressing thermal images from the database and sent them from the smartphone to the cloud, images quality changed as shown in Table 4. In addition, when thermal images are compressed by 5%, 15%, and 26%, the diagnostic accuracy changes with a very small difference of 0.0001% max with a change in the quality of thermal images. As for the diagnostic accuracy of compressed (5%, 15%, and 26%) thermal images used from the FLIR One Pro compared to the original images, it changed by a maximum difference of 0.1% with a change in the quality of thermal images (Table 5).
In the second part of the experiment, the results indicated that the use of four factors affecting the accuracy of diagnosis fluctuated. The results of the experiment using thermal images from the database show that detection accuracy in the first, the second, and the third health thermal images decreased by a maximum of 1.6% in Blurry images, Tilted images, and Shaken images. However, the Flipped image has maintained 100% detection accuracy. As for breast cancer thermography, detection accuracy increased in Blurry images and Shaken images by 0.0002%, while diagnostic accuracy decreased to nearly 11% in Tilted images. However, in the Flipped image, the accuracy percentage remained the same in the original pictures (Table 6).
On the other hand, the results of the experiment, in which FLIR One Pro thermal images were used, when compressed and sent from the smartphone to the cloud computing server, indicate a decrease in the quality of thermal images sent. However, this change in the quality of thermal images affected the detection accuracy, as the results showed a very slight fluctuation detection accuracy percentage around 0.03%. In addition, when Blurry images, Shaken images, and Flipped images were used, detection accuracy has increased by a small percentage (about 0.4% max.), but when using Tilted images, the detection accuracy percentage fluctuates +−0.5% (Table 7).

Results and Discussion
We have conducted many experiments by applying the proposed method and the results of the evaluation showed the success of detecting breast cancer at an early stage. The deep convolutional neural network model inception MV4 showed excellent performance in the training phase, so the diagnostic accuracy reached 100%. Moreover, the diagnostic period is only 6 s from sending the image through the application and until receiving the diagnostic result in the interface on the smartphone. The results showed that the size of the application and user interface fees amounted to 1.5 GB due to their large database. In addition, thermal image processing using deep convolutional neural networks requires a high-speed GPU that is not available in smartphones, so cloud computing has the ability to quickly diagnose and send results to the smartphone application.
The results showed that the thermal images (from the database and the FLIR One Pro thermal camera) sent from the smartphone to the cloud computing server via Wi-Fi, either direct or with barriers, did not change their quality as shown in Tables 2 and 3. However, when compressing thermal images from the database and sent them from the smartphone to the cloud, images quality changed as shown in Table 4. In addition, when thermal images are compressed by 5%, 15%, and 26%, the diagnostic accuracy changes with a very small difference of 0.0001% max with a change in the quality of thermal images. As for the diagnostic accuracy of compressed (5%, 15%, and 26%) thermal images used from the FLIR One Pro compared to the original images, it changed by a maximum difference of 0.1% with a change in the quality of thermal images (Table 5). In the second part of the experiment, the results indicated that the use of four factors affecting the accuracy of diagnosis fluctuated. The results of the experiment using thermal images from the database show that detection accuracy in the first, the second, and the third health thermal images decreased by a maximum of 1.6% in Blurry images, Tilted images, and Shaken images. However, the Flipped image has maintained 100% detection accuracy. As for breast cancer thermography, detection accuracy increased in Blurry images and Shaken images by 0.0002%, while diagnostic accuracy decreased to nearly 11% in Tilted images. However, in the Flipped image, the accuracy percentage remained the same in the original pictures (Table 6).
On the other hand, the results of the experiment, in which FLIR One Pro thermal images were used, when compressed and sent from the smartphone to the cloud computing server, indicate a decrease in the quality of thermal images sent. However, this change in the quality of thermal images affected the detection accuracy, as the results showed a very slight fluctuation detection accuracy percentage around 0.03%. In addition, when Blurry images, Shaken images, and Flipped images were used, detection accuracy has increased by a small percentage (about 0.4% max.), but when using Tilted images, the detection accuracy percentage fluctuates +−0.5% (Table 7).

Conclusions
Health applications in smartphones contributed to the increase in the culture of selfcare. Given previous studies that seek to reduce the incidence of breast cancer, however, it needs a primary diagnostic tool that is compatible with health applications in modern smartphones. The current paper proposes a home-automated diagnostic tool with the help of smartphone applications, cloud computing, and thermal cameras. The experimental results confirm the effectiveness of the proposal, as the accuracy rate in breast cancer detection has reached 100%. We conclude that breast thermography using health applications for smartphones and cloud computing is a good tool for early detection of breast cancer, especially for remote areas and elderly patients, in addition to providing features related to health education, rapid response, and periodic follow-up for patients. This tool on the smartphone application makes a quantum leap in raising the efficiency of initial self-diagnosis. Moreover, this technology is characterized by repeated use at the same time and used as a family diagnostic tool. Additionally, results show that the quality of thermal images did not affect by different distances and methods except in one method when compressing thermal images by 5%, 15%, and 26%. The results indicate 1% as maximum detection accuracy when compressing thermal images by 5%, 15%, and 26%. In addition, the results indicate detection accuracy increased in Blurry images and Shaken images by 0.0002%, while diagnostic accuracy decreased to nearly 11% in Tilted images. Future work could lead to the creation of an integrated health application that includes health education, periodic follow-up, communication with the health care center, updating patient data, prescriptions, and exercise, in addition to using this technique to detect other diseases such as lung cancer and foot ulcers. Moreover, future work should add a set of effects to thermal images before training in the deep convolutional neural network. Future works should focus on improving the classification and detection accuracies considering different age groups, gender types, and other convolved medical preconditions. Previous studies have ignored the diagnoses considering these important factors. An interesting fault diagnoses method that combines thermal imaging and machine learning is introduced in [23]. Future work may look into the possible application of this method to the early detection of breast cancer.