Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (12)

Search Parameters:
Authors = Mazliham Mohd Su’ud

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
37 pages, 12595 KiB  
Article
A Systematic Parameter Analysis of Cloud Simulation Tools in Cloud Computing Environments
by Muhammad Asim Shahid, Muhammad Mansoor Alam and Mazliham Mohd Su’ud
Appl. Sci. 2023, 13(15), 8785; https://doi.org/10.3390/app13158785 - 29 Jul 2023
Cited by 12 | Viewed by 5439
Abstract
To provide various applications in various domains, a large-scale cloud data center is required. Cloud computing enables access to nearly infinite computing resources on demand. As cloud computing grows in popularity, researchers in this field must conduct real-world experiments. Configuring and running these [...] Read more.
To provide various applications in various domains, a large-scale cloud data center is required. Cloud computing enables access to nearly infinite computing resources on demand. As cloud computing grows in popularity, researchers in this field must conduct real-world experiments. Configuring and running these tests in an actual cloud environment is costly. Modeling and simulation methods, on the other hand, are acceptable solutions for emulating environments in cloud computing. This research paper reviewed several simulation tools specifically for cloud computing in the literature and presented the most effective simulation methods in this research domain, as well as an analysis of a variety of cloud simulation tools. Cloud computing tools such as CloudSim, CloudSim Plus, CloudAnalyst, iFogSim, and CloudReports were evaluated. Furthermore, a parametric evaluation of cloud simulation tools is presented based on the identified parameters. Several 5-parameter tests were performed to demonstrate the capabilities of the cloud simulator. These results show the value of our proposed simulation system. CloudSim, CloudSim Plus, CloudAnalyst, iFogSim, and CloudReports are used to evaluate host processing elements, virtual machine processing elements, cloudlet processing elements, userbase average, minimum, and maximum, and cloudlet ID Start Time, Finish Time, Average Start, and Average Finish for each simulator. The outcomes compare these five simulator metrics. After reading this paper, the reader will be able to compare popular simulators in terms of supported models, architecture, and high-level features. We performed a comparative analysis of several cloud simulators based on various parameters. The goal is to provide insights for each analysis given their features, functionalities, and guidelines on the way to researchers’ preferred tools. Full article
(This article belongs to the Special Issue Advanced Technology of Intelligent Control and Simulation Evaluation)
Show Figures

Figure 1

26 pages, 903 KiB  
Article
Improving Reliability for Detecting Anomalies in the MQTT Network by Applying Correlation Analysis for Feature Selection Using Machine Learning Techniques
by Imran, Megat Farez Azril Zuhairi, Syed Mubashir Ali, Zeeshan Shahid, Muhammad Mansoor Alam and Mazliham Mohd Su’ud
Appl. Sci. 2023, 13(11), 6753; https://doi.org/10.3390/app13116753 - 1 Jun 2023
Cited by 8 | Viewed by 3151
Abstract
Anomaly detection (AD) has captured a significant amount of focus from the research field in recent years, with the rise of the Internet of Things (IoT) application. Anomalies, often known as outliers, are defined as the discovery of anomalous occurrences or observations that [...] Read more.
Anomaly detection (AD) has captured a significant amount of focus from the research field in recent years, with the rise of the Internet of Things (IoT) application. Anomalies, often known as outliers, are defined as the discovery of anomalous occurrences or observations that differ considerably from the mainstream of the data. The IoT which is described as a network of Internet-based digital sensors that continuously generate massive volumes of data and use to communicate with one another theMessage Queuing Telemetry Transport (MQTT) protocol. Brute-force, Denial-of-Service (DoS), Malformed, Flood, and Slowite attacks are the most common in theMQTT network. One of the significant factors in IoT AD is the time consumed to predict an attack and take preemptive measures. For instance, if an attack is detected late, the loss of attack is irreversible. This paper investigates the time to detect an attack using machine learning approaches and proposes a novel approach that applies correlation analysis to reduce the training and testing time of these algorithms. The new approach has been evaluated on Random Forest, Decision Tree, Naïve Bayes, Multi-Layer Perceptron, Artificial Neural Network, Logistic Regression, and Gradient Boost. The findings indicate that the correlation analysis is significantly beneficial in the process of feature engineering, primarily to determine the most relevant features in the MQTT dataset. This is, to the best of our knowledge, the first study on MQTTset that reduces the prediction time for DoS 0.92 (95% CI −0.378, 2.22) reduced to 0.77 (95% CI −0.414, 1.97) and for Malformed 2.92 (95% CI −2.6, 8.44) reduced to 0.49 (95% CI −0.273, 1.25). Full article
(This article belongs to the Special Issue Machine Learning for Network Security)
Show Figures

Figure 1

55 pages, 29708 KiB  
Article
Achieving Reliability in Cloud Computing by a Novel Hybrid Approach
by Muhammad Asim Shahid, Muhammad Mansoor Alam and Mazliham Mohd Su’ud
Sensors 2023, 23(4), 1965; https://doi.org/10.3390/s23041965 - 9 Feb 2023
Cited by 17 | Viewed by 3566
Abstract
Cloud computing (CC) benefits and opportunities are among the fastest growing technologies in the computer industry. Cloud computing’s challenges include resource allocation, security, quality of service, availability, privacy, data management, performance compatibility, and fault tolerance. Fault tolerance (FT) refers to a system’s ability [...] Read more.
Cloud computing (CC) benefits and opportunities are among the fastest growing technologies in the computer industry. Cloud computing’s challenges include resource allocation, security, quality of service, availability, privacy, data management, performance compatibility, and fault tolerance. Fault tolerance (FT) refers to a system’s ability to continue performing its intended task in the presence of defects. Fault-tolerance challenges include heterogeneity and a lack of standards, the need for automation, cloud downtime reliability, consideration for recovery point objects, recovery time objects, and cloud workload. The proposed research includes machine learning (ML) algorithms such as naïve Bayes (NB), library support vector machine (LibSVM), multinomial logistic regression (MLR), sequential minimal optimization (SMO), K-nearest neighbor (KNN), and random forest (RF) as well as a fault-tolerance method known as delta-checkpointing to achieve higher accuracy, lesser fault prediction error, and reliability. Furthermore, the secondary data were collected from the homonymous, experimental high-performance computing (HPC) system at the Swiss Federal Institute of Technology (ETH), Zurich, and the primary data were generated using virtual machines (VMs) to select the best machine learning classifier. In this article, the secondary and primary data were divided into two split ratios of 80/20 and 70/30, respectively, and cross-validation (5-fold) was used to identify more accuracy and less prediction of faults in terms of true, false, repair, and failure of virtual machines. Secondary data results show that naïve Bayes performed exceptionally well on CPU-Mem mono and multi blocks, and sequential minimal optimization performed very well on HDD mono and multi blocks in terms of accuracy and fault prediction. In the case of greater accuracy and less fault prediction, primary data results revealed that random forest performed very well in terms of accuracy and fault prediction but not with good time complexity. Sequential minimal optimization has good time complexity with minor differences in random forest accuracy and fault prediction. We decided to modify sequential minimal optimization. Finally, the modified sequential minimal optimization (MSMO) algorithm with the fault-tolerance delta-checkpointing (D-CP) method is proposed to improve accuracy, fault prediction error, and reliability in cloud computing. Full article
(This article belongs to the Special Issue Fuzzy Systems and Neural Networks for Engineering Applications)
Show Figures

Figure 1

24 pages, 5843 KiB  
Article
Performance Evaluation of Load-Balancing Algorithms with Different Service Broker Policies for Cloud Computing
by Muhammad Asim Shahid, Muhammad Mansoor Alam and Mazliham Mohd Su’ud
Appl. Sci. 2023, 13(3), 1586; https://doi.org/10.3390/app13031586 - 26 Jan 2023
Cited by 38 | Viewed by 6125
Abstract
Cloud computing has seen a major boom during the past few years. Many people have switched to cloud computing because traditional systems require complex resource distribution and cloud solutions are less expensive. Load balancing (LB) is one of the essential challenges in cloud [...] Read more.
Cloud computing has seen a major boom during the past few years. Many people have switched to cloud computing because traditional systems require complex resource distribution and cloud solutions are less expensive. Load balancing (LB) is one of the essential challenges in cloud computing used to balance the workload of cloud services. This research paper presents a performance evaluation of the existing load-balancing algorithms which are particle swarm optimization (PSO), round robin (RR), equally spread current execution (ESCE), and throttled load balancing. This study offers a detailed performance evaluation of various load-balancing algorithms by employing a cloud analyst platform. Efficiency concerning various service broker policy configurations for load-balancing algorithms’ virtual machine load balance was also calculated using metrics such as optimized response time (ORT), data center processing time (DCPT), virtual machine costs, data transfer costs, and total cost for different workloads and user bases. Many of the past papers that were mentioned in the literature worked on round robin and equally spread current execution, and throttled load-balancing algorithms were based on efficiency and response time in virtual machines without recognizing the relation between the task and the virtual machines, and the practical significance of the application. A comparison of specific load-balancing algorithms has been investigated. Different service broker policy (SBP) tests have been conducted to illustrate the load-balancing algorithm capabilities. Full article
(This article belongs to the Special Issue Advanced Technology of Intelligent Control and Simulation Evaluation)
Show Figures

Figure 1

18 pages, 377 KiB  
Article
AI-Enabled Wearable Medical Internet of Things in Healthcare System: A Survey
by Fazli Subhan, Alina Mirza, Mazliham Bin Mohd Su’ud, Muhammad Mansoor Alam, Shibli Nisar, Usman Habib and Muhammad Zubair Iqbal
Appl. Sci. 2023, 13(3), 1394; https://doi.org/10.3390/app13031394 - 20 Jan 2023
Cited by 68 | Viewed by 14975
Abstract
Technology has played a vital part in improving quality of life, especially in healthcare. Artificial intelligence (AI) and the Internet of Things (IoT) are extensively employed to link accessible medical resources and deliver dependable and effective intelligent healthcare. Body wearable devices have garnered [...] Read more.
Technology has played a vital part in improving quality of life, especially in healthcare. Artificial intelligence (AI) and the Internet of Things (IoT) are extensively employed to link accessible medical resources and deliver dependable and effective intelligent healthcare. Body wearable devices have garnered attention as powerful devices for healthcare applications, leading to various commercially available devices for multiple purposes, including individual healthcare, activity alerts, and fitness. The paper aims to cover all the advancements made in the wearable Medical Internet of Things (IoMT) for healthcare systems, which have been scrutinized from the perceptions of their efficacy in detecting, preventing, and monitoring diseases in healthcare. The latest healthcare issues are also included, such as COVID-19 and monkeypox. This paper thoroughly discusses all the directions proposed by the researchers to improve healthcare through wearable devices and artificial intelligence. The approaches adopted by the researchers to improve the overall accuracy, efficiency, and security of the healthcare system are discussed in detail. This paper also highlights all the constraints and opportunities of developing AI enabled IoT-based healthcare systems. Full article
(This article belongs to the Special Issue Artificial Intelligence and Robotics in Healthcare)
Show Figures

Figure 1

14 pages, 2680 KiB  
Review
Application of Internet of Things (IoT) in Sustainable Supply Chain Management
by Yasser Khan, Mazliham Bin Mohd Su’ud, Muhammad Mansoor Alam, Syed Fayaz Ahmad, Ahmad Y. A. Bani Ahmad (Ayassrah) and Nasir Khan
Sustainability 2023, 15(1), 694; https://doi.org/10.3390/su15010694 - 30 Dec 2022
Cited by 87 | Viewed by 26545
Abstract
The traditional supply chain system included smart objects to enhance intelligence, automation capabilities, and intelligent decision-making. Internet of Things (IoT) technologies are providing unprecedented opportunities to enhance efficiency and reduce the cost of the existing system of the supply chain. This article aims [...] Read more.
The traditional supply chain system included smart objects to enhance intelligence, automation capabilities, and intelligent decision-making. Internet of Things (IoT) technologies are providing unprecedented opportunities to enhance efficiency and reduce the cost of the existing system of the supply chain. This article aims to study the prevailing supply chain system and explore the benefits obtained after smart objects and embedded networks of IoT are implanted. Short-range communication technologies, radio frequency identification (RFID), middleware, and cloud computing are extensively comprehended to conceptualize the smart supply chain management system. Moreover, manufacturers are achieving maximum benefits in terms of safety, cost, intelligent management of inventory, and decision-making. This study also offers concepts of smart carriage, loading/unloading, transportation, warehousing, and packaging for the secure distribution of products. Furthermore, the tracking of customers to convince them to make more purchases and the modification of shops with the assistance of the Internet of Things are thoroughly idealized. Full article
Show Figures

Figure 1

20 pages, 3572 KiB  
Review
Architectural Threats to Security and Privacy: A Challenge for Internet of Things (IoT) Applications
by Yasser Khan, Mazliham Bin Mohd Su’ud, Muhammad Mansoor Alam, Sayed Fayaz Ahmad, Nur Agus Salim and Nasir Khan
Electronics 2023, 12(1), 88; https://doi.org/10.3390/electronics12010088 - 26 Dec 2022
Cited by 23 | Viewed by 5396
Abstract
The internet of things (IoT) is one of the growing platforms of the current era that has encircled a large population into its domain, and life appears to be useless without adopting this technology. A significant amount of data is generated from an [...] Read more.
The internet of things (IoT) is one of the growing platforms of the current era that has encircled a large population into its domain, and life appears to be useless without adopting this technology. A significant amount of data is generated from an immense number of smart devices and their allied applications that are constructively utilized to automate our daily life activities. This big data requires fast processing, storage, and safe passage through secure channels to safeguard it from any malicious attacks. In such a situation, security is considered crucial to protect the technological resources from unauthorized access or any interruption to disrupt the seamless and ubiquitous connectivity of the IoT from the perception layer to cloud computers. Motivated by this, this article demonstrates a general overview about the technology and layered architecture of the IoT followed by critical applications with a particular focus on key features of smart homes, smart agriculture, smart transportation, and smart healthcare. Next, security threats and vulnerabilities included with attacks on each layer of the IoT are explicitly elaborated. The classification of security challenges such as confidentiality, integrity, privacy, availability, authentication, non-repudiation, and key management is thoroughly reviewed. Finally, future research directions for security concerns are identified and presented. Full article
(This article belongs to the Topic Internet of Things: Latest Advances)
Show Figures

Figure 1

19 pages, 6345 KiB  
Article
Rice Crop Counting Using Aerial Imagery and GIS for the Assessment of Soil Health to Increase Crop Yield
by Syeda Iqra Hassan, Muhammad Mansoor Alam, Muhammad Yousuf Irfan Zia, Muhammad Rashid, Usman Illahi and Mazliham Mohd Su’ud
Sensors 2022, 22(21), 8567; https://doi.org/10.3390/s22218567 - 7 Nov 2022
Cited by 14 | Viewed by 4810
Abstract
Rice is one of the vital foods consumed in most countries throughout the world. To estimate the yield, crop counting is used to indicate improper growth, identification of loam land, and control of weeds. It is becoming necessary to grow crops healthy, precisely, [...] Read more.
Rice is one of the vital foods consumed in most countries throughout the world. To estimate the yield, crop counting is used to indicate improper growth, identification of loam land, and control of weeds. It is becoming necessary to grow crops healthy, precisely, and proficiently as the demand increases for food supplies. Traditional counting methods have numerous disadvantages, such as long delay times and high sensitivity, and they are easily disturbed by noise. In this research, the detection and counting of rice plants using an unmanned aerial vehicle (UAV) and aerial images with a geographic information system (GIS) are used. The technique is implemented in the area of forty acres of rice crop in Tando Adam, Sindh, Pakistan. To validate the performance of the proposed system, the obtained results are compared with the standard plant count techniques as well as approved by the agronomist after testing soil and monitoring the rice crop count in each acre of land of rice crops. From the results, it is found that the proposed system is precise and detects rice crops accurately, differentiates from other objects, and estimates the soil health based on plant counting data; however, in the case of clusters, the counting is performed in semi-automated mode. Full article
(This article belongs to the Section Smart Agriculture)
Show Figures

Figure 1

15 pages, 4630 KiB  
Article
Cardiovascular and Diabetes Diseases Classification Using Ensemble Stacking Classifiers with SVM as a Meta Classifier
by Asfandyar Khan, Abdullah Khan, Muhammad Muntazir Khan, Kamran Farid, Muhammad Mansoor Alam and Mazliham Bin Mohd Su’ud
Diagnostics 2022, 12(11), 2595; https://doi.org/10.3390/diagnostics12112595 - 26 Oct 2022
Cited by 16 | Viewed by 3094
Abstract
Cardiovascular disease includes coronary artery diseases (CAD), which include angina and myocardial infarction (commonly known as a heart attack), and coronary heart diseases (CHD), which are marked by the buildup of a waxy material called plaque inside the coronary arteries. Heart attacks are [...] Read more.
Cardiovascular disease includes coronary artery diseases (CAD), which include angina and myocardial infarction (commonly known as a heart attack), and coronary heart diseases (CHD), which are marked by the buildup of a waxy material called plaque inside the coronary arteries. Heart attacks are still the main cause of death worldwide, and if not treated right they have the potential to cause major health problems, such as diabetes. If ignored, diabetes can result in a variety of health problems, including heart disease, stroke, blindness, and kidney failure. Machine learning methods can be used to identify and diagnose diabetes and other illnesses. Diabetes and cardiovascular disease both can be diagnosed using several classifier types. Naive Bayes, K-Nearest neighbor (KNN), linear regression, decision trees (DT), and support vector machines (SVM) were among the classifiers employed, although all of these models had poor accuracy. Therefore, due to a lack of significant effort and poor accuracy, new research is required to diagnose diabetes and cardiovascular disease. This study developed an ensemble approach called “Stacking Classifier” in order to improve the performance of integrated flexible individual classifiers and decrease the likelihood of misclassifying a single instance. Naive Bayes, KNN, Linear Discriminant Analysis (LDA), and Decision Tree (DT) are just a few of the classifiers used in this study. As a meta-classifier, Random Forest and SVM are used. The suggested stacking classifier obtains a superior accuracy of 0.9735 percent when compared to current models for diagnosing diabetes, such as Naive Bayes, KNN, DT, and LDA, which are 0.7646 percent, 0.7460 percent, 0.7857 percent, and 0.7735 percent, respectively. Furthermore, for cardiovascular disease, when compared to current models such as KNN, NB, DT, LDA, and SVM, which are 0.8377 percent, 0.8256 percent, 0.8426 percent, 0.8523 percent, and 0.8472 percent, respectively, the suggested stacking classifier performed better and obtained a higher accuracy of 0.8871 percent. Full article
(This article belongs to the Special Issue Implementing AI in Diagnosis of Cardiovascular Diseases)
Show Figures

Figure 1

14 pages, 2552 KiB  
Article
Application of Machine Learning Algorithms for Sustainable Business Management Based on Macro-Economic Data: Supervised Learning Techniques Approach
by Muhammad Anees Khan, Kumail Abbas, Mazliham Mohd Su’ud, Anas A. Salameh, Muhammad Mansoor Alam, Nida Aman, Mehreen Mehreen, Amin Jan, Nik Alif Amri Bin Nik Hashim and Roslizawati Che Aziz
Sustainability 2022, 14(16), 9964; https://doi.org/10.3390/su14169964 - 12 Aug 2022
Cited by 14 | Viewed by 5274
Abstract
Macroeconomic indicators are the key to success in the development of any country and are very much important for the overall economy of any country in the world. In the past, researchers used the traditional methods of regression for estimating macroeconomic variables. However, [...] Read more.
Macroeconomic indicators are the key to success in the development of any country and are very much important for the overall economy of any country in the world. In the past, researchers used the traditional methods of regression for estimating macroeconomic variables. However, the advent of efficient machine learning (ML) methods has led to the improvement of intelligent mechanisms for solving time series forecasting problems of various economies around the globe. This study focuses on forecasting the data of the inflation rate and the exchange rate of Pakistan from January 1989 to December 2020. In this study, we used different ML algorithms like k-nearest neighbor (KNN), polynomial regression, artificial neural networks (ANNs), and support vector machine (SVM). The data set was split into two sets: the training set consisted of data from January 1989 to December 2018 for the training of machine algorithms, and the remaining data from January 2019 to December 2020 were used as a test set for ML testing. To find the accuracy of the algorithms used in the study, we used root mean square error (RMSE) and mean absolute error (MAE). The experimental results showed that ANNs archives the least RMSE and MAE compared to all the other algorithms used in the study. While using the ML method for analyzing and forecasting inflation rates based on error prediction, the test set showed that the polynomial regression (degree 1) and ANN methods outperformed SVM and KNN. However, on the other hand, forecasting the exchange rate, SVM RBF outperformed KNN, polynomial regression, and ANNs. Full article
(This article belongs to the Special Issue Circular Economy and Artificial Intelligence)
Show Figures

Figure 1

29 pages, 1562 KiB  
Review
A Review of Urdu Sentiment Analysis with Multilingual Perspective: A Case of Urdu and Roman Urdu Language
by Ihsan Ullah Khan, Aurangzeb Khan, Wahab Khan, Mazliham Mohd Su’ud, Muhammad Mansoor Alam, Fazli Subhan and Muhammad Zubair Asghar
Computers 2022, 11(1), 3; https://doi.org/10.3390/computers11010003 - 27 Dec 2021
Cited by 27 | Viewed by 19726
Abstract
Research efforts in the field of sentiment analysis have exponentially increased in the last few years due to its applicability in areas such as online product purchasing, marketing, and reputation management. Social media and online shopping sites have become a rich source of [...] Read more.
Research efforts in the field of sentiment analysis have exponentially increased in the last few years due to its applicability in areas such as online product purchasing, marketing, and reputation management. Social media and online shopping sites have become a rich source of user-generated data. Manufacturing, sales, and marketing organizations are progressively turning their eyes to this source to get worldwide feedback on their activities and products. Millions of sentences in Urdu and Roman Urdu are posted daily on social sites, such as Facebook, Instagram, Snapchat, and Twitter. Disregarding people’s opinions in Urdu and Roman Urdu and considering only resource-rich English language leads to the vital loss of this vast amount of data. Our research focused on collecting research papers related to Urdu and Roman Urdu language and analyzing them in terms of preprocessing, feature extraction, and classification techniques. This paper contains a comprehensive study of research conducted on Roman Urdu and Urdu text for a product review. This study is divided into categories, such as collection of relevant corpora, data preprocessing, feature extraction, classification platforms and approaches, limitations, and future work. The comparison was made based on evaluating different research factors, such as corpus, lexicon, and opinions. Each reviewed paper was evaluated according to some provided benchmarks and categorized accordingly. Based on results obtained and the comparisons made, we suggested some helpful steps in a future study. Full article
Show Figures

Figure 1

45 pages, 4733 KiB  
Review
Resource Allocation Schemes for 5G Network: A Systematic Review
by Muhammad Ayoub Kamal, Hafiz Wahab Raza, Muhammad Mansoor Alam, Mazliham Mohd Su’ud and Aznida binti Abu Bakar Sajak
Sensors 2021, 21(19), 6588; https://doi.org/10.3390/s21196588 - 2 Oct 2021
Cited by 73 | Viewed by 17113
Abstract
Fifth-generation (5G) communication technology is intended to offer higher data rates, outstanding user exposure, lower power consumption, and extremely short latency. Such cellular networks will implement a diverse multi-layer model comprising device-to-device networks, macro-cells, and different categories of small cells to assist customers [...] Read more.
Fifth-generation (5G) communication technology is intended to offer higher data rates, outstanding user exposure, lower power consumption, and extremely short latency. Such cellular networks will implement a diverse multi-layer model comprising device-to-device networks, macro-cells, and different categories of small cells to assist customers with desired quality-of-service (QoS). This multi-layer model affects several studies that confront utilizing interference management and resource allocation in 5G networks. With the growing need for cellular service and the limited resources to provide it, capably handling network traffic and operation has become a problem of resource distribution. One of the utmost serious problems is to alleviate the jamming in the network in support of having a better QoS. However, although a limited number of review papers have been written on resource distribution, no review papers have been written specifically on 5G resource allocation. Hence, this article analyzes the issue of resource allocation by classifying the various resource allocation schemes in 5G that have been reported in the literature and assessing their ability to enhance service quality. This survey bases its discussion on the metrics that are used to evaluate network performance. After consideration of the current evidence on resource allocation methods in 5G, the review hopes to empower scholars by suggesting future research areas on which to focus. Full article
(This article belongs to the Section Communications)
Show Figures

Figure 1

Back to TopTop