Advances of Machine Learning and Data Mining Using Mathematical Optimization in Honor of the 65th Birthday of Prof. Adil M. Bagirov

A special issue of Mathematics (ISSN 2227-7390). This special issue belongs to the section "Computational and Applied Mathematics".

Deadline for manuscript submissions: closed (29 February 2024) | Viewed by 7247

Special Issue Editors


E-Mail Website
Guest Editor
School of Science, RMIT University, Melbourne, VIC 3001, Australia
Interests: nonsmooth optimization; data mining; machine learning
Special Issues, Collections and Topics in MDPI journals

E-Mail Website
Guest Editor
School of Engineering, Information Technology and Physical Sciences, Federation University Australia, Ballarat, Victoria 3350, Australia
Interests: optimisation, in particular nonsmooth optimisation and its various applications

Special Issue Information

Dear Colleagues,

Machine learning (ML) is one of the fastest growing areas of computer science. The term ML, in particular, refers to the automated detection of meaningful patterns in data. Over the past couple of decades, it has gained tremendous popularity for its ability to extract information from data sets. The field of ML has branched into several subfields dealing with different types of learning tasks. Supervised (classification and regression), unsupervised (clustering) and semi-supervised learning are among them.

The recent results show that optimization models and methods are vital in designing efficient and accurate ML techniques, since using optimization in ML, particularly non-smooth optimization (NSO), leads to a significant reduction in the number of their variables, and thus allows us to extract essential knowledge from a huge volume of data efficiently and accurately.

The purpose of this Special Issue is to gather a collection of articles reflecting upon the latest developments of optimization-based machine learning and data mining techniques and their applications. The fields of interest include operations research and numerical optimization, such as issues of machine learning, data mining, big data, and others.

The Guest Editors are grateful to Professor Adil Bagirov, with whom they have had the privilege of conducting research in the area of data mining, machine learning and optimization. We wish him a very happy 65th birthday!

Dr. Sona Taheri
Dr. Nargiz Sultanova
Guest Editors

Manuscript Submission Information

Manuscripts should be submitted online at www.mdpi.com by registering and logging in to this website. Once you are registered, click here to go to the submission form. Manuscripts can be submitted until the deadline. All submissions that pass pre-check are peer-reviewed. Accepted papers will be published continuously in the journal (as soon as accepted) and will be listed together on the special issue website. Research articles, review articles as well as short communications are invited. For planned papers, a title and short abstract (about 100 words) can be sent to the Editorial Office for announcement on this website.

Submitted manuscripts should not have been published previously, nor be under consideration for publication elsewhere (except conference proceedings papers). All manuscripts are thoroughly refereed through a single-blind peer-review process. A guide for authors and other relevant information for submission of manuscripts is available on the Instructions for Authors page. Mathematics is an international peer-reviewed open access semimonthly journal published by MDPI.

Please visit the Instructions for Authors page before submitting a manuscript. The Article Processing Charge (APC) for publication in this open access journal is 2600 CHF (Swiss Francs). Submitted papers should be well formatted and use good English. Authors may use MDPI's English editing service prior to publication or during author revisions.

Keywords

  • data mining
  • machine learning
  • optimization
  • big data

Published Papers (2 papers)

Order results
Result details
Select all
Export citation of selected articles as:

Research

14 pages, 318 KiB  
Article
Absolute Value Inequality SVM for the PU Learning Problem
by Yongjia Yuan and Fusheng Bai
Mathematics 2024, 12(10), 1454; https://doi.org/10.3390/math12101454 - 8 May 2024
Viewed by 318
Abstract
Positive and unlabeled learning (PU learning) is a significant binary classification task in machine learning; it focuses on training accurate classifiers using positive data and unlabeled data. Most of the works in this area are based on a two-step strategy: the first step [...] Read more.
Positive and unlabeled learning (PU learning) is a significant binary classification task in machine learning; it focuses on training accurate classifiers using positive data and unlabeled data. Most of the works in this area are based on a two-step strategy: the first step is to identify reliable negative examples from unlabeled examples, and the second step is to construct the classifiers based on the positive examples and the identified reliable negative examples using supervised learning methods. However, these methods always underutilize the remaining unlabeled data, which limits the performance of PU learning. Furthermore, many methods require the iterative solution of the formulated quadratic programming problems to obtain the final classifier, resulting in a large computational cost. In this paper, we propose a new method called the absolute value inequality support vector machine, which applies the concept of eccentricity to select reliable negative examples from unlabeled data and then constructs a classifier based on the positive examples, the selected negative examples, and the remaining unlabeled data. In addition, we apply a hyperparameter optimization technique to automatically search and select the optimal parameter values in the proposed algorithm. Numerical experimental results on ten real-world datasets demonstrate that our method is better than the other three benchmark algorithms. Full article
Show Figures

Figure 1

28 pages, 7316 KiB  
Article
Supply Chain Demand Forecasting and Price Optimisation Models with Substitution Effect
by Keun Hee Lee, Mali Abdollahian, Sergei Schreider and Sona Taheri
Mathematics 2023, 11(11), 2502; https://doi.org/10.3390/math11112502 - 29 May 2023
Cited by 2 | Viewed by 5386
Abstract
Determining the optimal price of products is essential, as it plays a critical role in improving a company’s profitability and market competitiveness. This requires the ability to calculate customers’ demand in the Fast Moving Consumer Goods (FMCG) industry as various effects exist between [...] Read more.
Determining the optimal price of products is essential, as it plays a critical role in improving a company’s profitability and market competitiveness. This requires the ability to calculate customers’ demand in the Fast Moving Consumer Goods (FMCG) industry as various effects exist between multiple products within a product category. The substitution effect is one of the challenging effects at retail stores, as it requires investigating an exponential number of combinations of price changes and the availability of other products. This paper suggests a systematic price decision support tool for demand prediction and price optimise in online and stationary retailers considering the substitution effect. Two procedures reflecting the product price changes and the demand correlation structure are introduced for demand prediction and price optimisation models. First, the developed demand prediction procedure is carried out considering the combination of price changes of all products reflecting the effect of substitution. Time series and different well-known machine learning approaches with hyperparameter tuning and rolling forecasting methods are utilised to select each product’s best demand forecast. Demand forecast results are used as input in the price optimisation model. Second, the developed price optimisation procedure is a constraint programming problem based on a week time frame and a product category level aggregation and is capable of maximising profit out of the many price combinations. The results using real-world transaction data with 12 products and 4 discount rates demonstrate that including some business rules as constraints in the proposed price optimisation model reduces the number of price combinations from 11,274,924 to 19,440 and execution time from 129.59 to 25.831 min. The utilisation of the presented price optimisation support tool enables the supply chain managers to identify the optimal discount rate for individual products in a timely manner, resulting in a net profit increase. Full article
Show Figures

Figure 1

Back to TopTop