Sign in to use this feature.

Years

Between: -

Subjects

remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline
remove_circle_outline

Journals

Article Types

Countries / Regions

Search Results (70)

Search Parameters:
Keywords = lot-streaming

Order results
Result details
Results per page
Select all
Export citation of selected articles as:
20 pages, 1518 KB  
Article
An Effective Hybrid Rescheduling Method for Wafer Chip Precision Packaging Workshops in Complex Manufacturing Environments
by Ziyue Wang, Weikang Fang and Yichen Yang
Micromachines 2025, 16(12), 1403; https://doi.org/10.3390/mi16121403 - 12 Dec 2025
Viewed by 180
Abstract
With the continuous development of semiconductor manufacturing technology and information technology, the sizes of wafer chips are becoming smaller and the variety is increasing, which has put forward high requirements for wafer chip precision manufacturing and packaging workshops. On the one hand, the [...] Read more.
With the continuous development of semiconductor manufacturing technology and information technology, the sizes of wafer chips are becoming smaller and the variety is increasing, which has put forward high requirements for wafer chip precision manufacturing and packaging workshops. On the one hand, the market demand for multiple varieties and small batches will increase the difficulty of scheduling. On the other hand, the complex manufacturing environment brings various types of dynamic events that will disrupt production plans. Accordingly, this work researches the wafer chip precision packaging workshop rescheduling problem under events of machine breakdown, emergency order inserting and original order modification. Firstly, the mathematical model for the addressed problem is established, and the rolling horizon technology is adopted to deal with multiple dynamic events. Then, a hybrid algorithm combining an improved firefly optimization framework and variable neighborhood search strategy is proposed. The population evolution mechanism is designed based on the location-updating law of fireflies in nature. The variable neighborhood search is applied for avoiding local optima and sufficiently exploring in the neighborhood. At last, the test results of comparative experiments and engineering cases indicate that the proposed method is effective and stable and is superior to the current advanced algorithms. Full article
(This article belongs to the Special Issue Future Trends in Ultra-Precision Machining)
Show Figures

Figure 1

31 pages, 1107 KB  
Article
Length–Weight Distribution of Non-Zero Elements in Randomized Bit Sequences
by Christoph Lange, Andreas Ahrens, Yadu Krishnan Krishnakumar and Olaf Grote
Sensors 2025, 25(12), 3825; https://doi.org/10.3390/s25123825 - 19 Jun 2025
Viewed by 963
Abstract
Randomness plays an important role in data communication as well as in cybersecurity. In the simulation of communication systems, randomized bit sequences are often used to model a digital source information stream. Cryptographic outputs should look more random than deterministic in order to [...] Read more.
Randomness plays an important role in data communication as well as in cybersecurity. In the simulation of communication systems, randomized bit sequences are often used to model a digital source information stream. Cryptographic outputs should look more random than deterministic in order to provide an attacker with as little information as possible. Therefore, the investigation of randomness, especially in cybersecurity, has attracted a lot of attention and research activities. Common tests regarding randomness are hypothesis-based and focus on analyzing the distribution and independence of zero and non-zero elements in a given random sequence. In this work, a novel approach grounded in a gap-based burst analysis is presented and analyzed. Such approaches have been successfully implemented, e.g., in data communication systems and data networks. The focus of the current work is on detecting deviations from the ideal gap-density function describing randomized bit sequences. For testing and verification purposes, the well-researched post-quantum cryptographic CRYSTALS suite, including its Kyber and Dilithium schemes, is utilized. The proposed technique allows for quickly verifying the level of randomness in given cryptographic outputs. The results for different sequence-generation techniques are presented, thus validating the approach. The results show that key-encapsulation and key-exchange algorithms, such as CRYSTALS-Kyber, achieve a lower level of randomness compared to digital signature algorithms, such as CRYSTALS-Dilithium. Full article
(This article belongs to the Section Communications)
Show Figures

Figure 1

13 pages, 1412 KB  
Article
Harnessing Raman Spectroscopy for Enhanced Bioprocess Monitoring: Predictive CO2 Analysis and Robust pH Determination in Bioreactor Off-Gas Stream
by Tobias Wallocha and Michaela Poth
Fermentation 2025, 11(6), 317; https://doi.org/10.3390/fermentation11060317 - 2 Jun 2025
Viewed by 3689
Abstract
The accurate measurement of CO2 concentration in fermentation off-gas is crucial for monitoring and optimizing bioprocesses, particularly in mammalian cell cultures. In this study, we successfully utilized Raman off-gas spectroscopy to achieve time-resolved prediction of CO2 concentrations in the fermentation off-gas. [...] Read more.
The accurate measurement of CO2 concentration in fermentation off-gas is crucial for monitoring and optimizing bioprocesses, particularly in mammalian cell cultures. In this study, we successfully utilized Raman off-gas spectroscopy to achieve time-resolved prediction of CO2 concentrations in the fermentation off-gas. Our experiments were conducted using two different media: a commercial medium (medium 1) and an in-house Roche medium (medium 2), each tested with two different lots. The results demonstrated that Raman spectroscopy provides precise and real-time CO2 measurements, which are essential for effective process monitoring and control. Furthermore, we established that CO2 off-gas analysis can be directly correlated with the pH value of the fermentation medium. This correlation allows for accurate pH prediction with comparable precision to traditional methods, where CO2 levels are first determined via Raman spectroscopy or an off-gas analyzer and then used to infer pH through a correlation curve. In the final step of our study, we employed a Raman submers probe to predict CO2 and pH directly within the fermentation medium. Compared to the model accuracy in the off-gas stream, the performance of the Raman submers probe in predicting CO2 and pH within the medium was significantly worse, likely due to the absence of a pretrained model for CO2. Our findings highlight the potential of Raman off-gas spectroscopy as a powerful tool for real-time bioprocess monitoring and control, offering significant advantages in terms of accuracy and efficiency. Full article
Show Figures

Figure 1

24 pages, 1589 KB  
Review
Lot-Streaming Workshop Scheduling with Operation Flexibility: Review and Extension
by Zhiqiang Tian, Xingyu Jiang, Weijun Liu, Baohai Zhao, Shun Liu, Qingze Tan and Guangdong Tian
Systems 2025, 13(4), 271; https://doi.org/10.3390/systems13040271 - 9 Apr 2025
Cited by 4 | Viewed by 1340
Abstract
Lot-streaming scheduling methods with operation flexibility have been widely used in aerospace, semiconductor, automotive, pharmaceutical and other manufacturing enterprises. Lot-splitting scheduling methods have attracted much more attention from academia and industry due to an urgent requirement for an effective way to improve the [...] Read more.
Lot-streaming scheduling methods with operation flexibility have been widely used in aerospace, semiconductor, automotive, pharmaceutical and other manufacturing enterprises. Lot-splitting scheduling methods have attracted much more attention from academia and industry due to an urgent requirement for an effective way to improve the productivity of the flexible workshop scheduling. During the past decade, many works have been made on the different lot-streaming scheduling methods of the flexible workshop scheduling. The scope of this review focuses on the journal publications collected in the Web of Science database, among which 80% are from high-ranked journals. This paper aims to provide a comprehensive survey on the lot-streaming workshop scheduling with operation flexibility. First, the lot-streaming methods of jobs are discussed and the objectives as well as constraints in applications are summarized. Then, the problem models and their solution approaches are reviewed. Next, the research trends of problem applications, modeling and solution approaches are recalled. Finally, the potential future research directions are concluded. Full article
(This article belongs to the Special Issue Production Scheduling and Planning in Manufacturing Systems)
Show Figures

Figure 1

15 pages, 49237 KB  
Technical Note
A Novel Two-Stream Network for Few-Shot Remote Sensing Image Scene Classification
by Yaolin Lei, Yangyang Li and Heting Mao
Remote Sens. 2025, 17(7), 1192; https://doi.org/10.3390/rs17071192 - 27 Mar 2025
Viewed by 1033
Abstract
Recently, remote sensing image scene classification (RSISC) has gained considerable interest from the research community. Numerous approaches have been developed to tackling this issue, with deep learning techniques standing out due to their great performance in RSISC. Nevertheless, there is a general consensus [...] Read more.
Recently, remote sensing image scene classification (RSISC) has gained considerable interest from the research community. Numerous approaches have been developed to tackling this issue, with deep learning techniques standing out due to their great performance in RSISC. Nevertheless, there is a general consensus that deep learning techniques usually need a lot of labeled data to work best. Collecting sufficient labeled data usually necessitates substantial human labor and resource allocation. Hence, the significance of few-shot learning to RSISC has greatly increased. Thankfully, the recently proposed discriminative enhanced attention-based deep nearest neighbor neural network (DEADN4) method has introduced episodic training- and attention-based strategies to reduce the effect of background noise on the classification accuracy. Furthermore, DEADN4 uses deep global–local descriptors that extract both the overall features and detailed features, adjusts the loss function to distinguish between different classes better, and adds a term to make features within the same class closer together. This helps solve the problem of features within the same class being spread out and features between classes being too similar in remote sensing images. However, the DEADN4 method does not address the impact of large-scale variations in objects on RSISC. Therefore, we propose a two-stream deep nearest neighbor neural network (TSDN4) to resolve the aforementioned problem. Our framework consists of two streams: a global stream that assesses the likelihood of the whole image being associated with a particular class and a local stream that evaluates the probability of the most significant area corresponding to a particular class. The ultimate classification outcome is determined by putting together the results from both streams. Our method was evaluated across three distinct remote sensing image datasets to assess its effectiveness. To assess its performance, we compare our method with a range of advanced techniques, such as MatchingNet, RelationNet, MAML, Meta-SGD, DLA-MatchNet, DN4, DN4AM, and DEADN4, showcasing its encouraging results in addressing the challenges of few-shot RSISC. Full article
(This article belongs to the Section Remote Sensing Image Processing)
Show Figures

Graphical abstract

18 pages, 825 KB  
Article
Modeling Rollover Crash Risks: The Influence of Road Infrastructure and Traffic Stream Characteristics
by Abolfazl Khishdari, Hamid Mirzahossein, Xia Jin and Shahriar Afandizadeh
Infrastructures 2025, 10(2), 31; https://doi.org/10.3390/infrastructures10020031 - 27 Jan 2025
Cited by 2 | Viewed by 2063
Abstract
Rollover crashes are among the most prevalent types of accidents in developing countries. Various factors may contribute to the occurrence of rollover crashes. However, limited studies have simultaneously investigated both traffic stream and road-related variables. For instance, the effects of T-intersection density, U-turns, [...] Read more.
Rollover crashes are among the most prevalent types of accidents in developing countries. Various factors may contribute to the occurrence of rollover crashes. However, limited studies have simultaneously investigated both traffic stream and road-related variables. For instance, the effects of T-intersection density, U-turns, roadside parking lots, the entry and exit ramps of side roads, as well as traffic stream characteristics (e.g., standard deviation of vehicle speeds, speed violations, presence or absence of speed cameras, and road surface deterioration) have not been thoroughly explored in previous research. Additionally, the simultaneous modeling of crash frequency and intensity remains underexplored. This study examines single-vehicle rollover crashes in Yazd Province, located in central Iran, as a case study and simultaneously evaluates all the variables. A dataset comprising three years of crash data (2015–2017) was collected and analyzed. A crash index was developed based on the weight of crash intensity, road type, road length (as dependent variables), and road infrastructure and traffic stream properties (as independent variables). Initially, the dataset was refined to determine the significance of explanatory variables on the crash index. Correlation analysis was conducted to assess the linear independence between variable pairs using the variance inflation factor (VIF). Subsequently, various models were compared based on goodness of fit (GOF) indicators and odds ratio (OR) calculations. The results indicated that among ten crash modeling techniques, namely, Poisson, negative binomial (NB), zero-truncated Poisson (ZTP), zero-truncated negative binomial (ZTNB), zero-inflated Poisson (ZIP), zero-inflated negative binomial (ZINB), fixed-effect Poisson (FEP), fixed-effect negative binomial (FENB), random-effect Poisson (REP), and random-effect negative binomial (RENB), the FENB model outperformed the others. The Akaike information criterion (AIC) and Bayesian information criterion (BIC) values for the FENB model were 1305.7 and 1393.6, respectively, demonstrating its superior performance. The findings revealed a declining trend in the frequency and severity of rollover crashes. Full article
Show Figures

Figure 1

28 pages, 5341 KB  
Review
Aromatics Alkylated with Olefins Utilizing Zeolites as Heterogeneous Catalysts: A Review
by Samaa H. Al-Sultani, Ali Al-Shathr and Bashir Y. Al-Zaidi
Reactions 2024, 5(4), 900-927; https://doi.org/10.3390/reactions5040048 - 13 Nov 2024
Cited by 5 | Viewed by 3585
Abstract
The alkylation reaction of aromatic compounds gains considerable attention because of its wide application in bulk and fine chemical production. Aromatics alkylated with olefins is a well-known process, particularly for linear alkylbenzene, phenyloctanes, and heptyltoluene production. As octane boosters and precursors for various [...] Read more.
The alkylation reaction of aromatic compounds gains considerable attention because of its wide application in bulk and fine chemical production. Aromatics alkylated with olefins is a well-known process, particularly for linear alkylbenzene, phenyloctanes, and heptyltoluene production. As octane boosters and precursors for various petrochemical and bulk chemical products, a wide range of alkylated compounds are in high demand. Numerous unique structures have been proposed in addition to the usual zeolites (Y and beta) utilized in alkylation procedures. The inevitable deactivation of industrial catalysts over time on stream, which is followed by a decrease in catalytic activity and product selectivity, is one of their disadvantages. Therefore, careful consideration of catalyst deactivation regarding the setup and functioning of the process of catalysis is necessary. Although a lot of work has been carried out to date to prevent coke and increase catalyst lifespan, deactivation of the catalyst is still unavoidable. Coke deposition can lead to catalyst deactivation in industrial catalytic processes by obstructing pores and/or covering acid sites. It is very desirable to regenerate inactive catalysts in order to remove the coke and restore catalytic activity at the same time. Depending on the kind of catalyst, the deactivation processes, and the regeneration settings, each regeneration approach has pros and cons. In this comprehensive study, the focus was on discussing the reaction mechanism of 1-octene isomerization and toluene alkylation as an example of isomerization and alkylation reactions that occur simultaneously, shedding light in detail on the catalysts used for this type of complex reaction, taking into account the challenges facing the catalyst deactivation and reactivation procedures. Full article
Show Figures

Figure 1

24 pages, 2139 KB  
Article
A Decision Support Model for Lean Supply Chain Management in City Multifloor Manufacturing Clusters
by Bogusz Wiśnicki, Tygran Dzhuguryan, Sylwia Mielniczuk, Ihor Petrov and Liudmyla Davydenko
Sustainability 2024, 16(20), 8801; https://doi.org/10.3390/su16208801 - 11 Oct 2024
Cited by 5 | Viewed by 3511
Abstract
City manufacturing has once again become one of the priority areas for the sustainable development of smart cities thanks to the use of a wide range of green technologies and, first of all, additive technologies. Shortening the supply chain between producers and consumers [...] Read more.
City manufacturing has once again become one of the priority areas for the sustainable development of smart cities thanks to the use of a wide range of green technologies and, first of all, additive technologies. Shortening the supply chain between producers and consumers has significant effects on economic, social, and environmental dimensions. Zoning of city multifloor manufacturing (CMFM) in areas with a compact population in large cities in the form of clusters with their own city logistics nodes (CLNs) creates favorable conditions for promptly meeting the needs of citizens for goods of everyday demand and for passenger and freight transportation. City multifloor manufacturing clusters (CMFMCs) have been already studied quite a lot for their possible uses; nevertheless, an identified research gap is related to supply chain design efficiency concerning CMFMCs. Thus, the main objective of this study was to explore the possibilities of lean supply chain management (LSCM) as the integrated application of lean manufacturing (LM) approaches and I4.0 technologies for customer-centric value stream management based on eliminating all types of waste, reducing the use of natural and energy resources, and continuous improvement of processes related to logistics activities. This paper presents a decision support model for LSCM in CMFMCs, which is a mathematical deterministic model. This model justifies the minimization of the number of road transport transfers within the urban area and the amount of stock that is stored in CMFMC buildings and in CLNs, and also regulating supplier lead time. The model was verified and validated using appropriately selected test data based on the case study, which was designed as a typical CMFM manufacturing system with various parameters of CMFMCs and urban freight transport frameworks. The feasibility of using the proposed model for value stream mapping (VSM) and managing logistics processes and inventories in clusters is discussed. The findings can help decisionmakers and researchers improve the planning and management of logistics processes and inventory in clusters, even in the face of unexpected disruptions. Full article
Show Figures

Figure 1

21 pages, 3407 KB  
Article
An Automated Approach to the Heterogeneity Test for Sampling Protocol Optimization
by Gabriela Cardoso Prado, Ana Carolina Chieregati and Simon C. Dominy
Minerals 2024, 14(4), 434; https://doi.org/10.3390/min14040434 - 22 Apr 2024
Cited by 1 | Viewed by 1808
Abstract
The fundamental sampling error is one of the sampling errors defined by Pierre Gy’s Theory of Sampling and is related to the constitution heterogeneity of the mineralisation. Even if a sampling procedure is considered ideal or perfect, this error will still exist and, [...] Read more.
The fundamental sampling error is one of the sampling errors defined by Pierre Gy’s Theory of Sampling and is related to the constitution heterogeneity of the mineralisation. Even if a sampling procedure is considered ideal or perfect, this error will still exist and, therefore, cannot be eliminated. A key input into Gy’s fundamental sampling error equation is intrinsic heterogeneity. The intrinsic heterogeneity of a fragmented lot can be estimated by the “calibrated” formula of Gy, which can be written as a function of the sampling constants K and α. These constants can be calibrated by the standard heterogeneity test, originally developed by Pierre Gy and Francis Pitard. This method is based on the selection of rock fragments, individually and randomly, in an equiprobabilistic way from a lot of particulate material, aiming to estimate the intrinsic heterogeneity of the lot. This test, in addition to demanding time and space, can be influenced by human biases, and is difficult to quantify or measure. Aiming to simplify the test execution and eliminate the variance generated by human biases, a prototype called the intrinsic heterogeneity tester was developed as an automated alternative for heterogeneity testing. This prototype selects fragments from a falling stream, one by one, by means of a predefined laser count. To evaluate the prototype, a study was carried out, using painted chickpeas to simulate mineralisation grades and, sequentially, processing the same lot in the intrinsic heterogeneity tester prototype several times. The statistical and mineral content analysis, and comparisons between the intrinsic heterogeneity tester and the standard heterogeneity test sampling constants and constitution heterogeneities were undertaken. As a result, the authors conclude that the intrinsic heterogeneity tester prototype can be used as an alternative to the manual selection of individual fragments and for estimating the intrinsic heterogeneity of particulate material lots to support sampling protocol optimization. Full article
(This article belongs to the Section Mineral Processing and Extractive Metallurgy)
Show Figures

Figure 1

20 pages, 5779 KB  
Article
A Geoscience-Aware Network (GASlumNet) Combining UNet and ConvNeXt for Slum Mapping
by Wei Lu, Yunfeng Hu, Feifei Peng, Zhiming Feng and Yanzhao Yang
Remote Sens. 2024, 16(2), 260; https://doi.org/10.3390/rs16020260 - 9 Jan 2024
Cited by 9 | Viewed by 3028
Abstract
Approximately 1 billion people worldwide currently inhabit slum areas. The UN Sustainable Development Goal (SDG 11.1) underscores the imperative of upgrading all slums by 2030 to ensure adequate housing for everyone. Geo-locations of slums help local governments with upgrading slums and alleviating urban [...] Read more.
Approximately 1 billion people worldwide currently inhabit slum areas. The UN Sustainable Development Goal (SDG 11.1) underscores the imperative of upgrading all slums by 2030 to ensure adequate housing for everyone. Geo-locations of slums help local governments with upgrading slums and alleviating urban poverty. Remote sensing (RS) technology, with its excellent Earth observation capabilities, can play an important role in slum mapping. Deep learning (DL)-based RS information extraction methods have attracted a lot of attention. Currently, DL-based slum mapping studies typically uses three optical bands to adapt to existing models, neglecting essential geo-scientific information, such as spectral and textural characteristics, which are beneficial for slum mapping. Inspired by the geoscience-aware DL paradigm, we propose the Geoscience-Aware Network for slum mapping (GASlumNet), aiming to improve slum mapping accuracies via incorporating the DL model with geoscientific prior knowledge. GASlumNet employs a two-stream architecture, combining ConvNeXt and UNet. One stream concentrates on optical feature representation, while the other emphasizes geo-scientific features. Further, the feature-level and decision-level fusion mechanisms are applied to optimize deep features and enhance model performance. We used Jilin-1 Spectrum 01 and Sentinel-2 images to perform experiments in Mumbai, India. The results demonstrate that GASlumNet achieves higher slum mapping accuracy than the comparison models, with an intersection over union (IoU) of 58.41%. Specifically, GASlumNet improves the IoU by 4.60~5.97% over the baseline models, i.e., UNet and ConvNeXt-UNet, which exclusively utilize optical bands. Furthermore, GASlumNet enhances the IoU by 10.97% compared to FuseNet, a model that combines optical bands and geo-scientific features. Our method presents a new technical solution to achieve accurate slum mapping, offering potential benefits for regional and global slum mapping and upgrading initiatives. Full article
(This article belongs to the Section AI Remote Sensing)
Show Figures

Graphical abstract

7 pages, 1876 KB  
Proceeding Paper
Watershed Development Plans as an Approach to Rediscover Lost Crops in the Sarguja Division of Chhattisgarh, India
by Kashi Gupta, Sulab Kumar, Sandeep Banjara, Aayushi Sinha, Mohan Shrivastava and Sushma Kerketta
Biol. Life Sci. Forum 2023, 27(1), 33; https://doi.org/10.3390/IECAG2023-14972 - 13 Oct 2023
Viewed by 1512
Abstract
Over the last three decades, the Government of India (GOI) has used watershed management as a solution to solve issues concerning sustainable agricultural output in rainfed areas. Additionally, since 2003, the GOI has made watershed management a national policy. A lot of thought [...] Read more.
Over the last three decades, the Government of India (GOI) has used watershed management as a solution to solve issues concerning sustainable agricultural output in rainfed areas. Additionally, since 2003, the GOI has made watershed management a national policy. A lot of thought is given to all of the significant crops that have disappeared from farming systems in the watershed development programs (WDPs) of India’s current development plans, which are primarily focused on increasing and sustaining productivity levels. In the Sarguja division of Chhattisgarh, the present study attempted to document the on-site and off-site effects of these watershed development programs, as it observed an increase in the ground water level, a rise in the surface water and stream flow levels, a reduction in runoff as well as in soil erosion, increased agricultural and dairy production, improved livelihood and employment generation, and changes in the land use and farming patterns. These findings showed that the percentage of cropland increased in both Kharif and Rabi, because they started planting crops in Zaid, particularly cucumber, melon, and vegetables, that had been kept fallow. The patterns of land usage in the WDP regions have improved over time; due to farmers utilizing more wasteland for productive reasons, there has been a rise in the net sown area of these locations. Additionally, it has been claimed that many crops that were previously abandoned due to water shortages and other requirements are now being cultivated. Responses from the region’s population have been in favor of the introduction of innovative techniques like Agroforestry systems. Full article
(This article belongs to the Proceedings of The 3rd International Electronic Conference on Agronomy)
Show Figures

Figure 1

23 pages, 2611 KB  
Article
Fuzzy CNN Autoencoder for Unsupervised Anomaly Detection in Log Data
by Oleg Gorokhov, Mikhail Petrovskiy, Igor Mashechkin and Maria Kazachuk
Mathematics 2023, 11(18), 3995; https://doi.org/10.3390/math11183995 - 20 Sep 2023
Cited by 4 | Viewed by 3066
Abstract
Currently, the task of maintaining cybersecurity and reliability in various computer systems is relevant. This problem can be solved by detecting anomalies in the log data, which are represented as a stream of textual descriptions of events taking place. For these purposes, reduction [...] Read more.
Currently, the task of maintaining cybersecurity and reliability in various computer systems is relevant. This problem can be solved by detecting anomalies in the log data, which are represented as a stream of textual descriptions of events taking place. For these purposes, reduction to a One-class classification problem is used. Standard One-class classification methods do not achieve good results. Deep learning approaches are more effective. However, they are not robust to outliers and require a lot of computational effort. In this paper, we propose a new robust approach based on a convolutional autoencoder using fuzzy clustering. The proposed approach uses a parallel convolution operation to feature extraction, which makes it more efficient than the currently popular Transformer architecture. In the course of the experiments, the proposed approach showed the best results for both the cybersecurity and the reliability problems compared to existing approaches. It was also shown that the proposed approach is robust to outliers in the training set. Full article
(This article belongs to the Special Issue Mathematical Modeling, Optimization and Machine Learning, 2nd Edition)
Show Figures

Figure 1

4 pages, 990 KB  
Proceeding Paper
Improved Spider Monkey Optimization Algorithm for Hybrid Flow Shop Scheduling Problem with Lot Streaming
by Jinhao Du, Jabir Mumtaz and Jingyan Zhong
Eng. Proc. 2023, 45(1), 23; https://doi.org/10.3390/engproc2023045023 - 11 Sep 2023
Cited by 4 | Viewed by 1310
Abstract
This paper investigates the hybrid flow shop scheduling problem with lot streaming, which integrates the order lot problem (OLP), order sequence problem (OSP), and lots assignment problem (LAP), with the objective of minimizing both the maximum completion time (Cmax [...] Read more.
This paper investigates the hybrid flow shop scheduling problem with lot streaming, which integrates the order lot problem (OLP), order sequence problem (OSP), and lots assignment problem (LAP), with the objective of minimizing both the maximum completion time (Cmax) and the total tardiness (TT) simultaneously. An improved spider monkey optimization (I-SMO) algorithm is proposed by combining the advantages of crossover and mutation operations of a genetic algorithm (GA) with the spider monkey optimization algorithm. The contribution value method is employed to select both global and local leaders. Experimental comparisons with classical optimization algorithms, including particle swarm optimization (PSO) and differential evolution (DE), were conducted to demonstrate the superiority of the proposed I-SMO algorithm. Full article
Show Figures

Figure 1

23 pages, 6290 KB  
Article
Exploiting Dynamic Vector-Level Operations and a 2D-Enhanced Logistic Modular Map for Efficient Chaotic Image Encryption
by Hongmin Li, Shuqi Yu, Wei Feng, Yao Chen, Jing Zhang, Zhentao Qin, Zhengguo Zhu and Marcin Wozniak
Entropy 2023, 25(8), 1147; https://doi.org/10.3390/e25081147 - 31 Jul 2023
Cited by 67 | Viewed by 3047
Abstract
Over the past few years, chaotic image encryption has gained extensive attention. Nevertheless, the current studies on chaotic image encryption still possess certain constraints. To break these constraints, we initially created a two-dimensional enhanced logistic modular map (2D-ELMM) and subsequently devised a chaotic [...] Read more.
Over the past few years, chaotic image encryption has gained extensive attention. Nevertheless, the current studies on chaotic image encryption still possess certain constraints. To break these constraints, we initially created a two-dimensional enhanced logistic modular map (2D-ELMM) and subsequently devised a chaotic image encryption scheme based on vector-level operations and 2D-ELMM (CIES-DVEM). In contrast to some recent schemes, CIES-DVEM features remarkable advantages in several aspects. Firstly, 2D-ELMM is not only simpler in structure, but its chaotic performance is also significantly better than that of some newly reported chaotic maps. Secondly, the key stream generation process of CIES-DVEM is more practical, and there is no need to replace the secret key or recreate the chaotic sequence when handling different images. Thirdly, the encryption process of CIES-DVEM is dynamic and closely related to plaintext images, enabling it to withstand various attacks more effectively. Finally, CIES-DVEM incorporates lots of vector-level operations, resulting in a highly efficient encryption process. Numerous experiments and analyses indicate that CIES-DVEM not only boasts highly significant advantages in terms of encryption efficiency, but it also surpasses many recent encryption schemes in practicality and security. Full article
(This article belongs to the Special Issue Image Encryption and Privacy Protection Based on Chaotic Systems)
Show Figures

Figure 1

29 pages, 3727 KB  
Review
DC Microgrids: A Propitious Smart Grid Paradigm for Smart Cities
by Shriram S. Rangarajan, Rahul Raman, Amritpal Singh, Chandan Kumar Shiva, Ritesh Kumar, Pradip Kumar Sadhu, E. Randolph Collins and Tomonobu Senjyu
Smart Cities 2023, 6(4), 1690-1718; https://doi.org/10.3390/smartcities6040079 - 3 Jul 2023
Cited by 83 | Viewed by 11135
Abstract
Recent years have seen a surge in interest in DC microgrids as DC loads and DC sources like solar photovoltaic systems, fuel cells, batteries, and other options have become more mainstream. As more distributed energy resources (DERs) are integrated into an existing smart [...] Read more.
Recent years have seen a surge in interest in DC microgrids as DC loads and DC sources like solar photovoltaic systems, fuel cells, batteries, and other options have become more mainstream. As more distributed energy resources (DERs) are integrated into an existing smart grid, DC networks have come to the forefront of the industry. DC systems completely sidestep the need for synchronization, reactive power control, and frequency control. DC systems are more dependable and productive than ever before because AC systems are prone to all of these issues. There is a lot of unrealized potential in DC power, but it also faces some significant challenges. Protecting a DC system is difficult because there is no discrete location of where the current disappears. DC microgrid stability that is dependent on inertia must also be considered during the planning stage. The problems that DC microgrids have include insufficient power quality and poor communication. The power quality, inertia, communication, and economic operations of these value streams, as well as their underlying architectures and protection schemes, are all extensively discussed in this paper. This review paper examines the pros and cons of both grid-connected and isolated DC microgrids. In addition, the paper compares the different kinds of microgrids in terms of power distribution and energy management agency, such as the prerequisites for a DC microgrid’s planning, operation, and control that must be met before state-of-the-art systems can be implemented. Full article
Show Figures

Figure 1

Back to TopTop