Discriminating and Clustering Ordered Permutations Using Artificial Neural Networks: A Potential Application in ANN-Guided Genetic Algorithms
Abstract
:1. Introduction
2. Adaptive Resonance Theory Neural Network
2.1. ART-1
- Step 1:
- The vigilance parameter, learning rate, and weights are initialized as follows:
- Step 2:
- When an input vector is introduced in the ART neural network, the recognition layer begins the comparison and finds the maximum of all the neurons’ net output.
- Step 3:
- Run the vigilance test. A neuron (j) in the recognition layer passes the vigilance test only if
- Step 4:
- If the vigilance test fails, then obscure the current winner and go to step 1 to find another winning neuron. Repeat the whole process until a winning neuron passes the vigilance test, and then go to step 5.
- Step 5:
- If no neuron passes the vigilance test, then create a new neuron to accommodate the new input pattern.
- Step 6:
- Adjust and update the feed-forward weights from the winning neuron to the inputs. The updated bottom-up and top-down weights can be obtained as follows:
- Step 7:
- If there is no input vector, then stop; otherwise, go to Step 2.
2.2. Improved-ART-1
- (i)
- Column and rows will be arranged in decreasing increments of 1 s.
- (ii)
- The prototype patterns are stored during the training period according to one of the following equations:
3. Proposed Permutation to Binary Conversion Methods
3.1. Conversion Method-1
3.2. Conversion Method-2
4. Clustering Ordered Permutations
4.1. Experimental Data Generation
4.2. Clustering Performance Criteria
4.2.1. Misclassification
4.2.2. Homogeneity
4.2.3. Average Distance
4.3. Empirical Investigations
5. Proposed Architecture ANN Guided GA
6. Flow Shop Scheduling—A Case Study
6.1. Assumption and Notations
6.1.1. Problem Description
6.1.2. Notations
Indexes and Input Data | |
I | Total number of stages where the stages are indexed by i or ; |
Mi | Total number of machines in stage i where the machines are denoted by m or ; |
N | Total number of jobs where the jobs are represented by n or ; |
Pn | A set of pairs of stages for job n, i.e., the processing of job n in stage l is followed by its processing in stage i; |
Tn,m,i | Total processing time for one unit of job n on machine m in stage i; |
Qn | Batch size of job n; |
Rm,i | Number of maximum production runs of machine m in stage i where the production runs are indexed by r or ; |
Sm,i,n,p | Setup time on machine m in stage i for processing job n following the processing of job p on this machine; if , the setup may be called minor setup; |
An,i | Binary data equal 1 if setup of job n in stage i is attached (non-anticipatory), or 0 if this setup is detached setup (anticipatory); |
Bn,i | Binary data equal 1 if job n needs processing in stage i; otherwise, 0; |
Dn,m,i | Binary data equal 1 if job n can be processed on machine m in stage i, otherwise 0; ; |
Fm,i | The release date of machine m in stage i; |
Ω | Large positive number. |
Variables | |
Continuous Variables: | |
cn,i | Completion time of the job n from stage i; |
Completion time of the rth run of machine m in stage i. | |
Binary Variables: | |
xr,m,i,n | Binary variable that takes the value 1 if the rth run on machine m in stage i is for the job n, 0 otherwise; |
zr,m,i | A binary variable that equals 1 if the rth potential run of machine m in stage i has been assigned a job to process, 0 otherwise; |
cmax | Makespan of the schedule. |
6.2. Components of Pure GA
6.2.1. Solution Representation
6.2.2. Fitness Evaluation
- Case 1:
- If job n is considered the first job to be assigned to machine m, and stage i is the first stage for this job n; then, becomes .
- Case 2:
- If job p is the last job assigned to machine m and i is the first stage to be visited for job n, then becomes .
- Case 3:
- If job n is the first job to be assigned to machine m and this job n has to visit stage l before visiting stage i, then becomes .
- Case 4:
- If job p is the last job to be assigned to machine m, and job n has to visit stage l before visiting stage i, then becomes .
Assignment Rule
- Step 1:
- Decode the sizes of each job for the considered chromosome and obtain the permutation from the chromosome (as shown in Figure 7). Initialize and .
- Step 2:
- Set .
- Step 3:
- If , go to Step 4; otherwise, go to Step 7.
- Step 4:
- If , go to Step 5; otherwise, go to Step 6.
- Step 5:
- Assign job n to one of the available machines m in stage i, where it will be completed at the earliest completion time .
- Step 6:
- If , set and go to Step 4; otherwise, go to Step 7.
- Step 7:
- If , set and go to Step 2; otherwise, go to Step 8.
- Step 8:
- Calculate . Here, defines the smallest makespan/completion time of all jobs processed in the system of considered chromosome.
6.2.3. Genetic Operators
Selection Operator
Crossover Operator
- Step 1:
- Firstly, the crossover point is arbitrarily chosen from each parent. Then, all genes from the left-hand side of both parents’ crossover points are copied to generate two children. Additionally, the genes from the right side of the crossover point will be removed in the new child (see Figure 8a).
- Step 2:
- The children are then interchanged between the parents, i.e., child 1 goes to parent 2, and child 2 comes to parent 1. At last, the missing jobs are placed in the relative order of the other parent (as shown in Figure 8b).
Mutation Operator
6.3. Comparative Study: Pure GA vs. Improved-ART-1-Guided GA
6.3.1. Small Prototype Problem
6.3.2. Typical Solution for Small Problem
6.3.3. Performance Analysis When Solving Large Problems
7. Conclusions and Future Research
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Mahadevan, E.G. Ammonium Nitrate Explosives for Civil Applications: Slurries, Emulsions and Ammonium Nitrate Fuel Oils; John Wiley & Sons: Hoboken, NJ, USA, 2009. [Google Scholar] [CrossRef]
- Talbi, E.G. Machine Learning into Metaheuristics: A Survey and Taxonomy; Technical Report 6. ACM Comput. Surv. 2021, 54, 1–32. [Google Scholar] [CrossRef]
- Zhang, J.; Zhang, Z.H.; Lin, Y.; Chen, N.; Gong, Y.J.; Zhong, J.H.; Chung, H.S.; Li, Y.; Shi, Y.H. Evolutionary computation meets machine learning: A survey. IEEE Comput. Intell. Mag. 2011, 6, 68–75. [Google Scholar] [CrossRef]
- Mehta, A. A Comprehensive Guide to Neural Networks. Digitalvidya 2019, 2. [Google Scholar]
- Grossberg, S. Adaptive Resonance Theory: How a brain learns to consciously attend, learn, and recognize a changing world. Neural Netw. 2013, 37, 1–47. [Google Scholar] [CrossRef]
- Yang, C.; Ge, S.S.; Xiang, C.; Chai, T.; Lee, T.H. Output feedback NN control for two classes of discrete-time systems with unknown control directions in a unified approach. IEEE Trans. Neural Netw. 2008, 19, 1873–1886. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Mjolsness, E.; DeCoste, D. Machine learning for science: State of the art and future prospects. Science 2001, 293, 2051–2055. [Google Scholar] [CrossRef]
- Zurada, J.M.; Mazurowski, M.A.; Ragade, R.; Abdullin, A.; Wojtudiak, J.; Gentle, J. Building virtual community in computational intelligence and machine learning. IEEE Comput. Intell. Mag. 2009, 4, 43–54. [Google Scholar] [CrossRef]
- Kumar, R.; Aggarwal, R.K.; Sharma, J.D. Energy analysis of a building using artificial neural network: A review. Energy Build. 2013, 65, 352–358. [Google Scholar] [CrossRef]
- Alippi, C.; De Russis, C.; Piuri, V. A neural-network based control solution to air-fuel ratio control for automotive fuel-injection systems. IEEE Trans. Syst. Man Cybern. Part C Appl. Rev. 2003, 33, 259–268. [Google Scholar] [CrossRef]
- Marim, L.R.; Lemes, M.R.; Dal Pino, A. Neural-network-assisted genetic algorithm applied to silicon clusters. Phys. Rev. A-At. Mol. Opt. Phys. 2003, 67, 8. [Google Scholar] [CrossRef]
- Lee, C.; Gen, M.; Kuo, W. Reliability optimization design using a hybridized genetic algorithm with a neural-network technique. IEICE Trans. Fundam. Electron. Commun. Comput. Sci. 2001, E84-A, 627–637. [Google Scholar]
- Sivapathasekaran, C.; Mukherjee, S.; Ray, A.; Gupta, A.; Sen, R. Artificial neural network modeling and genetic algorithm based medium optimization for the improved production of marine biosurfactant. Bioresour. Technol. 2010, 101, 2884–2887. [Google Scholar] [CrossRef]
- Palmes, P.P.; Hayasaka, T.; Usui, S. Mutation-based genetic neural network. IEEE Trans. Neural Netw. 2005, 16, 587–600. [Google Scholar] [CrossRef]
- Patra, T.K.; Meenakshisundaram, V.; Hung, J.H.; Simmons, D.S. Neural-Network-Biased Genetic Algorithms for Materials Design: Evolutionary Algorithms That Learn. ACS Comb. Sci. 2017, 19, 96–107. [Google Scholar] [CrossRef]
- Deane, J. Hybrid genetic algorithm and augmented neural network application for solving the online advertisement scheduling problem with contextual targeting. Expert Syst. Appl. 2012, 39, 5168–5177. [Google Scholar] [CrossRef]
- Inthachot, M.; Boonjing, V.; Intakosum, S. Artificial Neural Network and Genetic Algorithm Hybrid Intelligence for Predicting Thai Stock Price Index Trend. Comput. Intell. Neurosci. 2016, 2016. [Google Scholar] [CrossRef] [PubMed] [Green Version]
- Chen, M.L.; Wu, C.M.; Chen, C.L. An integrated approach of art1 and tabu search to solve cell formation problems. J. Chin. Inst. Ind. Eng. 2002, 19, 62–74. [Google Scholar] [CrossRef] [Green Version]
- Cheng, C.Y.; Pourhejazy, P.; Ying, K.C.; Lin, C.F. Unsupervised Learning-based Artificial Bee Colony for minimizing non-value-adding operations. Appl. Soft Comput. 2021, 105, 107280. [Google Scholar] [CrossRef]
- Ali, I.M.; Essam, D.; Kasmarik, K. A novel design of differential evolution for solving discrete traveling salesman problems. Swarm Evol. Comput. 2020, 52, 100607. [Google Scholar] [CrossRef]
- Nasiri, M.M.; Salesi, S.; Rahbari, A.; Salmanzadeh Meydani, N.; Abdollai, M. A data mining approach for population-based methods to solve the JSSP. Soft Comput. 2019, 23, 11107–11122. [Google Scholar] [CrossRef]
- Min, J.N.; Jin, C.; Lu, L.J. Maximum-minimum distance clustering method for split-delivery vehicle-routing problem: Case studies and performance comparisons. Adv. Prod. Eng. Manag. 2019, 14, 125–135. [Google Scholar] [CrossRef]
- Burton, A.R.; Vladimirova, T. Utilisation of an adaptive resonance theory neural network as a genetic algorithm fitness evaluator. In Proceedings of the IEEE International Symposium on Information Theory-Proceedings, Ulm, Germany, 29 June–4 July 1997; p. 209. [Google Scholar] [CrossRef] [Green Version]
- Burton, A.R.; Vladimirova, T. Genetic Algorithm Utilising Neural Network Fitness Evaluation for Musical Composition. Artif. Neural Nets Genet. Algorithms 1998, 219–223. [Google Scholar] [CrossRef]
- Pathak, B.K.; Srivastava, S.; Srivastava, K. Neural network embedded multiobjective genetic algorithm to solve non-linear time-cost tradeoff problems of project scheduling. J. Sci. Ind. Res. 2008, 67, 124–131. [Google Scholar]
- Moraglio, A.; Kim, Y.H.; Yoon, Y. Geometric surrogate-based optimisation for permutation-based problems. In Proceedings of the Genetic and Evolutionary Computation Conference, GECCO’11-Companion Publication, Dublin, Ireland, 12–16 July 2011; pp. 133–134. [Google Scholar] [CrossRef]
- Horng, S.C.; Lin, S.Y.; Lee, L.H.; Chen, C.H. Memetic algorithm for real-time combinatorial stochastic simulation optimization problems with performance analysis. IEEE Trans. Cybern. 2013, 43, 1495–1509. [Google Scholar] [CrossRef] [PubMed]
- Lucas, F.; Billot, R.; Sevaux, M.; Sörensen, K. Reducing space search in combinatorial optimization using machine learning tools. Lect. Notes Comput. Sci. 2020, 12096, 143–150. [Google Scholar] [CrossRef]
- Hao, J.H.; Liu, M. A surrogate modelling approach combined with differential evolution for solving bottleneck stage scheduling problems. In Proceedings of the World Automation Congress Proceedings, Waikoloa, HI, USA, 3–7 August 2014; pp. 120–124. [Google Scholar] [CrossRef]
- Nguyen, S.; Zhang, M.; Johnston, M.; Tan, K.C. Selection schemes in surrogate-assisted genetic programming for job shop scheduling. Lect. Notes Comput. Sci. 2014, 8886, 656–667. [Google Scholar] [CrossRef]
- Karimi-Mamaghan, M.; Mohammadi, M.; Pasdeloup, B.; Meyer, P. Learning to select operators in meta-heuristics: An integration of Q-learning into the iterated greedy algorithm for the permutation flowshop scheduling problem. Eur. J. Oper. Res. 2022; in press. [Google Scholar] [CrossRef]
- Karimi-Mamaghan, M.; Mohammadi, M.; Meyer, P.; Karimi-Mamaghan, A.M.; Talbi, E.G. Machine learning at the service of meta-heuristics for solving combinatorial optimization problems: A state-of-the-art. Eur. J. Oper. Res. 2022, 296, 393–422. [Google Scholar] [CrossRef]
- Gunawan, A.; Lau, H.C.; Lu, K. ADOPT: Combining parameter tuning and Adaptive Operator Ordering for solving a class of Orienteering Problems. Comput. Ind. Eng. 2018, 121, 82–96. [Google Scholar] [CrossRef]
- Mosadegh, H.; Fatemi Ghomi, S.M.; Süer, G.A. Stochastic mixed-model assembly line sequencing problem: Mathematical modeling and Q-learning based simulated annealing hyper-heuristics. Eur. J. Oper. Res. 2020, 282, 530–544. [Google Scholar] [CrossRef]
- Zhao, F.; Zhang, L.; Cao, J.; Tang, J. A cooperative water wave optimization algorithm with reinforcement learning for the distributed assembly no-idle flowshop scheduling problem. Comput. Ind. Eng. 2021, 153, 107082. [Google Scholar] [CrossRef]
- Michalski, R.S. Learnable evolution model: Evolutionary processes guided by machine learning. Mach. Learn. 2000, 38, 9–40. [Google Scholar] [CrossRef] [Green Version]
- Wu, W.; Tseng, S.P. An improved learnable evolution model for discrete optimization problem. Smart Innov. Syst. Technol. 2017, 64, 333–340. [Google Scholar] [CrossRef]
- Moradi, B. The new optimization algorithm for the vehicle routing problem with time windows using multi-objective discrete learnable evolution model. Soft Comput. 2020, 24, 6741–6769. [Google Scholar] [CrossRef]
- Wojtusiak, J.; Warden, T.; Herzog, O. Agent-based pickup and delivery planning: The learnable evolution model approach. In Proceedings of the International Conference on Complex, Intelligent and Software Intensive Systems, CISIS 2011, Seoul, Korea, 30 June–2 July 2011; pp. 1–8. [Google Scholar] [CrossRef]
- Wojtusiak, J.; Warden, T.; Herzog, O. The learnable evolution model in agent-based delivery optimization. Memetic Comput. 2012, 4, 165–181. [Google Scholar] [CrossRef] [Green Version]
- Domanski, P.A.; Yashar, D.; Kaufman, K.A.; Michalski, R.S. An optimized design of finned-tube evaporators using the learnable evolution model. HVAC R Res. 2004, 10, 201–211. [Google Scholar] [CrossRef]
- Jourdan, L.; Corne, D.; Savic, D.; Walters, G. Preliminary investigation of the ‘learnable evolution model’ for faster/better multiobjective water systems design. Lect. Notes Comput. Sci. 2005, 3410, 841–855. [Google Scholar] [CrossRef]
- Tahsien, S.M.; Defersha, F.M. Discriminating and Clustering Ordered Permutations Using Neural Network and Potential Applications in Neural Network-Guided Metaheuristics. In Proceedings of the 2020 7th International Conference on Soft Computing and Machine Intelligence, ISCMI 2020. Institute of Electrical and Electronics Engineers (IEEE), Stockholm, Sweden, 14–15 November 2020; pp. 136–142. [Google Scholar] [CrossRef]
- Awodele, O.; Jegede, O. Neural Networks and Its Application in Engineering. In Proceedings of the 2009 InSITE Conference. Informing Science Institute, Macon, GA, USA, 12–15 June 2009. [Google Scholar] [CrossRef] [Green Version]
- Carpenter, G.A.; Grossberg, S. Neural dynamics of category learning and recognition: Attention, memory consolidation, and amnesia. Adv. Psychol. 1987, 42, 239–286. [Google Scholar] [CrossRef]
- Pandya, A.S.; Macy, R.B. Pattern Recognition with Neural Networks in C++; CRC Press: Boca Raton, FL, USA, 2021; Volume 16, pp. 261–262. [Google Scholar] [CrossRef]
- Dagli, C.; Huggahalli, R. Machine-part family formation with the adaptive resonance theory paradigm. Int. J. Prod. Res. 1995, 33, 893–913. [Google Scholar] [CrossRef]
- Dagli, C.H.; Huggahalli, R. A Neural Network Approach to Group Technology; World Scientific: Singapore, 1993. [Google Scholar] [CrossRef]
- Agarwal, A.; Colak, S.; Erenguc, S. A Neurogenetic approach for the resource-constrained project scheduling problem. Comput. Oper. Res. 2011, 38, 44–50. [Google Scholar] [CrossRef]
- Defersha, F.M.; Chen, M. Mathematical model and parallel genetic algorithm for hybrid flexible flowshop lot streaming problem. Int. J. Adv. Manuf. Technol. 2012, 62, 249–265. [Google Scholar] [CrossRef]
- Ruiz, R.; Şerifoǧlu, F.S.; Urlings, T. Modeling realistic hybrid flexible flowshop scheduling problems. Comput. Oper. Res. 2008, 35, 1151–1175. [Google Scholar] [CrossRef]
- Yilmaz Eroǧlu, D.; Özmutlu, H.C.; Köksal, S.A. A genetic algorithm for the unrelated parallel machine scheduling problem with job splitting and sequence-dependent setup times-loom scheduling. Tekst. Konfeksiyon 2014, 24, 66–73. [Google Scholar]
- Holland, J.H. Adaptation in Natural and Artificial Systems; MIT Press: Cambridge, MA, USA, 2019. [Google Scholar] [CrossRef]
- Ruiz, R.; Maroto, C. A genetic algorithm for hybrid flowshops with sequence dependent setup times and machine eligibility. Eur. J. Oper. Res. 2006, 169, 781–800. [Google Scholar] [CrossRef]
- Altenberg, L. Evolutionary Computation. In Encyclopedia of Evolutionary Biology; Springer: Berlin/Heidelberg, Germnay, 2016; pp. 40–47. [Google Scholar] [CrossRef]
- Defersha, F.M.; Chen, M. Jobshop lot streaming with routing flexibility, sequence-dependent setups, machine release dates and lag time. Int. J. Prod. Res. 2012, 50, 2331–2352. [Google Scholar] [CrossRef]
ART-1 | Imp. ART-1 | |||
---|---|---|---|---|
M1 | M2 | M1 | M2 | |
Misclassification | 1834 | 1900 | 22 | 183 |
Homogeneity (%) | 42 | 35 | 99 | 72 |
Average Distance | 47 | 55 | 8 | 21 |
P1 (10 Objetcs) | P2 (20 Objetcs) | P3 (30 Objects) | P4 (40 Objects) | |||||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
ART-1 | Imp. ART-1 | ART-1 | Imp. ART-1 | ART-1 | Imp. ART-1 | ART-1 | Imp. ART-1 | |||||||||
M1 | M2 | M1 | M2 | M1 | M2 | M1 | M2 | M1 | M2 | M1 | M2 | M1 | M2 | M1 | M2 | |
M | 2059 | 2215 | 941 | 1407 | 1829 | 1984 | 58 | 170 | 1921 | 1726 | 46 | 176 | 1834 | 1900 | 22 | 183 |
H | 26 | 26 | 39 | 52 | 44 | 34 | 96 | 85 | 34 | 47 | 98 | 75 | 42 | 35 | 99 | 72 |
A.D. | 8 | 8 | 5 | 7 | 18 | 20 | 11 | 12 | 33 | 33 | 17 | 18 | 47 | 55 | 8 | 21 |
Stage (i) | Number of Parallel Machine () | Machine (m) | Release Date of Machine m at Stage i () |
---|---|---|---|
1 | 2 | 1 | 0 |
2 | 40 | ||
2 | 2 | 1 | 40 |
2 | 80 | ||
3 | 2 | 1 | 0 |
2 | 120 | ||
4 | 2 | 1 | 80 |
2 | 240 |
Job | Batch Size | Operation | Stage | Setup Property | Processing Time on Eligible Machine, , () |
---|---|---|---|---|---|
1 | 20 | 1 | 1 | 0 | (1, 6) |
2 | 2 | 1 | (1, 4)(2, 3.5) | ||
3 | 4 | 0 | (1, 6)(2, 6.5) | ||
2 | 50 | 1 | 1 | 0 | (1, 4.5)(2, 4) |
2 | 2 | 1 | (1, 6.5)(2, 5.5) | ||
3 | 3 | 1 | (1, 6.5)(2, 6.5) | ||
3 | 20 | 1 | 1 | 1 | (1, 3.5)(2, 4) |
2 | 2 | 0 | (1, 3)(2, 3) | ||
3 | 3 | 1 | (1, 3)(2, 4.5) | ||
4 | 4 | 1 | (2, 5.5) | ||
4 | 20 | 1 | 1 | 1 | (1, 4)(2, 3) |
2 | 2 | 0 | (1, 4.5)(2, 4) | ||
3 | 4 | 0 | (2, 3.5) | ||
5 | 40 | 1 | 1 | 0 | (1, 4)(2, 3) |
2 | 2 | 1 | (1, 3)(2, 4.5) | ||
3 | 3 | 0 | (1, 3.5)(2, 4) | ||
4 | 4 | 1 | (1, 5) | ||
6 | 50 | 1 | 1 | 1 | (1, 4) |
2 | 2 | 1 | (1, 5)(2, 5.5) | ||
3 | 3 | 0 | (1, 5.5)(2, 6.5) | ||
7 | 30 | 1 | 1 | 0 | (1, 5)(2, 6) |
2 | 2 | 1 | (1, 6.5)(2, 5.5) | ||
3 | 4 | 1 | (1, 3) | ||
8 | 30 | 1 | 1 | 1 | (1, 6.5)(2, 6.5) |
2 | 2 | 0 | (1, 3.5)(2, 3.5) | ||
3 | 3 | 1 | (1, 5)(2, 5) | ||
4 | 4 | 0 | (1, 5)(2, 6.5) |
Job | Operation | Stage | Eligible Machine (i.e.,) | Setup Time (if the Operation Is the First to Be Processed) | () Setup Time if Job n Follows Job p |
---|---|---|---|---|---|
1 | 1 | 1 | 1 | 120 | (2, 340)(3, 180)(4, 80)(5, 200)(6, 280)(7, 360)(8, 220) |
2 | 2 | 1 | 100 | (2, 100)(3, 80)(4, 260)(5, 400)(6, 320)(7, 180)(8, 240) | |
2 | 120 | (2, 340)(3, 400)(4, 260)(5, 320)(6, 380)(7, 60)(8, 360) | |||
3 | 4 | 1 | 100 | (5, 80)(7, 340)(8, 260) | |
2 | 40 | (3, 300)(4, 240)(8, 260) | |||
2 | 1 | 1 | 1 | 80 | (1, 320)(3, 240)(4, 300)(5, 140)(6, 340)(7, 260)(8, 380) |
2 | 60 | (3, 380)(4, 320)(5, 120)(7, 80)(8, 260) | |||
2 | 2 | 1 | 60 | (1, 240)(3, 140)(4, 100)(5, 300)(6, 280)(7, 60)(8, 140) | |
2 | 120 | (1, 80)(3, 340)(4, 280)(5, 140)(6, 120)(7, 400)(8, 120) | |||
3 | 3 | 1 | 100 | (3, 120)(5, 40)(6, 60)(8, 400) | |
2 | 80 | (3, 260)(5, 140)(6, 100)(8, 140) | |||
3 | 1 | 1 | 1 | 60 | (1, 100)(2, 360)(4, 300)(5, 240)(6, 200)(7, 260)(8, 380) |
2 | 120 | (2, 260)(4, 240)(5, 260)(7, 280)(8, 260) | |||
2 | 2 | 1 | 80 | (1, 60)(2, 260)(4, 60)(5, 120)(6, 100)(7, 400)(8, 100) | |
2 | 100 | (1, 100)(2, 400)(4, 400)(5, 240)(6, 120)(7, 280)(8, 320) | |||
3 | 3 | 1 | 100 | (2, 120)(5, 200)(6, 400)(8, 260) | |
2 | 40 | (2, 220)(5, 360)(6, 200)(8, 240) | |||
4 | 4 | 2 | 40 | (1, 380)(4, 140)(8, 160) | |
4 | 1 | 1 | 1 | 40 | (1, 260)(2, 180)(3, 380)(5, 220)(6, 200)(7, 320)(8, 260) |
2 | 60 | (2, 400)(3, 360)(5, 80)(7, 280)(8, 300) | |||
2 | 2 | 1 | 120 | (1, 100)(2, 60)(3, 140)(5, 300)(6, 400)(7, 280)(8, 360) | |
2 | 60 | (1, 120)(2, 40)(3, 240)(5, 280)(6, 40)(7, 280)(8, 60) | |||
3 | 4 | 2 | 40 | (1, 260)(3, 320)(8, 180) | |
5 | 1 | 1 | 1 | 100 | (1, 140)(2, 260)(3, 120)(4, 160)(6, 120)(7, 120)(8, 380) |
2 | 120 | (2, 160)(3, 240)(4, 180)(7, 60)(8, 100) | |||
2 | 2 | 1 | 100 | (1, 140)(2, 140)(3, 160)(4, 340)(6, 160)(7, 320)(8, 120) | |
2 | 120 | (1, 240)(2, 400)(3, 220)(4, 100)(6, 320)(7, 80)(8, 380) | |||
3 | 3 | 1 | 80 | (2, 60)(3, 320)(6, 120)(8, 280) | |
2 | 60 | (2, 220)(3, 280)(6, 160)(8, 220) | |||
4 | 4 | 1 | 80 | (1, 300)(7, 220)(8, 180) | |
6 | 1 | 1 | 1 | 40 | (1, 140)(2, 120)(3, 200)(4, 280)(5, 200)(7, 180)(8, 320) |
2 | 2 | 1 | 100 | (1, 160)(2, 60)(3, 340)(4, 300)(5, 100)(7, 160)(8, 280) | |
2 | 80 | (1, 180)(2, 400)(3, 260)(4, 220)(5, 40)(7, 280)(8, 40) | |||
3 | 3 | 1 | 100 | (2, 200)(3, 220)(5, 280)(8, 320) | |
2 | 40 | (2, 340)(3, 240)(5, 40)(8, 100) | |||
7 | 1 | 1 | 1 | 80 | (1, 140)(2, 400)(3, 240)(4, 300)(5, 220)(6, 80)(8, 300) |
2 | 80 | (2, 140)(3, 280)(4, 200)(5, 120)(8, 240) | |||
2 | 2 | 1 | 80 | (1, 100)(2, 260)(3, 160)(4, 340)(5, 260)(6, 60)(8, 280) | |
2 | 80 | (1, 400)(2, 320)(3, 260)(4, 60)(5, 340)(6, 320)(8, 80) | |||
3 | 4 | 1 | 40 | (1, 340)(5, 300)(8, 380) | |
8 | 1 | 1 | 1 | 40 | (1, 60)(2, 320)(3, 400)(4, 260)(5, 120)(6, 380)(7, 280) |
2 | 80 | (2, 280)(3, 180)(4, 400)(5, 120)(7, 380) | |||
2 | 2 | 1 | 100 | (1, 320)(2, 320)(3, 220)(4, 400)(5, 360)(6, 320)(7, 240) | |
2 | 80 | (1, 240)(2, 160)(3, 200)(4, 260)(5, 360)(6, 60)(7, 140) | |||
3 | 3 | 1 | 120 | (2, 220)(3, 320)(5, 240)(6, 100) | |
2 | 80 | (2, 320)(3, 280)(5, 60)(6, 200) | |||
4 | 4 | 1 | 40 | (1, 300)(5, 60)(7, 120) | |
2 | 120 | (1, 40)(3, 260)(4, 300) |
s | n | i | m | Illustration of Steps |
---|---|---|---|---|
1 | 1 | 1 | 1 | Step 1: This is the 1st stage for and the 1st job to be processed on this machine, according to the conditions, Case 1 will be considered; Step 2: Set (n, i) = (1, 1), then = + + × = min |
2 | Machine is not eligible for | |||
Decision: is considered at this stage for , where completion time is 240 min. | ||||
2 | 1 | Step 1: This is the 1st job assigned to this machine at stage ; according to the condition, Case 3 is applied; Step 2: Set (n, i) = (1, 2), = max{ + (1 − ) × ;} + × + × = max{40 + (1 − 1) × 100; } + 20 × 4 + 1 ×100 = max{40;240} + 180 = 420 min | ||
2 | Step 1: This is the 1st job assigned to this machine at stage ; according to the condition Case 3 is applied; Step 2: Set (n, i) = (1, 2), = max{ + (1 − ) × ;} + × + × = max{80 + (1 − 1)× 120; } + 20× 3.5 + 1 × 120 = max{80;240} + 190 = 430 min | |||
Decision: Since takes less time than , the completion time will be 460 min. | ||||
3 | 1 | Machine is not required | ||
2 | Machine is not required | |||
4 | 1 | Step 1: This is the 1st job assigned to this machine at stage ; according to the condition, Case 3 is applied; Step 2: Set (n, i) = (1, 4), = max{ + (1 − ) × ;} + × + × = max{80 + (1 − 0)× 100;} + 20× 6 + 0× 100 = max{180;420} + 120 = 540 min | ||
2 | Step 1: This is the 1st job assigned to this machine at stage ; according to the condition, Case 3 is applied; Step 2: Set (n, i) = (1, 4), = max{ + (1 − ) × ;} + × + × = max{240 + (1 − 0) × 40; } + 20 × 6.5 + 0 × 40 = max{240;420} + 130 = 550 min | |||
Decision: Since takes less time than , the completion time of 540 min will be chosen. | ||||
2 | 8 | 1 | 1 | Step 1: This is the 1st stage for but not the 1st job for this machine; according to the conditions, Case 2 will be considered; Step 2: Set (n, i) = (8, 1), = + + × = min |
2 | Step 1: This is the 1st stage for and the 1st job to be processed on this machine; according to the conditions, Case 1 will be considered; Step 2: Set (n, i) = (8, 1), = + + × = 315 min | |||
Decision: is considered at this stage for , where completion time is 315 min. | ||||
2 | 1 | Step 1: This is neither the 1st stage for nor the 1st job that needs to be completed on this machine at stage; according to the conditions, Case 4 is applied; Step 2: Set (n, i) = (8, 2), = max{ + (1 − ) × ;} + × + × = max{420 + (1 − 0) × 320; } + 30 × 3.5 + 0 ×320 = max{420;315} + 425 = 845 min | ||
2 | Step 1: This is the 1st job assigned on this machine at stage ; according to the condition, Case 3 is applied; Step 2: Set (n, i) = (8, 2), = max{ + (1 − ) × ;} + × + × = 420 min | |||
Decision: Since takes less time than , the completion time will be 420 min. | ||||
3 | 1 | According to the conditions, Case 3 is applied, where: Set (n, i) = (8, 3), = max{ + (1 − ) × ;} + × + × = 690 min | ||
2 | According to the conditions, Case 3 is applied, Set (n, i) = (8, 3), where = max{ + (1 − ) × ;} + × + × = 650 min | |||
Decision: Since takes less time than , the completion time will be 650 min. | ||||
4 | 1 | According to the conditions, Case 4 is applied: Set (n, i) = (8, 4), where, = max{ + (1 − ) × ;} + × + × = 990 min | ||
2 | According to the conditions, Case 3 is applied: Set (n, i) = (8, 4), where = max{ + (1 − ) × ;} + × + × = 845 min | |||
Decision: Since takes less time than , the completion time of 845 min will be chosen. |
Problem | No. of | No. of | No. of Parallel | No. of |
---|---|---|---|---|
No. | Jobs | Stages | Machine | Batch Size |
(n) | (i) | (m) | () | |
1 | 20 | 10 | 4 to 2 | 30 to 60 |
2 | 6 | 10 | 5 to 2 | 20 to 50 |
3 | 60 | 20 | 4 to 3 | 20 to 50 |
4 | 90 | 6 | 4 to 3 | 20 to 50 |
5 | 90 | 1 | 6 to 4 | 20 to 50 |
6 | 90 | 8 | 8 to 4 | 20 to 50 |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Tahsien, S.M.; Defersha, F.M. Discriminating and Clustering Ordered Permutations Using Artificial Neural Networks: A Potential Application in ANN-Guided Genetic Algorithms. Appl. Sci. 2022, 12, 7784. https://doi.org/10.3390/app12157784
Tahsien SM, Defersha FM. Discriminating and Clustering Ordered Permutations Using Artificial Neural Networks: A Potential Application in ANN-Guided Genetic Algorithms. Applied Sciences. 2022; 12(15):7784. https://doi.org/10.3390/app12157784
Chicago/Turabian StyleTahsien, Syeda M., and Fantahun M. Defersha. 2022. "Discriminating and Clustering Ordered Permutations Using Artificial Neural Networks: A Potential Application in ANN-Guided Genetic Algorithms" Applied Sciences 12, no. 15: 7784. https://doi.org/10.3390/app12157784
APA StyleTahsien, S. M., & Defersha, F. M. (2022). Discriminating and Clustering Ordered Permutations Using Artificial Neural Networks: A Potential Application in ANN-Guided Genetic Algorithms. Applied Sciences, 12(15), 7784. https://doi.org/10.3390/app12157784