Performance Guarantees of Recurrent Neural Networks for the Subset Sum Problem
Abstract
:1. Introduction
1.1. Heuristic Algorithm for SSP
1.2. Dynamic Programming for SSP
1.3. Others
1.4. Our Contributions
- We introduce a recurrent neural network, denoted as SS-NN, to solve the classical SSP. By defining a new activation function, we develop a dynamic programming equation for the classical SSP. This activation function maps all negative numbers to −1 and ensures that the number of inputs to the neural network is fixed. NN-SS is utilized to mimic the dynamic programming approach for solving the SSP. We define the mathematical model for each hidden layer of NN-SS and prove its correctness.
- We propose an approximate solution method for a type of SSP, where the value of the subset-sum is closest to a given value but not exceeding it. The dynamic programming equations for this type of SSP are defined using rounding granularity and the ReLU activation function. We construct a recurrent neural network, denoted as ASS-NN, to mimic the presented dynamic programming approach in determining an approximate solution. We rigorously prove that our proposed ASS-NN can correctly solve the SSP and analyze both its time complexity and error in approximation.
- We verify the correctness of our proposed method through examples and demonstrate that actual error values in approximate solutions align with theoretical error values through a series of illustrative examples.
2. RNNs for an Exact Solution to the SSP
2.1. The First Hidden Layer of SS-NN
2.2. The Second Hidden Layer of SS-NN
2.3. The Third Hidden Layer of SS-NN
2.4. The Fourth Hidden Layer of SS-NN
- 1.
- If , then it follows that and = 1. Hence, we obtain .
- 2.
- If , then we conclude that = 0. Therefore, we have .
- 3.
- If , this implies that and we have . Thus, we conclude that .
2.5. The Fifth Hidden Layer of SS-NN
3. RNNs for the Approximate Solution to the SSP
3.1. The First Layer of ASS-NN
3.2. The Second Layer of ASS-NN
3.3. The Third Hidden Layer of ASS-NN
3.4. The Final Hidden Layer of ASS-NN
4. Verification and Analysis with Examples
4.1. Example for SS-NN
4.2. Example for ASS-NN
- 1.
- Given that , it follows that 3 belongs to the required subset.
- 2.
- Given that , it follows that 5 belongs to the required subset.
- 3.
- Given that , 12 also belongs to our desired subset.
- 4.
- Given that , 4 belongs to the required subset.
- 5.
- Given that , 34 does not belong in our required set.
- 6.
- Given that , we conclude that 7 is included in the required subset.
- 7.
- Given that the second argument of is zero, the solution process concludes here. Ultimately, we determine that the required subset of is .
4.3. Error Analysis
5. Discussion
6. Conclusions
Author Contributions
Funding
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
References
- Kang, L.; Wan, L.; Li, K. Efficient parallelization of a two-list algorithm for the subset-sum problem on a hybrid CPU/GPU cluster. In Proceedings of the 2014 Sixth International Symposium on Parallel Architectures, Algorithms and Programming, Beijing, China, 13–15 July 2014; IEEE: Piscataway, NJ, USA, 2014; pp. 93–98. [Google Scholar]
- Ghosh, D.; Chakravarti, N. A competitive local search heuristic for the subset sum problem. Comput. Oper. Res. 1999, 26, 271–279. [Google Scholar] [CrossRef]
- Madugula, M.K.; Majhi, S.K.; Panda, N. An efficient arithmetic optimization algorithm for solving subset-sum problem. In Proceedings of the 2022 International Conference on Connected Systems & Intelligence (CSI), Trivandrum, India, 31 August–2 September 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 1–7. [Google Scholar]
- Wan, L.; Li, K.; Liu, J.; Li, K. GPU implementation of a parallel two-list algorithm for the subset-sum problem. Concurr. Comput. Pract. Exp. 2015, 27, 119–145. [Google Scholar] [CrossRef]
- Wan, L.; Li, K.; Li, K. A novel cooperative accelerated parallel two-list algorithm for solving the subset-sum problem on a hybrid CPU–GPU cluster. J. Parallel Distrib. Comput. 2016, 97, 112–123. [Google Scholar] [CrossRef]
- Dutta, P.; Rajasree, M.S. Efficient reductions and algorithms for variants of Subset Sum. arXiv 2021, arXiv:2112.11020. [Google Scholar]
- Ye, Y.; Borodin, A. Priority algorithms for the subset-sum problem. J. Comb. Optim. 2008, 16, 198–228. [Google Scholar] [CrossRef]
- Parque, V. Tackling the Subset Sum Problem with Fixed Size using an Integer Representation Scheme. In Proceedings of the 2021 IEEE Congress on Evolutionary Computation (CEC), Kraków, Poland, 28 June–1 July 2021; IEEE: Piscataway, NJ, USA, 2021; pp. 1447–1453. [Google Scholar]
- Li, L.; Zhao, K.; Ji, Z. A genetic algorithm to solve the subset sum problem based on parallel computing. Appl. Math. Inf. Sci. 2015, 9, 921. [Google Scholar]
- Kolpakov, R.M.; Posypkin, M.A. Effective parallelization strategy for the solution of subset sum problems by the branch-and-bound method. Discret. Math. Appl. 2020, 30, 313–325. [Google Scholar] [CrossRef]
- Kolpakov, R.; Posypkin, M. Optimality and complexity analysis of a branch-and-bound method in solving some instances of the subset sum problem. Open Comput. Sci. 2021, 11, 116–126. [Google Scholar] [CrossRef]
- Thada, V.; Shrivastava, U. Solution of subset sum problem using genetic algorithm with rejection of infeasible offspring method. Int. J. Emerg. Technol. Comput. Appl. Sci 2014, 10, 259–262. [Google Scholar]
- Wang, R.L. A genetic algorithm for subset sum problem. Neurocomputing 2004, 57, 463–468. [Google Scholar] [CrossRef]
- Bhasin, H.; Singla, N. Modified genetic algorithms based solution to subset sum problem. Int. J. Adv. Res. Artif. Intell. 2012, 1, 38–41. [Google Scholar] [CrossRef]
- Saketh, K.H.; Jeyakumar, G. Comparison of dynamic programming and genetic algorithm approaches for solving subset sum problems. In Proceedings of the Computational Vision and Bio-Inspired Computing: ICCVBIC 2019, Coimbatore, India, 25–26 September 2019; Springer: Cham, Switzerland, 2020; pp. 472–479. [Google Scholar]
- Kolpakov, R.; Posypkin, M. Lower time bounds for parallel solving of the subset sum problem by a dynamic programming algorithm. Concurr. Comput. Pract. Exp. 2024, 36, e8144. [Google Scholar] [CrossRef]
- Yang, F.; Jin, T.; Liu, T.Y.; Sun, X.; Zhang, J. Boosting dynamic programming with neural networks for solving np-hard problems. In Proceedings of the 10th Asian Conference on Machine Learning, ACML 2018, Beijing, China, 14–16 November 2018; pp. 726–739. [Google Scholar]
- Allcock, J.; Hamoudi, Y.; Joux, A.; Klingelhöfer, F.; Santha, M. Classical and quantum dynamic programming for Subset-Sum and variants. arXiv 2021, arXiv:2111.07059. [Google Scholar]
- Fujiwara, H.; Watari, H.; Yamamoto, H. Dynamic Programming for the Subset Sum Problem. Formaliz. Math. 2020, 28, 89–92. [Google Scholar] [CrossRef]
- Xu, S.; Panwar, S.S.; Kodialam, M.; Lakshman, T. Deep neural network approximated dynamic programming for combinatorial optimization. In Proceedings of the AAAI 2020 Conference on Artificial Intelligence, New York, NY, USA, 7–12 February 2020; Volume 34, pp. 1684–1691. [Google Scholar]
- Biesner, D.; Gerlach, T.; Bauckhage, C.; Kliem, B.; Sifa, R. Solving subset sum problems using quantum inspired optimization algorithms with applications in auditing and financial data analysis. In Proceedings of the 2022 21st IEEE International Conference on Machine Learning and Applications (ICMLA), Nassau, Bahamas, 12–14 December 2022; IEEE: Piscataway, NJ, USA, 2022; pp. 903–908. [Google Scholar]
- Xu, C.; Zhang, G. Learning-augmented algorithms for online subset sum. J. Glob. Optim. 2023, 87, 989–1008. [Google Scholar] [CrossRef]
- Coron, J.S.; Gini, A. Provably solving the hidden subset sum problem via statistical learning. Math. Cryptol. 2021, 1, 70–84. [Google Scholar]
- Costandin, M. On a Geometric Interpretation Of the Subset Sum Problem. arXiv 2024, arXiv:2410.19024. [Google Scholar]
- Zheng, Q.; Zhu, P.; Xue, S.; Wang, Y.; Wu, C.; Yu, X.; Yu, M.; Liu, Y.; Deng, M.; Wu, J.; et al. Quantum algorithm and experimental demonstration for the subset sum problem. Sci. China Inf. Sci. 2022, 65, 182501. [Google Scholar] [CrossRef]
- Zheng, Q.; Yu, M.; Zhu, P.; Wang, Y.; Luo, W.; Xu, P. Solving the subset sum problem by the quantum Ising model with variational quantum optimization based on conditional values at risk. Sci. China Phys. Mech. Astron. 2024, 67, 280311. [Google Scholar] [CrossRef]
- Moon, B. The Subset Sum Problem: Reducing Time Complexity of NP-Completeness with Quantum Search. Undergrad. J. Math. Model. One Two 2012, 4, 2. [Google Scholar] [CrossRef]
- Bernstein, D.J.; Jeffery, S.; Lange, T.; Meurer, A. Quantum algorithms for the subset-sum problem. In Proceedings of the Post-Quantum Cryptography: 5th International Workshop, PQCrypto 2013, Limoges, France, 4–7 June 2013; Proceedings 5. Springer: Berlin/Heidelberg, Germany, 2013; pp. 16–33. [Google Scholar]
- Bengio, Y.; Lodi, A.; Prouvost, A. Machine learning for combinatorial optimization: A methodological tour d’horizon. Eur. J. Oper. Res. 2021, 290, 405–421. [Google Scholar] [CrossRef]
- Thapa, R. A Survey on Deep Learning-Based Methodologies for Solving Combinatorial Optimization Problems. 2020. Available online: https://www.researchgate.net/publication/343876842_A_Survey_on_Deep_Learning-based_Methodologies_for_Solving_Combinatorial_Optimization_Problems (accessed on 1 April 2025).
- Hopfield, J.J.; Tank, D.W. “Neural” computation of decisions in optimization problems. Biol. Cybern. 1985, 52, 141–152. [Google Scholar] [PubMed]
- Tarkov, M.S. Solving the traveling salesman problem using a recurrent neural network. Numer. Anal. Appl. 2015, 8, 275–283. [Google Scholar]
- Gu, S.; Cui, R. An efficient algorithm for the subset sum problem based on finite-time convergent recurrent neural network. Neurocomputing 2015, 149, 13–21. [Google Scholar] [CrossRef]
- Gu, S.; Hao, T. A pointer network based deep learning algorithm for 0–1 knapsack problem. In Proceedings of the 2018 Tenth International Conference on Advanced Computational Intelligence (ICACI), Xiamen, China, 29–31 March 2018; IEEE: Piscataway, NJ, USA, 2018; pp. 473–477. [Google Scholar]
- Zhao, X.; Wang, Z.; Zheng, G. Two-phase neural combinatorial optimization with reinforcement learning for agile satellite scheduling. J. Aerosp. Inf. Syst. 2020, 17, 346–357. [Google Scholar]
- Kechadi, M.T.; Low, K.S.; Goncalves, G. Recurrent neural network approach for cyclic job shop scheduling problem. J. Manuf. Syst. 2013, 32, 689–699. [Google Scholar]
- Hertrich, C.; Skutella, M. Provably good solutions to the knapsack problem via neural networks of bounded size. INFORMS J. Comput. 2023, 35, 1079–1097. [Google Scholar]
- Oltean, M.; Muntean, O. Solving the subset-sum problem with a light-based device. Nat. Comput. 2009, 8, 321–331. [Google Scholar] [CrossRef]
- Goodfellow, I.J.; Pouget-Abadie, J.; Mirza, M.; Xu, B.; Warde-Farley, D.; Ozair, S.; Courville, A.; Bengio, Y. Generative adversarial nets. In Advances in Neural Information Processing Systems; MIT Press: Cambridge, MA, USA, 2014; Volume 27. [Google Scholar]
- Vaswani, A.; Shazeer, N.; Parmar, N.; Uszkoreit, J.; Jones, L.; Gomez, A.N.; Kaiser, Ł.; Polosukhin, I. Attention is all you need. In Advances in Neural Information Processing Systems; MIT Press: Cambridge, MA, USA, 2017; Volume 30. [Google Scholar]
w | −1 | 0 | 1 | 2 | 3 | 4 | 5 | 6 | 7 | |
---|---|---|---|---|---|---|---|---|---|---|
i | ||||||||||
0 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
1 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
2 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
3 | 0 | 1 | 0 | 0 | 0 | 0 | 0 | 0 | 0 | |
4 | 0 | 1 | 0 | 0 | 0 | 1 | 0 | 0 | 0 | |
5 | 0 | 1 | 0 | 0 | 0 | 1 | 1 | 0 | 0 | |
6 | 0 | 1 | 0 | 0 | 1 | 1 | 1 | 0 | 1 |
w | 0 | 1 | 2 | 3 | 4 | 5 | 6 | |
---|---|---|---|---|---|---|---|---|
i | ||||||||
0 | 0 | 0 | 0 | 0 | 0 | 0 | ||
1 | 0 | 0 | 0 | 7 | 7 | 7 | ||
2 | 7 | 7 | 7 | 7 | 34 | 41 | ||
3 | 7 | 11 | 11 | 11 | 38 | 45 | ||
4 | 7 | 19 | 23 | 38 | 50 | 57 | ||
5 | 7 | 19 | 28 | 43 | 55 | 62 | ||
6 | 10 | 22 | 31 | 43 | 55 | 65 |
w | 1 | 2 | 3 | 4 | 5 | 6 | 7 | 8 | 9 | 10 | 11 | 12 |
---|---|---|---|---|---|---|---|---|---|---|---|---|
102 | 198 | 326 | 436 | 540 | 654 | 757 | 862 | 971 | 1090 | 1175 | 1289 | |
106 | 218 | 327 | 436 | 545 | 654 | 763 | 872 | 981 | 1090 | 1199 | 1304 | |
4 | 20 | 1 | 0 | 5 | 0 | 6 | 10 | 10 | 0 | 24 | 15 | |
w | 13 | 14 | 15 | 16 | 17 | 18 | 19 | 20 | 21 | 22 | 23 | 24 |
1417 | 1525 | 1614 | 1725 | 1846 | 1961 | 2053 | 2166 | 2274 | 2383 | 2499 | 2593 | |
1417 | 1526 | 1635 | 1744 | 1853 | 1962 | 2071 | 2180 | 2289 | 2398 | 2507 | 2616 | |
0 | 1 | 21 | 19 | 7 | 1 | 18 | 14 | 15 | 15 | 8 | 23 | |
w | 25 | 26 | 27 | 28 | 29 | 30 | 31 | 32 | 33 | 34 | 35 | 36 |
2708 | 2828 | 2942 | 3044 | 3158 | 3263 | 3365 | 3483 | 3569 | 3683 | 3797 | 3898 | |
2725 | 2834 | 2943 | 3052 | 3161 | 3270 | 3379 | 3488 | 3597 | 3706 | 3815 | 3924 | |
17 | 6 | 1 | 8 | 3 | 7 | 14 | 5 | 28 | 23 | 18 | 26 | |
w | 37 | 38 | 39 | 40 | 41 | 42 | 43 | 44 | 45 | 46 | 47 | 48 |
4007 | 4111 | 4208 | 4355 | 4432 | 4546 | 4685 | 4750 | 4870 | 4990 | 5109 | 5223 | |
4033 | 4142 | 4251 | 4360 | 4469 | 4578 | 4687 | 4796 | 4905 | 5014 | 5123 | 5231 | |
26 | 31 | 43 | 5 | 37 | 32 | 2 | 46 | 35 | 24 | 14 | 8 | |
w | 49 | 50 | ||||||||||
5308 | 5422 | |||||||||||
5377 | 5422 | |||||||||||
29 | 0 |
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2025 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Wang, Z.; Liao, W.; Jin, Y.; Wang, Z. Performance Guarantees of Recurrent Neural Networks for the Subset Sum Problem. Biomimetics 2025, 10, 231. https://doi.org/10.3390/biomimetics10040231
Wang Z, Liao W, Jin Y, Wang Z. Performance Guarantees of Recurrent Neural Networks for the Subset Sum Problem. Biomimetics. 2025; 10(4):231. https://doi.org/10.3390/biomimetics10040231
Chicago/Turabian StyleWang, Zengkai, Weizhi Liao, Youzhen Jin, and Zijia Wang. 2025. "Performance Guarantees of Recurrent Neural Networks for the Subset Sum Problem" Biomimetics 10, no. 4: 231. https://doi.org/10.3390/biomimetics10040231
APA StyleWang, Z., Liao, W., Jin, Y., & Wang, Z. (2025). Performance Guarantees of Recurrent Neural Networks for the Subset Sum Problem. Biomimetics, 10(4), 231. https://doi.org/10.3390/biomimetics10040231