Simplicity and Complexity in Combinatorial Optimization
Abstract
1. Introduction
2. Background and Problem Set up
2.1. Theoretical Framework
2.2. Problem Description
3. Complexity of Optima, Sets, and Objective Functions
3.1. Bounding the Complexity of Optima
- (i)
- Enumerate in order all elements
- (ii)
- For each , evaluate
- (iii)
- List in descending order of optimality: , , …,
- (iv)
- Set , print and halt
3.2. Complexity of the Set
3.3. Complexity of the Objective Function
3.4. Constraints
4. Simplicity and Complexity in Optima and Extrema
4.1. Simplicity from Simplicity
This means that base 1 is chemically bonded to base 9, and base 2 is bonded to base 8, and base 3 is bonded to base 7. The other bases (numbered 4, 5, 6, 10, 11, 12, 13) are not chemically bonded to any other base. Different underlying sequences give rise to different bonding patterns, and these bonding patterns define the RNA secondary structures. For a sequence of length L bases, there are around [39] different possible (valid) secondary structures. Underlying these structures are possible sequences of length L made up of the letters A, T(U), C, and G. The set of all RNA sequences can be obtained by merely enumerating all possible sequences in order, which is a simple complexity set, assuming we are given L. Also, the mapping [40] which generates a given structure from a sequence is [31] for RNA folding (structures are adopted based on free energy values). Therefore, the set of all structures of length L is complexity, given L.4.2. Complexity from Simplicity
4.3. Complexity from Complexity
4.4. Simplicity from Complexity
- (i)
- Enumerate in order all sequences
- (ii)
- For each , evaluate to find the corresponding x
- (iii)
- Record the frequency with which each structure x appears, where
- (iv)
- Define as the probability of observing structure x on random choice of s
5. Optimization by Algorithmic Probability Sampling
6. Coincidences of Extrema
6.1. Null Expectation for Simultaneously Optimizing Multiple Objective Functions
6.2. Coincidence of near Optima
7. Discussion
Author Contributions
Funding
Data Availability Statement
Acknowledgments
Conflicts of Interest
References
- Hilderbrandt, S.; Tromba, A. The Parsimonious Universe: Shape and Form in the Natural World; Springer: New York, NY, USA, 1996. [Google Scholar]
- Mezard, M.; Montanari, A. Information, Physics, and Computation; Oxford University Press: New York, NY, USA, 2009. [Google Scholar]
- Cohn, H. Order and disorder in energy minimization. In Proceedings of the International Congress of Mathematicians 2010 (ICM 2010), Hyderabad, India, 19–27 August 2010; (In 4 Volumes) Vol. I: Plenary Lectures and Ceremonies; Vols. II–IV: Invited Lectures; World Scientific: Singapore, 2010; pp. 2416–2443. [Google Scholar]
- Cohn, H.; Kumar, A. Universally optimal distribution of points on spheres. J. Am. Math. Soc. 2007, 20, 99–148. [Google Scholar] [CrossRef]
- Li, H.; Helling, R.; Tang, C.; Wingreen, N. Emergence of preferred structures in a simple model of protein folding. Science 1996, 273, 666. [Google Scholar] [CrossRef] [PubMed]
- Mélin, R.; Li, H.; Wingreen, N.; Tang, C. Designability, thermodynamic stability, and dynamics in protein folding: A lattice model study. J. Chem. Phys. 1999, 110, 1252. [Google Scholar] [CrossRef]
- Dias, C.; Grant, M. Designable structures are easy to unfold. Phys. Rev. E 2006, 74, 042902. [Google Scholar] [CrossRef]
- Greenbury, S.F.; Schaper, S.; Ahnert, S.E.; Louis, A.A. Genetic correlations greatly increase mutational robustness and can both reduce and enhance evolvability. PLoS Comput. Biol. 2016, 12, e1004773. [Google Scholar] [CrossRef]
- Nelson, E.; Teneyck, L.; Onuchic, J. Symmetry and kinetic optimization of proteinlike heteropolymers. Phys. Rev. Lett. 1997, 79, 3534–3537. [Google Scholar] [CrossRef]
- Valverde, S.; Cancho, R.; Sole, R. Scale-free networks from optimal design. EPL (Europhys. Lett.) 2002, 60, 512. [Google Scholar] [CrossRef]
- Cohen, R.; Havlin, S. Scale-free networks are ultrasmall. Phys. Rev. Lett. 2003, 90, 58701. [Google Scholar] [CrossRef]
- Solomonoff, R.J. A Preliminary Report on a General Theory of Inductive Inference (Revision of Report V-131). Contract AF 1960, 49, 376. [Google Scholar]
- Kolmogorov, A. Three approaches to the quantitative definition of information. Probl. Inf. Transm. 1965, 1, 1–7. [Google Scholar] [CrossRef]
- Chaitin, G.J. A theory of program size formally identical to information theory. J. ACM 1975, 22, 329–340. [Google Scholar] [CrossRef]
- Lloyd, S. Measures of complexity: A nonexhaustive list. IEEE Control Syst. Mag. 2001, 21, 7–8. [Google Scholar]
- Mitchell, M. Complexity: A Guided Tour; Oxford University Press: Oxford, UK, 2009. [Google Scholar]
- Turing, A.M. On computable numbers, with an application to the Entscheidungsproblem. J. Math. 1936, 58, 5. [Google Scholar]
- Li, M.; Vitanyi, P. An Introduction to Kolmogorov Complexity and Its Applications, 4th ed.; Texts in Computer Science; Springer: Berlin/Heidelberg, Germany; New York, NY, USA, 2019. [Google Scholar]
- Delahaye, J.; Zenil, H. Numerical Evaluation of Algorithmic Complexity for Short Strings: A Glance into the Innermost Structure of Algorithmic Randomness. Appl. Math. Comput. 2012, 219, 63–77. [Google Scholar] [CrossRef]
- Leyva-Acosta, Z.; Acuña Yeomans, E.; Hernandez-Quiroz, F. An additively optimal interpreter for approximating Kolmogorov prefix complexity. Entropy 2024, 26, 802. [Google Scholar] [CrossRef]
- Li, Z.; Huang, C.; Wang, X.; Hu, H.; Wyeth, C.; Bu, D.; Yu, Q.; Gao, W.; Liu, X.; Li, M. Lossless data compression by large models. Nat. Mach. Intell. 2025, 7, 794–799. [Google Scholar] [CrossRef]
- Grunwald, P.; Vitányi, P. Shannon information and Kolmogorov complexity. arXiv 2004, arXiv:cs/0410002. [Google Scholar] [CrossRef]
- Bennett, C. The thermodynamics of computation—A review. Int. J. Theor. Phys. 1982, 21, 905–940. [Google Scholar] [CrossRef]
- Kolchinsky, A.; Wolpert, D.H. Thermodynamic costs of Turing machines. Phys. Rev. Res. 2020, 2, 033312. [Google Scholar] [CrossRef]
- Zurek, W. Algorithmic randomness and physical entropy. Phys. Rev. A 1989, 40, 4731. [Google Scholar] [CrossRef]
- Ebtekar, A.; Hutter, M. Foundations of Algorithmic Thermodynamics. Phys. Rev. E 2025, 111, 014118. [Google Scholar] [CrossRef]
- Ebtekar, A.; Hutter, M. Modeling the Arrows of Time with Causal Multibaker Maps. Entropy 2024, 26, 776. [Google Scholar] [CrossRef] [PubMed]
- Avinery, R.; Kornreich, M.; Beck, R. Universal and accessible entropy estimation using a compression algorithm. Phys. Rev. Lett. 2019, 123, 178102. [Google Scholar] [CrossRef]
- Martiniani, S.; Chaikin, P.M.; Levine, D. Quantifying hidden order out of equilibrium. Phys. Rev. X 2019, 9, 011031. [Google Scholar] [CrossRef]
- Vitányi, P.M. Similarity and denoising. Philos. Trans. R. Soc. A Math. Phys. Eng. Sci. 2013, 371, 20120091. [Google Scholar] [CrossRef]
- Dingle, K.; Camargo, C.Q.; Louis, A.A. Input–output maps are strongly biased towards simple outputs. Nat. Commun. 2018, 9, 761. [Google Scholar] [CrossRef]
- Cilibrasi, R.; Vitányi, P.M.B. Clustering by compression. IEEE Trans. Inf. Theory 2005, 51, 1523–1545. [Google Scholar] [CrossRef]
- Johnston, I.G.; Dingle, K.; Greenbury, S.F.; Camargo, C.Q.; Doye, J.P.; Ahnert, S.E.; Louis, A.A. Symmetry and simplicity spontaneously emerge from the algorithmic nature of evolution. Proc. Natl. Acad. Sci. USA 2022, 119, e2113883119. [Google Scholar] [CrossRef]
- Calude, C. Information and Randomness: An Algorithmic Perspective; Springer: Berlin/Heidelberg, Germany, 2002. [Google Scholar]
- Gács, P. Lecture Notes on Descriptional Complexity and Randomness; Boston University, Graduate School of Arts and Sciences, Computer Science Department: Boston, MA, USA, 1988. [Google Scholar]
- Shen, A.; Uspensky, V.A.; Vereshchagin, N. Kolmogorov Complexity and Algorithmic Randomness; American Mathematical Society: Providence, RI, USA, 2022; Volume 220. [Google Scholar]
- Hutter, M. The Loss Rank Principle for Model Selection. In Proceedings of the 20th Annual Conference on Learning Theory (COLT’07), San Diego, CA, USA, 13–15 June 2007; Lecture Notes in Computer Science (LNAI); Volume 4539, pp. 589–603. [Google Scholar] [CrossRef]
- Borges, J.L. The Library of Babel; Collected Fictions; Penguin Random House Ireland Limited: Dublin, Ireland, 1998. [Google Scholar]
- Dingle, K.; Schaper, S.; Louis, A.A. The structure of the genotype–phenotype map strongly constrains the evolution of non-coding RNA. Interface Focus 2015, 5, 20150053. [Google Scholar] [CrossRef] [PubMed]
- Lorenz, R.; Bernhart, S.H.; Zu Siederdissen, C.H.; Tafer, H.; Flamm, C.; Stadler, P.F.; Hofacker, I.L. ViennaRNA Package 2.0. Algorithms Mol. Biol. 2011, 6, 26. [Google Scholar] [CrossRef] [PubMed]
- Wolfram, S. Undecidability and intractability in theoretical physics. Phys. Rev. Lett. 1985, 54, 735. [Google Scholar] [CrossRef]
- Wolfram, S. A New Kind of Science; Wolfram Media: Champaign, IL, USA, 2002. [Google Scholar]
- Vitányi, P. How incomputable is Kolmogorov complexity? Entropy 2020, 22, 408. [Google Scholar] [CrossRef] [PubMed]
- Sethna, J. Statistical Mechanics: Entropy, Order Parameters, and Complexity; Oxford University Press: New York, NY, USA, 2006; Volume 14. [Google Scholar]
- Levin, L. Laws of information conservation (nongrowth) and aspects of the foundation of probability theory. Probl. Peredachi Informatsii 1974, 10, 30–35. [Google Scholar]
- Hutter, M.; Legg, S.; Vitanyi, P.M. Algorithmic probability. Scholarpedia 2007, 2, 2572. [Google Scholar] [CrossRef]
- Dingle, K.; Hagolani, P.; Zimm, R.; Umar, M.; O’Sullivan, S.; Louis, A. Bounding phenotype transition probabilities via conditional complexity. J. R. Soc. Interface 2025, 22, 20240916. [Google Scholar] [CrossRef]
- Valle-Perez, G.; Camargo, C.Q.; Louis, A.A. Deep learning generalizes because the parameter-function map is biased towards simple functions. arXiv 2018, arXiv:1805.08522. [Google Scholar]
- Mingard, C.; Rees, H.; Valle-Pérez, G.; Louis, A.A. Deep neural networks have an inbuilt Occam’s razor. Nat. Commun. 2025, 16, 220. [Google Scholar] [CrossRef]
- Dingle, K.; Batlle, P.; Owhadi, H. Multiclass classification utilising an estimated algorithmic probability prior. Phys. D Nonlinear Phenom. 2023, 448, 133713. [Google Scholar] [CrossRef]
- Hu, T.; Banzhaf, W.; Ochoa, G. How Neutrality Shapes Evolution: Simplicity Bias and Search. In Proceedings of the Genetic and Evolutionary Computation Conference, Malaga, Spain, 14–18 July 2025; pp. 1008–1016. [Google Scholar]
- Dingle, K.; Kamal, R.; Hamzi, B. A note on a priori forecasting and simplicity bias in time series. Phys. A Stat. Mech. Its Appl. 2023, 609, 128339. [Google Scholar] [CrossRef]
- Dingle, K.; Alaskandarani, M.; Hamzi, B.; Louis, A.A. Exploring simplicity bias in 1d dynamical systems. Entropy 2024, 26, 426. [Google Scholar] [CrossRef] [PubMed]
- Hamzi, B.; Dingle, K. Simplicity bias, algorithmic probability, and the random logistic map. Phys. D Nonlinear Phenom. 2024, 463, 134160. [Google Scholar] [CrossRef]
- Ma, Y.A.; Chen, Y.; Jin, C.; Flammarion, N.; Jordan, M.I. Sampling can be faster than optimization. Proc. Natl. Acad. Sci. USA 2019, 116, 20881–20885. [Google Scholar] [CrossRef]
- Mingard, C.; Valle-Pérez, G.; Skalse, J.; Louis, A.A. Is SGD a Bayesian sampler? Well, almost. J. Mach. Learn. Res. 2021, 22, 1–64. [Google Scholar]
- Schmidhuber, J. The Speed Prior: A New Simplicity Measure Yielding Near-Optimal Computable Predictions. In Proceedings of the 15th Conference on Computational Learning Theory (COLT’02), Sydney, Australia, 8–10 July 2002; LNAI. Volume 2375, pp. 216–228. [Google Scholar]
- Grau-Moya, J.; Genewein, T.; Hutter, M.; Orseau, L.; Deletang, G.; Catt, E.; Ruoss, A.; Wenliang, L.K.; Mattern, C.; Aitchison, M.; et al. Learning Universal Predictors. In Proceedings of the 41st International Conference on Machine Learning, Vienna, Austria, 21–27 July 2024. [Google Scholar]
- Orseau, L.; Hutter, M.; Lelis, L.H. Levin Tree Search with Context Models. In Proceedings of the 32nd International Joint Conference on Artificial Intelligence (IJCAI’23), Macao, China, 19–25 August 2023; pp. 5622–5630. [Google Scholar] [CrossRef]
- Alaskandarani, M.; Dingle, K. Low complexity, low probability patterns and consequences for algorithmic probability applications. Complexity 2023, 2023, 9696075. [Google Scholar] [CrossRef]
- Kreinovich, V. Coincidences are not accidental: A theorem. Cybern. Syst. 1999, 30, 429–440. [Google Scholar] [CrossRef]
- Dessalles, J.L.J.L. Coincidences and the encounter problem: A formal account. arXiv 2011, arXiv:1106.3932. [Google Scholar] [CrossRef]
- Yanofsky, N.S. Kolmogorov Complexity and Our Search for Meaning: What Math Can Teach Us about Finding Order in Our Chaotic Lives. In The Best Writing on Mathematics 2019; Princeton University Press: Princeton, NJ, USA, 2019; Volume 8, pp. 208–213. [Google Scholar]
- Cohn, H.; Kumar, A. Algorithmic design of self-assembling structures. Proc. Natl. Acad. Sci. USA 2009, 106, 9570–9575. [Google Scholar] [CrossRef] [PubMed]
- Bormashenko, E. Fibonacci sequences, symmetry and order in biological patterns, their sources, information origin and the Landauer principle. Biophysica 2022, 2, 292–307. [Google Scholar] [CrossRef]
- Borenstein, Y.; Poli, R. Kolmogorov complexity, optimization and hardness. In Proceedings of the 2006 IEEE International Conference on Evolutionary Computation, Vancouver, BC, Canada, 16–21 July 2006; IEEE: New York, NY, USA, 2006; pp. 112–119. [Google Scholar]
- Borenstein, Y.; Poli, R. Information perspective of optimization. In Proceedings of the International Conference on Parallel Problem Solving from Nature, Reykjavik, Iceland, 9–13 September 2006; Springer: Berlin/Heidelberg, Germany, 2006; pp. 102–111. [Google Scholar]
- Bloom, J.D.; Silberg, J.J.; Wilke, C.O.; Drummond, D.A.; Adami, C.; Arnold, F.H. Thermodynamic prediction of protein neutrality. Proc. Natl. Acad. Sci. USA 2005, 102, 606–611. [Google Scholar] [CrossRef] [PubMed]
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content. |
© 2026 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license.
Share and Cite
Dingle, K.; Hutter, M. Simplicity and Complexity in Combinatorial Optimization. Entropy 2026, 28, 226. https://doi.org/10.3390/e28020226
Dingle K, Hutter M. Simplicity and Complexity in Combinatorial Optimization. Entropy. 2026; 28(2):226. https://doi.org/10.3390/e28020226
Chicago/Turabian StyleDingle, Kamal, and Marcus Hutter. 2026. "Simplicity and Complexity in Combinatorial Optimization" Entropy 28, no. 2: 226. https://doi.org/10.3390/e28020226
APA StyleDingle, K., & Hutter, M. (2026). Simplicity and Complexity in Combinatorial Optimization. Entropy, 28(2), 226. https://doi.org/10.3390/e28020226

