Development of a Quality-Based Model for Software Architecture Optimization: A Case Study of Monolith and Microservice Architectures
Abstract
:1. Introduction
2. Background
2.1. Monolithic and Microservice Software Architectures
2.2. Software Quality Standards
- Coupling—A software system may encompass multiple components grouped in modules. Components of one module can invoke the components from another module. Coupling can be defined as a measure of interdependence among modules in a software system [4]. In that context, the coupling between components and modules should be reduced to a minimum.
- Testability—Apart from the programming code, a software system can contain numerous tests (e.g., unit tests, components tests, integration tests). Testability could be defined as a degree of effectiveness and efficiency with which test criteria can be established for a system, product, or component and tests can be performed to determine whether those criteria have been met [17]. High level of testability is a required characteristic within the software development process.
- Security—Security considers the ways in which a software system protects information and data so that persons or other products or systems have the degree of data access appropriate to their types and levels of authorization [17]. Taking into consideration the importance of information and data found in a software system, security should be maintained at a high level.
- Complexity—As previously stated, a software system incorporates multiple components. Complexity could be defined as the degree to which a system or component has a design or implementation that is difficult to understand and verify [9]. A low level of complexity is a required characteristic within the software development process.
- Deployability—Subsequent to the implementation or change in some feature, a software system should be deployed to the appropriate environment (e.g., test environment, staging environment, production environment). Deployability considers all artefacts and activities required to put a software system into operation [9]. A high level of software system deployability is required.
- Availability—Availability could be defined as the degree to which a system or component is operational and accessible when required for use [17]. A software system should preserve a high level of availability.
3. Evaluation of Monolithic and Microservice Software Architectures
3.1. Technology Stack
3.2. Data Analysis
3.3. Evaluation
- API Gateway—provides an effective way to route to APIs (i.e., microservices);
- Service Registry—used for microservice registration and management;
- Messaging—enables asynchronous microservice communication via messaging interfaces;
- Spring Cloud Sleuth—enables auto-configuration for distributed tracing.
3.3.1. Coupling
3.3.2. Testability
3.3.3. Security
3.3.4. Complexity
3.3.5. Deployability
- database driver;
- spring-related libraries;
- data persistence-related libraries;
- tests-related libraries.
3.3.6. Availability
4. Quality-Based Model for Software Architecture Optimization
4.1. The Model for Software Architecture Optimization
- The feature that is the center of a cluster and all the features assigned to that cluster form one monolith. In this application, the center of the cluster is not relevant, but what is important is which features are grouped into a cluster (monolith);
- A cluster with one feature is a microservice;
- For each metric, maximal, minimal or targeted value is defined;
- Deviations from the maximal, minimal and target values of the metrics are allowed but should be minimized.
4.2. The MicroMono Application
5. Threats to Validity
6. Discussion
- security configuration of each microservice;
- security configuration of communication between microservices;
- external libraries configuration (e.g., database driver, data persistence-related libraries, tests-related libraries);
- configuration of each instance located on cloud infrastructure.
- In case of only one feature (i.e., k = 1) monolithic or microservice software architectures could be applied either way;
- In case of multiple features (i.e., k > 1), software quality attribute’s importance should be examined and a decision on whether to implement monolithic or microservice software architecture should be made.
7. Conclusions
Author Contributions
Funding
Data Availability Statement
Conflicts of Interest
References
- Pressman, R.S.; Maxim, B.R. Software Engineering: A Practitioner’s Approach, 9th ed.; McGraw-Hill Education: New York, NY, USA, 2019. [Google Scholar]
- Sommerville, I. Software Engineering, 9th ed.; Addison-Wesley Publishing: Boston, MA, USA, 2015. [Google Scholar]
- Kumar, G.; Bhatia, P.K. Comparative analysis of software engineering models from traditional to modern methodologies. In Proceedings of the 2014 Fourth International Conference on Advanced Computing & Communication Technologies, Rohtak, India, 8–9 February 2014; pp. 189–196. [Google Scholar]
- Bourque, P.; Fairley, R.E. Guide to the Software Engineering Body of Knowledge (SWEBOK (R)): Version 3.0; IEEE Computer Society Press: Piscataway, NJ, USA, 2014. [Google Scholar]
- Medvidovic, N.; Taylor, R.N. Software architecture: Foundations, theory, and practice. In Proceedings of the 32nd ACM/IEEE International Conference on Software Engineering-Volume 2, Cape Town, South Africa, 2–8 May 2010; pp. 471–472. [Google Scholar]
- Medvidovic, N.; Taylor, R. A classification and comparison framework for software architecture description languages. IEEE Trans. Softw. Eng. 2000, 26, 70–93. [Google Scholar] [CrossRef]
- Bass, L.; Clements, P.; Kazman, R. Software Architecture in Practice; Addison-Wesley Professional: Boston, MA, USA, 2003. [Google Scholar]
- Chen, H.-M.; Kazman, R.; Perry, O. From Software Architecture Analysis to Service Engineering: An Empirical Study of Methodology Development for Enterprise SOA Implementation. IEEE Trans. Serv. Comput. 2010, 3, 145–160. [Google Scholar] [CrossRef]
- ISO/IEC/IEEE 24765:2017 Systems and Software Engineering—Vocabulary. Available online: http://www.iso.org (accessed on 31 July 2022).
- Clements, P.; Garlan, D.; Bass, L.; Stafford, J.; Nord, R.; Ivers, J.; Little, R. Documenting Software Architectures: Views and Beyond; Pearson Education: London, UK, 2002. [Google Scholar]
- Dragoni, N.; Giallorenzo, S.; Lafuente, A.L.; Mazzara, M.; Montesi, F.; Mustafin, R.; Safina, L. Microservices: Yesterday, today, and tomorrow. In Present and Ulterior Software Engineering; Mazzara, M., Meyer, B., Eds.; Springer: Cham, Switzerland, 2017; pp. 195–216. [Google Scholar]
- Martin, R.C.; Martin, M. Agile Principles, Patterns, and Practices in C#; Prentice Hall: Hoboken, NJ, USA, 2006. [Google Scholar]
- Fazio, M.; Celesti, A.; Ranjan, R.; Liu, C.; Chen, L.; Villari, M. Open Issues in Scheduling Microservices in the Cloud. IEEE Cloud Comput. 2016, 3, 81–88. [Google Scholar] [CrossRef]
- Newman, S. Building Microservices: Designing Fine-Grained Systems; O’Reilly Media, Inc.: Sebastopol, CA, USA, 2015. [Google Scholar]
- Bansiya, J.; Davis, C.G. A hierarchical model for object-oriented design quality assessment. IEEE Trans. Softw. Eng. 2002, 28, 4–17. [Google Scholar] [CrossRef]
- Crosby, P.B. Quality is Free; Signet Book: New York, NY, USA, 1980. [Google Scholar]
- ISO/IEC 25010:2011 Systems and Software Engineering—Systems and Software Quality Requirements and Evaluation (SQuaRE)—System and Software Quality Models. Available online: http://www.iso.org (accessed on 31 July 2022).
- ISO/IEC 9126:2001 Software Engineering—Product Quality—Part 1: Quality Model. Available online: http://www.iso.org (accessed on 31 July 2022).
- Franch, X.; Carvallo, J. Using quality models in software package selection. IEEE Softw. 2003, 20, 34–41. [Google Scholar] [CrossRef]
- Suresh, Y.; Pati, J.; Rath, S.K. Effectiveness of Software Metrics for Object-oriented System. Procedia Technol. 2012, 6, 420–427. [Google Scholar] [CrossRef]
- Kan, S.H. Metrics and Models in Software Quality Engineering; Addison-Wesley Longman Publishing Co., Inc.: Boston, MA, USA, 2002. [Google Scholar]
- Balogun, A.; Basri, S.; Mahamad, S.; Abdulkadir, S.; Almomani, M.; Adeyemo, V.; Al-Tashi, Q.; Mojeed, H.; Imam, A.; Bajeh, A. Impact of Feature Selection Methods on the Predictive Performance of Software Defect Prediction Models: An Extensive Empirical Study. Symmetry 2020, 12, 1147. [Google Scholar] [CrossRef]
- Alshuqayran, N.; Ali, N.; Evans, R. A systematic mapping study in microservice architecture. In Proceedings of the 2016 IEEE 9th International Conference on Service-Oriented Computing and Applications (SOCA), Macau, China, 4–6 November 2016; pp. 44–51. [Google Scholar]
- Chung, L.; do Prado Leite, J.C.S. On non-functional requirements in software engineering. In Conceptual Modeling: Foundations and Applications; Borgida, A.T., Chaudhri, V.K., Giorgini, P., Yu, E.S., Eds.; Springer: Berlin/Heidelberg, Germany, 2009; pp. 363–379. [Google Scholar]
- Chen, L. Microservices: Architecting for continuous delivery and DevOps. In Proceedings of the 2018 IEEE International Conference on Software Architecture (ICSA), Seattle, WA, USA, 30 April–4 May 2018; pp. 39–397. [Google Scholar]
- Balalaie, A.; Heydarnoori, A.; Jamshidi, P. Microservices Architecture Enables DevOps: Migration to a Cloud-Native Architecture. IEEE Softw. 2016, 33, 42–52. [Google Scholar] [CrossRef]
- Palmer, S.R.; Felsing, M. A Practical Guide to Feature-Driven Development; Pearson Education: London, UK, 2001. [Google Scholar]
- ISO/IEC 26550:2015 Software and Systems Engineering—Reference Model for Product Line Engineering and Management. Available online: http://www.iso.org (accessed on 31 July 2022).
- Taibi, D.; Lenarduzzi, V.; Pahl, C. Architectural Patterns for Microservices: A Systematic Mapping Study. In Proceedings of the International Conference on Cloud Computing and Services Science (CLOSER), Funchal, Portugal, 19–21 March 2018; pp. 221–232. [Google Scholar]
- Hasselbring, W.; Reussner, R.; Jaekel, H.; Schlegelmilch, J.; Teschke, T.; Krieghoff, S. The dublo architecture pattern for smooth migration of business information systems: An experience report. In Proceedings of the 26th IEEE International Conference on Software Engineering (ICSE), Edinburgh, UK, 28–28 May 2004; pp. 117–126. [Google Scholar]
- Campbell, G.A. Cognitive complexity: An overview and evaluation. In Proceedings of the 2018 International Conference on Technical Debt, Gothenburg, Sweden, 27–28 May 2018; pp. 57–58. [Google Scholar]
- Atkinson, C.; Bostan, P.; Hummel, O.; Stoll, D. A practical approach to web service discovery and retrieval. In Proceedings of the IEEE International Conference on Web Services (ICWS 2007), Salt Lake City, UT, USA, 9–13 July 2007; pp. 241–248. [Google Scholar]
- Cardarelli, M.; Iovino, L.; Di Francesco, P.; Di Salle, A.; Malavolta, I.; Lago, P. An extensible data-driven approach for evaluating the quality of microservice architectures. In Proceedings of the 34th ACM/SIGAPP Symposium on Applied Computing, Limassol, Cyprus, 8–12 April 2019; pp. 1225–1234. [Google Scholar]
- Zimmermann, O. Microservices tenets. Comput. Sci. Res. Dev. 2017, 32, 301–310. [Google Scholar] [CrossRef]
- Calderón-Gómez, H.; Mendoza-Pittí, L.; Vargas-Lombardo, M.; Gómez-Pulido, J.; Rodríguez-Puyol, D.; Sención, G.; Polo-Luque, M.-L. Evaluating Service-Oriented and Microservice Architecture Patterns to Deploy eHealth Applications in Cloud Computing Environment. Appl. Sci. 2021, 11, 4350. [Google Scholar] [CrossRef]
- Al-Naeem, T.; Gorton, I.; Babar, M.A.; Rabhi, F.; Benatallah, B. A quality-driven systematic approach for architecting distributed software applications. In Proceedings of the 27th International Conference on Software Engineering, St. Louis, MO, USA, 15–21 May 2005; pp. 244–253. [Google Scholar]
- Tahvildari, L.; Kontogiannis, K.; Mylopoulos, J. Quality-driven software re-engineering. J. Syst. Softw. 2003, 66, 225–239. [Google Scholar] [CrossRef]
- Villegas, N.M.; Müller, H.A.; Tamura, G.; Duchien, L.; Casallas, R. A framework for evaluating quality-driven self-adaptive software systems. In Proceedings of the 6th International Symposium on Software Engineering for Adaptive and Self-Managing Systems, Honolulu, HI, USA, 23–24 May 2011; pp. 80–89. [Google Scholar]
- Mayer, T.; Hall, T. A Critical Analysis of Current OO Design Metrics. Softw. Qual. J. 1999, 8, 97–110. [Google Scholar] [CrossRef]
- Jamshidi, P.; Pahl, C.; Mendonca, N.C.; Lewis, J.; Tilkov, S. Microservices: The Journey So Far and Challenges Ahead. IEEE Softw. 2018, 35, 24–35. [Google Scholar] [CrossRef]
- Hitz, M.; Montazeri, B. Measuring coupling and cohesion in object-oriented systems. In Proceedings of the International Symposium on Applied Corporate Computing (ISACC), Cairns, Australia, 4–6 December 1995; pp. 1–10. [Google Scholar]
- Jin, W.; Liu, T.; Qu, Y.; Zheng, Q.; Cui, D.; Chi, J. Dynamic structure measurement for distributed software. Softw. Qual. J. 2017, 26, 1119–1145. [Google Scholar] [CrossRef]
- Gamma, E.; Helm, R.; Johnson, R.; Vlissides, J. Design Patterns: Elements of Reusable Object-Oriented Software; Addison-Wesley: Boston, MA, USA, 1995. [Google Scholar]
- Bächle, M.; Kirchberg, P. Ruby on rails. IEEE Softw. 2007, 24, 105–108. [Google Scholar] [CrossRef]
- Pahl, C.; Jamshidi, P. Microservices: A Systematic Mapping Study. In Proceedings of the 6th International Conference on Cloud Computing and Services Science (CLOSER 2016), Rome, Italy, 23–25 April 2016; Volume 1, pp. 137–146. [Google Scholar]
- Hasselbring, W.; Steinacker, G. Microservice architectures for scalability, agility and reliability in e-commerce. In Proceedings of the 2017 IEEE International Conference on Software Architecture Workshops (ICSAW), Gothenburg, Sweden, 5–7 April 2017; pp. 243–246. [Google Scholar]
- Villamizar, M.; Garcés, O.; Ochoa, L.; Castro, H.; Salamanca, L.; Verano, M.; Casallas, R.; Gil, S.; Valencia, C.; Zambrano, A.; et al. Cost comparison of running web applications in the cloud using monolithic, microservice, and AWS Lambda architectures. Serv. Oriented Comput. Appl. 2017, 11, 233–247. [Google Scholar] [CrossRef]
- Crnkovic, I.; Hnich, B.; Jonsson, T.; Kiziltan, Z. Specification, implementation, and deployment of components. Commun. ACM 2002, 45, 35–40. [Google Scholar] [CrossRef]
- Burns, B.; Oppenheimer, D. Design patterns for container-based distributed systems. In Proceedings of the 8th USENIX Workshop on Hot Topics in Cloud Computing (HotCloud 16), Denver, CO, USA, 20–21 June 2016. [Google Scholar]
- Fowler, M. Patterns of Enterprise Application Architecture; Addison-Wesley Longman Publishing Co., Inc.: Boston, MA, USA, 2002. [Google Scholar]
- Ampatzoglou, A.; Frantzeskou, G.; Stamelos, I. A methodology to assess the impact of design patterns on software quality. Inf. Softw. Technol. 2012, 54, 331–346. [Google Scholar] [CrossRef]
- Khomh, F.; Gueheneuce, Y.G. Do design patterns impact software quality positively? In Proceedings of the 2008 12th European Conference on Software Maintenance and Reengineering, Athens, Greece, 1–4 April 2008; pp. 274–278. [Google Scholar]
- Levcovitz, A.; Terra, R.; Valente, M.T. Towards a technique for extracting microservices from monolithic enterprise systems. In Proceedings of the 3rd Brazilian Workshop on Software Visualization, Evolution and Maintenance (VEM), Belo Horizonte, Brazil, 23 September 2015; pp. 97–104. [Google Scholar]
- Carrasco, A.; Bladel, B.V.; Demeyer, S. Migrating towards microservices: Migration and architecture smells. In Proceedings of the 2nd International Workshop on Refactoring, Montpellier, France, 4 September 2018; pp. 1–6. [Google Scholar]
- Sneed, H.M. Integrating legacy software into a service oriented architecture. In Proceedings of the Conference on Software Maintenance and Reengineering (CSMR′06), Bari, Italy, 22–24 March 2006; pp. 11–14. [Google Scholar]
- Bogner, J.; Fritzsch, J.; Wagner, S.; Zimmermann, A. Microservices in Industry: Insights into Technologies, Characteristics, and Software Quality. In Proceedings of the 2019 IEEE International Conference on Software Architecture Companion (ICSA-C), Hamburg, Germany, 25–26 March 2019; pp. 187–195. [Google Scholar]
- Taibi, D.; Lenarduzzi, V.; Pahl, C. Processes, Motivations, and Issues for Migrating to Microservices Architectures: An Empirical Investigation. IEEE Cloud Comput. 2017, 4, 22–32. [Google Scholar] [CrossRef]
- Fowler, M. Refactoring: Improving the Design of Existing Code; Addison-Wesley Professional: Boston, MA, USA, 2018. [Google Scholar]
- Mens, T.; Tourwé, T. A survey of software refactoring. IEEE Trans. Softw. Eng. 2004, 30, 126–139. [Google Scholar] [CrossRef] [Green Version]
- Alshayeb, M. Empirical investigation of refactoring effect on software quality. Inf. Softw. Technol. 2009, 51, 1319–1326. [Google Scholar] [CrossRef]
- Fontana, F.A.; Spinelli, S. Impact of refactoring on quality code evaluation. In Proceedings of the 4th Workshop on Refactoring Tools, Honolulu, HI, USA, 22 May 2011; pp. 37–40. [Google Scholar]
- Hassan, S.; Bahsoon, R. Microservices and their design trade-offs: A self-adaptive roadmap. In Proceedings of the 2016 IEEE International Conference on Services Computing (SCC), San Francisco, CA, USA, 27 June–2 July 2016; pp. 813–818. [Google Scholar]
Quality Attribute | Software Metric | Value |
---|---|---|
Coupling | Coupling Between Objects | 16 |
Testability | Unit Tests | 90 |
Code Coverage (%) | 94.2 | |
Covered Conditions | 30 | |
Uncovered Conditions | 0 | |
Security | Security Hotspots | 3 |
Security Configurations | 1 | |
Complexity | Cyclomatic Complexity | 75 |
Cognitive Complexity | 15 | |
Deployability | Build Time (min) | 02:07 |
Deployment Pipelines | 1 | |
Deployable Size (MB) | 41.2 | |
AWS Instances | 1 | |
Deployment Profiles | 3 | |
Availability | Healthcheck Endpoints | 1 |
Quality Attribute | Software Metric | Feature A | Feature B | Feature C | Feature D | Feature E |
---|---|---|---|---|---|---|
Coupling | Coupling Between Objects | 4 | 4 | 4 | 4 | 4 |
Testability | Unit Tests | 18 | 18 | 18 | 18 | 18 |
Code Coverage (%) | 87.5 | 86.3 | 87.5 | 86.3 | 84.9 | |
Covered Conditions | 6 | 6 | 6 | 6 | 6 | |
Uncovered Conditions | 0 | 0 | 0 | 0 | 0 | |
Security | Security Hotspots | 3 | 3 | 3 | 3 | 3 |
Security Configurations | 1 | 1 | 1 | 1 | 1 | |
Complexity | Cyclomatic Complexity | 18 | 20 | 18 | 20 | 21 |
Cognitive Complexity | 3 | 3 | 3 | 3 | 3 | |
Deployability | Build Time (min) | 01:28 | 01:37 | 01:24 | 01:27 | 01:30 |
Deployment Pipelines | 1 | 1 | 1 | 1 | 1 | |
Deployable Size (MB) | 40.4 | 53.2 | 40.4 | 53.2 | 53.2 | |
AWS Instances | 1 | 1 | 1 | 1 | 1 | |
Deployment Profiles | 3 | 3 | 3 | 3 | 3 | |
Availability | Healthcheck Endpoints | 1 | 1 | 1 | 1 | 1 |
Monolithic Software Architecture | Microservice Software Architecture | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|---|
No. of Features | 1 | 2 | 3 | ... | k | 1 | 2 | 3 | ... | k | Δ (Microservice − Monolithic) | |
Software Metric | ||||||||||||
Coupling Between Objects | 4 | 7 | 10 | ... | 3·k + 1 | 4 | 8 | 12 | ... | 4·k | k − 1 | |
Unit Tests | 18 | 36 | 54 | ... | 18·k | 18 | 36 | 54 | ... | 18·k | 0 | |
Covered Conditions | 6 | 12 | 18 | ... | 6·k | 6 | 12 | 18 | ... | 6·k | 0 | |
Cognitive Complexity | 3 | 6 | 9 | ... | 3·k | 3 | 6 | 9 | ... | 3·k | 0 | |
Security Hotspots | 3 | 3 | 3 | ... | 3 | 3 | 6 | 9 | ... | 3·k | 3·(k − 1) | |
Security Configurations | 1 | 1 | 1 | ... | 1 | 1 | 2 | 3 | ... | k | k − 1 | |
Deployment Pipelines | 1 | 1 | 1 | ... | 1 | 1 | 2 | 3 | ... | k | k − 1 | |
AWS Instances | 1 | 1 | 1 | ... | 1 | 1 | 2 | 3 | ... | k | k − 1 | |
Deployment Profiles | 3 | 3 | 3 | ... | 3 | 3 | 6 | 9 | ... | 3·k | 3·(k − 1) | |
Healthcheck Endpoints | 1 | 1 | 1 | ... | 1 | 1 | 2 | 3 | ... | k | k − 1 |
No. of Features | Solution | = 0.8 = 0.2 | = 0.7 = 0.3 | = 0.6 = 0.4 | = 0.5 = 0.5 | = 0.4 = 0.6 | = 0.3 = 0.7 | = 0.2 = 0.8 |
---|---|---|---|---|---|---|---|---|
10 | # of monoliths | 1 (8) | 1 (7) | 1 (6) | 2 (4, 2) | 1 (4) | 1 (3) | 1 (2) |
# of microservices | 2 | 3 | 4 | 4 | 6 | 7 | 8 | |
deviation | 0 | 0 | 1 | 1 | 1 | 2 | 2 | |
time used | 0.0 s | 0.0 s | 0.0 s | 0.0 s | 0.0 s | 0.0 s | 0.0 s | |
20 | # of monoliths | 1 (16) | 1 (14) | 1 (12) | 2 (8, 3) | 1 (8) | 2 (5, 2) | 1 (4) |
# of microservices | 4 | 6 | 8 | 9 | 12 | 13 | 16 | |
deviation | 0 | 0 | 1 | 1 | 1 | 2 | 2 | |
time used | 0.0 s | 0.0 s | 0.0 s | 0.0 s | 0.0 s | 0.0 s | 0.0 s | |
30 | # of monoliths | 1 (24) | 1 (21) | 1 (18) | 1 (15) | 3 (6, 6, 2) | 1 (9) | 1 (6) |
# of microservices | 6 | 9 | 12 | 15 | 16 | 21 | 24 | |
deviation | 0 | 0 | 1 | 1 | 1 | 2 | 2 | |
time used | 0.0 s | 0.0 s | 0.0 s | 0.0 s | 0.1 s | 0.0 s | 0.1 s | |
40 | # of monoliths | 1 (32) | 1 (28) | 1 (24) | 1 (20) | 1 (16) | 2 (10, 3) | 3 (4, 3, 3) |
# of microservices | 8 | 12 | 16 | 20 | 24 | 27 | 30 | |
deviation | 0 | 0 | 1 | 1 | 1 | 2 | 2 | |
time used | 0.0 s | 0.0 s | 0.0 s | 0.0 s | 0.2 s | 0.1 s | 0.2 s | |
50 | # of monoliths | 1 (40) | 1 (35) | 1 (30) | 4 (22, 2, 2, 2) | 3 (15, 5, 2) | 1 (15) | 2 (8, 3) |
# of microservices | 10 | 15 | 20 | 22 | 28 | 35 | 39 | |
deviation | 0 | 0 | 1 | 1 | 1 | 2 | 2 | |
time used | 0.0 s | 0.0 s | 0.0 s | 0.3 s | 0.8 s | 0.3 s | 0.3 s | |
100 | # of monoliths | 1 (80) | 1 (70) | 1 (60) | 5 (29, 4, 4, 2, 15) | 3 (29, 8, 5) | 3 (14, 9, 9) | 2 (12, 9) |
# of microservices | 20 | 30 | 40 | 46 | 58 | 68 | 79 | |
deviation | 0 | 0 | 1 | 1 | 1 | 2 | 2 | |
time used | 0.1 s | 0.2 s | 0.2 s | 2.5 s | 2.2 s | 0.8 s | 0.7 s | |
150 | # of monoliths | 1 (120) | 1 (105) | 1 (90) | 2 (62, 14) | 2 (58, 3) | 6 (36, 6, 2, 2, 2, 2) | 4 (21, 2, 7, 3) |
# of microservices | 30 | 45 | 60 | 74 | 89 | 100 | 117 | |
deviation | 0 | 0 | 1 | 1 | 1 | 2 | 2 | |
time used | 0.4 s | 0.3 s | 0.4 s | 3.4 s | 18.8 s | 2.0 s | 40.8 s | |
200 | # of monoliths | 1 (160) | 1 (140) | 1 (120) | 7 (49, 8, 37, 2, 3, 4, 3) | 1 (80) | 3 (45, 15, 2) | 4 (28, 11, 2, 2) |
# of microservices | 40 | 60 | 80 | 94 | 120 | 138 | 157 | |
deviation | 0 | 0 | 1 | 1 | 1 | 2 | 2 | |
time used | 0.6 s | 0.6 s | 0.7 s | 33.2 s | 15.9 s | 23.2 s | 17.3 s | |
250 | # of monoliths | 1 (200) | 1 (175) | 1 (150) | 18 * | 35 ** | 1 (75) | 15 *** |
# of microservices | 50 | 75 | 100 | 108 | 116 | 175 | 186 | |
deviation | 0 | 0 | 1 | 1 | 1 | 2 | 2 | |
time used | 0.9 s | 0.9 s | 0.9 s | 198.4 s | 161.9 s | 22.7 s | 74.6 s | |
300 | # of monoliths | 1 (240) | 1 (210) | 1 (120) | 8 (87, 38, 10, 9, 3, 6, 2, 2) | 5 (76, 17, 22, 6, 3) | 6 (21, 50, 4, 12, 4, 4) | 3 (42, 18, 2) |
# of microservices | 60 | 90 | 180 | 143 | 176 | 205 | 238 | |
deviation | 0 | 0 | 1 | 1 | 1 | 2 | 2 | |
time used | 1.3 s | 1.3 s | 1.3 s | 41.0 s | 54.5 s | 87.5 s | 37.4 s |
Software Metric | |||||||||||
---|---|---|---|---|---|---|---|---|---|---|---|
CBO | TU | CC | CCM | SH | SC | DP | AWS | DPF | HE | ||
k | constraint type | < | = | = | = | < | < | > | > | > | > |
10 | condition | 36 | 180 | 60 | 30 | 17 | 6 | 6 | 6 | 17 | 6 |
monolith | - | - | - | - | - | - | 5 | 5 | 14 | 5 | |
microservice | 4 | - | - | - | 13 | 4 | - | - | - | - | |
optimal | - | - | - | - | 1 | - | - | - | - | - | |
20 | condition | 71 | 360 | 120 | 60 | 32 | 11 | 11 | 11 | 32 | 11 |
monolith | - | - | - | - | - | - | 10 | 10 | 29 | 10 | |
microservice | 9 | - | - | - | 28 | 9 | - | - | - | - | |
optimal | - | - | - | - | 1 | - | - | - | - | - | |
30 | condition | 106 | 540 | 180 | 90 | 47 | 16 | 16 | 16 | 47 | 16 |
monolith | - | - | - | - | - | - | 15 | 15 | 44 | 15 | |
microservice | 14 | - | - | - | 43 | 14 | - | - | - | - | |
optimal | - | - | - | - | 1 | - | - | - | - | - | |
40 | condition | 141 | 720 | 240 | 120 | 62 | 21 | 21 | 21 | 62 | 21 |
monolith | - | - | - | - | - | - | 20 | 20 | 59 | 20 | |
microservice | 19 | - | - | - | 58 | 19 | - | - | - | - | |
optimal | - | - | - | - | 1 | - | - | - | - | - | |
50 | condition | 176 | 900 | 300 | 150 | 77 | 26 | 26 | 26 | 77 | 26 |
monolith | - | - | - | - | - | - | 25 | 25 | 74 | 25 | |
microservice | 24 | - | - | - | 73 | 24 | - | - | - | - | |
optimal | - | - | - | - | 1 | - | - | - | - | - | |
100 | condition | 351 | 1800 | 600 | 300 | 152 | 51 | 51 | 51 | 152 | 51 |
monolith | - | - | - | - | - | - | 50 | 50 | 149 | 50 | |
microservice | 49 | - | - | - | 148 | 49 | - | - | - | - | |
optimal | - | - | - | - | 1 | - | - | - | - | - | |
150 | condition | 526 | 2700 | 900 | 450 | 227 | 76 | 76 | 76 | 227 | 76 |
monolith | - | - | - | - | - | - | 75 | 75 | 224 | 75 | |
microservice | 74 | - | - | - | 223 | 74 | - | - | - | - | |
optimal | - | - | - | - | 1 | - | - | - | - | - | |
200 | condition | 701 | 3600 | 1200 | 600 | 302 | 101 | 101 | 101 | 302 | 101 |
monolith | - | - | - | - | - | - | 100 | 100 | 299 | 100 | |
microservice | 99 | - | - | - | 298 | 99 | - | - | - | - | |
optimal | - | - | - | - | 1 | - | - | - | - | - | |
250 | condition | 876 | 4500 | 1500 | 750 | 377 | 126 | 126 | 126 | 377 | 126 |
monolith | - | - | - | - | - | - | 125 | 125 | 374 | 125 | |
microservice | 124 | - | - | - | 373 | 124 | - | - | - | - | |
optimal | - | - | - | - | 1 | - | - | - | - | - | |
300 | condition | 1051 | 5400 | 1800 | 900 | 452 | 151 | 151 | 151 | 452 | 151 |
monolith | - | - | - | - | - | - | 150 | 150 | 449 | 150 | |
microservice | 149 | - | - | - | 448 | 149 | - | - | - | - | |
optimal | - | - | - | - | 1 | - | - | - | - | - |
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations. |
© 2022 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).
Share and Cite
Milić, M.; Makajić-Nikolić, D. Development of a Quality-Based Model for Software Architecture Optimization: A Case Study of Monolith and Microservice Architectures. Symmetry 2022, 14, 1824. https://doi.org/10.3390/sym14091824
Milić M, Makajić-Nikolić D. Development of a Quality-Based Model for Software Architecture Optimization: A Case Study of Monolith and Microservice Architectures. Symmetry. 2022; 14(9):1824. https://doi.org/10.3390/sym14091824
Chicago/Turabian StyleMilić, Miloš, and Dragana Makajić-Nikolić. 2022. "Development of a Quality-Based Model for Software Architecture Optimization: A Case Study of Monolith and Microservice Architectures" Symmetry 14, no. 9: 1824. https://doi.org/10.3390/sym14091824
APA StyleMilić, M., & Makajić-Nikolić, D. (2022). Development of a Quality-Based Model for Software Architecture Optimization: A Case Study of Monolith and Microservice Architectures. Symmetry, 14(9), 1824. https://doi.org/10.3390/sym14091824