1. Introduction
In the modern era of information society, computers are a basic work tool. Modern tasks performed by Information Technology can be simple, i.e., not requiring high computing power, e.g., document processing, or they can be characterized by high computational complexity, e.g., services provided by data centers. Such a variety of tasks means that many types of Information Technology (IT) equipment and computers dedicated to specific applications are available. For example, the market offers ultrabooks, i.e., portable computers with low energy consumption, laptops (general-purpose portable computers), or typical stationary computers (desktops) for efficient workstations, both portable and stationary. However, all Information Technology (IT) devices have a common feature from an electrical point of view, namely that they are nonlinear receivers with a variable load value. This nonlinearity is related to the presence of rectifier systems in power supplies. On the other hand, load variability is related to the method of work, where it is characteristic that complex computational tasks require increased power consumption. Therefore, the right selection of computer equipment, in terms of both performance and electricity consumption, can contribute to reducing electricity consumption, which directly affects the operating costs of the management organization. In addition, current computers are equipped with advanced technical solutions that help optimize (minimize) energy consumption, which in the case of data centers and the increasing percentage of IT devices used is becoming extremely important not only in financial terms, but also for the environment and the sustainable development of the global economy.
Software is an integral part of working life, and it is used in various industries and companies to manage processes, communication, and data analysis. During our daily commutes, we use navigation apps to inform us of current traffic and optimize our route. Childcare providers use apps to organize schedules, monitor the safety of children, and to communicate with their parents. In the kitchen, software makes it easier to plan meals, access recipes, and control the cooking process with special applications. In government offices, software enables electronic errands, reducing waiting times and facilitating access to public services. In healthcare and medicine, software supports diagnosis, treatment, and patient data management, thus improving the overall quality of healthcare. Most of these applications require interaction with services and components on the global IT network. Software is ubiquitous and has thus enabled a giant leap in civilization. Reducing its use or abandoning digital technologies would result in an inevitable regression. However, software requires computers and communication links, which means even more hardware. That hardware, in turn, consumes huge amounts of energy.
Already, a study conducted by the Boston Consulting Group has shown that the Internet is responsible for 2.4% of total global greenhouse gas emissions, which is equivalent to the entire aviation industry [
1]. These data are from 2020. Currently (in 2025), these results have increased rapidly and significantly, driven by the COVID pandemic, and will continue to do so. This is due to the information-based economy, and additionally, elements that restrict the free movement of people, such as pandemics, will intensify this process. The ICT sector is now responsible for 3 to 4% of global CO
2 emissions, about twice that of the much more heavily scrutinized aviation sector. And with global data use estimated to grow 60% this year, the industry could be responsible for up to 14% of global CO
2 emissions by 2040 unless significant steps are taken to lower the environmental impact of telco and communication technology companies [
2]. Undoubtedly, electricity is one of the most essential media for life. However, five years ago, in 2020, many organizations and modern enterprises did not deal with the issue of measuring energy consumption in as many aspects as they do now, although, of course, this parameter was in the public consciousness, and its values have always been important. However, with the passage of time, the power consumption factor has become much more important, including primarily the price, but also the fundamental impact on the environment and the global economy has been noticed. This has caused an unprecedented need to look for ways to save this resource and reduce production costs. Information society, having measured energy consumption, allows its use to be optimized in many ways. The use of a meter for specific devices is currently necessary due to the possibility of using many methods of optimizing energy consumption. Of course, this also allows for drawing more precise conclusions and applying appropriate methods for reducing or including the costs incurred for the appropriate prices of products or services, as is the case with data centers. The great interest in resources, consumption measurements, and methods for optimizing the use of electricity is undoubtedly influenced by drastically rising prices and the difficult-to-predict large fluctuations in their changes. At the moment, there is no way to slow down technological development without a dramatic drop in quality of life. Therefore, we need to consider how to best use existing resources so that future generations can live in a world no worse than the one we live in.
This article is divided into several sections. The
Section 1 shows where the impulse to begin this study came from. The
Section 2 presents the methodology of the conducted research, including the research questions. The
Section 3 shows the results of a literature review oriented according to the order of decisions made in the software development processes. Its aim is to support decision-makers in searching for energy-efficient solutions. The
Section 4 presents the current state of application of good practices and methods for reducing energy consumption by software. The
Section 5 summarizes the research results, presents our conclusions, and shows directions for changing the current situation.
The aim of this article is to show the need to optimize the use of electricity by computer devices in accordance with the concept of sustainable development and to analyze methods of reducing energy consumption. The article discusses the most popular techniques for reducing energy consumption by software, which is of fundamental importance for this type of organization and translates into the costs of services provided, and indirectly affects the future of the global economy.
2. Motivation
At the end of 2021, the English edition of
Le Monde Diplomatique published an article by Guilliam Pitron, “No Such Place as the Cloud” [
3]. The article broadly discusses the impact of digital technologies on the natural environment. The author, as a journalist who has been dealing with the geopolitics of rare earth metal extraction for many years [
4], presents not only the issues of the direct impact of the increasing extraction of raw materials in order to construct the latest digital equipment, but also discusses the negative consequences of software on the environment of our planet. The problems associated with the digital economy and its impact on the environment are becoming increasingly visible as new technologies develop. Although digitalization brings many benefits, it also involves certain ecological challenges that are worth considering. Here are some of the most important issues:
Energy consumption—The development of the digital economy is associated with a huge demand for energy. Data centers that store data and support cloud services require huge amounts of energy to operate, cool, and store information. Technologies such as artificial intelligence, blockchain, and the Internet of Things (IoT) also increase the demand for computing power, which is associated with large carbon dioxide emissions.
Electrosmog and e-waste—Digitalization is associated with the production and use of many electronic devices, which become waste after some time. E-waste is one of the most dynamically growing types of waste in the world. Old phones, computers, peripherals, and other electronics end up in landfills, which is a big problem related to their recycling and environmental pollution.
CO2 emissions related to data transport—Internet services and data transmission also have an impact on the environment. Sending information over the network requires infrastructure that not only consumes energy but also contributes to greenhouse gas emissions. As more and more processes move to the cloud, the number of servers and forms of network infrastructure increases, which increases total emissions.
Raw material production and extraction—The production of electronic devices requires a large amount of natural resources, such as metal ores, valuable minerals, and rare earths. The extraction of these resources is associated with environmental degradation, water pollution, and the destruction of ecosystems. In addition, electronics production generates large amounts of pollution and waste.
Sustainability and social responsibility—In response to these challenges, many companies, organizations, and governments are starting to implement sustainable development strategies. Examples include investing in green energy (e.g., renewable energy sources for data centers), manufacturing more energy-efficient devices, and developing e-waste recycling technologies. These changes are necessary to minimize the negative impact of the digital economy on the environment.
Sustainable development within digitalization—Another solution could be the implementation of the concept of “green digitalization”. This involves using new digital technologies in a way that supports sustainable development. Examples include energy management systems that help reduce energy consumption in buildings and cities, or platforms supporting energy efficiency and sustainable production.
The challenges related to the digital economy and its impact on the environment are real and require a coordinated approach. In order to minimize the negative effects of technological development, it is necessary to seek innovative solutions that combine the development of digitalization with care for our planet. This is a task not only for technologists, but also for political decision-makers and companies that have an impact on shaping the future of the digital economy.
Figure 1 shows a map of the problems described in the article. Thanks to a new method of estimating environmental costs, Material Input per Service-Unit (MIPS), it turned out that the more advanced the production technology, the higher the environmental costs [
5]. This applies to the most advanced technologies of electronic equipment production. This equipment is used to build the largest infrastructure on our planet.
The reliability requirements of information processing and storage systems magnify substantial environmental costs. Data center power systems have backup circuits with high-power diesel-powered generators. Software reliability requirements are forcing the replication of resources across multiple data centers with different geographic locations. Redundancy increases the demand for equipment and energy to power it several times over. We can consider several questions in this context. Is there an immediate need for all the data collected and transmitted on the Internet? What about photos of cute kittens posted on social networks? It is worth considering the priorities of data importance.
The change in approach to creating fault-tolerant software that detects failure and recovers all functions after failure has passed should reduce the need for hardware redundancy.
The design of infrastructure to handle the maximum load on the network, which rarely happens in practice, contributes to increased power consumption. Another factor increasing demand is the exponential growth of data. Software produces and collects data to analyze network user behavior for marketing and advertising purposes. Companies collect profiles of individual users and resell them at a high price. Various types of spam generators, trolls, botnets, and deepfakes consume power and are responsible for the vast data growth. The flood of unwanted and malicious information also generates a high demand for software to defend against attacks. The Internet of Things also produces large amounts of data transported and stored on the network. Properly designed software could help reduce this burden.
The last negative phenomenon described in the article is the creation of automatic investors. Specialized algorithmic software generating profits for shareholders generates a significant demand for computing power and data storage in computing centers, thus introducing vast amounts of CO2 into the atmosphere.
The article’s author cited an estimate that in 2022, the data centers that comprise it will consume about 10% of global energy production. All this growing infrastructure is used to store and process data. It was estimated that the Information and Communication Technology sector (ICT) will be responsible for 9% of all electricity demand by 2030, and 20% by 2050 [
6]. Most of the efforts to reduce the carbon footprint of ICT have been dedicated to improving hardware energy efficiency. However, it is largely agreed that improving the sustainability of ICT will also need to involve making software more sustainable by reducing the energy consumption or the carbon footprint associated with its deployment [
7]. The potential impact of “Green IT” in general, covering hardware and software, and “Green Coding”, covering ecologically sustainable software in particular, is explained in several articles [
8,
9,
10,
11]. Software is largely responsible for energy consumption. Reading the aforementioned article was a surprise for the authors of this article, as people with extensive experience in the Information and Communications Technology (ICT) industry (programming, IT project management, and education). The picture seen from a great height looks different from the perspective of a person working in the industry. Many employees still live in the myth that software is something abstract and intangible and that software production is creating something out of nothing. It is difficult to understand how these intangible entities can threaten the natural environment or limit the ability of future generations to meet basic needs. However, pollution generated by digital technology is a fact. After recognizing the importance of the problem, the authors decided to look for solutions to improve the situation in their own areas of competence.
3. Research Method
In this article, we followed the guidelines for conducting a systematic literature review [
12] with the consideration of process changes for a small research team and guidelines for conducting a light review. We performed the following steps of the procedure for systematically searching and selecting relevant sources:
Step 1. Define the research objective and research questions.
The aim of this research is to answer the question of how software engineering can contribute to the sustainable development of digital technologies. The systematic literature review and classification of methods used in practice aims to answer the following research questions:
RQ1: What are the possibilities (methods, practices, patterns) for reducing the energy consumption of computer devices?
RQ2: What is the current state of application of existing methods and practices in the software industry?
RQ3: How can existing solutions be initiated in the industry?
RQ4: What methods, tools, and techniques have been proposed to achieve sustainable software development?
RQ5: How should we choose between available methods to optimize the energy consumption used to power all computer devices and new technologies?
Step 2. Conduct a pilot search.
After defining the research objective and research questions, we conducted pilot searches using the Google Scholar search engine and various queries. The results of the initial searches allowed us to orient ourselves in the existing literature and appropriately select the words in the search string.
Step 3. Define the search string.
The query was intended to find articles on software energy consumption or software energy efficiency. The phrase “green software” directly indicates the topic of interest. Hence, the query consisted of six elements connected by logical connectives. As a result, we constructed the following query:
“((((energy OR power) AND (efficiency OR consumption)) OR green) AND software)”.
In addition to the results of the main query, we included words that specified the topic of the searched studies, e.g., technique, practice, methodology, requirement, architecture, algorithm, approximate, survey, etc.
Step 4. Specify the searched article databases.
First, we used the Google Scholar search engine because it is easy to use, provides a wide range of materials in the search, and thus provides a broad perspective on the problem and its solutions.
Figure 2 shows the number of articles meeting the criteria on the following pages of Google Scholar results. Second, we used the IEEE Xplore [
13], ScienceDirect [
14], Springer Link [
15], Scopus [
16], and Web of Science (WoS) [
17] search engines. In order to expand the article database, we also used the snowballing method [
18]. This method refers to a research technique commonly used in literature reviews, data collection, or identifying sources of information. It is a strategy where you start with a few key sources or studies and then progressively identify additional relevant sources through citations and references within those initial sources, much like a snowball gaining size as it rolls. There are two main types of snowballing [
19]:
Backward snowballing: This involves looking at the references or citations in a key paper or source you start with. By reviewing the works cited in that paper, you identify other relevant articles that may be helpful for your research.
Forward snowballing: In contrast, forward snowballing means looking at who has cited the key paper you are working with (e.g., through citation databases like Google Scholar). This allows you to track the development of the research and find newer studies that have built on the original paper.
Both methods allow you to systematically find and build a body of relevant literature without having to search for each paper from scratch. It is often used when conducting systematic reviews in academic research.
Step 5. Defining the inclusion and exclusion criteria.
We were interested in articles from journals and conferences on software engineering in English. We searched for texts that answered our research questions, i.e., contained descriptions of methods, methods, techniques, and practices, preferably verified in practice. We were also interested in articles, surveys, and reviews showing the state of application of these methods in practice and proposals for improving the situation. The articles had to be publicly available. As a result, we received a set of inclusion and exclusion criteria, which we present in
Table 1.
Step 6. Conducting the main search.
We conducted the main search in December 2022. As a result, Google Scholar returned about 1,870,000 results. The results were sorted in order of best matching.
Step 7. Selection of publications by title, keywords, and abstract.
Then, publications were reviewed based on title, abstract, and keywords. At this stage, publications on topics other than software engineering, or dealing with energy consumption by Machine Learning (ML) and Deep Learning (DL) models, as well as publications without full text available, were eliminated. The results were reviewed in pages containing 20 results. The number of matching articles decreased on subsequent pages, so the search was stopped after page 8 (
Figure 2).
Step 8. Read the full text.
We read the remaining 33 publications in full. We received a list of seven articles suitable for inclusion in the review. This is of course far too few to review software energy efficiency methods, examine the situation in the industry, and draw conclusions on how to improve it. Therefore, we decided to use an agile approach and dynamically changed the query.
Another argument for moving away from a rigid protocol is that systematic reviews were created to collect medical research results and then build large data sets from them that would allow for meta-studies. Our goal was different, so we used the IEEE Xplore [
13], ScienceDirect [
14], Springer Link [
15], Scopus [
16], and WoS [
17] search engines to create a lightweight review. We attached words specifying the topic of the searched studies to the single software engineering disciplines in the order shown in
Figure 3: for example, technique, practice, methodology, requirement, architecture, algorithm, approximate, survey, etc. To retrieve more articles, we used the snowballing method [
20]. This resulted in 98 articles, 68 of which we used in this article.
4. Literature Review
A systematic review of articles in the field of software engineering on the energy consumption of IT systems and applications revealed a wide range of methods and best-practice techniques that address the problem and have been validated by researchers and practitioners in the IT industry.
In this article, we provide an overview of research articles that consider different aspects of software engineering and their impact on energy consumption.
Figure 3 shows them in the order discussed in the article.
Software engineering is the application of a systematic, disciplined, and quantifiable approach to the development, testing, implementation, operation, and maintenance of software systems [
21]. There are various approaches to dividing the above disciplines into processes and sub-processes performed sequentially or in parallel. These approaches are called software life cycle methodologies or models. Each of the processes in these models is supported by recommended, proven practices and techniques. Are there any software development methodologies that are more supportive of energy-efficient software development? The known literature does not consider the answer at this level of generality. Currently, the most commonly used approaches to software development are agile approaches. As a result of their popularity, a greater number of projects end in success. Since we care about the smallest possible burden on the environment by software, we can say that agile methodologies are the most effective, because other approaches more often end in wasting time, effort, energy, and money. The question of which agile practices most often support the development of efficient software is answered by the authors of a broad review of articles from 2005 to 2017 [
22]. The results of the literature review were verified by means of a survey conducted among experts from the IT industry. The paper identifies six main success factors, which are supported by 36 identified practices. The results show that agile software development methods are emerging as the best way to produce green and sustainable software. A similar opinion is expressed by Dick, who developed a model that integrates Green IT aspects with software engineering processes using agile methods to create “greener” software from scratch [
23]. Therefore, in this paper, we consider software engineering from the point of view of agile methodologies.
4.1. Software Architecture
Designing the architecture of an IT system is the most important stage of design in traditional software development methodologies. Its purpose is to divide the system into components, determine how they work together, and indicate the technology. As a result of working on the architecture of the system being developed, we obtain greater coherence of components and less coupling between them. The role of the architect does not exist in scrum teams; the entire team is responsible for making such decisions. Decisions regarding architecture can be made at the very beginning of the development process, but in agile methodologies, these decisions are made after the software implementation begins. They make up the evolving architecture. This architecture consists of emergent decisions, from which the target architecture emerges, and refactoring decisions that change previous decisions. Regardless of when these decisions are made in the software development process, they have a great impact on the energy consumption of the software being created. The system architecture design process requires making a number of key decisions made on the basis of compromise. Decisions concern how to meet the non-functional requirements and key functional requirements of the customer. In this way, it is strongly connected to the area of requirements acquisition. In agile methodologies, we do not know all the requirements at the beginning; we learn the details during the project work. For Green IT systems, it is important to have clearly defined, verifiable requirements for the energy efficiency of the software. Seyff et al. describe a Win–Win model of requirements negotiation extended with sustainability [
24]. On this basis, appropriate decisions can be made and verified in practice. Solid support for architectural decisions based on a model that allows forecasting energy consumption with an error below 6% is presented in [
25]. Architectural tactics, i.e., decisions based on quality measures, are proposed by the authors of [
26]. Thanks to the perspective of energy consumption by the software architecture, it was possible to reduce energy consumption by 67% in the example application. Support for making architectural decisions in the context of energy consumption is also provided by the decision maps presented in. Maintainability is an important quality in making architectural decisions [
27]. Its relationship with energy consumption by software is the subject of promising preliminary work described in [
28]. Fontana de Nardin et al. developed the Elergy model, which uses the advantages of the elasticity-driven approach of microservice architecture in cloud applications [
29]. It allows us to predict the load changes in individual microservices and make a decision about scaling the service in advance. Its energy efficiency allows us to save up to 27.92% of energy. The methods supporting decision-making at the architectural level and their effectiveness in reducing energy consumption are given in
Table 2.
At a similar level of generality are decisions about the choice of programming language and software technology. Below, we present an overview of the information on this topic.
4.2. Programming Languages
A programming language allows us to express the collected knowledge about the system being built in the form of code. Decisions regarding the selection of the software language and related technologies, libraries, and frameworks are made at the beginning of the agile development process. They are related to the technological competences of the developers who make up the project teams. If we are building a monolithic system, it will be difficult to change the decision once made. In the case of choosing a microservice architecture, it is possible to delay decision-making until the emergence of subsequent business areas. Such an architecture makes it easier to build a system from components created in different technologies. Both the programming language and related technologies have a strong relationship with the energy consumption of the built software. In an organization producing green software, decisions regarding the selection of a programming language and technology should be made based on their energy efficiency. The energy consumption of different software languages has been the subject of many studies in recent years and is described in the following articles. Georgoiu et al. conducted an experiment comparing the energy consumption of nine basic tasks from the Rosetta Code collection [
17] implemented in seven interpreted and compiled programming languages [
30]. For simple tasks, the lowest energy consumption was measured for compiled languages with optimization enabled. For URL encoding and decoding, Java achieved the highest energy efficiency. Among interpreted languages, JavaScript achieved the highest efficiency.
Pereira et al. compared the energy consumption, execution time, and memory usage of ten programming problems from the Computer Language Benchmark Game (CLBG) collection [
31]. Measuring the execution of programs in 27 tested programming languages on a desktop machine with an Intel Core i5-4460 processor showed that compiled programming languages are usually the fastest and most energy-efficient (C was the best for almost all tested problems) [
32]. In second place in these respects are languages run in virtual machines (Java 8 consumes twice as much energy as C). Interpreted languages turned out to be the least energy-efficient and slowest (Python 3.13.2 consumes 76 times more energy than C). The results of the research were later validated by the authors using tasks from the aforementioned Rosetta Code repository [
33]. Oliveira et al. compared the energy consumption of a set of applications and benchmarks from Rosetta Code, CLBG, and F-Droid [
34] on the Android mobile platform [
35]. Their results indicate that for intensive computations, JavaScript ES2024 is more energy-efficient than Java. The authors suggest using a hybrid approach, running Java applications with JavaScript scripts to save energy. The summary of the energy consumption results of popular programming languages is given in
Table 3.
The entire team is responsible for software design in agile methodologies. One of the main decisions in the design of software components is the use of design patterns. Design patterns are proven solutions to problems that often appear in a specific context. The use of patterns allows for better understanding and modification of the code of the created component and increasing the flexibility of the solution [
36]. Feitosa et al. [
1] describe both modern patterns oriented towards increasing energy efficiency and energy consequences of using standard design patterns.
Energy patterns understood as good practices reducing the energy consumption of mobile applications were described by Cruz et al. in [
35]. It contains a catalog of 22 practices obtained as a result of searching in GitHub 2.49.0 code repositories and then thematic analysis [
37] and classification. Schaarschmidt et al. built a framework for describing design patterns for energy-efficient embedded systems [
38]. They proposed a numerical efficiency coefficient to calculate energy savings due to the use of the pattern. The framework was used to describe new and well-known design patterns.
Table 4 summarizes the research on the impact of design patterns on energy consumption. The next subsection presents the results of the search for works in the field of software implementation.
4.3. Software Implementation
In agile methodologies, software development is performed in iterations lasting several weeks. At the beginning of each iteration, the team details the requirements and chooses which ones to implement. Then, they design, code, and test the emerging software functions. As a result of the iteration, the client receives another version of working software. Based on the feedback, the team adjusts the software implementation to the client’s needs. In the part describing implementation, we present research on reducing the energy consumption of lower-level elements, such as selecting appropriate algorithms, the impact of parallel programming, using approximate calculations, and using good practices.
One way to increase the energy efficiency of software is to choose an appropriate algorithm that consumes less energy. Energy consumption depends on the execution time, which results from the computational complexity of the algorithm. However, it does not exhaust all the causes. Among other things, the locality of calculations is important, which increases the use of cache memory, performance, and energy efficiency. There are comparisons of different versions of algorithms on different hardware platforms, which allow us to choose the optimal energy solution. Rashid, Ardito, and Torchiano performed an experiment in which they measured the energy consumption of different sorting algorithms implemented in different programming languages on the Raspberry Pi device [
39]. The results show which sorting algorithms implemented in which language consumed the least energy. Schmitt et al. performed similar experiments comparing the energy efficiency of sorting algorithms on desktop computers [
40]. Collections are abstract classes, which are used to store objects and allow us to treat them as a single set of data. They allow us to perform operations on it, e.g., adding, removing, and browsing elements of the set. There are many types of collections, which differ in properties and implementation. The proper selection of a specific collection for the problem and context has an impact on increasing the performance and reducing the energy consumption of the application.
Pereira et al. measured the energy consumption of standard Java collections and used the measurement results to optimize the energy consumption of a set of sample applications [
41]. In preliminary results, they achieved a 6.2% reduction in energy consumption. Hasan et al. created detailed profiles of the energy consumed by standard operations performed on Java abstract collections [
42]. The results of the study show that using the wrong collection, according to the profiles, can result in a 300% increase in energy consumption. Optimizing the sample application using WALA analysis resulted in a 38% reduction in energy consumption and a speedup of about 1.6. The WALA analysis is a framework used in research and decision-making, particularly in fields like policy analysis, project management, and social sciences. WALA stands for Winners, Losers, Alternatives, and Assumptions. It is a tool used to assess the impacts of a decision, project, or policy from multiple perspectives. The WALA analysis helps to identify the stakeholders involved, weigh trade-offs, and ensure that decisions are as balanced and well informed as possible. Pinto et al. empirically studied 16 implementations of Java collections and conducted energy consumption measurements [
43]. They obtained significant results in comparisons of individual implementations, where a 2.19 reduction in energy consumption was achieved. For real Tomcat and Xalan applications, an improvement of 17% was obtained. As we can see, by using appropriate types and implementations of collections, a significant reduction in energy consumption can be achieved. Another important group of algorithms are cryptographic algorithms, because they require a lot of computational power and therefore consume a significant amount of energy.
An important class of frequently used algorithms with high computational complexity are cryptographic algorithms. Classical algorithms used until recently are threatened on the one hand by the growing power of conventional computers and on the other hand by the ever-closer perspective of introducing quantum computers with enormous computational power. Therefore, we tend to extend hashes and cryptographic keys and introduce new, more complex algorithms. In 2016, the American National Institute of Standards and Technology (NIST) began the process of standardizing post-quantum cryptography (PQC) algorithms [
44]. The energy consumption of these algorithms is particularly important for mobile devices or IoT. C. A. Roma et al. conducted research on the energy efficiency of PQC algorithms [
45] running on the Intel Core i7 processor. The obtained results for different algorithms and security levels allowed them to select the appropriate algorithm for the context. Elsadeka et al. investigated the energy efficiency of lightweight cryptographic algorithms standardized by NIST [
46]. Thanks to parallel processing, up to 49% and 28% increases in energy efficiency were achieved, and up to 96% and 45% for the Elephant and ISAP algorithms. Specialized processors are developed for IoT applications to reduce energy consumption. An example of such a solution is an energy-efficient cryptoprocessor supporting the post-quantum version of the TLS protocol [
47]. This is a hybrid solution containing a low-power RISC-V microprocessor core, on which the software part is implemented, and hardware accelerators for AES, ECC, and SHA2 encryption. Thanks to the use of accelerators, the energy consumption spent on encryption was reduced by two to eight times. Even better energy efficiency results were obtained thanks to the hardware implementation of the lightweight cryptography algorithm PRESENT [
48] for IoT devices. The authors report an increase in energy efficiency of 63 times compared to the standard AES algorithm. The most energy-efficient cryptographic algorithms were implemented in hardware for use in IoT devices. Maybe it is time to consider hardware support for cryptography in desktop devices and server solutions for data centers?
Table 5 summarizes the results of work on energy-efficient algorithms.
4.4. Parallel Programming
Parallel programming is the process of dividing large problems into smaller ones and solving them in parallel. The interest in parallel programming is largely due to the physical limitations of increasing the frequency of processors. For this reason, modern processors consist of many cores enabling parallel operation.
A comprehensive review of software methods for improving the energy efficiency of parallel processing includes the most advanced methods for supercomputers intended for scientific computing [
49]. This 2017 article presents energy efficiency as a major problem in the design of modern computing systems from supercomputers to laptops with multicore processors. The theoretical models described in the article show that energy efficiency is dependent on both the application and the platform; energy consumption and performance are closely correlated, and the choice of processor frequency has a smaller impact on efficiency than changing parallelism. The article presents methods for improving energy efficiency in five groups: resource management, parallelization optimization, communication, automatic energy tuning, and approximation. Although advanced methods are mainly related to scientific computations performed on supercomputers, they can be an inspiration for software solutions for data centers. Kambadur and Kim worked on discovering configurations and parameters that will reduce the energy consumption of parallel applications [
50]. The authors measured the energy consumption and execution time of 41 test applications in 220 configurations. The results indicate that the most important factors for low energy consumption are the efficiency of parallelization and optimizations introduced by the compiler.
The literature review [
51] also lists two general algorithms for minimizing energy consumption in multithreaded processing. Both reduce energy consumption by effectively managing threads. The first one, DVFS (Dynamic Voltage Frequency Scaling), adjusts the supply voltage and changes the processor clock frequency depending on the current load. However, this approach does not work well when using multithreaded processors. To overcome this limitation, the authors used the thread shuffling technique, which reduced energy consumption by 56% [
52]. In the HERMES solution, also based on DFVS, a strategy based on the thief–victim approach [
53] is used. It consists of two algorithms, workpath-sensitive and workload-sensitive, which adjust the clock frequency of the processor core to its load. As a result of using the HERMES approach, the energy consumption can be reduced by about 3–4%.
Table 6 summarizes the research results on the impact of parallel implementation on energy efficiency. The next subsection lists solutions based on adjusting the accuracy of calculations.
Approximate computing is an approach to reducing the precision of computations in order to save energy. With this approach, we can distinguish techniques based on approximate data types and on approximate iterations and functions. When using approximate data types, the most important problem is separating data that should be calculated precisely from those that can be approximated. Sampson et al. proposed the EnerJ language, which is an extension of the Java language introducing approximate data types [
54]. Using EnerJ annotations, one can mark approximate and precise application data. Thanks to the annotations, the data are placed in approximate memory, and the code processing them has reduced energy consumption with reduced accuracy. The authors’ experiments have shown that it is possible to reduce energy consumption by 10–50% with a small loss of accuracy.
Programmers often use double-precision floating-point variables. In many applications, such precision is not needed. Compared to single-precision variables, they occupy twice as much memory and twice the bandwidth. Dongarra et al. have evaluated the performance and energy consumption of PLASMA and FLAME numerical libraries using single-precision variables and a hybrid approach [
55]. The time and energy consumption were reduced by 50% for the single-precision and 25–30% for the hybrid approaches. Loop perforation is a computational technique that changes the number of loops so that only a fraction of the iterations are executed, which reduces execution time and energy consumption. Hoffmann et al. have described a technique called code perforation that automatically extends the application to increase performance and energy efficiency at the expense of accuracy [
56]. Applications of this technique include scientific computing, video and audio coding, Monte Carlo simulations, and Machine Learning algorithms. It is based on the SpeedPress compiler, which uses code perforation and the SpeedGuard module that continuously monitors the loss of precision and controls the code perforation accordingly.
A very effective approach to saving energy by simplifying computations is memoization. It consists of memorizing the results of a computationally intensive function for each parameter value. Agosta et al. proposed an approach based on dynamic memoization, which consists of the automatic identification of functions to be memorized and the automatic memorization of results for the most frequently used parameters [
57]. They performed the verification for a set of financial functions. In practice, they achieved a reduction in energy consumption of 74% and an increase in efficiency of 79%. Approximate calculations enable a significant reduction in software energy consumption.
Table 7 summarizes the mentioned studies on this topic.
5. Results
5.1. Good Practices
Best coding practices are sets of rules formally or informally established by different coding communities that help software practitioners improve software quality. Another solution is to divide the system into so-called voltage islands. This is a technique that involves supplying different parts of the system with a lower voltage, depending on the requirements imposed on the operating frequency. Power management and optimization solutions are sought at the stage of designing integrated circuits (methods at this stage include primarily the selection of hardware and architecture, as well as design strategies) or during techniques implemented during system operation.
Current techniques used for low energy consumption are innovative power-limited modes, intelligent analog peripherals, or clock signal gating and supply voltage scaling. However, it should be noted at the same time that without effectively operating and well-designed software, it is not possible to use the full functionality provided by the above-mentioned power reduction techniques. As new solutions with increasingly lower power consumption appear on the microcontroller market, the ultimate limit of energy savings of systems will increasingly be software that will enable maximum use of saving modes. It seems that for solutions saving energy consumption, the most important thing is to design the algorithm in such a way that the processor can stay in a state of deep power reduction for as long as possible. Such a long stay in sleep is possible because the microcontroller and its peripherals can be powered by various sources of clocks signals. Moreover, peripheral systems can operate without the supervision of the processor, allowing it to be turned off and the total consumption of electricity to be reduced. This means that analog-to-digital converters can sample the signal, timers can count pulses, and serial communication modules can send data in energy-saving mode, without the need for intervention from the processor and waking it up from sleep. Therefore, developers should consider using the power-saving mode, which is now becoming a required trend.
A literature search found one reference on best coding practices for achieving energy-efficient programming. Pinto et al. searched StackOverflow, a popular developer resource for questions and answers about software energy efficiency [
58]. The authors identified seven main causes of increased energy consumption cited by developers. These include unnecessary resource consumption, background activities, and excessive synchronization. They also identified eight solutions that were most frequently recommended on the resource. They then compared the recommended solutions with the state of the art and found numerous shortcomings. The solutions that are consistent with the state of the art are listed as best practices.
Table 8 lists these practices. For details on the application of the practices and the contexts in which they are effective, we refer to the original article.
The remaining literature found refers to bad code smells, which are the result of not applying good practices. Refactoring is the essence of agile methodologies. It is the process of improving the design of existing code without changing its behavior [
60]. Thanks to refactoring, the source code becomes more readable, duplicates are eliminated, errors are easier to find, and the complexity of the system is managed. Without refactoring, the software slowly degrades and it becomes increasingly difficult to make changes to it. Bad code smells are symptoms of deeper structural problems that should be recognized and refactored.
Cruz and Abreu described bad code smells associated with the low energy efficiency of mobile applications [
61]. In the next paper, they proposed a tool for automatic refactoring of source code that fixes five common problems in mobile applications [
62]. Pinto et al. conducted an extensive review of the literature on energy refactorings published in 2015 [
63]. The literature identifies 11 energy efficiency issues and presents the possibility of their refactoring. The following refactoring areas were identified: User Interfaces, CPU Offloading, HTTP Requests, I/O Operations, Continuously Running Apps, Excessive Copy Chains, Embracing Parallelism, GPU Programming, Approximate Programming, Energy Types, and Stream Programming. A detailed discussion of the areas can be found in the aforementioned publication. Şanlıalp et al. conducted an energy efficiency analysis of five refactoring techniques and three of their combinations for 2048 desktop and 2048 mobile applications [
64]. The energy consumption of applications written in C# and Java was measured using the Intel Power Gadget tool [
65]. The analysis found that all refactoring techniques reduced energy consumption, but using a combination of incompatible techniques did not reduce energy consumption as much as expected.
Table 9 presents a summary of energy efficiency refactoring methods.
Work on reducing software energy consumption requires measuring the actual energy consumption. Thanks to feedback, we can assess the effectiveness of the solutions used and work on changes in the right direction. Measurements sometimes require measuring the total energy consumed by software and hardware, and sometimes measuring the energy consumption of a specific hardware component and the software component cooperating with it. Hardware methods, such as black-box methods, are best for measuring the total energy consumption. To optimize energy efficiency, we most often need information about the impact of the software component on energy consumption. In such cases, the best measurement methods are white-box methods, i.e., software methods. The simplest solution used in the articles is to use the electrical engineering method to measure the total energy consumed by the device on which the software is running. Wattmeters can be used to determine the energy consumption of the software in both mobile and embedded devices, as well as in computers and servers. Thanks to them, the standard electrical engineering method of power measurement can be used.
The studies described in the articles [
39] and [
40] use the NI USB-6210 [
65] and Yokogawa WT210 power analyzer [
66] devices. Another of these devices is Whatts Up? [
67]—most often used to study the energy consumption of mobile devices. This is a power meter that saves up to 2000 measurements in the device’s memory, which is read and presented in the application on a PC. The application communicates with the device via a USB connection. GreenMiner [
68] is a device dedicated to testing the energy consumption of smartphones. It physically measures the energy consumption of mobile devices (Android phones) and automates application testing and reporting measurements to developers and researchers. There are also specialized devices, such as EET (Energy Efficiency Tester) [
69]. It is a hardware energy efficiency meter. It is used to measure power and store measurement results for a PC running the analyzed software. It was designed as a component of the FEETINGS framework (Framework for Energy Efficiency Testing to Improve eNviromental Goals of the Software). The framework supports automatic collection of energy consumption data from the most important components of the computer, along with processing and visualization of the collected data. Seflab 1 (Software Energy Footprint Lab) [
70] is a laboratory consisting of a server and a measuring computer. The server is equipped with component power sensors (processors, motherboard, memory, hard drive, fans). The measuring computer uses transducers to read the power consumption of the server components caused by the software running on it.
5.2. Software Tools
Software tools enable white-box measurements. This gives them a big advantage over hardware tools, as they enable the measurement of energy consumption by components of the measured system, sometimes even with a resolution down to the methods or functions of the tested software. Measuring energy consumption before and after changes to a component or function allows for making rational decisions to increase the energy efficiency of the software being created. Profilers are specialized tools that allow for dynamic analysis of a program in terms of energy consumption. Software methods are mostly based on interfaces provided by modern processors. One of them is the Running Average Power Limit (RAPL) developed by Intel [
71,
72]. It is used to report the cumulative energy consumption of various domains: the CPU system, the attached DRAM memory, and the embedded graphics processor. Below is a list of tools described in the review articles or used in the described studies. These tools work on different hardware platforms and provide different measurement resolutions. The AEP (Android Energy Profiler) [
73] provides detailed information about the performance and energy consumption of any Android app in real time, without the need for source code or external power meters. Android Runner (AR) [
74] is a framework for automatically executing measurement-based experiments in native and web applications. The main goal of AR is to streamline the execution of measurement-based experiments on Android devices. BitWatts [
75] relies on distributed middleware to collect information about process utilization and infer detailed energy consumption without the need to invest in hardware (e.g., power meters).
IgProf [
76] is an application profiler developed at CERN for scientific computing workloads. The energy profiler is based on sampling and obtains energy measurements from the Running Average Power Limit (RAPL) interface present in the latest Intel processors. Jalen [
77] is a software-level profiler. It is responsible for profiling running applications and estimating their energy consumption at a higher level, i.e., at the thread or method level. Jolinar [
78] is an open-source energy measurement tool that allows profiling a specific application at a given time. Jolinar can profile the energy consumed by various resources, such as CPU, RAM, and disk, except for the network card. jRAPL [
79] is an open-source library that provides a set of APIs for profiling Java programs running on processors with Running Average Power Limit (RAPL) support. PETRA (Power Estimation Tool for Android) [
80] is able to measure the power consumption of Android applications based on the tools and APIs provided with the publicly available Project Volta 2. The tool provides an estimated measurement of power consumption at the method level.
PowerAPI [
81] is an operating system-level system monitoring library. It estimates the power consumption of running processes in real time, based on raw information collected from hardware devices (e.g., CPU, network card) via the operating system. Power Gadget 3.7.0 [
64] is a software-based tool for monitoring the power consumption of Intel Core processors (2nd to 7th generation). It is supported on Windows and Mac OS X. It includes an application, driver, and libraries for monitoring and estimating real-time CPU power information in watts using in-processor energy meters. The product is no longer continued [
82]. PowerJoular [
83] enables monitoring of power consumption during the operation of many hardware components of different devices and architectures. The initial version monitors the power consumption of CPUs and GPUs in computers and servers, and CPUs in Raspberry Pi devices.
Trepn Profiler [
84] is a device power modeling and performance measurement tool to estimate the energy of the device, components, and smartphone applications. A review of articles on methods and practices for reducing software power consumption also provides information on the measurement methods used. We have compiled a summary of the methods used in the described studies. The most commonly used method in the studies is the use of the RAPL interface [
71,
72]. The second most common is the use of various types of wattmeters. The remaining methods occur in isolated cases. Some of the methods described by us were not used in the studies.
Table 10 shows the complete list of methods used in the studies described in the articles, supplemented with the results of the literature review, sorted by most frequently used and then alphabetically. As part of such a review, a specific classification is created for methods, tools, and techniques that support the achievement of the goal included in question RQ4 regarding sustainable software development.
Building energy-efficient software is an increasingly important task for developers [
85]. The cumulative body of knowledge of techniques that support this goal is very important. We conducted a systematic literature review to gather information on existing techniques that allow developers to increase energy efficiency in apps. Based on a synthesis of the findings from included primary studies, we propose a taxonomy of techniques for improving the energy efficiency in software.
5.3. Research Results
The problem of energy consumption in software has been noticed quite recently, both by the research community and by practitioners working on software development in the industry. In this article, we reviewed methods, techniques, and practices that allow us to reduce energy consumption in software. This is part of the achievements of the last ten years of research. In order to use these results in practice to create energy-efficient software, developers should be aware of both the existence of the problem and ready-made effective solutions. In recent years, many authors have investigated this issue using surveys. The surveys included different groups of project team members. They concerned the awareness of the problem of energy consumption in software, the existence of methods to reduce consumption, and supporting tools. The level of knowledge about methods of optimizing efficiency and measurement methods was also checked. The respondents also determined whether they used these methods and tools in practice. Additionally, one of the authors checked whether beliefs about climate change are correlated with awareness.
Figure 4 shows a map of the aspects examined using surveys.
By reviewing these studies, we were able to answer the second research question: RQ2: What is the current state of application of existing methods and practices in the software industry? Pang et al. conducted a survey among 122 software developers in 2013. The survey results, along with an analysis of the results in relation to the literature, were published in 2016 [
86]. The anonymous online survey consisted of 13 questions, 10 of which concerned knowledge and experience with software energy consumption. Only 18% responded that they considered energy consumption when developing software, 14% considered reducing energy consumption as a non-functional requirement, and 21% modified software to reduce energy consumption. To reduce energy consumption, one must first measure it. Only 10% of respondents responded that they did so. This analysis shows that developers have limited knowledge about energy efficiency, are not familiar with best practices to reduce software energy consumption, and are not sure how software consumes energy. These results emphasize the need for energy efficiency training.
Bashroush et al. described the issue of increasing energy consumption in data centers from the perspective of software architects [
34]. They conducted a survey among 12 experienced software architects from different types of ICT organizations. When asked about energy consumption challenges in the past five years, only 17% of them answered positively that they had solved such problems. When asked whether they believed that energy consumption would become a major architectural problem in the next 5 years, 67% of them answered positively. When asked whether they had the right tools to solve energy efficiency problems at the design stage, only 25% agreed. The reasons for this state of affairs are the lack of knowledge about how design affects software energy efficiency, the lack of cooperation and information flow between organizational structures, and the lack of correlation between energy savings and the amount of payment for the service. The conclusions emphasize the need to establish a measure of software energy efficiency so that energy-efficient solutions can be compared and selected.
Pinto and Castor conducted a survey in 2017 among 62 developers who worked on applications for mobile platforms [
63]. Over 68% of respondents were experienced developers with 8 or more years of experience; 60% of them experienced problems with the energy consumption of the software they created. When asked if they had found the main cause of the problem, 50% of them did not answer. Among those who did answer, the recurring answers included background activities, GPS, and unnecessary resource usage. On the other hand, 32% of respondents did not observe a significant improvement in energy consumption after using their solutions. Of those who noticed an improvement, only 5% used specialist tools. The rest relied on subjective feelings and indicated websites as a source of information. Despite this, over 67% of them stated that the problem of energy consumption is important or very important. This shows that there is awareness of the energy consumption problem among developers of applications for mobile platforms. However, there is a lack of awareness of the existence of measurement methods and techniques and practices that solve the problems. There is also a lack of tools that would support energy measurement divided into hardware and software components and tools supporting code refactoring.
One of the authors of this article, W. Wysocki, conducted a telephone survey among employees of software development companies in early 2023 [
87]. The main goal of the study was to explain why, despite the existence of many techniques, methods, and practices aimed at reducing software energy consumption, developers so rarely use them in the software industry. The results of the study show that the state of awareness among developers and customers still needs to be improved. Most things have remained unchanged since 2013, when Pang et al. [
88] conducted their research. Developers, even if they were not familiar with the problem of software energy efficiency, were able to generate ideas for reducing and measuring energy consumption. The responses indicated flexibility and a willingness to solve problems rather than specific knowledge. We saw that few developers working in the software industry have experience in reducing software energy consumption. It was interesting to find that developers’ belief that humans are not responsible for climate change correlates with their failure to consider the problem of software energy efficiency as important. Developers’ open statements showed that some developers believe that their customers should be the ones to demand reduced software energy consumption. Some developers believe that using cloud computing ensures the energy efficiency of the software. A conversation with a data center employee revealed that economic factors (the cost of data center services) do not force reduced software energy consumption.
Proposals to change the situation in the software industry and consequently in the world result from reflection on the current state. It is most often undertaken by authors conducting surveys.
Figure 5 shows the main factors and their impact on improving the situation. The main problem emerging from the research is the awareness of developers and customers being too low. Both groups are not fully aware of the existence of the problem and the possibilities of action. Developers do not know about existing solutions and tools, and customers do not know that their decisions affect the situation [
89]. The introduction of energy consumption standards for software may encourage institutional customers to order energy-efficient software. This in turn may launch training for managers and developers of the software industry, which will increase their knowledge and practical skills. The introduction of standard software efficiency labeling will make it easier for individual customers to make a decision to purchase an application. A separate path is to introduce the topic of ensuring software efficiency to the curriculum of higher education. Theoretical and practical training will introduce prepared, conscious staff for the industry to the market.
The described proposals for changes constitute an answer to the third research question:
RQ3: How can existing solutions be initiated in the industry?
Pang et al., based on their survey results, state that only 3 percent of respondents received complaints from users about excessive energy consumption [
90]. According to them, this may indicate that users are not aware of the problem. Therefore, Zhang et al. propose the introduction of benchmarks, energy consumption standards, and appropriate product labels [
91]. Clearly defining the energy consumption class of applications can help increase users’ awareness and help them make informed choices about the right product. Pinto et al. [
92] write that developers need to be more aware of the energy consumption problem. Currently, they do not know how to develop and maintain energy-efficient software systems. They also do not have sufficient support from tools. Bashroush et al. remind us that data centers are subject to standards such as the European Code of Conduct for Data Centres [
93] and the Green Grid Data Centre Maturity Model [
94]. Therefore, a similar software certification process seems inevitable.
The authors, based on the survey results [
95], formulate the thesis that the most important thing is the awareness of the problem and existing ready-made solutions. The answer to question RQ5, how to choose between available methods to optimize energy consumption, is the possibility of measuring lower energy consumption by using specific methods. This will allow for the creation of specific rules that will define specific standards that should be used for this purpose. Among other things, this will need to be taught as part of the certification of the broadly understood IT profession. Therefore, we should focus on the appropriate education of students who, when entering the job market, will change the software industry by becoming conscious developers, architects, managers, and customers. The introduction of appropriate standards and certificates for software will launch professional training in the field of adapting products to standards, which will increase the awareness of both producers and institutional users. We must provide users with the skills and knowledge to work more sustainably, hopefully leading to a reduced environmental impact [
96]. Through continuous training and education, it will be possible to create a community of researchers that have a mindset that includes sustainability aspects when it comes to the use of computing resources.
6. Conclusions
Currently, the main problem emerging from this research is the low awareness of developers and customers. Both groups are not fully aware of the existence of the problem and the possibilities of action. Developers are not aware of existing methods and tools, while customers are not aware of the impact of their decisions. The introduction of energy consumption standards for software may encourage customers to order energy-efficient software. However, this situation will undoubtedly change due to the fact of rising prices and, at the same time, growing demand for electricity [
96]. Because the desire to remain competitive on the market, which is a key issue for companies, will force intelligent energy management, due to rising energy prices, there will undoubtedly be a huge interest in this topic. In fact, it has already happened, as proven by the systematic review of the literature conducted in this article. Additionally, this trend will be strengthened by the active promotion of sustainable economic development and high quality of life. However, to achieve these goals, it will be necessary to effectively manage natural resources, and thus also optimize energy consumption, which is the foundation of a knowledge-based economy [
97]. At the same time, the development of the global economy is currently taking place particularly quickly in the areas of intelligent technologies, which forces an increasing demand for energy. And undoubtedly, interest in solutions of this type is growing dynamically due to the ongoing urbanization and digitization of more and more aspects of residents’ lives. However, this also creates many new challenges, hence the need for intelligent solutions, including in the areas of public safety, urban mobility, or environmental monitoring, including, among others, the optimization of energy used to power all computer devices and new technologies.
Educating users on sensibly and efficiently using available computing resources is key to reaching sustainable fundamental research. Tutorials on sustainable computing practices must reach as many users as possible. Thus, an important avenue for future research would be to develop appropriate replication mechanisms for the different platforms. This could, for example, be in the form of a simulation testbed, which mimics the platform’s runtime environment as much as possible, to bypass restrictions. A further interesting observation is that negative results are rarely published. We only came across a single publication that explicitly reported on results labeled as not reliable. In their research on the effects of adjusting deep parameters to enhance the energy efficiency of mobile applications, the authors of the study clearly state that it is safe for developers to ignore this technique. We contend that such findings hold just as much significance for the research community and practitioners alike. They offer valuable insights into the specific scenarios and contexts in which a particular approach may fail to yield any significant energy savings. In the future, identified techniques part of the proposed taxonomy can be analyzed on a quantitative basis and newly discovered techniques can be reflected in the taxonomy as well.
Climate change is one of the most important challenges facing today’s national economies. Extreme weather phenomena, rapidly rising river levels, and shrinking resources have an unprecedented impact on the functioning of entire economies. However, the existing mechanisms of functioning are unsustainable in the long term. Therefore, economies are focusing on more ecological goals, wanting to reduce the impact on the environment, and are implementing technologies for monitoring threats and the impact of extreme weather factors. In order to cope with the burden of climate change, it is necessary to optimize energy consumption. Additionally, with problems related to ensuring the security of energy supplies, it is necessary to take care of the optimization of energy use, which will give a fundamental advantage to such a modern economy.
Practical Contributions, Limitations, and Future Research Directions
The authors outlined a specific taxonomy in relation to the issue of energy efficiency in the IT area and presented lists of methods used and proposed solutions in practice. In this paper, we investigated techniques that improve the energy efficiency of software. We conducted a systematic literature review of five scientific databases. Based on our findings, we contributed to the body of knowledge in software engineering by systematically developing a taxonomy of techniques and establishing a set of methods available to software developers and testers when conducting energy efficiency testing.
More and more applications are making energy consumption critical. This is especially true for battery-powered devices and all mobile equipment. More and more suppliers of electronic components are paying great attention to these issues. This is of great importance for knowledge- and technology-based world economies because it has a direct impact on the operating costs of economic entities and thus on economic profitability. In addition, the use of energy optimization methods has a beneficial effect on the environment through the concept of sustainable development. Energy efficiency is an extremely important feature of all computing devices, and finding new ways to save energy during the operation of these devices, which are the foundation of the knowledge-based economy, should now be a priority for many integrated circuit designers and software developers. Currently, there are many hardware and software methods for optimizing energy consumption, and we are only at the beginning of their search and wide-scale application. In many segments and from different sides, techniques for reducing the consumption of electricity, which is the foundation of every economy, should be analyzed and sought for the good of the environment and the possibility of sustainable development of the world. Modern developed countries are looking for solutions to this problem to enable the further development of world economies, which may be threatened by the insufficient amount of energy supplied to modern economies.