Development of Evaluation Criteria for Robotic Process Automation (RPA) Solution Selection

: When introducing a robotic process automation (RPA) solution for business automation, selecting an RPA solution that is suitable for the automation target and goals is extremely difﬁcult for customers. One reason for this difﬁculty is that standardised evaluation items and indicators that can support the evaluation of RPA have not been deﬁned. The broad extension of RPA is still in its infancy and only a few studies have been conducted on this subject. In this study, an evaluation breakdown structure for RPA selection was developed by deriving evaluation items from prior studies related to RPA selection and a feasibility study was conducted. Consequently, a questionnaire was administered three times, and the coefﬁcients of variation, content validity, consensus, and convergence of factors and criteria were measured from the survey results. All of these measurement results are reﬂected in the ﬁnal suitability value that was calculated to verify the stability of the evaluation system and evaluation criteria indicators. This study is the ﬁrst to develop an RPA solution selection evaluation standard and the proposed evaluation breakdown structure provides useful evaluation criteria and a checklist for successful RPA application and introduction


Introduction
Robotic process automation (RPA) is a business-process-based software solution that automates and processes simple and repetitive tasks using software robots [1][2][3][4][5].Specifically, it processes organisational structured data and provides rule-based outputs [6,7].The term was coined in the early 2000s by the Blue Prism company, which introduced software robots based on screen scraping technology [8].RPA automates human behaviour and tasks, whereas artificial intelligence (AI) automates analysis and decision making by imitating intelligence and reasoning.Both technologies can enable various services individually or in combination [9].RPA has facilitated innovation in the productivity improvement of many industries [10].It originated in the field of information systems as a disruptive innovation that, among other automation solutions, has had a profound effect on job descriptions and work itself [11].Since then, RPA adoption has grown because RPA does not require dedicated software development and is a low-cost solution that requires a small workforce and minimal implementation time to automate operations.It can automate the functions of existing software, promote communication between IT and other departments, and can more easily recognise coding-related capabilities and knowledge compared to other development methods.In particular, software integration is made possible based on various tasks in existing environments, which facilitates low complexity with high efficiency and productivity [12], and inevitable process improvement [7,13] in Industry 4.0.Therefore, automation using AI technology has recently been widely introduced across many industrial sectors.RPA has been applied to more than 20 diverse business areas, including internal organisational operations, functional improvements, risk management audits, data analysis, and reporting [2,3], and many enterprises have successfully integrated RPA [2,6,[14][15][16][17][18].

Preliminary Research 2.1. Screen Scraping
In the term RPA, 'robotic' does not refer to a physical robot, but a 'computer process', in the sense that it replaces human cognitive work [14].This implies that perception and behaviour are connected intelligently.Therefore, when introducing RPA, it is necessary to distinguish between the role of RPA in existing stereotyped processes and the role of employees.
Figure 1 compares pre-and post-RPA business processing for an 'Order Details Processing Task'.Prior to applying RPA, an employee periodically logs in directly to the system.After confirming and verifying orders, the employee applies prices and discount rates that meet specific conditions, applies any additional discounts, and then charges the post-delivery price.However, after applying RPA, the employee only needs to perform the role of verifying order information based on contract terms and the other tasks are completed entirely by the RPA software.The technology that enables this process is called screen scraping (also known as web scraping or web harvesting).This term refers to a technique used to capture and decode text and bitmap data on a computer screen that is used primarily in web environments to extract and convert structured data from output data into a human-readable form [8]. Screen scraping allows users to specify the outline of a box around icons and labels [22], which then allows robots to identify and click areas that are not accessible through existing limited pixel-based coded screen scraping [1].A screen scraper communicates with the system as if it was an ordinary user, explores the user screen of the system, and reads information [23].Additionally, a screen scraper can serve as a component of a larger program outside the information system [23].A single-time scaler retrieves all information from an old information system and stores it in a new database, but uses a continuous scaler to keep the existing system active and retrieve information on the system screen when requested.Based on this principle, RPA identifies the patterns through which users perform tasks on existing legacy system screens.Developing automated lists of tasks from extracted patterns allows an RPA robot to repeat tasks directly in a graphical user interface (GUI) automatically.

Comparative Studies on RPA Solutions
Kim [2] divided RPA solution functions into robots, robot managers, and script-editing tools, as shown in Figure 2, and compared various RPA solutions.Ribeiro et al. [4] compared RPA intelligence functions between six RPA solutions with high market shares by dividing them into AI-related goals, technologies, and algorithms, as shown in Figure 3.The technology that enables this process is called screen scraping (also known as web scraping or web harvesting).This term refers to a technique used to capture and decode text and bitmap data on a computer screen that is used primarily in web environments to extract and convert structured data from output data into a human-readable form [8]. Screen scraping allows users to specify the outline of a box around icons and labels [22], which then allows robots to identify and click areas that are not accessible through existing limited pixel-based coded screen scraping [1].A screen scraper communicates with the system as if it was an ordinary user, explores the user screen of the system, and reads information [23].Additionally, a screen scraper can serve as a component of a larger program outside the information system [23].A single-time scaler retrieves all information from an old information system and stores it in a new database, but uses a continuous scaler to keep the existing system active and retrieve information on the system screen when requested.Based on this principle, RPA identifies the patterns through which users perform tasks on existing legacy system screens.Developing automated lists of tasks from extracted patterns allows an RPA robot to repeat tasks directly in a graphical user interface (GUI) automatically.

Comparative Studies on RPA Solutions
Kim [2] divided RPA solution functions into robots, robot managers, and script-editing tools, as shown in Figure 2, and compared various RPA solutions.Ribeiro et al. [4] compared RPA intelligence functions between six RPA solutions with high market shares by dividing them into AI-related goals, technologies, and algorithms, as shown in Figure 3.
The results for each major RPA vendor presented in The Forrester (Cambridge, Massachusetts) Wave Evaluation [24] were divided into solution functions and strategies.Scores are provided on a scale of weak (0) to strong (5).These data represent an evaluation of the top vendors in the RPA market and do not describe the entire vendor landscape.Each vendor's position on the vertical axis of the graphic indicates the strength of its current offering [24].The key criteria for these solutions include task and process discovery, portfolio analysis, bot design and development, deployment and management, security and governance, scaling experience, and architecture [24].Placement on the horizontal axis indicates the strength of vendor strategies [24].This represents the product vision and innovation roadmap, delivery and support models, financial performance, and partner ecosystem [24].The functional analysis results are presented in Figure 4.The results for each major RPA vendor presented in The Forrester (Cambridge, Massachusetts) Wave Evaluation [24] were divided into solution functions and strategies.Scores are provided on a scale of weak (0) to strong (5).These data represent an evaluation of the top vendors in the RPA market and do not describe the entire vendor landscape.Each vendor's position on the vertical axis of the graphic indicates the strength of its current offering [24].The key criteria for these solutions include task and process discovery, portfolio analysis, bot design and development, deployment and management, security and governance, scaling experience, and architecture [24].Placement on the horizontal axis indicates the strength of vendor strategies [24].This represents the product vision and innovation roadmap, delivery and support models, financial performance, and partner ecosystem [24].The functional analysis results are presented in Figure 4.The results for each major RPA vendor presented in The Forrester (Cambridge, Mas sachusetts) Wave Evaluation [24] were divided into solution functions and strategies Scores are provided on a scale of weak (0) to strong (5).These data represent an evaluation of the top vendors in the RPA market and do not describe the entire vendor landscape Each vendor's position on the vertical axis of the graphic indicates the strength of its cur rent offering [24].The key criteria for these solutions include task and process discovery portfolio analysis, bot design and development, deployment and management, security and governance, scaling experience, and architecture [24].Placement on the horizonta axis indicates the strength of vendor strategies [24].This represents the product vision and innovation roadmap, delivery and support models, financial performance, and partner ecosystem [24].The functional analysis results are presented in Figure 4.

Studies on RPA Solution Evaluation Elements
The US Federal RPA Community of Practice [25] has proposed evaluation elements for each department in terms of technical capabilities, process management, and operations, as shown in Table 1.

Studies on RPA Solution Evaluation Elements
The US Federal RPA Community of Practice [25] has proposed evaluation elements for each department in terms of technical capabilities, process management, and operations, as shown in Table 1.For processes, only review items that are applicable to the RPA solution selection evaluation criteria were extracted from [8], reconstructed, and included in the listed categories.
Major RPA vendors attended the RPA Introduction Guide Seminar [26] sponsored by the Korea Electronic Newspaper and announced evaluation criteria for RPA solution selection.Here, Chan Sik Bong from KPMG proposed the selection of a solution with sufficient references to prioritise enterprise-level introduction and stable construction when introducing an RPA.Sean Lee, who is the managing director of Automation Anywhere, explained the derivation and verification of non-functional requirements, including automation functional requirements, architectural requirements, and development convenience/operability/maintenance/security requirements.Gye Kwan Kim, who is the CEO of Grid One, opined that 'Korea's IT environment should include not only company businesses, similar cases, business areas, and investment efficiency (return on investment, ROI), but also the ability to perform tasks in non-standardised GUI environments such as ActiveX and Flash'.Myung Su Jo, who is the managing director of Deloitte, announced that the prime considerations for RPA solution selection should be application capabilities, technical compatibility, manufacturing capabilities, and pricing.

Business Structural Optimisation Studies on Improving RPA Operational Efficiency
Algorithms that minimise the number of robots [27] when introducing RPA and algorithms that optimise area clustering and storage location allocation in process automation cloud systems are important considerations for operational efficiency.Strategically, new capabilities in terms of the average cost of automation, total investment cost, quality control, optimisation of management, and control of automation productivity have been presented as important factors for consideration [25].Following in-depth experimentation on the RPA tools available in the market, the authors of [28] developed a classification framework for product categorisation and a methodology for selecting target tasks for robotic process automation using human interface records and process mining techniques [29].Additionally, the authors of [30] developed an application to automate data acquisition process management and control by applying an RPA implementation workflow.

Research Procedure and Methodology
The sequence and method of implementation adopted in this study are summarised in Figure 5.
presented as important factors for consideration [25].Following in-depth experimentation on the RPA tools available in the market, the authors of [28] developed a classification framework for product categorisation and a methodology for selecting target tasks for robotic process automation using human interface records and process mining techniques [29].Additionally, the authors of [30] developed an application to automate data acquisition process management and control by applying an RPA implementation workflow.

Research Procedure and Methodology
The sequence and method of implementation adopted in this study are summarised in Figure 5.

Research Procedure
The first step in this study was to structure collected evaluation items.Therefore, comprehensively structured items that should be evaluated when selecting RPA solutions were collected from existing resources, including literature reviews, press releases, and seminar videos.It was necessary to consider the efficiency and productivity of the introducing organisation and to be practical from both strategic and construction perspectives.I applied a solution-lifecycle-level approach progressing from user or organisational requirements to actual construction, management, and control tasks.Consequently, the initial introduction, functional, infrastructure, and vendor support aspects were evaluated comprehensively.

Research Procedure
The first step in this study was to structure collected evaluation items.Therefore, comprehensively structured items that should be evaluated when selecting RPA solutions were collected from existing resources, including literature reviews, press releases, and seminar videos.It was necessary to consider the efficiency and productivity of the introducing organisation and to be practical from both strategic and construction perspectives.I applied a solution-lifecycle-level approach progressing from user or organisational requirements to actual construction, management, and control tasks.Consequently, the initial introduction, functional, infrastructure, and vendor support aspects were evaluated comprehensively.
The second step was to derive evaluation criteria for RPA solution selection.Therefore, I developed a draft RPA solution evaluation standard based on the detailed evaluation department and evaluation items finalised in the structured evaluation item results.The proposed RPA solution selection evaluation system consists of three layers: evaluation department, evaluation item, and evaluation criteria.Each layer is based on similarity and the group names of the evaluation department and evaluation item were defined by referencing existing resources [2,4,8,[24][25][26].
The third step was to verify the RPA solution evaluation criteria.Therefore, the Delphi survey method was used to verify the evaluation criteria for the proposed RPA solution.The validity of the questionnaire was evaluated using a seven-point Ricardo scale, and the coefficient of variation (CV), content validity ratio (CVR), conformity assessment (CA), and convergence degree (CGD) of the questionnaire results were calculated.If all validity measurements were satisfactory, then it was deemed that the RPA solution-phase criteria were satisfied.
(1) Stability measured using the CV The CV measures the value of measurement data and uses measured values as the basis for determining the agreement between panels [31,32].It is the ratio of the standard deviation to the mean (average), as defined in Equation ( 1) [33].Based on the study by Khorramshahgol and Moustakis [34], it was judged that a CV value below 0.5 is stable, a value of 0.5 to 0.8 is relatively stable, and additional questionnaires are required for a CV above 0.8 [34].
(2) CVR The CVR is defined as the total number of exports divided by the number of 'important' responses [31], as shown in Equation ( 2).The effective minimum value of the CVR based on the number of experts was determined by Lawshe [35,36].In this study, there were 11 experts, so it was judged that a CVR value of 59 or more would satisfy the relevant conditions.
Here, n r refers to the number of panel members indicating an item as 'essential', and N refers to the total number of panel members.
(3) Consensus degree (CSD) and convergence degree (CGD) To determine whether a panel is looking for agreement, the results presented by Delbecq et al. [37] were applied to measure the CSD and CGD, where the CSD was required to be at least 0.75 and the CGD was required to be less than 0.5, as defined in Equations ( 3) and ( 4).
Median = median value Q1 = first quartile, 25% of the total Q3 = third quartile, 75% of the total (4) CA CA applies Equation ( 5) to the CVR, CSD, and CDG values calculated using the equations presented above.As shown in Equation ( 5), CV, CVR, CSD, and CGD are all considered to be 'conforming' in the RPA solution selection evaluation criteria.
The final step is defining the RPA solution evaluation criteria.The Delphi survey method verifies the N-order evaluation criteria and determines the appropriate evaluation criteria for RPA solutions.Therefore, anonymous experts were asked about the evaluation criteria for RPA solutions after reflecting on the opinions of experts in the first round and re-questioning the revised evaluation criteria.

Research Methodology
In this study, the Delphi methodology was employed to develop indicators for the developed RPA breakdown structure and selection criteria with help from experts.The Delphi method refers to a 'set of procedures to guide experts' views on the issues they want to predict and summarise them into a comprehensive judgement' [38].This method can be used in scenarios where relevant research is insufficient or new evaluation standards are to be developed [17,31].This method can also be used as a judgement, decision support, or prediction tool because Delphi surveys often help in understanding problems, opportunities, and solutions, as well as developing predictions [22].
In this study, the evaluation department, evaluation items, and evaluation standard of the RPA solution derived from existing sources were evaluated based on a seven-point Ricardo scale.Currently, there are no international standards or guidelines for selection criteria for RPA.To the best of my knowledge, this is the first study to address this problem.The Delphi methodology is suitable in that expert opinions are considered as much as possible to derive meaningful items by collecting informed ideas.The stability of factors and criteria were measured using CV, CVR, CSD, CGD, and CA, and then filtered.
To apply the Delphi survey method, RPA consulting and construction experts were defined as people with at least three years of relevant work experience.Delbecq et al. [37] suggested that between 10 and 15 people should be included for an appropriate number of members of the Delphi method group.Therefore, this study used a panel of 11 experts.

Developing Evaluation Criteria for RPA Solution Selection 4.1. Structuring Selected Evaluation Items
A total of 87 major criteria were derived and structured selected evaluation items were obtained from the collection of candidate items for RPA evaluation criteria that should be considered by an organisation when selecting RPA solutions, as described in previous studies.The details of the collected criteria are presented in Table 3.

Deriving Evaluation Criteria for RPA Solution Selection
The selection criteria were divided into evaluation categories, evaluation items, and evaluation criteria.The evaluation department converged the contents collected from existing resources [2,4,8,[24][25][26] as closely as possible, resulting in categories of 'introduction strategy', 'functionality', 'technical architecture', and 'operational management'.Regarding the evaluation criteria, the collected considerations and evaluation criteria defined in previous studies were rearranged according to their affinity.Next, similar standard names were renamed to a single name with the same meaning and only one duplicate item was deleted for the same standard.
Finally, evaluation items were defined as comprehensive representations of the considerations contained in each group to form a final refined evaluation criteria group.In this process, I attempted to maintain the framework of the evaluation criteria groups by referring to the results of prior research [2,4,8,[24][25][26].
Consequently, seven evaluation items for the introduction strategy evaluation department were evaluated based on economic validity, supply maintenance, technical compatibility, real-time decision-making support, strategic compatibility, and process.
The functionality evaluation items map robot management and operation, analysis/categorisation/prediction, automation, and process evaluation criteria.The technical architecture evaluation department derives security and architecture evaluation items and maps detailed evaluation criteria for each evaluation item.The operational management evaluation department derives operational and standardised asset management evaluation items and defines the key evaluation criteria for RPA operations.The final evaluation criteria for RPA solution selection are presented in Table 4.For the convenience of the composition of questionnaires and the preparation and analysis of the results of questionnaires according to the evaluation index, the numbers under column 'No.' refer to the evaluation items and criteria.Additionally, for the same purpose, the evaluation department, evaluation items, and evaluation criteria are represented by I, II, and III, respectively.

RPA Solution Evaluation Criteria Verification
To verify the RPA solution evaluation criteria, experts with more than three years of experience in RPA construction and operation in Korea were invited.These experts consisted of RPA service supply groups such as RPA solution vendors, builders, consultants, and customer groups that introduce and operate RPA services.Detailed information regarding the final participating experts is provided in Table 5.A total of three Delphi surveys were conducted to verify the criteria for RPA solution selection.The questionnaire questions were repeated in the form of 34 items for the first round and the evaluation indicators for 27 items were included in the second round.The initial development of evaluation departments, evaluation items, and evaluation criteria reflects the results of validity measurements and suggestions for evaluation indicators.
Validation was performed three times and the main contents of each step of verification are summarised below.
First Verification Overview: The validity of the evaluation criteria listed in Table 4 was verified for each I. evaluation category, II.evaluation item, and III.evaluation criterion.After deriving indicators satisfying the conditions of CV ≤ 0.7, CVR ≥ 0.59, CSD ≥ 0.75, and CGD ≤ 0.5, six indicators were identified as appropriate, as shown in Table 6.The column headings a , b , c , and d in Table 6 indicate the conformity for each value, where '0 represents 'suitable' and '1 represents 'unsuitable'.
When revised carefully, the evaluation categories as a whole and evaluation criteria for 'Architecture' and 'Technical Architecture' were considered as security evaluation items.These criteria are the most stable standards for evaluation.
Next, the opinions of the first expert evaluation were incorporated.The RPA solution selection evaluation benchmark index was improved by reflecting the 'Proposal of Evaluation Criteria' of experts for each questionnaire item.As a result, the names associated with the evaluation department were consolidated from 'customer introduction strategy', 'functionality', 'development and operability', and 'operation management' into the name of 'operation management system'.Regarding the evaluation items, the real-time decision support and strategy integrity evaluation items of the introduction strategy evaluation department, analysis/classification/prediction, and process evaluation criteria of the functional evaluation department were rearranged.
Evaluation criteria for AI technology collaboration and expansion of the functional evaluation department were added, including real-time decision support for the introduction strategy evaluation department and analysis/category/prediction.To include the revised evaluation criteria, the names of the evaluation items were revised as deemed necessary.Consequently, the introduction strategy evaluation department revised its evaluation item names by adding 'solution supplier capacity' and changing 'technical integrity' to 'technical policy integrity', 'security policy' to 'security policy conformity', and 'process' to 'methodology'.
In the functional evaluation department, the names of the evaluation items were revised by adding 'robot management and operability' and changing 'automation' to 'automation process development ease'.In the management evaluation department, 'management' was changed to 'management policy' and 'standardised asset management' was changed to 'information asset management policy'.The criteria for selecting RPA solutions that reflect the results of the validity evaluation and expert questionnaires are summarised in Table 7.In this table, 'N1 represents drafts developed via literature research and 'N2 represents revised drafts.'N3 represents deleted drafts and 'N4 represents movement between evaluation items.Many of the items marked with 'O' in the validity evaluation were also corrected.Reference customer case holder [8,26] solution provider capability [26], product vision [24], dealer market awareness [8], existing results in the same field [8], terms and conditions [8], product and service support [24], support capabilities, education and customer service [8], partner ecosystem [24] Technology policy conformity N2 Purpose of application of RPA introduction N4 , hardware/software requirements [8], technical elements (OS/hardware requirements and technical capabilities required for RPA deployment and operation) [8], technical compatibility [26], performance [24], system interaction and integration [8], RPA program application consistency [8], scalability N1 , relatedness to other technologies N1 , portfolio [24] N4 , revolutionary roadmap [24] N4 , risk management strategy based on risk analysis evaluation [8,26] N4 , mission and objectives of companies and agencies [8] N4 , leadership priority and strategy, initiative [8] N4 , licensing structure [25] N4

Technical architecture
Security [8] Compliance with legal systems such as personal information protection, account and personal identification management, data encryption/protection, application security, risk/security evaluation, authentication, process traceability Architecture [8] On-premise/cloud, virtualisation support using VM/content technology N2 , availability/disaster recovery capabilities, permissions, network capacity, performance management capabilities review, RPA program technical policy/architecture consistency N4 , availability of duplex configuration N1 , collaboration structure with customer's internal system N1 Code sharing method, technical policy update, RPA lifecycle management, licence management, standardised operating models N1 N1 : Added to reflect opinions from the primary expert survey.N2 : Modified to reflect opinions from the primary expert questionnaire.N3 : Deleted to reflect opinions from the primary expert survey.N4 : Modified to reflect opinions from the primary expert questionnaire.
To the best of my knowledge, this is the first expert questionnaire on RPA solution selection criteria and the conditions of CV ≤ 0.5 and CVR ≥ 0.99 were applied in consideration of the large number of relocations in evaluation items and criteria for actively incorporating expert opinions.To improve the accuracy of our study, the results in Table 6 were not used in their initial form and final amendments were applied to a more stringent standard.To this end, the results of Survey I.1 were adopted in the first Delphi survey.Additionally, the four evaluation items II.5, II.6, II.9, and II.11 were deleted, and the II.16 evaluation item was added.The evaluation criteria III.5, III.6, III.9, and III.11 included in the four deleted evaluation items were deleted or rearranged for form other evaluation items.
Second Verification Overview: For the development of the secondary verification questionnaire, the evaluation target index was selected based on whether CV ≤ 0.5 and CVR < 0.99 were satisfied by the verification result criteria of the primary questionnaire.Additionally, all indicators, deleted evaluation items, and deleted evaluation criteria were excluded.When completing the second questionnaire, the experts could easily correct the results by providing mean, standard deviation, stability, validity, consensus, convergence, and final judgement values.A total of 27 questionnaire items were presented.Considering that the evaluation department, evaluation items, and evaluation criteria were based on the opinions collected in the first round, the stability index was CV ≤ 0.75 and validity was determined according to CVR < 0.59 (p = 0.05) (Table 8).
The results of the second Delphi questionnaire were derived from 11 appropriately fitted items.A detailed examination of each indicator identified 'I.2.Development and operability' and 'I.3.Technical architecture' as appropriate categories for the evaluation department.Among the evaluation items, 'II.2.Solution supplier capabilities', 'II.3.Technical policy consistency', 'II.8.Robot management and operability', and 'II.13.The architecture' were identified as appropriate items.Among the evaluation criteria, III.1, III.3, III.8, III.10, and III.13 were identified as conforming.
Next, the opinions of the secondary expert evaluations were reflected.The RPA solution selection evaluation standard index was refined again by reflecting the contents of the 'Evaluation Standard Opinion Proposal' presented in the second questionnaire.No additional opinions were expressed by the evaluation department according to the table breakdown.Regarding the evaluation items, it was suggested that development and evaluation methods are necessary for the evaluation items considered by the introduction strategy evaluation departments of customers.Furthermore, 'methodology' was revised to 'discovery and appropriateness evaluation of automation work objects', and 'automation process development and evaluation methodology' was added to the evaluation criteria.Additionally, for the development and operability evaluation department, 'methodology' was revised to 'automation process development and convenience' because 'ease of automation process development' did not include evaluation criteria.The evaluation items for the management system evaluation department and information asset management policy are ambiguous, so no differences appeared.However, 'operation policy' emphasises that RPA falls under information service operation management policy.Therefore, 'individual information service operation system' and 'information asset management policy' were changed to 'company-wide information asset management system'.Other changes, including changes to the evaluation criteria, are presented in Table 9.The contents of this table are described in the form of 'Evaluation Item Number: Evaluation Criteria Elements'.For example, '4: Consistency with customer security architecture' indicates that item 'II.4 Security policy consistency assessment' has added an evaluation criterion called 'Consistency with customer security architecture'.Third Verification Overview: To develop a questionnaire for the third round of verification, only 16 indicators that were not selected in the results of the second round of verification were included.The final adoption criteria were indicators satisfying Equation ( 5), and the results are presented in Table 10.Of the 16 included indicators, 12 were confirmed to be valid and four were found to be inappropriate.The CVR of the evaluation items in II.7 and II.15 was each −0.09 and 0.45.The CVR of the evaluation criteria in III.7 was −0.09 and that of the evaluation criteria in III.15 was −0.27.Despite failing to satisfy the CVR requirement, evaluation item II.15 satisfied the requirements for stability, agreement, and convergence.Paradoxically, most experts disproved that the associated evaluation criteria were inappropriate.In the end, evaluation items II.7 and II.15, and evaluation criteria III.7 and III.15 were eliminated.
Next, the opinions of the third expert survey were incorporated.First, the evaluation criteria for III.7 and III.15 reflect the opinions of experts from the first and third surveys, and the evaluation criteria for automation policy development and convenience evaluation are contained in II.10.Other calculations [8], process engineering, and evaluation [25], which met the initial evaluation criteria derived from our literature search in terms of departmental unit objectives and enterprise unit objectives, were deleted.The evaluation criteria of III.15 were also deleted.III.15 defines the evaluation criteria corresponding to the company-wide operational management system to be observed during the operation of 'III.14 individual automated processes'.Therefore, the III.15 evaluation criteria were modified into expressions suitable for individual automated operating systems such as code sharing [8], RPA lifecycle management [8], licence management [8], common module standardisation, and product repository management.
The evaluation criterion name In II.12 was revised to 'supplementary management' because it was suggested that it should be revised to 'security management' to enhance the 'consistency of security policies' and 'differentiation of the introduction strategy'.The evaluation criterion of 'character recognition ability to handle the specificity of native languages without exception [2,26], OCR' was revised to 'OCR (printed), OCR (written)'.The results of the other detailed opinions are presented in Table 11.The RPA solution evaluation criteria were divided into four evaluation departments and 10 evaluation items, each of which contains several evaluation criteria.The customer introduction strategy evaluation department introduced 'economic feasibility', 'solution supplier capabilities', 'technical policy consistency,' and 'security policy consistency' evaluation items.In the development and operability evaluation department, 'robot management and operability', 'automation process development and convenience', 'AI technology connection and extension, 'security management', and 'architecture evaluation' were defined.The final RPA solution selection evaluation criteria for each evaluation item are presented in Table 12.In this table, the evaluation item numbers for the evaluation categories defined in this study and the flow of results from previous surveys are presented to help readers understand the development of the evaluation criteria.
Through the survey process, some of the criteria derived from our literature research were deleted or revised and various additional opinions were incorporated based on expert knowledge and experience.
In the evaluation of customer deployment strategies, cost items are divided into solution prices, deployment costs, licensing costs, operating costs, etc.In the evaluation of technical policy consistency, automated process development and evaluation methodology were added as standards for evaluating the conformity of the security architecture of a customer in relation to security policy conformity.
In the development and operability evaluation department, the 'Business Operations Analysis Function [25]' was modified to be more intuitive and concrete in the form of 'Robot Operational Status Summary Function [25]'.The development and convenience evaluation items for automated processes were revised to provide a better understanding of existing indicators, including customer performance procedures (manual, automation)

Implications
RPA is limited to specific businesses and often accompanied by robot operations.Although RPA it is a software tool, it has limitations in that it cannot directly apply the technical evaluation criteria used in software construction.Therefore, organisations that wish to introduce RPA must establish appropriate criteria for selecting solutions.This minimises the time and effort required to modify and standardise subsequent maintenance and operational processes to match solutions by standardising and selecting appropriate applications for RPA.Even if there is no IT specialisation, it can easily be incorporated and so-called shadow IT introduction may increase.Therefore, RPA management should also be considered at the enterprise architecture standardisation and integration level in an enterprise-wide resource management system.
One expert participating in our Delphi surveys suggested that organisational consideration should be given to the evaluation criteria of RPA solutions for enterprise and agency missions and objectives [8], leadership priorities, and strategies.Another vendor expert stated that 'RPA's information management and business data management are often independent of customer companies, and the involvement of suppliers is limited, so it is necessary to manage data standards'.This company-wide issue is one of the evaluation items associated with the operating management system evaluation department defined in this study and is the main reason why this evaluation department and its corresponding evaluation items are maintained.
AI technology collaboration and scalability evaluation items were established in terms of development and operability, which is consistent with the current trend of selecting RPA solutions starting with the introduction of AI.In particular, 'robot management and operation' and 'automation process development and operation', which are not typically considered in the software field, are emerging as unique elements compared to other evaluation criteria.
Even if RPA is introduced based on these solution selection evaluation criteria, further efforts as a company are essential to recognise and utilise RPA in an organisation in the early stages of RPA development.Because RPA aims to automate repetitive business processes that have been standardised by companies, it is necessary to change the structure of an organisation to one that can further enhance and add value to existing human resources.Furthermore, the efficiency of operations achieved through RPA should be linked to the performance evaluation of individuals and their organisations, and the results should be shared as best practices at the company level to induce the spread of operational efficiency.Vendors and designers of RPA solutions should strive not only to promote companies that wish to introduce RPA, but also to form active partnerships that can promote the development of solutions that suit partners.Additionally, RPA lacks a consistent vocabulary.Therefore, a vendor-independent conceptualisation of RPA relationships between vocabularies is required [39].

Potential Threats to the Validity of this Study
The evaluation criteria used for selecting RPA solutions should be carefully selected according to their level of importance and the characteristics of the target industry group.Recently, several studies on various applications of RPA [40][41][42][43][44][45][46] have been published.It has been emphasized that the aspect of modelling improvement through implementations, adaptations, changes, and tracking that meet the needs of the target business environment should be prioritised.This indicates that the problem of the productivity paradox [47], where productivity decreases with increasing IT investment, including RPA solutions, may appear.
The current practice for developing RPA is to observe how routines are executed and then implement the executable RPA scripts that are necessary to automate routines using software robots based on evaluations by skilled human experts [48].However, process optimization through intelligent automation that can interpret the UI logs of routine executions and support changes in automation routines for intermediate inputs is still in the research stage.Regardless, some studies [49,50] have indicated the emergence of new trends in IT that will pave the way for developing new methods of achieving sustainability that are very noteworthy for RPA adoption and selection.This implies that when introducing RPA, one should not overlook the fact that RPA is continuing to develop and evolve.Therefore, it must be clarified whether the goal of introducing RPA is simply automation, or process integration, intelligent automation, and autonomous intelligent work that enables decision making to minimize potential risks and threats when investing in and constructing IT, including RPA.

Conclusions
In recent years, RPA has been rapidly adopted by commercial organisations to automate repetitive business processes [19,20].However, with various RPA solutions available on the market, it is difficult for companies to select RPA solutions that suit their business characteristics and processes.No formal evaluation criteria for RPA solution selection have been developed to alleviate this issue.
In this study, I developed evaluation indicators that can be used to select an optimal RPA solution for a specific enterprise.Based on a literature review, evaluation indicators were subdivided into evaluation departments, evaluation items, and evaluation criteria, and organised hierarchically.Eleven experts rated the validity of the derived evaluation indicators through three Delphi surveys.As a result, ten evaluation items were assigned to four evaluation departments and the evaluation criteria to be considered for each item were presented in detail.The customer deployment strategy evaluation department focuses on items of 'economic feasibility', 'solution supplier capabilities', 'technical policy consistency', and 'security policy consistency evaluation'.The development and operability evaluation department considers 'robot management and operability', 'automation process development and convenience', 'AI technology collaboration', and 'extension evaluation items'.
The technical architecture evaluation department considers 'security management and architecture evaluation items', as well as 'operation management'.The system evaluation department considers individual automated process operating system evaluation items.
This study is of great significance for the development of evaluation indicators for RPA solution selection.Additionally, the evaluation criteria for each evaluation item presented in the developed evaluation index can be used as a checklist when applied in practice.This should allow organisations that are introducing RPA and those who lack an understanding of RPA to select RPA solutions that are optimised for enterprise and business characteristics.Finally, the presented evaluation standard can provide a theoretical reference for revising technical evaluation laws and regulations related to national software projects such as Korea's software technology evaluation standard.
Regardless, because this study did not consider the selection of RPA solutions for a specific company, the feasibility of the derived RPA solution evaluation criteria must be verified through additional studies.As part of a follow-up study, I intend to conduct further research on the weight calculation for each indicator so that the work characteristics of each company are reflected at the most optimal level.

Figure 3 .
Figure 3.Comparison of technologies and goals associated with AI.Adaptation based on Ref. [4].

Figure 3 .
Figure 3.Comparison of technologies and goals associated with AI.Adaptation based on Ref. [4].

Figure 3 .
Figure 3.Comparison of technologies and goals associated with AI.Adaptation based on Ref. [4].

Figure 5 .
Figure 5. Overview of research process and methodology.

Figure 5 .
Figure 5. Overview of research process and methodology. N4

Table 1 .
Evaluation factors by dimensional classification of RPA product functions according to the US Federal RPA Community of Practice.Adaptation based on Ref. [25].

Table 1 .
[25]uation factors by dimensional classification of RPA product functions according to the US Federal RPA Community of Practice.Adaptation based on Ref.[25].

Table 3 .
Numbers of key criteria for the evaluation of RPA selection collected through preliminary research.

Table 4 .
Derived results for evaluation criteria for RPA solution selection.

Table 5 .
Information regarding experts who participated in this study.
1 Solution: AO (Automate One), AA (Automation Anywhere), BR (Brity RPA), BP (Blue Prism), UP (UiPath), AW (A.Works).a Number of constructions.b Construction experience solution.c Number of years of RPA construction experience.d RPA Operation solution.e Number of RPA systems introduced.f Introduction and operation period.g Field of introduction operations.

Table 6 .
First conformity assessment results.

Table 7 .
Updated selection criteria based on the first round of Delphi evaluation opinions.

Table 8 .
Secondary conformity assessment results.

Table 9 .
Updated selection criteria following the second Delphi evaluation survey.

Table 10 .
Third conformity assessment results.

Table 11 .
Updated selection criteria based on the third Delphi evaluation survey.