Managing Engineering Change within the Paradigm of Product Lifecycle Management

: Managing change in organizations is a laborious task that consumes value added time in various segments of the product lifecycle including design and development, production, delivery, and product disposition. Product lifecycle management plays an important role in minimizing the time required for managing engineering changes. This research aims to perform an extensive survey of the literature in this area. There is no consolidated review available in this area summarizing advances in engineering change management vis- à -vis product lifecycle management. Thus, the paper gives an overview of product lifecycle management-based thinking and change management. This review puts forward the most relevant research regarding the practices and frameworks developed for managing engineering change in an organization. These include model-based deﬁnition, digital twin, process-based semantic approach, service-oriented architecture, Uniﬁed Modeling Language, and uniﬁed feature modeling. The gaps between the extents of conformance to success factors have been identiﬁed as extent of integration, standardization, versatility of application, support of existing systems, and the extent of product lifecycle support.


Introduction
The growth of Product Lifecycle Management (PLM) in today's businesses has been rapid. This growth has been attributed to the ever-growing demand for novel product design, shortening of product lifecycles, and increased customization of consumer products [1,2]. Product definition can change rapidly within an organization, and in response to these changes systems must be robust enough to adapt to these changes. Establishing support systems to manage the rapidly changing and volatile global landscape requires adaptable techniques for requisite change management. Historically, the need was met with Product Data Management (PDM) which slowly evolved into PLM, and it encompasses broader aspects of product lifecycle in comparison with PDM. The evolution of PLMbased thinking needs maturing, in this regard one of the major barriers in implementation of PLM thinking-based change management is lack of standardization among different platforms [3].
In perspective, for the development of support structure in terms of Model-Based Definition (MBD), Product Manufacturing Information (PMI) has been formalized by using various neutral file formats such as GES, IGES, and PRC. These were used to integrate design and production management [4], however, the business changes occurring during and after these phases were not managed. STEP was the answer to this question, and the standardization of STEP (AP242) gave businesses the opportunity to include business changes in files across different functions of the organization. Then came the Digital Twin (DT) concept that integrated real-time data with virtual management environments seamlessly [5]. These change management tools help organizations to overcome time wastages in the process of change and help organizations to take quick and informed decisions.
There are multiple domains of research taking place in the area of Change Management (CM) in the context of PLM. These domains include managing change through semantic manipulation [6], establishment of unified modeling languages for managing change [7], engineering change management via interoperable file formats between various platforms [8], development of unified modeling techniques including PMI [9], software-based change management architecture, and generation of virtual models based on real-time data [5]. The goal of research in these areas is to standardize any specific aspect of CM. However, a holistic review outlining the major aspects of all these domains is absent in the literature to date. The present endeavor is to fill this gap by bringing together highlights of all these areas and to put forth the future avenues of research vis-à-vis engineering change management and CM throughout Product Lifecycle (CMPL).
The aims of this paper are to provide: RQ1: an overview of PLM, and a generic summary of CM methodology as presented as described in the literature.
RQ2: a brief review of techniques chosen by the authors along with the highlights of the important literature.
RQ3: highlighting of the gaps found between the interfaces of various techniques. The literature review is laid out in Section 2, and the methodology is presented in Section 3. Finally, the results, discussion, and conclusion are presented in Sections 3-6.

Historical Development of PLM
PLM gives the opportunity to control the complete lifespan of a product and its related information. PDM is a subdivision of PLM and was its predecessor. In the early 1980s with the introduction of CAD/CAPP/CAE, product designers were able to create digital designs of products easily. These were easy to modify and reuse as compared to the designs on paper. The ever-growing volumes of the designs seemed unmanageable. As a result, first Engineering Data Management (EDM), then PDM, emerged [10].
The first version of PDM only addressed the engineering realm of the product's lifecycle. It failed to affect the non-technical activities such as supply chain, sales, marketing, and external entities such as end customers and vendors. After the onset of web-based PDM systems, engineering information became open and accessible to all stakeholders, both internal and external. PLM appeared in the late 1990s with a target of covering all aspects of product lifespan, i.e., from initial idea to its disposal [11]. PLM has a broader approach to the definition, organization, and analysis of product-related processes. It includes products, data, applications, processes, people, work methods, and equipment. It combined the activities of PDM and Business Process Management (BPM). BPM is another effective tool, concerned with managing business processes for optimization [12]. PLM brings the two together seamlessly [13]. PLM incorporates a product-centric vision enabled by the adoption of advanced ICT solutions fostering collaboration among many actors and organizations [10]. In 2000, the concept of PLM 1.0 was started and dominated till 2018. The improved version of PLM 2.0 emerged in 2018 [7].

Benefits and Impact of PLM
The grand idea of PLM is to create a "single version of truth" about a product including all aspects of information-requirements, design, engineering, and manufacturing [8]. It defines all the processes related to it in an organization and declares it the basis of all resources. It helps in increasing profitability, reducing costs associated with time, quality, and value added services, it maximizes the marketing portfolio of the product, and it increases new product developments in reduced time. PLM grasps the control of a product across its lifecycle. It improves the process of product development, allowing companies to bring novel products in a short time. It helps nurture creativity in organizations, reducing time to market for new products and providing unparalleled after sales services for current products. It helps companies to develop, produce, and assemble components at different sites around the world. PLM facilitates the teamwork between design chain and supply chain [12].
PLM's core is the efficient distribution, retrieval, and storage of data relating to all the aspects of a product lifecycle. It enables quick finding, distilling, spreading across various domains, and reuse of data on a routine basis. The work that has been carried out is sharable. PLM provides the basic gadgets and information to minimize the product-related cost. It also cuts recall, warranty, and recycling costs. It also helps companies seize the many market opportunities for new products. It enables the value of a product to be maximized [14].
The benefits of PLM are specific, measurable, achievable, and realistic. Typical objectives for PLM are to increase profitability by 30% and decrease related sustainable costs by 50% [14].
As this century has moved forward, the environment of product management has become increasingly complex and undergone frequent changes. Some of these changes can be termed as macroeconomic, geopolitical, environmental, cultural, and social. Others are due to new technology revolutions happening all around the world. Changes create opportunities, some create problems, and they also have associated risk as change in one area may lead to enhanced risks in another area [12].

PLM's Role in Change Management
PLM presents a developing approach and an integrated and distributed software tool to foster the process of change management. It accomplishes this by streamlining the processes of idea/comment collection, tracking entries, central data collection, data arrangement and analysis, recurring processes of approval, and documentation changes. After documentation changes, PLM also helps in up-to-date data accessibility for implementation and references [14].
Knowledge linkages that come out of a PLM collective database are very helpful in establishing correlations between different operations and data from different domains of knowledge. This improves the originality of change processes. Distributed and comprehensible knowledge comprising a complete product lifecycle in appropriate detail that is easily accessible is a source of competitive advantage in today's global markets and dispersed organizational presence. Contributors from different domains can efficiently make use of and contribute to their collective knowledge base by reducing design lead times, easy standardized information retrieval, and inspecting the product lifecycle data for further insights [14,15].

General Overview of Change Management
Change management is referred to in various forms:  [17].
The focus of this paper is on ECM and CMPL. CM can further be categorized into various phases of PL including the development and introduction phase, maturity and growth stage, and finally the decline phase.
It can be seen that the change management process is inversely proportional to the succession of product lifecycle phases. It has a high percentage during the design and development phase and has very little contribution to the overall lifecycle of a product during the decline phase [18]. Below is the overview of CM along with the tools and techniques utilized by the industry to implement CM throughout the organization.

Change Management Process
The general change process sequence as discussed by Tavcar and Duhovnik [15] can gives a basic idea about how changes are handled or should be handled systematically. This process is illustrated in Figure 1.
Processes 2022, 10, x FOR PEER REVIEW 4 of 16 during the decline phase [18]. Below is the overview of CM along with the tools and techniques utilized by the industry to implement CM throughout the organization.

Change Management Process
The general change process sequence as discussed by Tavcar and Duhovnik [15] can gives a basic idea about how changes are handled or should be handled systematically. This process is illustrated in Figure 1. As seen from Figure 1, first, employees, users, and stakeholders are encouraged to participate in drafting a change idea. This change can involve a manufacturing process parameter change, or a specific dimension, or addition of an alternate material grade in BOM, etc. The reasons for these sorts of changes vary widely from global to local perspectives (including availability of resources, and process optimizations). The change request is documented and takes the form of a proposal. The proposal is rigorously tested and evaluated from different angles by competent decision-making professionals in the change team. If the proposal is approved, then product documentation is changed and circulated throughout the dispersed enterprise for ground implementation.
It is observed [15] that communication, decision-making, process definition, organization, and information systems play a pivotal role in the change process discussed above. Therefore, any tool that ensures improvements in these areas will eventually ease the change process, which is essentially necessary for innovation and new product development [19].

ECM Tools and Techniques
There are various tools and accompanying software used for change management in the industry to manage the aforementioned change process. Some of these are the subject of discussion of this paper:


The Model-Based Definition (MBD) technique is utilized for creation of 3D CAD models during the design phase of the products. An MBD contains Product Manufacturing Information (PMI) and once an MBD is created it is utilized in all the domains of the organization in the downstream, e.g., manufacturing, quality control and assurance, supply chain, marketing, finance, etc. As seen from Figure 1, first, employees, users, and stakeholders are encouraged to participate in drafting a change idea. This change can involve a manufacturing process parameter change, or a specific dimension, or addition of an alternate material grade in BOM, etc. The reasons for these sorts of changes vary widely from global to local perspectives (including availability of resources, and process optimizations). The change request is documented and takes the form of a proposal. The proposal is rigorously tested and evaluated from different angles by competent decision-making professionals in the change team. If the proposal is approved, then product documentation is changed and circulated throughout the dispersed enterprise for ground implementation.
It is observed [15] that communication, decision-making, process definition, organization, and information systems play a pivotal role in the change process discussed above. Therefore, any tool that ensures improvements in these areas will eventually ease the change process, which is essentially necessary for innovation and new product development [19].

ECM Tools and Techniques
There are various tools and accompanying software used for change management in the industry to manage the aforementioned change process. Some of these are the subject of discussion of this paper:

•
The Model-Based Definition (MBD) technique is utilized for creation of 3D CAD models during the design phase of the products. An MBD contains Product Manufacturing Information (PMI) and once an MBD is created it is utilized in all the domains of the organization in the downstream, e.g., manufacturing, quality control and assurance, supply chain, marketing, finance, etc.

•
The digital twin technique is a modern way of integrating real-time data into a digital model of the product, continuously updating the virtual model to make effective communication, decision making, and change management during the complete lifecycle of the product.
• Various other algorithms and software-based solutions have been developed to seamlessly integrate and manage change throughout the lifecycle of the product. The solutions identified in research are both semantic and quantitative/annotated. Service-Oriented Architecture (SOA), Process-Oriented Semantic Knowledge Management (POSKM) towards integration during the lifecycle, unified feature modeling, and unified modeling languages utilized for CMPL are some of the techniques reviewed by the authors.

Model-Based Definition (MBD)
Traditionally, 2D drawings have been the center of engineering communication throughout the history of humankind. Take, for example, the classical pyramids of Egypt to the modern Burj Khalifa of Dubai. The industry was aligned in managing PMI through the use of 2D drawings annotated with geometrical dimensioning and tolerancing information and manufacturing-related information [16]. However, with the advent of solid models in the 1980s, the landscape changed, with CAD solid models replacing drawings. Since then, there has been a transformation of industrial practices to utilize more robust 3D models for representing design information to downstream departments within organizations. In modern day engineering practices of CM and ECM, there is an ever-growing need to utilize design information effectively and efficiently during and after the D&D phase of products in order to decrease time to market of diverse products. Therefore, a need was felt to establish neutral file exchange formats that could translate design information to different stakeholders, enabling changes as and when required. Development of IGES (1978), STEP (1980), STL (1987), JT (1990), and PRC (2002) was started [8] with the aim of developing a data exchange format for different software applications. These formats are today extensively utilized for communication of PMI throughout the PL.

History
In 2003, ASME released Y14.41 "Digital Product Definition Data Practices" [20] or more broadly speaking the basis of an MBD. This release fortified the idea of presenting PMI through 3D annotated drawings [21]. With an MBD combined with the upgradation of neutral file formats with their latest Application Protocols (APs), it became possible for the industries to adopt a virtual product development, thereby decreasing the time required for these activities.
An MBD is the foundation of Model-Based Enterprise (MBE), a methodology to utilize digital models to execute all the technical activities throughout the lifecycle of a product. It allows for the detailed definition of the product in its design phase containing all the desired information in a single representation.

Application
Application of an MBD in different domains is presented in Table 1. The applications point out the diversity of MBDs in various fields of CM. In this approach, first, a 3D model based on an MBD is generated during the design phase of product development, containing all the PMI data. The generated file is usually in a CAD package. Second, it is converted into a distribution file format (neutral file format) for all the stakeholders to perform various activities of PLM based on this file. The distribution file is termed as a "container" of the information. The file contains all the ECM-based information, e.g., title, owner, version control, changes occurred, etc. Third, the distribution file integrates into the PLM deployed by the organization and becomes the basis of managing all the activities of the organization [8].

Digital Twin (DT) History and Introduction
Michael Grieves at the University of Michigan introduced the term Digital Twin (DT) in 2002. A DT can be defined as a digital replica of something in the physical world in which data flow is fully integrated in both directions. It is more than blueprints or schematics. It is the virtual representation of all elements and the dynamics of a system [23].
A DT fetches data from various resources including CAD models, product lifecycles, manufacturing engineering, and sensors to depict a virtual model of the product, allowing us to forecast performance, maintenance, and failures and, most importantly, it can be utilized used to manage ECM [24,25].
Incorporating change in any part of a product lifecycle requires resources and time. The predictive nature of a DT allows us to understand the impact of changes in the virtual world. Only when the virtual model performs according to our requirement will the change be implemented physically. This is an important implication of DTs that needs exploration.

Applications
Recently, DT technology was adopted in Formula 1 racing in which every second matters. A simulation can help the car team know what adjustments can improve performance of the vehicle [26]. DTs are also being used in the fields of heath care services and factory layout planning to study the impact of proposed changes virtually.

Future Trends
In the era of IoT, as DTs advance, we will observe integration of these models into the manufacturing process itself. The future of manufacturing will have an advantage as orders reduce to a size of one and design changes will become a daily event. With the use of advanced smart and nanomaterials, the possibility to change the physical form and behavior of the product remotely would be the ultimate DT application in PLM [25].

Unified Feature Modeling (UFM)
Changes are not very easy, especially when we are dealing with the whole lifecycle of a product. Changes are interrelated and one change in a later stage is linked to or requires other changes in previous or preceding stages. So, in order to change an entity somewhere in the product lifecycle one needs to be informed about the dependency tree of that change and a method or algorithm which explains how to carry out a certain change in control.
A systematic approach to deal with this change propagation with the help of an algorithm for unified feature modeling of PLM has been proposed by the authors [27,28]. The algorithm first searches for intrastage dependencies and then searches for interstage associations. Once they are identified, the associations are linked, and a dependency tree is formed which will carry out the necessary chain of changes. In this way, model consistency and validity are improved and models can be used in collaborative and concurrent engineering.

Unified Modeling Language (UML)
The tools employed to implement PLM are not as universal as its idea. Different tools are still used for different lifecycle stages with some common standards between them to make their communications easy. Some other approaches do exist which propose a unified modeling mechanism for all the stages of the product lifecycle. One such approach is presented by using Unified Modeling Language (UML) [7]. The authors advocate their case by identifying a unified modeling approach as the need of the time and UML extreme competence as a candidate for PLM unified implementation across all stages and proposing a constraint management algorithm which will be tied with UML to augment its capability and to complete the modeling job.
UML is a graphical object-oriented-based language which was previously used in industry for business and technical purposes [29]. Its natural flow and detail-capturing ability make it suitable for PLM unified modeling. Along with UML, the authors also present a constraint management algorithm which relates system states with all the different types of constraints.
Constraint management reflects across the whole system along with history check and user control. It means that change history can be viewed, and changes can be reverted with this algorithm. The authors also presented a vacuum cleaner example to illustrate the use of UML as a unified modeling technique for PLM. The case study aptly captures UML's ability to manage change effectively during different stages of development.

Process-Oriented Semantic Knowledge Management (POSKM)
Different stakeholders in an organization have their own way of representing and interpreting knowledge. Changing the way these stakeholders view, interpret, and change the product-related information is time consuming. These errands must be executed through a systematic approach in order to keep a record of the changes and distribution of knowledge [6].
Knowledge Management (KM) is utilized for methodical knowledge generation, sharing, and reuse. Ontologies can be use in this matter with the shortcoming that if there is any change in technology whatsoever, changing the whole product information base and models can be difficult [30].
In this technique, an object-oriented modular approach is adopted to form modular data that is produced during the product lifecycle, along with the data generated in the process chain itself. It is a new methodology of process-oriented semantic knowledge management that allows dispersion and circulation of appropriate knowledge through the lifecycle of the product and thus avoids the collaborators needing to change their methodologies. The basic architecture contains different layers: an ontology layer (contains a semantic description of the product), process chain layer (extends the shared knowledge base), and data layer (data representation of the underlying ontology layer).
Through this layering, effective controls on CM can be sought to manage PLM in industry.

Service-Oriented Architecture (SOA)
This is an integration framework in the domain of PLM that allows for modular design of engineering, business, and process objects in the form of services that can coordinate and interact across different domains of PLM. This integration framework has been possible because of the developments that have taken place in some standardizations of product data and metadata, discovery, and maturity of SOA for sharing of organizational knowledge, and provision of dynamic middleware available for integration of data across the board [4].
The application of SOA to the PLM domain has helped reach its full potential, allowing it to dynamically align various functions of the organization to a single platform. This alignment helps both technical and business sides of product development to share, modify, and update related information on a need to need basis rapidly and efficiently. It can be seen from the most recent research that SOA is receiving attention from academia and industry as a solution to an integrated approach of PLM. The research has shown promising advances in terms of SOA practices and SOA adoption in various fields [31].

Methodology
A literature review was utilized to meet the research objective. A Systematic Literature Review (SLR) was utilized in this study. SLR studies aim to identify relevant primary papers, extract the required data, analyze, and synthesize results to gain further and broader insight into the investigated domain [32,33]. Systematic reviews adopt a replicable, scientific, and transparent process [33,34] through exhaustive literature searches of published and unpublished studies and provide an audit trail of the reviewer's decision procedures and conclusions [35].
A systematic search was carried out by the authors for articles relating to the subject under study. Articles published from 2000 up to and including 2022, using the academic search engines SCOPUS and Web of Science, were utilized. It consists of using a combination of query strings for the titles, abstract, and keywords of studies. The search strings that were applied to search the databases were: "PLM", "CM", and "PLM AND CM". Table 2 summarizes the literature review process with a summary of the inclusion/exclusion criteria [36,37]. Each researcher checked the citations and bibliographies of the selected studies to identify any additional relevant studies that were missed in the database search. Finally, gray literature, predatory journals, conference papers, magazine-related articles, workshops, books, editorials, and prefaces were excluded [36,37].

1.
Describe an overview of PLM and CM methodology as described in the literature.

2.
Review PLM/CM techniques along with the highlights of the important works.

3.
Ascertain the gaps found between the interfaces of various techniques.

4.
Discussion of key findings is conducted on the reviewed literature coupled with future trends and conclusions.      Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability A flowchart was utilized ensure and improve review transparency [38] and a flowchart was utilized to draw out and map the steps within the SLR process ( Table 2). The flowchart allows future researchers to follow, replicate, and draw implications from the research findings.
The authors carried out reviews of the articles found to assess the eligibility for inclusion based on the search criteria [39]. Articles that matched the research criteria and research objectives were included.
At this final stage, utilizing the review methodology, after screening for themes related to the research, 16 articles or relevant research papers were selected for more exploratory analysis of the subthemes of the PLM and CM areas. The findings were summarized through reviewing patterns of publications and emerging themes in these publications.

Results
The results of the literature review highlighted the following themes ( Table 3) in terms of the benefits of the methodologies and technologies. There were five main broad interdependent and interrelated themes by which the benefits could be categorized. Firstly, the level by which the technology can be standardized was a theme highlighted by [8,10,23,24,40]. Secondly, the extent of integration [6,18,22] was deemed a critical factor in the techniques/methods chosen. Thirdly, versatility of the chosen application [7,31,41] is important in both integration and standardization of existing platforms and the choice of technology chosen. The final themes highlighted related to the technology benefits were support of existing systems [3,7,23] and the extent of PLM support [28]. The disadvantages of the technologies demonstrated were aligned with the previous themes and criteria, including the poor level of standardization [23,24,27], the lack of the extent of integration [5,6,8,18], the lack of versatility of the chosen application, lack of support of existing systems [5,23,24], and the lack of extent of PLM support [18,26,27]. The goal of the authors is to present an approach to integrate 3D annotations in the digital thread of the product lifecycle. It proposes software element that integrates this information in a PLM server. The approach overcomes existing barriers in collaborative product development. This leads to concurrent design with increased flexibilities of engineering

Unified Modeling Language based on XML integration
The framework presented is comprehensive and utilizes standard formats of ASME Y14.41 The effectiveness of developed software element needs industry testing Validation of the proposed system in production environment  The proposed research areas were many from the literature research. Many technologies needed further validation [4,23,28] and further evidence [45,46], integration, and case studies [7,31,40].

Discussion
The literature survey shows promising research in the field of ECM integration throughout PLM. The various techniques highlighted above are some of the most extensive techniques that are being employed in both technical and business aspects of the product lifecycle. It has been identified that the following are some of the success factors that can be used to analyze integration of processes in an organization. Based on the merits and demerits highlighted in Table 3, gaps between the extents of conformance to success factors have been identified and are presented in Table 3.
Support of Existing Systems (SES); e.
Extent of PLM Support (EPS).
The referenced literature in Table 3 was analyzed in view of the aforementioned factors. Table 4 highlights the conformance of the surveyed techniques to the success factors, where the fully shaded circles depict complete conformance and half-shaded circles show partial conformance. It can be seen from the table that the frameworks discussed thus far have a potential to support the PLM environment by providing a platform to integrate technical and business processes of the organization.

MBD
Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability  Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability  Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability  Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability  Remove articles not related to the search area or search period

Validation of Search Results
Cross comparison of articles among researchers Revisiting of articles to confirm acceptance or exclusion Ensure interrater reliability  As illustrated in Table 4, the MBD framework has been extensively used to integrate BOM, PMI, and CAD/CAE/CAPP-related product data. PRC and STEP file schemas are already used in the industry for streamlining design, manufacturing, and inspection activities of PLM. They also help support logistics, procurement, and supply chain aspects of the organization, including supplier management. The approach is prevalent in the manufacturing sector. However, applications in service industry and construction industry are limited.
The DT, however, is an emerging concept. It requires evidence in both empirical and normative regimes to prove its potential in varied industrial applications. It has been utilized in health care systems as well, but the lack of standardization in its architecture requires further research.
UML and UFM have already become part of PLM infrastructure. Both can be based on XML schemas and have been employed in various enterprise software solutions for the purpose of standardization. Both the frameworks have proved to be the starting point for standard establishing organizations including Organization for the Advancement of Structured Information Standards (OASIS) and Object Management Group (OMG). These developments are key to integrate the software part of ECM and PLM.
POSKM considers the qualitative aspect of integration, namely, semantics. It has, however, successfully standardized the usage of nouns, verbs, and related grammar to exclusively represent PLM-based thinking in an organization. It effectively addresses the business aspects of PLM.
SOA is another extensive framework that is utilized in industries to translate different aspects of PLM integration throughout the organization. It utilizes an MBD in product design and development phases and utilizes XML-based schemas developed by OASIS and OMG to integrate the business processes. Recently, it has been the center of attention at various academic and industrial consortiums as a potential candidate for adoption as an integration framework for the whole organization. However, the developments in these areas need testing and implementations need to be assessed in order to prove its potential as a universal framework.
The subject of this paper has been diverse and all-encompassing with the purpose of highlighting key frameworks established during recent years to address the growing demands for dynamic ECM throughout PLM. It has been shown with this research that there have been developments in recent years to develop standard languages and schemas to streamline and address the needs of PLM-based systems. The benefits of adopting PLM-based thinking are far reaching and once these frameworks and schemas mature, there will be rapid advances in adoption of an integrated approach towards knowledge management in both technical and business aspects.
The frameworks have already been deployed in aviation industries across the world (owing to diverse and global manufacturing requirements). In this aspect, the future is looking promising to adopt these advancements once they mature in standardization by further developments.
For future research in PLM, there are many directions that can be further studied. Accessible and extensive information does not mean that the information is comprehensible as well. It must be organized, structured, and presented in an intuitive and comprehensible way without losing any resolution of the information [6]. Secondly, today's PLM systems that are web based and contain almost everything on a product can pose serious cyber security risks. Web-based technologies are relatively more prone to cyberattacks [15] which can lead to intellectual property theft. Lastly, merely arranging data will not suffice to reap the full fruits of PLM. New algorithms and techniques such as data mining, AI, and machine learning can be employed to extract useful knowledge linkages which will enhance our understanding of data and ultimately improve the business.

Conclusions
This research has answered the research objectives to provide an overview of PLM and a summary of CM methods from the literature (RQ1). A review of model-based definition, digital twin, process-based semantic approach, service-oriented architecture, Unified Modeling Language, and unified feature modeling has been presented (RQ2). The gaps between the extents of conformance to success factors have been identified as extent of integration, standardization, versatility of application, support of existing systems, and the extent of product lifecycle management support (RQ3). Data sciences, including machine learning, neural networks, genetic algorithms, Industry 4.0, and Big Data, call for more dynamic and robust engineering change management schemes. The prognostics and diagnostics will continue to become more interlinked, and methodologies discussed in this paper will continue to evolve to meet the needs of the industry. The future work would be to standardize these languages and methodologies as per the new demands of data sciences. It will require standard organizations to be more robust in suggesting new industry standards and will be necessary to shape the future. Currently, as highlighted in the review, the industry lacks standardizations and, in order to move ahead in the future, frameworks need to be established that are homogeneous and can be utilized for a range of industries. How PLM systems and these frameworks assimilate the changes required is an opportunity for study.