Next Article in Journal / Special Issue
Comparison of Multi-Criteria Group Decision-Making Methods for Urban Sewer Network Plan Selection
Previous Article in Journal / Special Issue
Methodological Proposal for the Development of Insurance Policies for Building Components
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Managing the Performance of Asset Acquisition and Operation with Decision Support Tools

1
Department of Mechanical Engineering, The University of Bath, Bath BA2 7AY, UK
2
National Grid, Warwick, CV34 6DA, UK
*
Author to whom correspondence should be addressed.
CivilEng 2020, 1(1), 10-25; https://doi.org/10.3390/civileng1010002
Submission received: 2 March 2020 / Revised: 3 June 2020 / Accepted: 3 June 2020 / Published: 9 June 2020
(This article belongs to the Special Issue Addressing Risk in Engineering Asset Management)

Abstract

:
Decision support tools (DSTs) are increasingly being used to assist with asset acquisition and management decisions. Whether these tools are “fit for purpose” will have both economic and non-economic implications. Despite this, the on-going governance of DST performance receives only limited attention within both the academic and industry literature. This work addresses that research gap. Within this paper a conceptual process for managing the operational performance of decision support tools is presented. The novelty of the approach is that it aligns with the ISO 5500x:2014 Asset Management Standard, therefore introducing consistency in the governance of DSTs with physical engineered assets. A case study of the UK’s National Grid Electricity Transmission (NGET) is used to inform the approach design. The evaluation demonstrates it to be both logical and useable within the context of NGET and they have expressed an intention to implement the approach within their business. A key finding of the research was that DSTs transcend functions and organisations. This is significant and can lead to a situation whereby performance and criticality are interpreted and measured differently. The introduction of a common approach for managing DST performance works towards improving consistency and creating a shared understanding.

1. Introduction

Improvements in economic performance and human well-being are intrinsically linked to strategic infrastructure. To meet objectives in these areas, global spending on infrastructure will significantly increase, rising from USD 4 trillion per year in 2012 to an estimated USD 9 trillion per year by 2025 [1]. One way to reduce the amount needing to be spent is to make the money invested deliver greater returns—to improve asset investment productivity. Estimates suggest that improvements in the selection, build, operation, and maintenance of infrastructure have the potential to save USD 1 trillion per annum [2]. To realise these efficiencies, infrastructure organisations are making use of manual and computerised tools to assist with asset acquisition and operational decisions. Within the sector these are commonly referred to as decision support tools (DSTs).
DSTs have been introduced to improve the efficiency, effectiveness, and consistency of asset decisions. However, benefits are not guaranteed. Across both the business process and information systems areas, there are examples of new business initiatives where expected gains have either not been realised, or initially realised but not sustained [3,4,5,6,7,8,9,10,11,12]. Despite this challenge, the literature shows only limited attention given to the governance of DST performance. This is significant. If DSTs are making sub-optimal asset decisions, this will adversely affect asset investment productivity.
The research presented within this paper was conducted in collaboration with the Engineering and Physical Sciences Research Council (EPSRC) and National Grid Electricity Transmission (NGET) under an Industrial Case Award (iCASE). Within this work, we assert that under the wording of the International Standard for Asset Management (ISO 5500x: 2014), DSTs can be considered to be organisational assets. Therefore, for consistency, organisations should manage them in the same way that they manage their physical engineered assets.
The output of this 42-month project was a conceptual approach for managing the performance of DSTs used within an asset management context. The approach consists of a DST performance management process and DST performance assessment techniques—the methods for applying the process in an industry setting [13]. The focus of this paper is the DST performance management process.
The paper is structured as follows: First, an introduction to the practice of asset management and the use of DSTs within this context (Section 2). The literature provides evidence of the creation and use of DSTs, but also identifies a gap in the on-going governance of their performance. The research approach, whereby a case study of National Grid Electricity Transmission (NGET) is used to inform the design of the approach, is described (Section 3). The DST performance management approach is detailed (Section 4). Finally, the outcome of an evaluation by NGET subject matter experts is presented (Section 5), conclusions formulated (Section 6), and future work identified (Section 7).

2. Background

Managing assets and asset management (AM) are not the same thing [14]. Although there is a long history of caring for assets (managing assets), the structured approach for deriving value from assets (asset management) is an emerging discipline [14,15,16]. The main contributions to the field are considered to come from government and practitioners, with academic contributions being moderate [17]. Despite increasing contributions to the field, criticism of AM practice persists. Asset-owning organisations are accused of being unsystematic in their approach to projects with many of their processes unchanged in decades [2]. ISO 5500x:2014 works towards addressing these criticisms through the introduction of an International Standard for AM practice.
Within the UK energy sector, the use of quality standards within their AM operations has been strongly encouraged by the UK regulator, Ofgem [18]. The current ISO AM Standard adopts a top-down approach in which the objectives of the organisation are translated into a coordinated system of asset plans, policies, and processes. The principle ambition of AM is to realise the value of assets and in doing so support achieving the organisational objectives.
Traditionally, the focus of AM systems was physical engineered assets. Replacement of the British Standard [19] by the International Standard [20] in 2014 redefined assets as anything which contributes potential or actual value to an organisation achieving its organisational objectives [20]. Innovative thinking in this regard has seen proposals to increase the scope of AM systems to include physical (non-engineered) assets, e.g., environmental assets [21] and people [22], as well as non-physical assets, e.g., data [23] and software [24]. Ultimately, the scope of the assets which are managed within an AM system is defined by the organisation and will be dependent upon external and internal factors including resources, legislative or licencing requirements, and the maturity of an organisation’s AM practice.

Decision Support Tools

The overarching aim of AM is to realise value from assets. In achieving this, it is not merely a case of maximising asset performance, but in achieving the optimum balance of performance, cost, opportunity, and risk throughout the lifecycle of the asset [20]. To support organisations in achieving this optimum balance, a range of DSTs are utilised.
A review of the academic literature shows that DSTs are being created to assist in making asset decisions. These tools can be broadly categorised into two groups: those which support decisions about strategy and new acquisitions [25,26,27,28], and those supporting decisions within an existing system, e.g., operation [29,30,31,32] and maintenance [33,34,35,36].
Although providing evidence of DSTs being created within academia, this did not necessarily mean that DSTs were being used in practice. Articles within the Institute of Asset Management (IAM) members’ magazine [37,38,39,40,41,42], and Subject Specific Guidelines [43] provided an insight into DST use within the field.
Subject Specific Guidelines (SSGs) are a core element of the IAM’s asset management knowledge base. Created through peer review and IAM Expert Panel assessment, they represent the professional body’s current thinking on a particular subject. Although published in 2015, Subject 8; Life Cycle Value Realisation [43] provides the most recent consolidation of DST “good practice”. Within the SSGs, ten exemplars of DSTs used within the UK’s core infrastructure organisations (e.g., Network Rail, National Grid, Severn Trent Water, and London Underground) are provided. Unlike academic DSTs which are strongly focused towards customised computer-based solutions, the ten exemplars show DSTs to include manual processes, computer-based utilising standard software (e.g., Access, Excel), and configured software solutions. Similar to the academic literature, they are seen to have a variety of attributes and are applied to a range of asset decision problems.
The evolutionary nature of DSTs underpins academic theory in this area [44,45,46,47,48]. Despite this, the focus of both the academic and industry literature is on proposing and presenting DST approaches with little consideration given to the management of their operational performance. Specifically, there were no performance governance approaches that aligned the management of DSTs with the requirements of the ISO 5500x:2014 Asset Management Standard.
Section 3 details the approach taken in creating the DST performance management process.

3. Research Approach

Case studies are particularly suited to understanding complex relationships in the context of social settings [49]. This research uses a case study of National Grid Electricity Transmission (NGET) to inform the design of an approach for managing DST performance. Within this section, the case study is presented (Section 3.1) and the method used in defining the approach requirements is detailed (Section 3.2).

3.1. Case Study—National Grid Electricity Transmissions (NGET)

National Grid Electricity Transmissions (NGET) own and operate the high-voltage electricity transmission network within England and Wales and operate (but do not own) the Scottish transmissions network. As such, they play a central role in the UK power infrastructure sector. The nature of their business means that NGET is rich in physical assets (e.g., overhead lines, underground cables, substations), with an asset portfolio valued in excess of GBP 40 billion [50]. Going forward, over the eight years from 2013, a spending programme of GBP 13.6 billion capital expenditure (Capex) and GBP 16.4 billion total expenditure (Totex) are scheduled. Within NGET, this extensive asset portfolio is managed by way of an ISO 5500x:2014 aligned AM system.
To understand the use and governance of DSTs within NGET, an in-depth study was undertaken. Figure 1 identifies the inputs used in conducting this case study.
The case study showed that NGET was making extensive use of DSTs. These encompassed manual processes [51], computer-based databases and spreadsheets [52], and customised software solutions [53]. DST theory asserts that the nature of DSTs makes them cross-functional [46,47,48,54]. This was supported by the case study. Indeed, one example not only transcended functions but also organisations being built and revised by an external supplier, using data from both inside and outside of the organisation, governed by NGETs’ IT department, used by NGET asset management engineers, and providing information to NGET decision-makers [53,55,56].
Gathering a full picture of the extent of DSTs’ use within NGET proved challenging as no central register was held. Although no complete register was available, an inventory of “non-trivial” end-user computing (EUC) reports and databases was maintained. This provided a list of DSTs held locally, utilising standard computer software, i.e., Excel and Access [57]. Analysis of the inventory identified 195 different DSTs, with a variety of attributes and purposes, being used across seven different business functions.
The case study provided evidence that NGET was undertaking control and governance activities to ensure DST performance. However, these tended to be ad hoc, rather than coordinated through an embedded central process, with different measures for DST criticality and performance seen [51,55,57,58].

3.2. Defining the Approach Requirements

The goal of this research was to create an approach for managing DST performance which was both rigorous, but logical and useable within AM practice. In this regard, understanding the requirements of the stakeholders was key. Requirements engineering (RE) is concerned with managing the wants and needs of the stakeholders and translating these into a statement of how the design should be. The RE method used within this research was based on the cyclical approach proposed by Callele et al. [59]. Figure 2 presents the requirement engineering method and the activities undertaken at each of the four stages: elicitation, analysis, documentation, and validation.
During the elicitation (Stage 1), an NGET subject matter expert was asked to identify the approach stakeholders and their requirements. To mitigate the risk that stakeholders were missed, a stakeholder map was used to guide this activity (Figure 3). Four stakeholders were identified: owner, customers, employees, and government.
Following, the expert was asked to brainstorm the requirements of each. Brainstorming is a recognised technique that encourages freethinking [61]. This resulted in the generation of fourteen requirement statements (see Table 1).
Amongst the fourteen was the requirement that the approach must conform to the ISO AM Standard. The analysis (Stage 2) looked to identify any conflicts or omissions between the remaining thirteen stakeholder requirements and the Standard.
An exploration of the Standard identified that there were no specific requirements for a DST performance management approach. Rather, there were key concepts (principles, general requirements, and guidelines) which must be met. The first step in conducting the analysis was to identify these key concepts.
Identification of the key concepts was undertaken using thematic analysis [62] and the utilisation of the NVivo computer-assisted qualitative data analysis software (CAQDAS). Table 2 provides details of the nine key concepts identified and an example excerpt coded to each theme.
Next, the NGET subject matter expert was asked to assess whether the stakeholder requirements were satisfied through the key concepts. The results are presented in Table 1.
Although there were no direct mappings of stakeholder requirements, the subject matter expert confirmed that each of the stakeholder requirements was satisfied by key concepts within the Standard. No conflicts or omissions were identified.
With the nine key concepts meeting the needs and wants of the stakeholders, these were documented as the approach requirements (Stage 3).
Validation (Stage 4) was undertaken by way of a questionnaire which was completed by two NGET subject matter experts. The case study had shown DSTs’ creation and operation to involve both NGET IT and engineering functions. Consequently, one participant was recruited from each area.
The questionnaire focused on validation of three areas: the stakeholders, the stakeholder requirements, and the approach requirements. The results are presented in Table 3.
The responses showed complete alignment across the two participants. Both agreed that there were no stakeholders who they considered should not be included (1A). Although both identified that suppliers had been omitted as a stakeholder (1B), they did not consider that they had any additional requirements above those which had already been identified (2B). The literature review, case study, and requirement engineering activity suggest there would be a benefit (or need) to align the DST management approach with the ISO AM Standard. The responses provided further support with both respondents agreeing that the approach should conform to both the asset management and risk management standards (3A). Both respondents confirmed that the ten approach requirements were valid with none unnecessary (4A), or omitted (4B).
The DST performance management approach which was informed by the requirements defined within this section is now presented.

4. The DST Performance Management Approach

The DST performance management approach encompasses two parts: the DST performance management process and DST performance assessment techniques (the methods for applying the process in an industry setting). Figure 4 shows how these are positioned within the AM hierarchy.
Defining the requirements for an asset management system, the International Standard for Asset Management (ISO 5500x:2014) occupies the highest level within the hierarchy.
An AM system comprises of the policies and procedures for applying AM within an organisation [20].
Forming part of the AM system, the policy for DST control and governance defines how DSTs will be managed through life. The DST performance management process manages performance during the operational stage of a DST. Wider measures will be necessary to cover the complete lifecycle (i.e., creation, in-use, and end-of-life).
The DST performance management process provides the means through which DST operational performance is managed. It defines the steps when conducting a DST performance assessment, and incorporates elements that ensure that it meets the requirements for a quality managed process as defined within ISO 9001:2014 Quality Management Systems [63].
The DST performance management techniques are the “how to” methods through which to conduct the steps contained within the process. As stated, the focus of this paper is on presenting the DST performance management process.

The DST Performance Management Process

Figure 5 presents the DST performance management process. The design of the process echoes that of the risk management process described within the ISO 31000:2009 Risk Management Standard [64]. Aligning the two processes was considered advantageous for four main reasons: (1) When defining risk and how to undertake risk management, the Asset Management Standard references the Risk Management Standard [20]. (2) The ISO organisation asserts that for consistency and control, the risk management framework and processes should be integrated within other management systems [65]. (3) A stakeholder requirement was that the approach should comply with the Risk Management Standard. (4) The inclusion of the risk management process as part of the Risk Management Standard assures that it satisfies the requirements for a process as defined within the ISO Quality Management Standard [66].
Although similar in design, within the DST performance management process, the risk assessment is replaced by the DST performance assessment. Furthermore, the risk management process is visualised as isolated from other processes, whereas the DST performance management process is integrated with other management systems and/or governance processes at the treatment stage. Table 4 presents a comparison of the elements within the two processes.
The comparison shows three elements that are alike: “Communication and consultation”, “Monitoring and review”, and “Establishing the context”.
Communication and consultation” and “Monitoring and review” are the elements through which continual improvement is embedded within the process. Consultation and communication ensure that stakeholder views are taken into consideration, are informed about how decisions are made, and advised of any action taken.
Monitoring and review” acts to deliver assurance of the quality (e.g., efficiency and effectiveness) of the process, embedding the ongoing monitoring and periodic review of a process and its outcomes.
Establishing the context” (renamed “Scope, Context and Criteria” in the 2018 revision of the risk management process [65]) is the means through which a generic process is tailored to meet the requirements of an individual organisation. Within the risk management process, “Establishing the context” defines the objectives, internal and external context, and sets the risk criteria for the remaining process. The DST performance management process mirrors this, defining the internal, external, and process context, and the rules for applying the rest of the process, i.e., the rules that an organisation will follow when carrying out the activities within the DST performance assessment. These include the rules for:
  • Establishing how critical a DST is;
  • Determining which DSTs shall have their performance measured;
  • Evaluation of DST performance assessments;
  • The treatment applied given a certain performance assessment outcome.
At the “Assessment” element, the processes differ. Although the DST performance management process contains the same three steps as the risk management process (identification, analysis, and evaluation), the activities undertaken within each are changed. Within the DST performance management assessment, identification captures information about the DSTs used within the organisation. It delivers a complete list of all DSTs within scope (determined by the rules defined within “Establishing the context”), their business criticality, and information reporting their performance management activities and outcomes. Performance management reporting information is generated when the DST performance management process is applied.
Analysis applies the rules defined within “Establishing the context” to identify the DSTs that require their performance to be measured. Once identified, the performance of the DST is measured.
Evaluation applies the rules defined within the “Establishing the context” element to the results of the performance assessment. The outcome determines the treatment which is applied.
The 2018 revision of the risk management process saw recording and reporting made explicit through its inclusion as a separate element [65]. Non-prescriptive, the element requires the activities and outcomes of the process to be reported “through appropriate mechanisms”. Decisions around the creation, retention, and handling of documented information are determined by the organisation and its context. Within the DST performance management process, “Recording and Reporting” are undertaken within the identification element with the register containing both the detail of the asset and performance management activities pertaining to it.

5. Evaluation

Evaluation was undertaken by way of a focus group comprised of five NGET subject matter experts. A focus group is a commonly used qualitative research approach that seeks to bring participants together to explore an idea [49,67,68].
The evaluation encompassed four elements: whether the experts felt that a research challenge existed, and strengths, weaknesses, opportunities, and threats of the DST performance management approach. Specific to the DST performance management process, it systematically explored:
The outputs at each step of defining the approach requirements:
  • Are there any stakeholders who have been missed/should not be included?
  • Are there any stakeholder requirements that have been missed/should not be included?
  • Are there any approach requirements which are missing/should not be included?
Whether the proposed conceptual DST performance management process was logical and usable:
  • Does the process appear to satisfy the ten approach requirements?
  • Does the process appear logical/usable for each of the three exemplar DSTs? *Note: logic and useable were considered in the context of three NGET DSTs [51,52,53]. These DSTs represent each of the categories of the DSTs identified within NGET (manual, computer-based, and configured solutions).
The underpinning decision system theory identifies that in the creation, build, maintenance, and operation of DSTs, a range of functions will be involved [46,47,48,54]. The NGET case study provided support for this in practice. Keen [47] considered the cross-functional nature of DSTs from the perspective of how they evolve. Within his work, he created a model of decision tool adaption which identifies three actors (user, system, and builder). Alternatively, Alter [54] focused on the decision tool use, identifying two categories of user—the intermediary (who uses the system), and the decision-maker (who applies the information to decision making).
Given the range of DST types, scope, and attributes it would not be possible to create a definitive list of participants who should be involved in the evaluation of a process for managing DST performance. In this study, the selection was guided by the literature, the case study, and the inputs of the NGET senior management to ensure a comprehensive range of roles, and appropriate level of expertise. Table 5 presents details of the participants involved in the evaluation: their department, job title, and responsibilities in relation to DSTs. Although the case study provided evidence that DSTs can involve different organisations as well as functions, at this early in the research, the involvement of external participants was not considered desirable.
To mitigate researcher bias, the focus group was facilitated, and responses captured by one of the participants. During the analysis, the captured comments and audio recording of the focus group were analysed [13].
Approach requirements: The participants agreed that those identified (owner, customers, employees, and government) represented the core stakeholders and that there were no stakeholder requirements identified which should not be included. Identifying whether there were missing requirements proved more challenging. This led to discussions of what a requirement statement actually meant and consequently, whether further requirements were required, or were encompassed by an existing statement.
Logic and usability: Despite the discussion which arose around the approach requirements, the participants agreed that the process met the requirements. Additionally, it was agreed that it was logical and useable when considered in the context of each of the three NGET DSTs.
A key insight from the case study was that differences between functions have the potential to lead to inconsistency in governance. In addition to differences in how performance and criticality was measured and rated across DSTs, the focus group identified that other metrics for these two areas existed “from a data perspective we follow the business continuity categories which are operationally critical, critical, core, and efficiency and performance”. Comments captured during the focus group identified “there will be different perceptions of performance”.

6. Conclusions

The academic theory recognises that unless decision tools adapt to changes in the environment, they will become less effective. Despite this, little attention has been given to managing DST performance following implementation. The research presented within this paper addresses that gap through the creation of a DST performance management process.
A case study of NGET shows them to be making extensive use of DSTs. Unlike the academic literature, where the focus is on customised computer-based solutions, the majority of tools used by NGET are held locally and utilise standard software packages such as Excel and Access. This is significant and suggests that without proper governance, there would be a lack of transparency around how asset decisions were being made.
The academic literature asserts that the nature of DSTs means they will transcend organisational functions. This is supported by the NGET case study which sees DSTs cross both functions and organisations. The evaluation of the DST performance management process identified measures and ratings of performance and criticality as two areas where differences arose. Potentially, this could introduce inefficiencies, miscommunication, and lead to difficulties in establishing where responsibility resides.
AM standards have moved to defining an asset as anything that is considered to contribute value towards achieving organisational objectives [20]. Within this work, we propose that DSTs are assets, and for consistency organisations should manage them in the same way as their physical engineered assets. In achieving this, the DST performance management process was designed to align with the ISO 5500x:2014 AM Standard. The evaluation of the process by NGET subject matter experts found it to be both logical and useable within the context of NGET.
Indications are that this research has the potential to create impact outside of academia. NGET have indicated their intention to implement the approach within their business [69]. Interest from the practitioner community is high with requests made to present the approach at the IAM Conference (November 2017), and for a special interest group of the British Computer Society (April 2018). Additionally, an overview of the approach has been published within the members’ magazines of both the IAM [70] and Chartered Quality Institute [71].

7. Future Work

Conducting quantitative research in a laboratory setting is relatively straightforward. There is one “reality”, which can be validated using recognised, statistical methods. The challenge of creating a DST performance approach which is both rigorous and practical within an industry context is complex. This is especially true when the research is innovative and the requirements for the approach have yet to emerge or crystallise. This research created a conceptual design. It is accepted that in the course of progressing through experimental and implementation stages, perceptions of what is possible, wanted, and needed will evolve. Consequently, requirements must be continually revisited and adjusted.
Future research efforts should be focused on three key areas:
  • The aim of this research was to create a conceptual approach for measuring DST performance which is both rigorous and practical. The logic and usability of the process have been validated by NGET subject matter experts. However, it is accepted that what may be logical and useable in theory, may not be so in practice. NGET have declared an intention to implement the process within their organisation [69]. Research efforts should look to conduct field test validation studies of both the process and underpinning techniques within both NGET and other organisations;
  • The transferability of the approach to the water sector was evaluated as part of the programme of work [13]. Although providing some evidence, further studies across a wider sample of sectors and organisations are required;
  • Key to the uptake of the approach by industry is being able to demonstrate that it has value. In this regard, methods and studies to assess the whole life cost versus the benefit of DST performance management are required.

Author Contributions

Conceptualization, S.L.; methodology, S.L.; software, n/a; validation, S.L; formal analysis, S.L.; investigation, S.L.; resources, S.L. and D.D.; data curation, S.L.; writing—original draft preparation, S.L.; writing—review and editing, S.L.; visualization, S.L.; supervision, L.N., M.M. and D.D.; project administration, L.N. and D.D.; funding acquisition, L.N. and D.D. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by The Engineering and Physical Sciences Research Council (EPSRC) under an Industrial CASE award voucher number 13220126.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. PricewaterhouseCoopers LLP; Oxford Economics. Capital Project and Infrastructure Spending Outlook to 2025. 2015. Available online: https://www.gihub.org/resources/publications/capital-project-and-infrastructure-spending-outlook-to-2025/ (accessed on 7 June 2020).
  2. McKinsey Global Institute. Infrastructure Productivity: How to Save $1 Trillion a Year; McKinsey Global Institute: New York, NY, USA, 2013. [Google Scholar]
  3. Alavi, M.; Joachimsthaler, E.A. Revisiting DSS implementation research: A meta-analysis of the literature and suggestions for researchers. MIS Q. Manag. Inf. Syst. 1992, 16, 95–113. [Google Scholar] [CrossRef] [Green Version]
  4. Finlay, P.N.; Forghani, M. A classification of success factors for decision support systems. J. Strateg. Inf. Syst. 1998, 7, 52–70. [Google Scholar] [CrossRef]
  5. Hicks, B.J.; Matthews, J. The barriers to realising sustainable process improvement: A root cause analysis of paradigms for manufacturing systems improvement. Int. J. Comput. Integr. Manuf. 2010, 23, 585–602. [Google Scholar] [CrossRef] [Green Version]
  6. JISC. Change Management. Available online: https://www.jisc.ac.uk/guides/change-management (accessed on 28 January 2016).
  7. Salazar, A.J.; Sawyer, S. Information Technology in Organizations and Electronic Markets; World Scientific Publishing Co: Singapore, 2007. [Google Scholar]
  8. Sauer, C. Why Information Systems Fail: A Case Study Approach; Alfred Waller Ltd.: Henley-on-Thames, UK, 1993. [Google Scholar]
  9. Streit, J.; Pizka, M. Why Software Quality Improvement Fails (and How to Succeed Nevertheless). In Proceedings of the ICSE ’11: Proceedings of the 33rd International Conference on Software Engineering, Honolulu, HI, USA, 21–18 May 2011; ACM: New York, NY, USA, 2011; pp. 726–735. [Google Scholar]
  10. Studer, Q. Making process improvement stick. Healthc. Financ. Manag. 2014, 68, 90–96. [Google Scholar]
  11. The Standish Group Report. The Chaos Report. Available online: http://www.standishgroup.com/outline (accessed on 26 January 2018).
  12. Van Dyk, D.J.; Pretorius, L. A Systems Thinking Approach to the Sustainability of Quality Improvement Programmes. S Afr. J. Ind. Eng. 2014, 25, 71. [Google Scholar] [CrossRef] [Green Version]
  13. Lattanzio, S. Asset Management Decision Support Tools: A Conceptual Approach for Managing their Performance; The University of Bath: Bath, UK, 2018. [Google Scholar]
  14. ISO. ISO/TC 251 Managing Assets in the Context of Asset Management; International Organization for Standardization: Geneva, Switzerland, 2017. [Google Scholar]
  15. Van Der Lei, T.; Herder, P.; Wijnia, Y. Asset Management: The State of the Art in Europe from A Life Cycle Perspective; Springer Netherlands: Dordrecht, The Netherlands, 2012; ISBN 9789400727243. [Google Scholar]
  16. Zuashkiani, A.; Schoenmaker, R.; Parlikad, A.; Jafari, M. A critical examination of asset management curriculum in Europe, North America and Australia. In Proceedings of the Asset Management Conference 2014, London, UK, 27–28 November 2014. [Google Scholar]
  17. Too, E.G. A Framework for Strategic Infrastructure Asset Management. In Definitions, Concepts and Scope of Engineering Asset Management Engineering Asset Management Review; Springer: London, UK, 2010; Volume 1, pp. 31–62. [Google Scholar]
  18. Ofgem. Refocusing Ofgem’s ARM 2002 Survey to BSI-PAS 55 Certification; 2005. Available online: https://www.ofgem.gov.uk/publications-and-updates/refocusing-ofgems-arm-2002-survey-bsi-pas-55-certification (accessed on 7 June 2020).
  19. British Standards Institute. PAS 55; British Standards Institute: London, UK, 2008. [Google Scholar]
  20. BS ISO 55000 Series: 2014 Asset Management; BSI Group: London, UK, 2014.
  21. Papacharalampou, C.; McManus, M.; Newnes, L.B.; Green, D. Catchment metabolism: Integrating natural capital in the asset management portfolio of the water sector. J. Clean. Prod. 2017, 142, 1994–2005. [Google Scholar] [CrossRef] [Green Version]
  22. Gibbons, P. People as Assets. In Assets; Institute of Asset Management: London, UK, 2019; pp. 9–11. [Google Scholar]
  23. Kaney, M. Data, data everywhere—But how do we use it? In Assets; Institute of Asset Management: London, UK, 2018; p. 8. [Google Scholar]
  24. BS ISO/IEC 19770-1:2017 Information Technology—IT Asset Management—Part 1: IT Asset Management Systems—Requirements. 2017. Available online: https://www.iso.org/obp/ui/#iso:std:iso-iec:19770:-1:ed-3:v1:en (accessed on 7 June 2020).
  25. Bhamidipati, S. Simulation framework for asset management in climate-change adaptation of transportation infrastructure. Transp. Res. Procedia 2015, 8, 17–28. [Google Scholar] [CrossRef]
  26. Doorley, R.; Pakrashi, V.; Szeto, W.Y.; Ghosh, B. Designing cycle networks to maximize health, environmental, and travel time impacts: An optimization-based approach. Int. J. Sustain. Transp. 2020, 14, 361–374. [Google Scholar] [CrossRef]
  27. Havelaar, M.; Jaspers, W.; Wolfert, A.; Van Nederveen, G.A.; Auping, W.L. Long-term planning within complex and dynamic infrastructure systems. In Proceedings of the Life-Cycle Analysis and Assessment in Civil Engineering: Towards an Integrated Vision: 6th International Symposium on Life-Cycle Civil Engineering (IALCCE 2018), Ghent, Belgium, 28–31 October 2018; pp. 993–998. [Google Scholar]
  28. Shahtaheri, Y.; Flint, M.M.; de la Garza, J.M. Sustainable Infrastructure Multi-Criteria Preference Assessment of Alternatives for Early Design. Autom. Constr. 2018, 96, 16–28. [Google Scholar] [CrossRef]
  29. Ouammi, A.; Achour, Y.; Zejli, D.; Dagdougui, H. Supervisory Model Predictive Control for Optimal Energy Management of Networked Smart Greenhouses Integrated Microgrid. IEEE Trans. Autom. Sci. Eng. 2020, 17, 117–128. [Google Scholar] [CrossRef]
  30. Rahmawati, S.D.; Whitson, C.H.; Foss, B.; Kuntadi, A. Integrated field operation and optimization. J. Pet. Sci. Eng. 2012, 81, 161–170. [Google Scholar] [CrossRef]
  31. Roberts, K.P.; Turner, D.A.; Coello, J.; Stringfellow, A.M.; Bello, I.A.; Powrie, W.; Watson, G.V.R. SWIMS: A dynamic life cycle-based optimisation and decision support tool for solid waste management. J. Clean. Prod. 2018, 196, 547–563. [Google Scholar] [CrossRef] [Green Version]
  32. Smith, J.S.; Safferman, S.I.; Saffron, C.M. Development and application of a decision support tool for biomass co-firing in existing coal-fired power plants. J. Clean. Prod. 2019, 236. [Google Scholar] [CrossRef]
  33. Dunn, R.; Harwood, K. Bridge Asset Management in Hertfordshire-Now and in the future. In Proceedings of the Asset Management Conference 2015, London, UK, 25–26 November 2015. [Google Scholar]
  34. Elsawah, H.; Bakry, I.; Moselhi, O. Decision support model for integrated risk assessment and prioritization of intervention plans of municipal infrastructure. J. Pipeline Syst. Eng. Pract. 2016, 7. [Google Scholar] [CrossRef]
  35. Geelen, C.V.C.; Yntema, D.R.; Molenaar, J.; Keesman, K.J. Monitoring Support for Water Distribution Systems based on Pressure Sensor Data. Water Resour. Manag. 2019, 33, 3339–3353. [Google Scholar] [CrossRef] [Green Version]
  36. Maniatis, G.; Williams, R.D.; Hoey, T.B.; Hicks, J.; Carroll, W. A decision support tool for assessing risks to above-ground river pipeline crossings. Proc. Inst. Civ. Eng. Water Manag. 2020, 173, 87–100. [Google Scholar] [CrossRef] [Green Version]
  37. Alian, A. Processing the past, predicting the future. In Assets; Institute of Asset Management: London, UK, 2018; pp. 12–15. [Google Scholar]
  38. Black, M. Navigating Data. In Assets; Institute of Asset Management: London, UK, 2018; pp. 14–16. [Google Scholar]
  39. DiMatteo, S. From manual to automatic. In Assets; Institute of Asset Management: London, UK, 2019; pp. 16–17. [Google Scholar]
  40. Herrin, C. Towering Achievements. In Assets; Institute of Asset Management: London, UK, 2019; p. 22. [Google Scholar]
  41. Hobbs, R. A new model for railways. In Assets; Institute of Asset Management: London, UK, 2020; pp. 24–26. [Google Scholar]
  42. Jacobs. Sifting the data lake. In Assets; Institute of Asset Management: London, UK, 2020; p. 5. [Google Scholar]
  43. IAM. Subject 8: Life Cycle Value Realisation; Institute of Asset Management: London, UK, 2015. [Google Scholar]
  44. Institute of asset management. Available online: https://theiam.org/knowledge/subject-8-life-cycle-value-realisation/#:~:text=The%20Life%20Cycle%20Value%20Realisation,or%20renewal%2Fdisposal%20of%20assets (accessed on 8 June 2020).
  45. Arnott, D. Decision support systems evolution: Framework, case study and research agenda. Eur. J. Inf. Syst. 2004, 13, 247–259. [Google Scholar] [CrossRef]
  46. Courbon, J.-C. User-Centred DSS Design and Implementation. In Implementing Systems for Supporting Management Decisions: Concepts; Methods and Experiences, Chapman & Hall: London, UK, 1996. [Google Scholar]
  47. Keen, P.G.W. Adaptive Design for Decision Support Systems. ACM Sigmis Database 1980, 1, 15–25. [Google Scholar] [CrossRef]
  48. Sprague, R.H. A framework for the development of decision support systems. MIS Q. 1980, 4, 1–26. [Google Scholar] [CrossRef]
  49. Denscombe, M. The Good Research Guide, 4th ed.; Open University Press: Berkshire, UK, 2010. [Google Scholar]
  50. National Grid. National Grid Annual Report and Accounts 2016/17; 2017. Available online: https://investors.nationalgrid.com/news-and-reports/reports/2016-17/plc (accessed on 7 June 2020).
  51. National Grid. Policy Statement (Transmission), WLVF PS(T); National Grid: Warwick, UK, 2013. [Google Scholar]
  52. National Grid. Network Output Measures Methodology; National Grid: Warwick, UK, 2010. [Google Scholar]
  53. National Grid. National Grid Analytics and Our SAM Journey So Far; National Grid: Warwick, UK, 2015. [Google Scholar]
  54. Alter, S. A taxonomy of Decision Support Systems. Sloan Manag. Rev. 1977, 19, 39. [Google Scholar]
  55. National Grid. SAM Benefits Update <Restricted Access>; National Grid: Warwick, UK, 2017. [Google Scholar]
  56. National Grid. Strategy Enabler: The Strategic Asset Management (SAM) Programme; National Grid: Warwick, UK, 2015. [Google Scholar]
  57. National Grid. Electricity Transmission EUC output <Restricted Output>; National Grid: Warwick, UK, 2017. [Google Scholar]
  58. Derrick, D. Transcription of Semi Structured Interview between Susan Lattanzio (SL) University of Bath and Derrick Dunkley (DD); National Grid plc. Conducted 04/02/2015 <<Restricted Access>>; Susan, L., Ed.; National Grid: Warwick, UK, 2015. [Google Scholar]
  59. Callele, D.; Wnuk, K.; Penzenstadler, B. New Frontiers for Requirements Engineering. In Proceedings of the 2017 IEEE 25th International Requirements Engineering Conference (RE), Lisbon, Portugal, 4–8 September 2017; pp. 184–193. [Google Scholar] [CrossRef]
  60. Freeman, R.E. Strategic Management a Stakeholder Approach; Pitman: Boston, MA, USA, 1984. [Google Scholar]
  61. Zowghi, D.; Coulin, C. Requirements Elicitation: A survey of Techniques, Approaches, and Tools. In Engineering and Managing Software Requirements; Springer: Berlin/Heidelberg, Germany, 2005. [Google Scholar]
  62. Braun, V.; Clarke, V. Using thematic analysis in psychology. Qual. Res. Psychol. 2006, 3, 77–101. [Google Scholar] [CrossRef] [Green Version]
  63. BS EN ISO 9001: 2015 Quality Management System Requirements; BSI Group: London, UK, 2015.
  64. BS ISO 31000: 2009 Risk Management-Principles and Guidelines; BSI Group: London, UK, 2009.
  65. BS ISO 31000 Series: 2018 Risk Management—Guidelines; BSI Group: London, UK, 2018.
  66. BS EN ISO 9000: 2015 Quality Managment Systems; BSI Group: London, UK, 2015.
  67. Bell, J. Doing Your Research Project; Open University Press: Maidenhead, UK, 2010. [Google Scholar]
  68. Gray, D.E. Doing Research in the Real World; SAGE Publications: London, UK, 2014. [Google Scholar]
  69. ENA. Smarter Networks Portal. Available online: https://www.smarternetworks.org/project/nia_nget0153/documents (accessed on 4 May 2020).
  70. Lattanzio, S. Making the Right Choice. In Assets; Institute of Asset Management: London, UK, 2017; p. 8. [Google Scholar]
  71. Lattanzio, S. Decision Governance. In Qual. World; Chartered Institute of Quality Management: London, UK, 2019. [Google Scholar]
Figure 1. Case study input data.
Figure 1. Case study input data.
Civileng 01 00002 g001
Figure 2. Requirements engineering method.
Figure 2. Requirements engineering method.
Civileng 01 00002 g002
Figure 3. Stakeholder map. Freeman [60].
Figure 3. Stakeholder map. Freeman [60].
Civileng 01 00002 g003
Figure 4. Decision support tool (DST) performance management within an asset management hierarchy.
Figure 4. Decision support tool (DST) performance management within an asset management hierarchy.
Civileng 01 00002 g004
Figure 5. DST performance management process.
Figure 5. DST performance management process.
Civileng 01 00002 g005
Table 1. Asset Management Standard’s key concepts mapped against stakeholder requirements.
Table 1. Asset Management Standard’s key concepts mapped against stakeholder requirements.
Key Concepts
Stakeholder RequirementsProcess BasedProcess IntegrationCommunication & ConsultationEvolvingMonitoring and Continual ImprovementLife Cycle ApproachDefined LeadershipContextualRisk Based
Life-cycle value achieving customer requirements and over delivery of regulatory performance
Industry compliant conforms to ISO 55000 and ISO 31000
Adaptable to asset base, satisfies data requirements and organisation systems
Performance to be agile. Accurate tool that produces validated results
Technical competence reflecting asset position and network risk
Life-cycle management, safe, credible, economic and efficient.
Value, safe environmentally. Adhering to International Standard
Delivers credible results
Agile can be upgraded
We know how it works
Safe, reliable, and efficient outputs which are understood
Consistent with consumer value mechanistic approach easy to understand translating inputs, process, outputs.
Transparent, consistent with Scots TO’s
Stable – repeatable and reproducible
Table 2. ISO 5500x:2014 key concepts.
Table 2. ISO 5500x:2014 key concepts.
Key ConceptExample excerpt from the ISO 5500x:2014 Standard coded to this theme
Process based“The organization should develop processes to provide for the systematic measurement, monitoring, analysis and evaluation of the organization’s assets”.
Process integration“A factor of successful asset management is the ability to integrate asset management processes, activities and data with those of other organizational functions, e.g. quality, accounting, safety, risk and human resources”
Consultation and communication“Failure to both communicate and consult in an appropriate way about asset management activities can in itself constitute a risk, because it could later prevent an organization from fulfilling its objectives”.
Evolving“The organization should outline how it will establish, implement, maintain and improve the system”.
Monitoring and continual ImprovementIt is a “concept that is applicable to the assets, the asset management activities and the asset management system, including those activities or processes which are outsourced”.
The identification of “Opportunities for improvement can be determined directly through monitoring the performance of the asset management system, and through monitoring asset performance”.
Life cycle approachThe stages of an asset’s life are undefined but “can start with the conception of the need for the asset, through to its disposal, and includes the managing of any potential post disposal liabilities”.
Defined leadershipThe success in establishing, operating, and improving AM is dependent on the “leadership and commitment from all managerial levels”
ContextualWhat constitutes value is contextual, it “will depend on these objectives, the nature and purpose of the organization and the needs and expectations of its stakeholders”.
Risk-Based“Asset management translates the organization’s objectives into asset-related decisions, plans and activities, using a risk-based approach”
Table 3. Results of the approach requirements validation.
Table 3. Results of the approach requirements validation.
Respondent 1Respondent 2
Job TitleInformation Quality OfficerAnalytics Development Leader
1AAre there any stakeholders identified who you feel should not be included? If so, provide detail and reasoning.NoNo
1BAre there any approach stakeholders who you feel have not been identified? If so, provide detail and reasoning.Suppliers, i.e., IBM, Wipro who provide services to build the DST (SAM) platformPossibly suppliers as they would have their own input to the process.
2AAre there any stakeholder requirements you feel should not be included? If so, provide the detail and reasoning.NoNo
2BAre there any stakeholder requirements which you feel have not been identified? If so, provide the detail and reasoning.NoNo
3AWithin the stakeholder requirements it was identified that the approach should conform to ISO 55000 and ISO 31000. Do you agree with that statement? YES/NO. If ‘No’ provide your reasoning.YesYes
4AAre there any approach requirements you feel should not be included? If so, provide the detail and justification.NoNo
4BAre there any approach requirements which you feel have not been identified? If so, provide the detail and justification.NoNo
Table 4. Comparison of the elements within the risk management and DST performance management processes.
Table 4. Comparison of the elements within the risk management and DST performance management processes.
Process ElementRisk Management ProcessDST Performance Management Process
Communication and consultation
Monitoring and review
Establishing the context *
AssessmentRisk assessmentDST performance assessment
Recording and Reporting
(Added in the 2018 revision of the risk management process [65])
Separate element within the process Incorporated within the identification stage of the DST performance assessment
* Renamed Scope, Context and Criteria in the 2018 revision of the risk management process [65].
Table 5. Participants in the Evaluation focus group.
Table 5. Participants in the Evaluation focus group.
DepartmentJob TitleResponsibilities
Participant 1Asset PolicyAsset Management Development EngineerManager of DST users (including tools used within regulatory reporting)
Participant 2Process and EnablementInformation Quality ManagerAssurance of asset data
Governance of asset data and information
Participant 3Asset PolicyAsset Management Development EngineerFMEA (failure risk effect analysis) and risk modelling
Participant 4Asset PolicyAsset Management Development EngineerDST modeler
Participant 5Asset PolicyAsset Management Development EngineerAsset risk modeler

Share and Cite

MDPI and ACS Style

Lattanzio, S.; Newnes, L.; McManus, M.; Dunkley, D. Managing the Performance of Asset Acquisition and Operation with Decision Support Tools. CivilEng 2020, 1, 10-25. https://doi.org/10.3390/civileng1010002

AMA Style

Lattanzio S, Newnes L, McManus M, Dunkley D. Managing the Performance of Asset Acquisition and Operation with Decision Support Tools. CivilEng. 2020; 1(1):10-25. https://doi.org/10.3390/civileng1010002

Chicago/Turabian Style

Lattanzio, Susan, Linda Newnes, Marcelle McManus, and Derrick Dunkley. 2020. "Managing the Performance of Asset Acquisition and Operation with Decision Support Tools" CivilEng 1, no. 1: 10-25. https://doi.org/10.3390/civileng1010002

Article Metrics

Back to TopTop