IT strategy formulation, which has been known as Strategic Information System Planning (SISP), is experiencing a period of far reaching renewal. It affects both the content and the processes of making strategies [1
]. This is due to two reasons of different nature: the convergence of business and IT strategies in what has been called Digital Transformation [2
] and, from a research standpoint, the reinforcement of the “Strategy as Practice” school [3
]. Strategy is now considered an ongoing social process, whereas the literature has drawn to “the realities of strategy formation” [5
] (p. 372), with an increased focus on incremental planning, program implementation and organizational learning. However, along this evolution, not much attention has been paid to IS strategy implementation by itself, not to say the assessment of the implementation process and outcomes [6
On our part, following Earl [8
] and Peppard [9
], we pose the assumption that evaluating the implementation of the IT strategic plans on a periodical basis is key: (a) to attain the results of the intended strategies; (b) to adapt and update them to emerging threats and opportunities; (c) to facilitate common understanding and ownership of the information projects between the business and the IT leaders; and d) to ensure organizational learning and transformation. The latter has been considered one of the most challenging issues that face academic institutions [10
]. This kind of exercise needs to be carefully adapted to each individual context, taking into account the design criteria of the original Plan and the conditions of execution over time, and be properly conducted, executed and communicated.
This article is a part of a long range practice-oriented research on the process of IT strategy making at the Universitat Oberta de Catalunya (UOC), a leading European online university. We participated in the preparation of its SISP (named Information Systems Master Plan, ISMP) [11
] in 2014, the evaluation of its execution that was conducted through the Summer of 2017, and its update in 2018. The first author of this paper (hereafter, “the researcher”) was a member of the lead team of the project, working in a cooperative Action Design Research mode [12
]. With this approach, we intend both to make research being involved in a specific customer practice (“action”) and to create artifacts that may prove useful for practitioners and academicians (“design”). The piece we present here collects the process, methods and results of the evaluation or assessment phase and the major consequences for the update and further development of the Plan.
Our aim is to validate existing models for assessing the implementation of IT Strategies in complex organizations and to explore a novel framework that may prove useful both for practitioners and researchers. Therefore, our goal is not to provide an external oversight or audit of the institutional practises or to compare the results of the assessment with an existing construct or with other results, but to analyze the assessment process in itself, in order to improve the existing methods and explore avenues of generalization of the proposed scheme in different contexts.
For that purpose, following the suggestion of “cycles and choices” by Salmela and Spil [13
], we designed a tentative analytical model made up of three dimensions and two related concepts for each dimension. These categories are: (1) Strategy (meaning alignment and ability to respond); (2) Performance (that includes benefits realization and program execution); and (3) Governance (covering satisfaction of the major stakeholders and the mechanisms of program governance). We interpreted each dimension in the light of the relevant literature, both professional and academic, from very different fields. We found that, when dealing with execution, the researcher is drawn beyond strategic planning to studies on impact, success, satisfaction, architecture, and program management, among others.
For the “alignment” dimension, we preferably used the canonical SAM model by Henderson and Venkatraman [14
] and some of its updates. For “intended vs. realized strategies”, we took the conceptual approach of Chan et al. [15
] and its methodological applications by Kopmann et al. [16
] and Spil and Salmela [13
]. For “benefits realization”, we chose standard benefits libraries, such as the one provided by Hunter et al. [17
]. For “program management”, we considered those authors who show a more strategic orientation, e.g., Dye and Pennypacker [18
] and Thiry [19
]. For “satisfaction” of stakeholders, we followed the guidelines from Galliers [20
] and some classical references on impact and success of IS. For “governance”, we chose professional sources, such as Isaca’s COBIT [21
], among others. Section 2.1
presents for a complete depiction of our main references.
This manuscript is an extension of an EMCIS 2018 paper [22
]. We provide a more in-depth analysis of the state of the art, a detailed layout of the methods that make up the analytical framework and a further discussion of the case. This includes additional materials about its planning, budget, execution and a new section on the consequences for the updating of the Plan and its governance. We have also incorporated some quotations showing the point of view of the leading participants and our own observations.
In the remainder of this paper, we summarize in Section 2
relevant research in the field of IT strategy, with a special focus on evaluation of IT strategy execution; we also present and discuss there the main references that ground our proposed analytical model. Section 3
provides basic information of the context and the setting of the research. Section 4
shows the research approach, methods and tools and the project organization and planning. Section 5
highlights the main results of the evaluation process, presents some samples of the artifacts and outcomes of the exercise and summarizes the consequences and further developments that are being applied by the institution after the evaluation project. Finally, Section 6
concludes with a piece of discussion and proposals for academicians and practitioners.
2. Related Research
The study of Strategic Information Systems Planning (SISP) as a formal exercise “to provide an organization with a comprehensive portfolio of IT assets to support its major business processes and to enable its transformation, achieving durable competitive advantage” [23
] has been the object of substantial scholar attention since the 1980s. The works by Earl [8
], refined over time, and Letherer [23
] with different authors, are still the reference of the classical school of IT strategic planning. In Earl’s view, plans should be checked and updated every year. Taking into account reported lack of implementation or severe implementation issues, focus was put on the quality of the design and formulation of the IT strategy and a nice series of articles was published intending to identify critical success factors (CSF) and prescriptions for better strategy planning [23
]. The major empirical basis on CSFs for better planning is the massive research by Doherty et al. [29
], after Earl’s model.
Nevertheless, much less interest has been paid to IS strategy implementation, let alone the evaluation of the implementation process and results, which is the focus of this work [6
]. In 2008, Teubner and Mocker [6
] reviewed a sample of 434 papers published in major MIS journals between 1977 and 2001. Of those, only 21 were related to implementation. Although with a different methodology, in 2013, Amrollahi et al. [7
] found nine papers on implementation and eight on evaluation, out of 102 papers on SISP published between 2000 and 2009. After the revision of those papers and some more recent ones, we chose the comprehensive approach to continuous planning made by Salmela and Spil [13
], who proposed a framework of “cycles” and “choices” of planning that could be flexibly adapted to the needs, context and maturity of each organization and could be improved and refined over time.
Nowadays, the prevalent school on the field of IT strategy (as it had been shortly before for business strategy) advocates for the consideration of strategic planning as a contingent [38
] and ongoing social process in organizations [39
], addressed to build-up new capabilities, blurring the dichotomy of MIS and business domains [41
]. This has been called strategizing
] or strategy as practice
]. Subsequently, a more comprehensive “practice turn” in strategy research has been proposed [3
]. The literature has shifted towards “the realities of strategy formation” [5
], such as incremental planning and cultural and organizational transformation. “Agile” strategic planning [45
] or even in some cases the lack of an explicit and formal IT strategy plan have been praised as predictors of better IT strategy execution [46
]. A summary of these trends is shown in Table 1
As regards higher education, SISP and IT strategy formulation as an instrument for industry transformation and the acquisition of competitive advantage are not rare in the literature [10
]. This has been the object of single or multiple case studies in different contexts [53
]. Sabherwal [48
] and Clarke [56
] analyzed the combination of different methods in strategy planning. Kirinic [52
] introduced the concept of maturity levels. Effective use and adoption of IT and social and organizational issues, such as governance, management of stakeholders, technology adoption and change management (i.e., “strategies for winning faculty support for teaching with Technology” [49
], have been the major topics of interest [10
2.1. Main References
Following Spil and Salmela’s [13
] approach, we selected from the analysis of the academic and professional literature and discussed with the customer (in this context, “Customer” is the usual term used in Action Research [70
]) a model of assessment aimed to examine the main achievements and pitfalls over the execution of the Plan. In our conceptual research and practical intervention, we actually found that IT strategy execution is a space where strategy planning, IT governance, enterprise architecture, project portfolio management or the “success” of information systems, among other disciplines, necessarily meet.
From these suggestions and other of practical nature (information availability, coordination costs, and time-frame), we designed a scheme based on three major conceptual dimensions of analysis and two categories of key concepts for each dimension (Table 2
): (a) the contribution of the IS strategy to the business strategy; (b) the effectiveness of the strategy implementation; and (c) the quality of governance. Next, we reflect on the main sources being used for each analytical dimension.
represents strategic alignment over time. We borrow first the Strategic Alignment Model (SAM) as proposed by Henderson and Venkatraman [14
], taking into account some more recent discussions, e.g., those of Chan and Reich [71
], Coltman et al. [72
] and Juiz and Toomey [73
]. As for the ability to cope with new threats and opportunities, we took the classical approach by Mintzberg and Waters [74
] for business strategy and Chan et al. [15
] for IT strategy. Kopmann et al. [16
] and Salmela and Spil [13
] provide further practical considerations for the method of application.
is related with the effectiveness of IS Strategy application, either from an external point of view (the acquisition of business benefits) or an internal one (the quality and efficiency of program execution). For business realization, we took the repertoire of methods and libraries of examples provided by different academic and professional authors, mainly Parker et al. [76
], Ashurts et al. [77
], Ward and Daniel [78
] and Hunter et al. [17
] from Gartner. As to program execution, we chose authors who provide a more strategically driven approach to project portfolio management, such as Dye and Pennypacker [18
], Thiry [19
], Meskendahl [79
] and again Kopmann et al. [16
, we include both the feedback provided by key participants about the implementation of the strategy and those issues related with program management and governance mechanisms. As to the first part, we took advantage of the guidelines provided by Galliers [20
]; we also needed to take some classical references of the literature about impact as in Gable et al. [82
], and success of IS as in the works by DeLone and McNeal [80
] and Petter et al. [83
]. Program governance and management, from a strategic standpoint, is well represented in the books by Dye and Pennypacker (eds.) [18
], and Thiry [19
] and the article by Bartenschlager et al. [84
]. Finally, ISACA’s COBIT is a professional standard that is attracting increased interest of the academia, more specially in the higher education industry.
The application of these concepts into specific methods is shown in Section 4
and Section 5
3. Research Context
The UOC is the oldest fully online university in the world. Founded in 1995, it now enrolls 70,000 students, has 400 full-time professors and 4300 associate part-time professors, provides 77 official graduate programs and runs a budget of 108.7 M€ (million of euros). It operates within a public–private funding and governance regime, in a highly-regulated environment. The current governing body, appointed in 2013, designed an ambitious growth and transformation strategy [85
] of which the ISMP for the period 2014–2018 was an instrumental part. The annual budget allocated to the Plan is about 3 M€, out of a total IT budget of 8.7 M€. The IS department (led by an IT Director or “CIO”, who reports to the Chief Operations Officer) has 53 internal and 85 external full-time employees.
The IT expenditure vs. revenue ratio and the weight of the strategic or “transformational” projects within the portfolio of IT assets is substantial and could be compared with the figures of digital industries [86
], such as software and Internet services. The fact of being a pure digital player makes paramount for the UOC the effective exploitation of information technologies in the global and rapidly evolving market of higher education and life-long learning [87
The ISMP was structured in ten “strategic initiatives” (meaning groups of programs and projects aimed to a single business objective) and 42 individual projects to be deployed over a period of four years (2015–2018). Table 3
shows the major strategic initiatives that form the ISMP.
The themes of the original ISMP were oriented to the transformation of business processes and IT infrastructure. It is worth noting the weight of those cross-organizational programs covering the core processes of the value chain (CRM, SIS, LMS, and BI), which amount to 29.3% of the total, and those considered “infrastructural” (IT Architecture and Infrastructure, Mobile, UX, Security), representing 43.1% of the planned budget. It is also noticeable the small proportion allocated to the Learning Management System, as compared to other management infrastructure processes. When discussing this issue with the former COO and main sponsor of the ISMP, he posed the lack of a clear view on how the LMS should evolve, the difficulty to articulate a strong leadership within the academy and the absence of robust mature technology solutions in the marketplace as the main reasons that explain this outcome.
Since its original conception, the ISMP was designed as: (a) a top-down transformation program; (b) addressed to renew the core business applications and the technology infrastructure base; (c) ruled by the top management and (d) led and executed by the CIO (Chief Information Officer); and (e) with the support of a program office [11
As to the assessment, which is the object of this paper, it is the initial part of a broader exercise to update the ISMP, that is: (a) to re-align the Plan with the Strategic assumptions of the business (as expressed in the update of the institution Strategic Plan); (b) to identify new IT strategic initiatives for the years to come; (c) to get new funding for them; and (d) to set up new governance arrangements. In the words of the current COO and former CIO:
“We learned that Strategic Planning was the proper way to focus and preserve transformation and that transformation was not still complete... and maybe never will.”
The whole project of updating the Plan was executed from June 2017 to March 2018. The assessment phase, which is the object of this article, took place between June and September 2017.
4. Research Model: Methods and Arrangements
4.1. Research Methods
The overall approach of this research has been an Action Design Research [12
] approach. A toolkit has been proposed for the deployment of the assessment. It is grounded on the analytical dimensions that we presented in the related research section and combines different methods, tools and techniques. For instance, a case study stance was carried out to understand the original ISMP and the changes produced over time. A quantitative and qualitative independent survey was ordered to capture the satisfaction and feedback of the major stakeholders. These different work-streams are correlated and the process works through a number of iterations. The timing, the content and the setting of individual and group interactions were critical, as it was their preparation through previous analysis of the bulk of materials produced by the program office and the project leaders. A summary of this toolkit is presented in Table 4
. The assessment was completed in ten weeks. Forty-two people of different ranks (mainly top and middle managers) took part, with an estimated effort of 800 man hours (for a better understanding of the different participants and their roles, please see Section 4.2
To complete our research purposes, an additional round of in-depth reflective interviews with members of the Project team, the sponsors and the Steering Committee were conducted between October and December 2017. Some of their insights are reflected on in this paper.
4.2. Project Organization
To conduct this effort and to prepare a proposal for the Executive Board of the University, a Steering Committee (SC) and a project team (PT) were settled. The researcher was commissioned by the university as the project co-leader. He took part in most of the workshops and meetings and carried out personally individual interviews with prominent members of the management and the faculty. This commission was made explicit, both as a support to the management and as an Action Research exercise. A Memorandum of Understanding between the institution and the researcher was signed. The researcher could work with scientific rigor, freedom of action and independence but his proposals regarding the method had to be adapted to the available information and the organizational context, within a demanding time-frame. He also intervened in the preparation and discussion of the conclusions with the PT. An organization chart of the project is shown in Table 5
Let us consider the governance arrangements for the assessment project. First, it is worth reminding that the main object of the assessment was to build up a case for the updating and renewal of the ISMP. This explains the leading role of the CIO, who reports to the COO, and why both took the status of sponsors.
During the execution of the ISMP over the past years, the governance focus had been put on monitoring and controlling the program and approving budgets and major changes. A small committee formed by the CEO, two deputy Managing Directors (COO and CFO) and the CIO was in charge. In contrast, for the evaluation, the Steering Committee was extended with the involvement of the Vice-Chancellor of Teaching and the dean of one of the schools. Eventually, this sent an early signal to the academy in order to re-balance the power of decision and resource allocation and to increase the weight of those projects related with learning and research. Finally, this new scheme facilitated the involvement of a large number of participants in the evaluation process.
The project team included not only the members of the Program Management Office charged with the ISMP, but also the IT Demand Manager and her teams, i.e., the group of IT managers with a better knowledge and closeness to the leaders of the business and the academy and their needs. This combination should also allow better reflecting on the value provided by the Plan to the business outcomes and, to some extent, break the flawed customer–provider relationship between IT and users.
To our view, those arrangements were vital not only for the success of the project but also to the further governance of the whole program, i.e., the updated ISMP. The assessment acted as a trial and an advancement of subsequent schemes of planning, organization and governance.
4.3. Project Plan
A more detailed outline of the project plan is shown in Table 6
. Four work-streams were set up, based on the conceptual dimensions of analysis and adapted to the practical working context: (1) Strategic Alignment; (2) Program Execution; (3) Benefits realization; and (4) Program Management. It was not produced properly in a sequential manner. The working plan was as follows. First, the IT teams put together the internal information and elaborated their own estimate. Later, the Project Management Office, also constituted as the project office of the assessment exercise, normalized these inputs. Finally, the project team, with the support of the Head of the IT Business Partnership group, prepared the sessions with the business units.
The latter was an unusual event for both parties: working out program execution or business benefits from IT projects (work-stream 2), arising success metrics (work-stream 3) and taking a common stance was not that easy. Often, the business units approach was to focus on the pending demands and taking advantage of the event to formulate new claims. Sometimes the IT leaders took a defensive stance. However, in most cases, the exercise was perceived as positive and established a common ground for the phases to come: building up a shared view of IT initiatives in terms of business benefits and a joint commitment to determine new demands for the updated program. It was also an opportunity of growth for the IT teams, more accustomed to execute well defined projects working with contractors and not so much to reflect on business value and to interact with business leaders and managers.The researcher participated in most of these workshops.
For work-streams 1 (strategic alignment) and 4 (program management), the approach was quite different. The project office prepared draft documents to compare the initial outline, the actual results and the alignment with the recently updated Strategic Plan of the UOC. The researcher, acting as project co-director, held individual discussions with the top management. It is worth noticing those differences in the approach. When dealing with business strategy issues, the project leaders needed to carefully evaluate for every section of the assignment which were the parties that should be involved and the working approach with them.
Finally, it is worth mentioning the importance of providing ongoing feedback to the sponsors and of preparing “professional” (not academic) materials and presentations for the SC.
Although with new forms and labels, IT strategy making is still a major concern for IT and business executives and managers. The current paradigm advocates for an ongoing contingent social process of strategy formation or “strategy as practice”. This paper adheres to this stance. However, academic and professional literature has paid less attention to the evaluation of the implementation of IT strategies, notwithstanding reported lack of implementation or failures and pitfalls in the implementation attempts. In our view, plans should be assessed on a periodical basis to secure continuous business alignment, review priorities, reinforce organizational commitment with IT major plans, improve the conversation between business and IT and capture the benefits of investments. This article, after an extended review of the state of the art and an Action Design Research exercise, aims to contribute towards a dynamic framework for the assessment of the execution of IT strategic plans.
For this purpose, we have suggested selecting three main dimensions of analysis: (1) Strategy (that observes strategic alignment and the response to emergent business strategies); (2) Performance (in terms of benefits realization and program execution); and (3) Governance (including the perception of major stakeholders and the mechanisms of decision making). It is worth mentioning that, when studying execution, this work goes beyond the field of classical Strategic Planning up to other different domains, such as IT Governance, Project Portfolio Management, Enterprise Architecture and the concepts of impact and success of IS. Each dimension cannot be considered in isolation but it is related to the rest and makes a part of a comprehensive view. Of all these, the most difficult to acquire and to work with the teams is benefits realization, i.e., the way that the selection, management and execution of IT investments make a difference and creates value for the business. In our view, this is the most critical one, since it allows improving the quality and effectiveness of the dialogue and collaborative work between IT and business, to set priorities and to secure the return of IT investments.
With respect to the assessment model we have proposed, the selection of variables and metrics and their measurement should be reviewed through further research and actual implementation. We also suggest that a specific dimension related with organizational learning and business transformation should be extended and integrated within the model. Moreover, the variables related with benefits realization need to be refined within each specific context. To create better choices of analysis and intervention, we recommend a thorough examination of various contexts of application and the creation of improved maturity models.
According to the process and its results, the proposed method seems to be a quick, effective and efficient approach, in agreement with our initial working hypothesis and the literature. It also seems to be a practical tool to renew the conversation and take actions for improvement among the IT leaders, top management and major constituencies.
Regarding future work, we plan to repeat the exercise periodically, to validate and improve this approach. We will also compare the specific results of the analysis of our case and its comparison with other reported cases.
6.2. Practical Implications
The assessment occurs in a short time-frame through document analysis and intensive individual and group interactions. The governance, preparation, content, setting and selection of participants are all crucial. Decisions of who, in which role and how active they are over the process affect not only the effectiveness and quality of the exercise, but also the involvement of the different stakeholders, the visibility and credibility of the effort and, even more, the implications and further developments for the organization. The exercise is complemented with a number of reflective interviews to better understand the process, results and consequences for research purposes.
It may be said that the process is part of the product: the overall outcome seems to be an improved understanding and commitment (a buy-in) of the top and middle managers regarding the Plan. The exercise is not an audit, but an effective lever for the understanding and improvement of the organization’s performance. Planning and execution occur in a specific and evolving context. Thus, the methods need to be customized and the conclusions need to be interpreted under these conditions. A deep involvement of the researcher in the project team is relevant, without diminishing his independence and academic space.
As to the results of the assessment, in general terms, it may be stated that the closer the results are to the design criteria of the Plan (or the formal acceptance change criteria over its execution), the more successful may be considered the final results. Consequently, the “success” of the implementation of a strategic information system plan will be mainly measured by its ability to effectively achieve the objectives the organization wanted to achieve as perceived by its major stakeholders. This perception might be misled or biased to the eyes of the researcher for different reasons. The objective of the assessment is not to find the truth, but to confront those perceptions with evidence, create a helpful room for discussion and assist the management in order to react on the results. On the other hand, execution appears to be mainly related to the ability of IT and business to “mature” strategic initiatives as defined in the SISP into actionable projects, the certainty of the technical solutions at hand and the strength of leaderships. A tension appears between strategic impact and blunt execution. Finally, strategy implementation seems closely linked with project portfolio management and the results show the need of developing further capabilities in this area.
The results of the assessment have been taken into consideration for the update of the Plan and its governance mechanisms, which are being currently worked upon. Our research includes the support of the governing and management teams and new results will be submitted for publication shortly.