Next Article in Journal
Financial Spillover Effects in Supply Chains: Do Customers and Suppliers Really Benefit?
Previous Article in Journal
A Novel Integrated FUCOM-MARCOS Model for Evaluation of Human Resources in a Transport Company
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Overcoming Barriers in Supply Chain Analytics—Investigating Measures in LSCM Organizations

Chair of Logistics, Technische Universität Berlin, Straße des 17. Juni 135, 10623 Berlin, Germany
*
Author to whom correspondence should be addressed.
Logistics 2020, 4(1), 5; https://doi.org/10.3390/logistics4010005
Submission received: 23 December 2019 / Revised: 16 February 2020 / Accepted: 17 February 2020 / Published: 26 February 2020
(This article belongs to the Section Artificial Intelligence, Logistics Analytics, and Automation)

Abstract

:
While supply chain analytics shows promise regarding value, benefits, and increase in performance for logistics and supply chain management (LSCM) organizations, those organizations are often either reluctant to invest or unable to achieve the returns they aspire to. This article systematically explores the barriers LSCM organizations experience in employing supply chain analytics that contribute to such reluctance and unachieved returns and measures to overcome these barriers. This article therefore aims to systemize the barriers and measures and allocate measures to barriers in order to provide organizations with directions on how to cope with their individual barriers. By using Grounded Theory through 12 in-depth interviews and Q-Methodology to synthesize the intended results, this article derives core categories for the barriers and measures, and their impacts and relationships are mapped based on empirical evidence from various actors along the supply chain. Resultingly, the article presents the core categories of barriers and measures, including their effect on different phases of the analytics solutions life cycle, the explanation of these effects, and accompanying examples. Finally, to address the intended aim of providing directions to organizations, the article provides recommendations for overcoming the identified barriers in organizations.

1. Introduction

The business of logistics and supply chain management (LSCM) is changing rapidly based on new technologies and consumer trends that demand new and broad digital capabilities [1]. The phenomenon that is taking hold of this business, transforming it, and that demands action from the organizations in the market, is digitalization [2]. One of the inherent effects of digitalization is the constant growth of data, describing all sorts of aspects of the world and, thus, possessing potential insights about it and opportunities to act on those insights. This growth is not expected to halt or decelerate but to accelerate exponentially—a recent study estimates a growth of the global datasphere from 33 zettabytes in 2018 to 175 zettabytes in 2025 [3].
To leverage the growth of data and the opportunities of digitalization, the studies mentioned above emphasize the use of analytics. Analytics is the use of various quantitative, explanatory, and statistical methods on extensive amounts of data [4]. When focused on the context of logistics and supply chain management, it is specified as supply chain analytics (SCA) [5]. Scholars have established that SCA helps to improve organizational and supply chain performance, increase efficiency, reduce supply chain costs, and contribute to competitive advantage [6,7,8,9,10].
The implementation and application of SCA is, however, an immense challenge for organizations. Scholars and organizational reports alike have demonstrated a variety of barriers organizations face and the resulting effect of low implementation rates. Scholars have presented barriers ranging from management failures and lack of analytical knowledge to unwillingness to commit IT resources [9,10,11,12,13,14]. Organizational reports highlight the barriers of cost, insufficient data, and not achieving the aspired to benefits [15,16,17].
However, these studies did not explicitly intend to explore such barriers but identified them in addition to their research objectives. Despite these studies clearly implicating the existence of various barriers, scholars in the field of LSCM have rarely addressed the topic of measures to overcome these barriers. An exception is the study of Schoenherr and Speier-Pero [12], who provide some insight on factors contributing to the successful implementation of analytics in LSCM. However, research on practices and measures has concentrated on the research field of analytics in general. As a result, there has been no investigation of whether LSCM employs specific measures, which measures are most relevant for LSCM managers, and what impact they have. Thus, this study intends to take a specific look at the barriers that occurring for organizations investing in analytics in the field of LSCM with the additional intend to identify which measures and practices are being applied to overcome these barriers. This research is intended to support logistics and supply chain managers concerning directions in which to go forward with SCA in a successful and impactful manner and support the transformation of their supply chains into the digital age.
In particular, this study seeks to contribute to the following research objectives:
  • RO1: Identify and systematize the organizational barriers that appear when companies initiate, perform, or deploy SCA initiatives within the organization.
  • RO2: Identify and allocate organizational measures that seek to cope with the depicted barriers.
In pursuit of these objectives, 12 in-depth interviews were conducted with different actors along the supply chain, including suppliers, original equipment manufacturers (OEMs), retailers, and logistics service providers, as well as additional analytics providers with strong experience in LSCM. From the exploratory and extensive interviews, the study identified the barriers to initiating SCA initiatives, applying SCA, and deploying SCA solutions and derived and the organizations’ measures for coping with such barriers. The collected data was analyzed using the Grounded Theory approach [18]. This approach was supplemented by the Q-Methodology [19,20] to derive a theoretical framework of measures and barriers in SCA, mapping barriers and measures of SCA initiatives and their relationships. This contribution, in terms of systemizing barriers and measures, extracting core categories, and identifying proposed relationships, provides an important basis for further deductive research.
The remainder of the article is structured as follows. Section 2 provides an overview of the relevant theoretical background to this study. Section 3 presents and details the methodology. Section 4 contains the results and their discussion. The conclusion, including implications, further research directions, and limitations, is presented in Section 5.

2. Theoretical Background

To overcome barriers that prevent the successful use of SCA solutions, this section introduces SCA with exemplified effects and its previously documented barriers. Further, this section presents the recommended practices discussed in non-domain-specific analytics literature as a preliminary analysis of available measures.

2.1. Supply Chain Analytics

SCA is understood as the domain-specific application of business analytics (or analytics), which is “concerned with evidence-based problem recognition and solving that happen within the context of business situations” [21]. In organizations, it is applied by analysts (“data scientists”) using the most advanced and complex tools and methods [4] or by process experts in the form of self-service analytics exploiting the accessibility of analytics in respective software for self-service [22]. Analytics is an essential component of generating value from “Big Data” [23]. Analytics has been reported to increase decision-making effectiveness, can generate organizational agility (given a fit between analytics tools, data, people, and tasks), and can contribute to providing competitive advantage [24,25,26].
SCA is specific to the domain of LSCM. LSCM is concerned with efficiently integrating the actors of the supply chain (suppliers, manufacturers, logistics service providers such as warehouses and transportation, retail) such that products are manufactured and distributed to the customer in the right amount, at the right time, at the right quality, and with system-wide minimal cost [27]. As a pivotal perspective to the definition based on actors and objectives, Christopher’s definition [28] introduces the various activities and managed entities of LSCM. These include management of procurement and movement and storage of products in the different stages of their life cycles as materials, parts, and finished inventory. While the multi-objective trade-off under the influence of various actors, tasks, and entities already indicates the need for advanced methods to support decision-making, further aspects increase this need. Dynamic vertical, horizontal, and diagonal interactions among actors create hard to control complexity [29], complexity increases the occurrence of disruptive events and, thus, operational and financial risks [30], and due to the focus on the customer, common methods for managing disruptive risks from other domains are potentially a poor fit to LSCM [31]. Further, besides disruptive events, volatility results in further mismatch of supply and demand [32], adding to the need for advanced methods to support decision-making.
SCA caters to the various decision-support needs of LSCM as it exploits the variety of analytics methods for the multitude of issues [5]. Thus, it is concerned with applying quantitative and qualitative analytical methods to recognize and solve problems evidence-based in the context of LSCM [5,33,34]. In this regard, SCA helps to measure and monitor performance, understand and control poor performance, and improve performance [6,7,8]. However, with the variety of tasks requiring support come a variety of applications that are hyper-specialized and also provide a challenge for organizations in LSCM [10]. Further, adoption of SCA in organizations in LSCM is slow, unwillingness to share data with supply chain partners is widespread, and organizations experience barriers [12,13,35].

2.2. Barriers of Supply Chain Analytics

In this section, barriers to adopting and employing analytics are reviewed. Several studies have discussed individual barriers in the domain of LSCM. However, these barriers are widely spread across studies with no study systemizing them or exploring barriers explicitly to create a comprehensive catalog. The presented literature overview is intended as a basis for the data collection. This literature overview is limited to the field of LSCM.
Following the cycle of an adoption process, an early barrier to adopting analytics is the approachability of analytics. Efforts to adopt and employ analytics may be slowed down by a lack of consensus regarding the terms surrounding the concept of analytics and the subsequent confusion [13,14] as well as the perception of complexity and the difficulty of its management [11,12]. This is accompanied by a constant change of technologies and methods, which organizations would need to be able to keep up with [6,10]. Further, managers see the lack of LSCM specific solutions as a barrier [12]. Sanders [10] describes this inability to adopt analytics due to overwhelming complexity as “analysis paralysis.”
When approaching analytics despite the perceived complexity, organizations in LSCM encounter a lack of experience of employees in analytics and utilizing data [10,12]. As opposed to the paralyzing complexity described above, the missing literacy concerning data and analytics is expressed in missing knowledge and creativity. The lack of knowledge about how to approach and utilize data has been repeatedly stressed in the literature [10,12,13,14,35,36]. This includes the identification of data most suitable for analytics, understanding which data is useful and which is useless, experience with relevant technologies, ideas about what to do with the available data, knowledge on how to transform data into information for decision-making, and how to drive the supply chain with data. Oliveira et al. [36] emphasize organizations’ lack of roadmaps on using data and information even after systems for data collection have been set up. A further hindrance in this context is the inability to clearly point out how value in terms of the business objectives is generated from analytics, and thus to sell analytics initiatives to key stakeholders [13].
To adopt and employ analytics, resources are needed. Labor resources comprise the first form of resource identified in the literature as creating challenges for organizations in LSCM. One study reports time constraints as a primary barrier [12], indicating the lack of time to get familiar with analytics and relevant technologies as well as to execute initiatives in addition to daily business. Zhu et al. [37] emphasize the time-consuming effort of getting analytics solutions into production (meaning implemented and used in the value creating process), which might consume time from a variety of employees and hinder parallel initiatives. In addition, some labor requirements for analytics present a challenge for LSCM organizations in themselves. The literature emphasizes a lack of employees in LSCM organizations to handle and understand data, analytics software, and IT systems, as well as being able to interpret the analytics results [38]. However, Kache and Seuring [13] report that LSCM managers lack an understanding of what skills the required employees need to possess, leading to problems in recruiting these employees.
The second form of resources needed is monetary investment. The cost of solutions available on the market, the cost of getting the necessary raw data, and systems integration (of the focal organization and its partners) are named as barriers to adopting analytics [10,12,14,39]. In addition, the returns from the investments and the benefits are hard to predict and unclear in advance. Thus, it is hard to estimate the breakeven point [10,14,36]. Furthermore, scholars highlight the problem of attribution of any performance increase to the investment in analytics. They present issues of quantifying benefits, attributing benefits to analytics or process changes, and time lags between implementation and identifiable performance increase (ramp-up) [36,38,40]. In this regard, scholars have referenced the “IT productivity paradox,” as discussed by Brynjolfsson [41], which presents the paradoxical situation of researchers not being able to measure a productivity increase from IT while organizations were increasingly investing in it (e.g., because performance increase is not measurable with their usual performance measurement or because of time lags).
To employ analytics, several data sources have to be (continuously) combined to solve complex business question. However, the issue of integrating systems in a diverse and incompatible IT landscape impedes the integration of data sources. LSCM managers reported a lack of integration of existing systems to scholars due to evolved (rather than designed) environments with specialized systems for the various business units or due to legacy systems not intended for data exchange [12,13,14,38]. Scholars have emphasized that the issue is particularly present when supply chain partners with unaligned systems (fragmented systems with varying maturity) are expected to combine data sources [13]. Mistakes in this integration of systems can lead to inaccurate results and wrong decisions [36].
The supply of data is another major barrier to the adoption and employment of analytics in LSCM. First of all, a lack of data or a lack of timeliness in providing the data has been reported as hindering [10,12,42]. Hazen et al. [42] have particularly stressed the issue of insufficient data quality in general, which other scholars have confirmed, expanding the issue to having different quality levels controlled by different employees [7,14,39]. Hazen et al. [42] further stressed lack of means for measuring data quality, controlling data quality, and the persistence of bad quality data as input to analytics until they are actively removed (since they are not consumed as opposed to other inputs). Further scholars added inexperience in assuring data quality to the list [43].
Having access to data comes with the issue of responsibility for that data, which can limit its usability for analytics. Handling customer data alongside the organization’s own data brings the responsibility for organizations in LSCM to ensure data privacy and data security [13]. Moreover, access does not equal ownership. Data ownership and the associated rights to data can be missing or unclear [14,38].
To gain access to relevant data from supply chain partners or the right to use the data from partners for analytics, their collaboration on data and analytics is needed. However, several scholars present the issue of unwillingness of supply chain partners to participate in data sharing or its dependency on the provision of incentives and security [13,43]. Partners may want to, and should, keep certain data confidential (e.g., for legal reasons), but they also keep their data to themselves because of lack of trust and fear of losing control over their data [13,14].
If all the barriers above are somehow overcome, the use and subsequent benefits of analytics solutions in LSCM organizations are not assured. Scholars report issue with the mindset of employees regarding analytics solutions. The employees may not want to use new, analytics-based systems, do not show openness toward them, and are not eager to change their culture [9,12,38]. Richey et al. [14] specifically discuss how employees in LSCM value relationships and trust-based business generation, which they do not want to sacrifice for data driven solutions.
Another barrier to the adoption and employment of analytics in LSCM can be the physical process assumed to be supported by the analytics solution. Scholars discuss the reduced impact of analytics if the process has a low level of uncertainty (or volatility) [37,39]. While this takes only a certain range of application areas into account and ignores benefits for complex information inputs or benefits for faster decision-making, it presents the potential flaw of employing analytics to processes that may offer little value in terms of return on the investment. Reversing the perspective, Oliveira et al. [36] emphasize the need for a certain level of process maturity before employing analytics and gaining value from it, and Srinivasan and Swink [39] present the need for flexibility in the physical process such that reactions to analytical results can be applied. If there is no flexibility, the opportunities to gain value from analytics are foregone since they cannot be executed. As a result, the value and benefits from analytics depend on the status of the physical process, whose weaknesses can become barriers to utilizing the full potential of analytics.
Assuming that all the above barriers are overcome, the absence of governance of analytics can nevertheless hinder organizations from gaining the expected value from analytics. A lack of top management commitment and understanding of advantages from analytics will of course hinder the adoption of analytics in the first place [11]. When employing analytics, the lack of governance can result in a lack of focus and missing relevance of the results produced (“measurement minutiae”), reducing the effect of analytics [10]. However, scholars have stressed that the result of the absence of strategically and managerially controlled analytics is isolated and fragmented analytics efforts, resulting in only small benefits (even in disadvantages for other functions) and unaligned analytics maturity along business functions [10,38]. To conclude, analytics efforts need to be guided by an organizational understanding of the purpose of analytics and it should be incorporated into the business strategy. Otherwise, gaining the hoped for value and benefits can be impeded [13].

2.3. Measures to Fully Utilize the Benefits of Analytics

As opposed to barriers, measures are rarely addressed in the LSCM literature. Even in the field of analytics, countering barriers is usually a side note. In particular, no study could be found systemizing measures, deriving core categories, or identifying effects and relationships. In this section, practices and measures are explored considering literature on analytics in general and the few studies from the LSCM domain.
Schoenherr and Speier-Pero [12] investigated several contemporary aspects related to the adoption level of analytics in LSCM organizations and created a list of circumstances that lead to a higher motivation for organizations to adopt analytics. Their list does not intend to show how organizations created the circumstances intentionally or unintentionally but explores the status quo in these organizations. Thus, the list does not represent executable practices. The motivators include the existence of aspects the absence of which was noted, in the discussion above, as forming a barrier, such as senior leadership promoting analytics and displaying commitment. Further, the use of analytics is encouraged by competitors and colleagues, and to a lesser degree by customers as well. The strongest motivator for using analytics has been reported as the user’s conviction about the value of analytics, implying the need to experience the value from analytics, probably in support of the user’s own processes.
Scholars have presented several practices to make the use of analytics more appealing, which could lead to increased conviction concerning the value of analytics and motivate its continued use. This includes visually appealing software interfaces, which would also help to make the results easier to understand and comprehend; mobile availability via tablet computers and smartphones, which have been observed to increase the frequency of using analytics; and user-engaging approaches that strive to stimulate interaction, like gamification or self-service [44]. Further, showing the outcome of decisions on individual scorecards or assessments after (minor) decisions is supposed to build trust in analytics and show its value, while scholars report this application in a supportive way as opposed to a blaming or judging way [25,45]. Additionally, for stakeholders, who are needed to provide funding, analytics can be made more appealing by articulating relevant business cases [4].
In contrast, scholars have also reported more instructive practices. Recommended supportive change management efforts include promotional activities, training courses, and coaching concerning analytics and its goals and benefits [45,46,47]. Further practices include underlining the necessity to discontinue non-data driven methods, the demand for data-based explanations in decision-making, and incentives for data-driven decision making [46,48]. One recommendation even suggests managers allowing themselves be overruled by analytics to promote their trust in it [48]. However, the replacement of employees who are unwilling to change has also been suggested [46].
For advancing analytics in the organization, several practices have been presented. To create more relevant business cases requires managers to build a certain understanding of analytics—not for application, but for understanding execution and requirements [25]. Another recommendation for managers is to ask second order questions, which are, roughly described, questions not about how to improve the solution to a problem, but questions around finding another solution to the problem [49]. However, efforts are also suggested from the analysts’ side, either by getting “translators” to explain analytics in business terms or requiring analytics experts to build business understanding [4,25,50]. Overall, the LSCM-specific and general analytics literature suggests an incremental and evolving approach, potentially starting in high impact areas [10,50].
One relatively specific practice, as compared to the other practices presented above, is the collaborative approach between analytics experts and business experts, which helps to build analytics expertise on the business side, and vice versa, and supposedly results in better use cases and outcomes. This has been suggested by several scholars, named a “field and forum” approach, “shadowing,” or simply described as the analytics expert joining the business expert in his real-work value creation and decision-making processes, including the related data usage, to learn and discuss opportunities for improvements afterwards [44,49,51]. Another related practice is the use of “data labs”, in which experts from the different areas are co-located to work collaboratively free from the distractions of daily business [44,49]. These practices are complemented by agile development methods, in which the progress of shorter periods (“sprints”) is discussed, e.g., by assessing prototypes [44].
In contrast to these practices oriented toward individuals, some practices to align and integrate the analytics efforts of an organization are suggested. Scholars suggest setting up an enterprise-wide information agenda and strategic directions for data and analytics [46,52]. A cross-functional integration of several business units and collaboration in the organization (e.g., in a newly created center of excellence for analytics) and with key partners has been suggested specifically for LSCM, since analytics initiatives quickly influence partners [10,13,46]. Additionally, researchers have suggested creating a “single-source-of-truth,” which demands data exchange between all business units, as an organizational measure [45], although this also touches on technological issues.
Concerning technological aspects, one issue addressed in the literature is data. Scholars have recommended creating data standards and automating data onboarding, integration, and quality processes as much as possible to accelerate the creation of insight from data [44]. It has further been observed that analytically mature organizations employ centralized groups responsible for ensuring data quality and availability [4]. In the context of LSCM, the structuring of data in a manner described as “scrubbing” has been recommended as a first step in maturing in analytics [10]. The highlighted importance here is that the errors which would occur during data generation are eradicated such that data is already clean, structured, and organized for insight creation.
Another technological aspect comprises the IT systems. By analyzing the challenges and opportunities of analytics in LSCM, scholars stress the status of existing IT systems and their role in making the use of analytics challenging [13]. Thus, they recommend the prioritization of continuous IT investments. Concerning data integration issues in and between organizations (e.g., with supply chain partners), researchers have presented the example of digital platforms, which act as a hub-and-spoke system with interfaces to several partners and customers instead of creating a point-to-point connection [10]. Such a platform would level different formats and standards.

3. Methodology

For this study, a mixed-methods approach has been chosen to support the sub-tasks of the two formulated research objectives, identification and systematization. The approach is set out in the following sections. Data collection and data analysis are explained. This section concludes with reliability considerations.

3.1. Research Design

A mixed-methods approach has been chosen, combining grounded theory [18] and the Q-methodology [19,20,53] to exploit synergies between them. Explicitly, after open coding in which phenomena are identified, Grounded Theory intends the scholars to use axial coding to relate the identified phenomena to each other and selective coding to extract core categories [18]. This search for core categories is executed via the Q-methodology, which intends to perform a perception based sorting, creating categorizations and additionally unveiling patterns of perceptions forming the categorizations and providing insights on them [19,20]. This mixed-methods approach provides a more robust framework for the relationships compared to either individual approach. Each of the methods has been used in the context of LSCM to identify barriers and measures.
Researchers use Grounded Theory with the intention of creating deeper knowledge on a research phenomenon and have repeatedly done so to identify barriers and measures in LSCM. For example, studies have identified and categorized barriers and coping mechanisms on implementing sustainable practices in LSCM [54], the implementation of supply chain technologies in emerging markets [55], and the transition towards a supply chain orientation [56]. These studies investigated barriers to transition to a rationally more beneficial position, usually strongly influenced by human behavior, and the systematization of measures and their effects. They emphasize the capability of Grounded Theory to identify and aggregate knowledge as well as to build a consensus [18].
Specifically focusing on extracting consistent aggregation and groups of concepts, the Q-methodology has been used as the method of choice in LSCM research. The method has been used to form a mutually exclusive and exhaustive taxonomy of biased behavior in supplier selection [57], conceptualization of moderators and sources of Supply Chain Volatility [32], and to aggregate attitudes and levels of acceptance towards different innovations and practices in low-input food supply chains across multiple cultures and multiple stages of the supply chain [58]. These articles present, by example, the utility of the Q-Methodology for aggregating measures and extracting conceptualizations in LSCM research and resultingly support the systematization of identified knowledge.

3.2. Data Collection

The data collection method was based on established practices. To sample relevant experts as interviewees, boundaries have to be defined [59]. First, to gain a broad overview of barriers and measures in LSCM, the boundary of relevant organizations for data collection were set to include various Supply Chain actors, as introduced in Section 2.1: manufacturing (OEM and suppliers), retail, logistics service providers, and, in addition, analytics providers with a focus on LSCM. Second, experts had to have relevant job functions related to analytics. Third, experts had to have experience with analytics in the context of LSCM, and thus experience with SCA. Potential interviewees were identified based on the first two criteria and invited to the study. After an invitation to participate, potential interviewees with a positive response expressing interest were sent an overview of the study [60]. Based on the overview, the potential interviewees were asked to evaluate their experience with SCA—the third boundary criterion—and, in case of negative evaluation, asked to help to contact the most knowledgeable informants on the subject matter. If their experience satisfied the criteria, the experts were included in the study. After 12 interviews with 13 interviewees, the data collection was concluded. At this point, the identification of new barriers and measures through additional interviews had attenuated, while the effort to recruit further interviewees had increased substantially. This was evaluated as fulfilling the condition of saturation [18] and is comparable to similar studies in LSCM [61,62,63]. Table 1 lists the interviewees.
The interviews were conducted in a semi-structured format using an interview protocol for reliability, which was initiated with a grand tour open-end question [64,65]. Interviewees were asked about barriers regarding analytics as applied to LSCM that they were currently experiencing or had experienced in the past. Based on their responses, they were subsequently asked about two aspects: first, to provide detailed information on the barriers for increased understanding of the barriers’ effect, and second, how these barriers were being/had been addressed, and how the measures used affected the barriers. Subsequently, interviewees were asked about their experience with the barriers presented in Section 2.2 and, depending on the response, asked about measures taken to overcome these barriers. The second part of the interview on measures was again initiated with a grand tour open-end question on best practices used to increase the success rate of analytics initiatives and interviewees were asked to give justifications for their use. These justifications were requested interrogated to understand whether these practices were used to solve or overcome particular problems, and whether these practices qualify as measures. The interviews were concluded with questions on the use of the measures presented in Section 2.3 and, depending on the response, the justification for their use.
Since none of the interviewees was located near the researchers, most interviews were conducted by phone. Previous studies have used telephone interviews for interview research in LSCM as a measure to overcome the restrictions from different locations without any reported bias [61,62,63,66,67,68]. Subject to the approval of the interviewees, the interviews were audio-recorded and transcribed verbatim, providing the qualitative data for analysis. The interviews lasted between 45 and 105 min, with an average of 65 min. Interviewers took handwritten notes during the interviews for the purpose of recording and guiding the interview. The Analysis was initiated after the first interviews to allow preliminary interpretations and insights, which were expanded in subsequent interviews. All data were documented in a structured database for further reliability. Audio records were deleted after transcription as committed to the interviewees.

3.3. Data Analysis

In the first step, the data was rigorously analyzed according to Grounded Theory guidelines [18]. The analysis was initiated after the third interview to allow continuous contrasting of developing theory and data collected from the ongoing interviews [18,69]. The analysis steps, starting with open coding, were executed using ATLAS.ti (Version 8.4). During open coding, the interviews were coded to identify recurring themes. The coding was focused on identifying the underlying nature of challenging conditions during the application of SCA in the organization and extracting the actions taken to cope with those challenging conditions, their original intent, and their actual impact. Due to the intent of this research, the codes created were mainly abstractions and vivo codes of the interviewees. The interviews were continuously read and reread to establish similarities and disparities. As a result, codes and the categories they represent were restructured.
After concluding the data collection and finalizing the open coding for the data, axial coding was conducted to lift the level of abstraction as well as further restructure and aggregate existing categories [18]. During this process, short descriptions of categories of barriers and measures were created as input for the Q-Methodology based selective coding process.
In this regard, the Q-methodology is used as a supplement to the selective coding steps and, thus, to the Grounded Theory approach. Scholars have emphasized the adaptability of the Q-methodology to the interest of researchers, such as using differences of sorting as input for discussions [70]. As a result, the Q-methodology was adapted for this study to identify different patterns of thought about sorting the aggregated codes that were the output result from the axial coding step. These sorting results were used as input for consensus-building on the core categories, which the selective coding intends to derive. In this regard, participants developed their individual sorting and contributed their underlying patterns of thought to the consensus building in an unbiased manner by performing the steps of the Q-methodology and commenting on their results without the other participants criticizing the sorting. The value of the participants’ comments on their sorting process has been highlighted by Ellingsen et al. [53].
In detail, the Q-methodology starts with two steps that set up the sorting process. First, a concourse is created, which is the collection of statements pertaining to a research area of interest derived from empirical and secondary data sources [19,20]. This step is equivalent to the open coding step that identified 154 codes—in this case on barriers in applying SCA and measures to overcome them – from the semi-structured interviews. The subsequent step is the creation of a Q-sample, which represents a comprehensive, balanced, and representative subset from the concourse [19,20]. This step is intended to limit the number of statements by grouping similar statements and refining statements that are too specific or too general [20,53]. This step is equivalent to the axial coding step that related similar codes in the data, 30 barriers and 59 measures.
The sorting procedure, named Q-sort, is performed on the Q-sample as a third step [53]. As participants, three researchers with high expertise on LSCM with different viewpoints on analytics have been chosen. In accordance to Ellingsen et al. [53], the participants received a sorting instruction, were reminded about the non-existence of a “right” sorting, and were given the task to perform the Q-sample freely. The sorting instruction was explained as to sort the items based on the participant’s perception of similarity. The free mode of the Q-sort was chosen such that the participants had to create their own groups. This approach intended to promote the strength of the Q-methodology, as repeatedly emphasized, of uncovering different patterns of thought, perception, and opinions, which can be used to asses areas of consensus and friction [19,20,70]. Here, consensus on the core categories of barriers to SCA and measures to overcome these barriers was assessed.

3.4. Reliability

The reliability of this study was judged based on criteria regularly used by scholars evaluating the reliability of similar studies [54,56,71,72]. The criteria of credibility, transferability, dependability, confirmability, and integrity in these studies emerged from the recommendations of Hirschmann [73] as well as Lincoln and Guba [74]. The criteria of fit, understanding, generality, and control emerged from Strauss and Corbin [18].
Credibility describes the extent to which the results are acceptable representations of the data. It was addressed by the timeframe of four months in which the interviews were conducted, the review of the research summary by the interviewees, and the Q-methodology that led to a review of the codes by three researchers [56,71,72].
The extent to which the findings can be applied from one context to another is covered in the criterion of transferability. This was addressed by the theoretical sampling, which led to a diverse set of organizations, business sizes, and roles in the supply chain [56,71,72].
The criterion of dependability refers to the dependence of the results on time and place, and as a result the extent of the results’ stability and consistency. To ensure dependability, interviewees were asked to reflect on the decisions, actions, and changes over time that were related to the phenomenon and, thus, their past and present experience [56,71,72].
Confirmability measures the extent to which interpretations are the result of interviewees’ experience as opposed to researcher bias. To create confirmability, the research protocol was used before and during the interviews to ensure that interviewees could prepare and answer without researcher bias [60,65] as well as by providing the results to the interviewees for review [56,71,72].
Integrity of this study, representing the extent of influence by misinformation and evasion by participants, was ensured by conducting non-threatening, professional, and confidential interviews [56,72].
The extent to which the findings were a fit to the area under investigation, inherent in the criteria of fit, was addressed by the credibility, dependability, and confirmability criteria [56,72].
The results’ reflection of the “own world of the interviewees, which is covered in the criterion of understanding, was addressed by providing the results to the interviewees with a request for comment and review [18,54,72].
Generality describes the extent to which the findings discover multiple aspects of the phenomenon under investigation. It was addressed by applying the Q-methodology to integrate different patterns of thought into the study [20] and by conducting interviews of a sufficient length (45–105 min), with extensive openness by the interviewees, strongly focused on the area under investigation and covering multiple facets related to that area [18,54,56,72].
Finally, control describes the extent to which organizations can influence aspects of the emerging theory. By focusing on measures, the study strongly focuses on aspects over which organizations have control, which was discussed with the interviewees during the study to establish the existence of control [18,54,72].

4. Results and Discussion

Resulting from the analysis of the data, one framework regarding the impact of barriers on SCA initiatives and another framework regarding the measures to counter these barriers have been derived. Due to the analysis, core categories have been extracted and abstracted, which assemble the frameworks. This section presents and explains the derived frameworks.
A major inference made from the analysis of the collected evidence has been that barriers and measures affect different steps along the cycle of applying analytics to a LSCM organization. During the Q-Methodology, the researchers built consensus on distinguishing four phases:
(1)
“Orientation about Analytics” describes actions, circumstances, and events before specific analytics initiatives are planned. During the orientation, the necessary conditions for applying analytics are created and employees are motivated to invent analytics initiatives to be executed.
(2)
“Planning of analytics initiative” describes actions, circumstances, and events during the set-up of a specific analytics initiative, in which the addressed business problem/ business case is specified, the approach designed, resources and budget committed, and relevant people are invited to participate.
(3)
“Execution of analytics initiative” describes actions, circumstances, and events during the development and creation of analytics solutions in specific analytics initiatives. For example, this includes interactions with data, applications of analytical methods, the use of technology, and interaction of analysts with business experts.
(4)
“Use of analytics solution” refers to actions, circumstances, and events after the solution development, which include the deployment of the analytics solution to users and their interaction with the solution in the short and long term.
Allocating the effects to project models such as CRISP-DM was rejected as unsuitable. Several barriers and measures occur outside the usual scope of such models. Further, barriers and measures show similar effects in most solution development phases, which are differentiated in too much detail in these project models.
In addition, as extracted from the evidence, a distinction in the impact of barriers and measures is made into capabilities and culture. Capabilities refers to the capabilities of the organizations, the organizations’ processual and technological infrastructure and standards, as well as the knowledge and skills of organizational members. On the other hand, culture in the context of this study refers to the attitudes and behaviors of single individuals or groups of individuals in the organization, such as their motivation, solution-orientation, feelings, openness, and willingness, as well as their critical reflection on their attitude and behavior. In this sense, culture does not have to reflect the general culture of the organization, but only groups of individuals of the organization.

4.1. Barriers

This article has identified and systematized barriers that occur when companies initiate, perform, or deploy SCA initiatives within the organization. These Barriers can be allocated with high confidence to either capability issues or cultural issues. Ambiguous cases, which researchers had to discuss extensively, were solely found in actors who omitted critical reflection on their decisions, creating problems in subsequent process steps. Eventually, a consensus was reached about such behavior corresponding to the attitudes of the actors and has, thus, been allocated to the cultural side. Below, the identified categories of barriers and their impacts are described, with examples. The framework of barriers on applying analytics to LSCM is presented in Figure 1.
From an aggregated point of view, the evidence shows that cultural barriers occur more frequently during the phase of orientation and use, whereas capability barriers are more frequent during the phases of planning and execution of analytics initiatives. This result is not surprising, since the phases of orientation and use demand the engagement and motivation of, as well as interactions among, individuals who have their primary focus on tasks and activities unrelated to analytics. Unfamiliarity with and distance from the topic create a different attitude and behavior towards analytics as opposed to that of analysts. Interviewees repeatedly emphasized that this unfamiliarity and distance must be overcome by clearly showing these individuals the benefits of analytics in their specific areas of work. Capability barriers in these phases can either not be determined specifically, since the specific problem for an analytics initiative is not determined in the orientation phase, or the potential capability barriers were addressed in the solution development prior to use. The other way around, in planning and execution individuals familiar with and close to analytics are necessary, resulting in reduced barriers of attitude or behavior in terms of unwillingness. However, discrepancies between needed and available skills, technology, and data are impactful since they hinder the tasks and activities necessary for solution development.

4.1.1. Capability Barriers due to Unfitting Conditions

Barriers due to unfitting conditions refer to circumstances in the organization’s processes, infrastructure, environment, or reputation that strongly impede or prevent the execution or completion of the development of an analytics solution. The results show that such barriers have been observed in all phases except for the use phase. An early obstacle for organizations is absence of the talent needed for analytics initiatives. Such talent does not perceive LSCM as an attractive or challenging domain, as opposed to technology organizations at the methodological forefront. In the planning of initiatives, efforts can be halted due to various missing tools, technologies, or data needed for the initiative. These foundational conditions can be missing since organizations—especially smaller organizations—lack the budget for continuous investment. Further, such conditions can become apparent during the execution of the initiative, when activities have already begun. An example is the realization that the target process cannot be supported by analytics as intended, either because it is not standardized and stable enough to create a beneficial solution or because it has process limits and restrictions that prohibit creating improved solutions compared to those that exist. Such conditions usually demand fundamental changes in the organization outside the realm of analytics.

4.1.2. Capability Barriers due to Missing Responsibility

Barriers due to missing responsibility occur along all phases and usually result in extra effort, avoidable with clearly assigned responsibilities. Missing responsibility on ownership of data sources and the necessary actions and requirements connected to these responsibilities leads to situations in which knowledge about existing data sources and their location is unevenly distributed and the data and their flow are not well documented. Thus, individuals in the organization lack the understanding of data and their importance—an important component for the invention of potential initiatives. In specific initiatives, this complicates execution due to unnecessary effort to access data owners and their data, effortful coordination between several owners, and missing traceability of data usage through the organization. Imprecise responsibilities on data may result in changes to data by employees unaware of their relevance to analytics solutions in use, compromising the solution without any communication of changes. In addition, missing oversight of analytics leads to the development of heterogenous tools, methods, and solution approaches, likely to produce redundant solutions for similar problems.

4.1.3. Capability Barriers due to Missing Knowledge

Missing knowledge on analytics is a substantial barrier to using analytics and can affect the application of analytics in LSCM organizations in a variety of ways. Without individuals in business processes who understand analytics and have experienced how analytics can create value, the search for potential initiatives in an orientation phase is less likely to result in meaningful initiatives for these business processes. Since the value is usually generated indirectly from actions taken due to the analytics solution, this barrier is inherent regarding analytics. During the planning phase, this indirect value generation and also the inherent uncertainty of the generated benefits, which are highly dependent on the data and context, can lead to underestimation of the resulting value and reluctance towards investments. The other way around, missing knowledge can result in high and unworkable expectations, impractical or absurd demands on solutions abilities, and ignorance of the limitations of analytics. Combining this with a missing ability to translate the business need into an analytical need due to missing knowledge, these Barriers will result in initiatives that have no way of achieving the unrealistic or mis-communicated objectives. In the execution, missing knowledge on functionality of analytics can result in the inability to let the business experts participate in the solution process, or missing understanding of performance evaluation in analytics can result in the inability to gain acceptance for progress. Lastly, missing knowledge can result in the users’ inability to use the developed analytics solutions in their business processes, denying the actual gain from the solution’s value.

4.1.4. Capability Barriers due to Unfitting Resources

Unfitting resources (data and systems) are barriers that become present during the planning or execution of analytics initiatives, but usually affect the execution. Unfitting resources can prolong the development of the solutions, yield subpar success, or even halt the solution. These issues are usually technical, such as the relevant data is missing, especially for machine learning or time series methods; the data is incorrect, requiring intensive cleaning and feedback for correct values from the process experts; or the accessible data does not fully reflect the business rules, requiring additional preparation. While solving these issues is foremost time intensive but solvable upon identification, the identification of incorrectness or lack of fit to business rules might occur at a late stage, such that the effort is doubled, or the solution developed from the data does not gain acknowledgement from the users. Further, technical issues related to heterogeneity of systems and the resulting incompatibility of the data with the systems prevent data access or the integration of the data for more comprehensive analysis, which can either be overcome with additional time and effort, or development of the solution may ultimately be abandoned.

4.1.5. Culture Barriers due to Unwillingness

As mentioned above, the reference to culture considered in this article does not have to reflect the culture of the general organization, but can apply to small groups of individuals. In this case, small groups that behave unwillingly in their areas of responsibility can seriously impact the success of analytics in an organization, including specific initiatives or the ability to perform analytics at all. Considering the barriers of unwillingness, they become relevant when individuals motivated for analytics become dependent on the input and cooperation of others. During orientation, the dependency on IT departments to create the foundation for analytics in data and analytical software can block the efforts of motivated individuals, when the IT department is unwilling. Interdependent with this, employees unwilling to commit to analytics will suppress management’s motivation for analytics, while missing commitment from management can suppress employees’ motivation (and necessary resources such as budget) for it. During the planning phase, when ideas are expressed, the motivated individuals bring together their ideas and business problems but are less dependent on third parties, assuming this phase is entered after management expressed support. In the subsequent phases, in particular the unwillingness of potential users to provide their data or to use the solution can have a strong negative effect on the benefits realized with deployment of the analytics initiative. This unwillingness can be traced to an unwillingness to change any business customs that are already established, because of the effort to change (or rather the ease to remain with the familiar) and the implication of changes, such as different responsibilities (including reduced control), modified behavior, and altered processes. This is often accompanied by an expressed disbelief in the potential and value of anything that is not the habit.

4.1.6. Culture Barriers due to Emotion

Emotions, especially of fear and perceived unfair treatment, are strong barriers to employees accepting the organization’s pursuit of analytics or use of solutions. In the data, they have been primarily observed as a reaction to information asymmetries of individuals not actively working on analytics or not involved in solution processes, who fill the asymmetries with assumptions. Involving employees such as potential users or supply chain partners in the planning and collaborating with them in the execution can create some level of transparency, such that emotion is a lesser barrier in these phases. However, in the phase of orientation, during which employees and supply chain partners might be left without well communicated intentions of the usefulness of analytics or might not fully believe the communicated intentions due to events in the past (e.g., something that builds general mistrust unrelated to the individuals eager to use Analytics), they might react emotionally to the information asymmetries. In detail, employees fear losing their jobs and supply chain partners fear repercussions from sharing data, building resistance against analytics initiatives. Similarly, developing solutions such that the solution process is not transparent to the potential user can lead to mistrust and skepticism towards the solution by the users. This emotion of mistrust is a reaction to the information asymmetries between requirements and the adherence to the requirements by the developed solution. This is particularly nurtured by analysts who over-sell their solutions.

4.1.7. Culture Barriers due to Missing Critical Thinking

Missing critical thinking in the context of this paper refers to decisions made without the necessary reflection of these decisions. While this could be a result of missing knowledge and, thus, be a capability barrier, it can also result from not taking enough time and effort for consideration, or enthusiasm about a critical problem finally being solved, which is not to be impaired by critical considerations. This can occur as bandwagon behavior from management directing the budget towards some hyped technology or some hyped use cases unfitting to the business need, while the budget would be needed for foundational technologies or uses cases that are boring, but beneficial. The missing business impact of hyped but failed initiatives can generate mistrust of employees in new technologies, as discussed above. On a user level, the continued use of business owned tools/shadow IT, which usually lack adherence to standards, documentation, or may even have errors, is often not critically reflected on regarding the impact of the creator leaving the organization, compatibility with other systems, or fit with business strategy and rules. Thus, the development of a standardized and adhering substitute in an analytics initiative is usually not triggered, while being highly necessary. During solution development, one recurring issue is the creation of workarounds—technical debt—by analysts without reflection on the impact on later development stages. The issues usually surface later and create additional effort or can even prevent the success of the analytics solution and its benefits.

4.2. Measures

This article has identified measures organization deploy to cope with the barriers above. Overall, an allocation of measures that are directly assigned to specific barriers could not be derived from the data, since interviewees expressed that some measures address several barriers, and vice versa, by intention or not. Further, interviewees explained that measures are adjusted and advanced over time as a reaction to the specific needs and capabilities of the organization and employees. Thus, measures on barriers are path and context dependent. However, the measures have been categorized and systematized to core categories and allocated to barrier categories. While several measures are presented in the following section, it must be emphasized that the presentation is reduced to these core categories as derived from the Grounded Theory process. Examples are presented to improve comprehensibility and to further illustrate the core categories. It cannot be generalized that certain, specific measures will help all organizations to overcome certain, specific barriers, especially in their vanilla form.
Reduced to the core categories, measures generally address capability barriers but contribute to overcoming cultural barriers at the same time. Acceptance is gained by building knowledge and creating processes for more transparent development and information exchange—information asymmetries are reduced. Thus, the cause is addressed (missing capabilities), but not the symptoms (unwillingness). Comparably to the barriers, the identified measures affect the different stages of the analytics initiative lifecycle differently. While demystification is essential to aid the orientation phase, the creation of specific capabilities affects the project development phases and communication and involvement become vital for the actual analytical work in the execution and use phases. The links are illustrated in Figure 2, with examples for measures in the respective core categories.

4.2.1. Measures Contributing to Demystification

The measures contributing to demystification are intended to communicate the value and benefits of analytics, fitted to the intended receivers and their context. The most widely used measures to reach the mass of employees are formal and informal communications presenting achieved value and benefits, such as workshops, presentations, internal conferences, or communication material. Interviewees stated that the act of showing achieved value, especially accompanied by individuals who benefitted from the executed initiatives, was effectively convincing and provided the opportunity to address questions and lack of clarity in live formats. However, critics may not attend and, thus, may not be convinced. Successful initiatives are therefore to go on tour with roadshows disseminating the evident benefits of analytics across the organization. Similarly, pro-active training on analytics creates an understanding of analytics and allow questions to be answered. Even when applied to employees less likely to use the methods, this resource-intensive approach creates acceptance and a realistic expectation. Another form of demystification is the creation of use cases combined with process experts for their respective domains (e.g., the different units involved in the LSCM activities). The created use cases represent a more tangible and, thus, comprehensible form of value from analytics. Similarly, this approach is more likely to convince supply chain partners to cooperate. Use cases can be identified by guided ideation workshops, evaluating existing data sources, as part of pro-active training, or by offering to formalize business owned tools/shadow IT. However, time for conferences and training events meant to demystify analytics for top management is rarely available. Investments are motivated by the value from tangible and clear use cases or a consistently voiced need for investments harmonized across several business units.
While the value created by analytics initiatives is mostly uncertain, different approaches for a value estimation exist. Estimations based on comparable solutions (e.g., other organizations or business units, previous initiatives) or based on the terminated inefficiencies engaged in the analytics initiative are preferable, but such tangibles are frequently missing. Thus, the creation of proofs-of-concept or prototypes is needed, which demands initial investment to allow a value estimation. If these initial investments are too high (i.e., cost intense data collection), a rare approach is the use of hypothetical prototypes, for which the data are simulated. However, this kind of prototype only tests ideas and requires serious changes to develop a production-ready solution.

4.2.2. Measures Contributing to Obtaining Capabilities—Human

Enabling humans by building their analytics capabilities involves several forms of training, dependent on whether they are expected to contribute to analytics initiatives as domain experts or expected to apply analytical methods. To enable their contribution, either a training event for employees is carried out or data savviness becomes a recruitment criterion for business roles and top management. The approach of enabling contribution accepts that not every employee needs to execute analytical methods. However, this portion will decline, while not every employee can be upskilled. To improve application capabilities, self-service analytics and citizen data scientists are created. In self-service analytics, employees are provided with role specific tools, which allow for data-driven decision making and dedicated analytical analysis for their roles. This should be fed back by analysts regularly. The citizen data scientist concept acknowledges that certain employees are already practically in the role of an analyst for their respective unit. These are identified, provided with additional training and the respective title to act as analyst in the business while having strong domain knowledge, therefore providing the first contact for analytical inquiries. Both concepts improve access to analytics to allow employees—potential users of analytics solutions—easier interaction with analytics.
To get access to analytical talent, organizations may alter their image to appeal to potential candidates outside of the company. This includes the portrayal of analytically complex and innovative initiatives and the active engagement of the talent in universities and platforms relevant for that talent (conventions, internet forums). As talent is understood to be a limited resource necessary for superior performance in competition, organizations must take active steps to attract that talent.

4.2.3. Measures Contributing to Obtaining Capabilities—Data

In regard to data being repeatedly expressed as an inevitable issue in any analytics initiative in the discussions on barriers, the multitude of described measures was not surprising. However, these measures are often peripheral topics to other core categories. Considering the measures solely dedicated to data, centralizing data and establishing open data policies are supposed to create access to the data needed for individual analysis and analytics initiatives. Further, measures to design data collection processes such that the eventual collection of sensitive personal data is factored in beforehand reduce side effects on accessibility of related non-sensitive data. Emphasized for data quality and availability are automation, such as automated data collection, validation rules for manual data collection, or the automated interpretation of free text with machine learning, as well as increasing datafication of processes and products in consultation with analysts. In conclusion, the measures emphasize the design of data collection and storage under the consideration of future use of the data for analytics.
Measures peripheral to organizational measures include organizational measures to improve data quality and availability. The core of the collected measures is to create awareness on data quality at the point of data creation, which is the business unit. This includes the allocation of data ownership and responsibility to the business units, providing feedback and training on data quality, but also, in return, establishing data quality business units as points of contact for issues and guidance. Peripheral measures with technology will be discussed below.

4.2.4. Measures Contributing to Obtaining Capabilities—Technology

The capability building measures for technology address the creation of an IT ecosystem that supports analytics. This means, on the one hand, creating accessibility and consistency of data, and on the other, to enrich options for analytics solutions and enhance the deployment of solutions. The essence of technology capability measures taken by organizations is to develop the IT ecosystem towards a single-source-of-truth. However, this vision might not be achieved—or even aspired to—since it displays a level of complexity that is hard to handle and might not be cost efficient. In particular, the idea of leapfrogging to this vision was explained by interviewees to be unreasonable. Instead, organizations move in the direction of this vision in different forms dependent on their individual needs either by consolidating IT systems in smaller numbers and integrating obsolete systems into newer systems; by setting up IT platforms as integration layers, which allow interchange between systems; or by replacing systems with more modular systems to develop a plug-and-play style IT ecosystem. Thereby, the platform solution is the preferred way to overcome technical integration issues with supply chain partners. In conclusion, the measures taken that aimed to create IT ecosystems for analytics need a few years’ foresight to keep complexity controllable and investments reasonable.

4.2.5. Measures Contributing to Obtaining Capabilities—Organization

To clarify, organizational capabilities address changes of the organizational structures and processes, while procedural capabilities address changes of the analytical solution development process. The organizational capabilities have several focus areas, including structures that reduce the organizational distance between analysts and domains, as well as creating a better understanding of each other’s tasks and issues. This includes the localization of analysts in a hybrid format with centralized analysts in a center of excellence and decentralized analysts in the business units to exploit the advantages of both forms. Dependent on the existing structures, a recurring measure is to enhance process improvement units with analytics to make use of the existing closeness of these established units and the business domains. Further, the initiatives are executed in cross-functional teams. One organization advanced the latter to permanent cross-functional analytics teams with own backlog of projects.
Another focus area is the development of, and commitment to, rules and codes of conduct. This can be employee focused to build acceptance and trust in analytics, such as a code of conduct on what restrictions exist on accessing and analyzing data as well as committing to restrictions for automation solutions, which must benefit the employees instead of replacing them. Transparency and communication of these commitments are key. The rules can further be organization focused and shift control on analytics related decisions. Examples are the centralization of decisions on sharing data with partners or even making data sharing part of contracts with partners, which is becoming more frequent due to service level agreements and performance evaluations.
Finally, designated budgets for analytics as an organizational change in the budgeting process are used to stimulate analytics initiatives. However, the appropriateness of the resulting initiatives needs to be checked (e.g., by consultation with analysts).

4.2.6. Measures Contributing to Obtaining Capabilities—Procedures

The collected measures to enhance the development process of analytics solutions are most notable for presenting reactions to overcome issues. Some of these formalized procedures address information and knowledge gaps, which are time- and resource-consuming to close, while the measures are merely cost-efficient fixes. As described by the interviewees, the solution for these gaps is to gain experience with analytics, which can be achieved by participating in analytics initiatives where the fixes are used to ensure avoiding these issues until the experience gained makes them obsolete. In this sense, business experts’ inability to translate business needs to analytical needs can be fixed through review and feedback with analysts. Analysts’ misinterpretation of users’ demands for analytics solutions, which are as a result unneeded and unused, is fixed by formal agreements establishing commitment and willingness. Business experts’ unwillingness to cooperate is fixed by analysts assuming responsibility for the focal decision-making processes of the analytics initiative. Other formalized measures schedule activities early to avoid impairing issues in later stages. This can occur during the stage of project execution, supported by pre-checks of available data, or process stability and flexibility, and cloud-hosted containerization of the solution development. This can also occur during the phase of using the solutions and facilitating their long-term success using documentation, including incorporating solutions into standard operational procedures and scheduling long-tern evaluations of the solutions as part of deployment.
As pointed out by one researcher during the data analysis phase of this article, some measures listed under this core category are merely practices for successful project management. However, humans might have difficulties in transferring practices from one domain of application to another while still collecting experience with the new domain.

4.2.7. Measures Contributing to Involvement and Communication

The final core category of measures identified represents measures contributing to the creation of involvement and communication in order to improve each side’s understanding of the other side’s actions and needs in order to improve decision-making. These measures can benefit an executed analytics initiative both directly and indirectly. Direct measures address a high level of involvement of the intended users in the development process, such as early and constant inclusion in an agile solution development format with regular sprint meetings, co-location of analysts and users, or a collaborative field and forum approach with analysts directly observing and taking part in the activities and decision-making of the users. Further, deployment processes can focus extensively on the users’ experience with the solution by presenting the solution in a user-oriented form, by gradually introducing the solution into the process, or by providing consumable and user-oriented training events. Interviewees emphasized that the focus on the user during all stages of development is vital for the initiative’s success. Besides the importance of user involvement, it was further acknowledged that the user’s mindset is occupied with their business responsibility, while their obligation to an analytics initiative is the advocation of business interests and not the understanding of analytical methods.
However, to advance in analytics, the organization’s employees need to develop an understanding of the interpretation of the results of analytics solutions and their implications and become familiar with them. Thus, a repeatedly highlighted indirect measure is constantly requesting justification for employees’ decisions in the form of data-based results. Another indirect measure is to promote and support exchange of analysts. These indirect measures create an environment that enhances collaboration in potential upcoming analytics initiatives.

4.3. Discussion on Applying Measures and Handling Barriers

As introduced above, the impact of measures on an organization could not be generalized from the data and is highly dependent on the context of the organization. Due to this and the dynamic nature of measures existing at different stages of development, allocating measures to specific barriers was not supported by the collected data. Below, further points regarding the core categories derived from the Grounded Theory methodology are discussed.
Considering the capability barriers, the variety of requirements for more complex analytics initiatives are underlined. Scholars have highlighted the need for a technical foundation [52] as well as the required talent [10], the co-dependency on technical capacity and skills [75], the need for data quality [42], or the required knowledge of employees and managers [76]. This study identified a multitude of requirements: analytical talent, the IT systems landscape, the data to be analyzed, organizational structures, the knowledge of the people expected to cooperate with analysts, and the condition of the unit of analysis. Therefore, this study identified that data interacts with all other requirement while decreasing in quality and accessibility, subject to these other requirements. In particular, lack of responsibility affects data negatively, which indicates a missing understanding of data as an organizational asset—a paradigm scholars have emphasized in the past [75]. However, interviewees reported impactful results achieved with analysis on a small scale—small enough to be run on personal laptops—demanding lower levels of commitment.
Discussing cultural barriers, this study uses this label for the behaviors and beliefs of individuals in the organization, which may not reflect the general culture of the organization. The barriers allocated to this category highlight the cross-functional nature of analytics in LSCM, since they result from treating analytics as an isolated technology topic. The need for collaboration and exchange has been discussed in the literature [77], but this study emphasizes the information asymmetries resulting from lack of cooperation as having the consequence of individual beliefs that negatively impact the acceptance of analytics in the organization. While unwillingness on the employee level is the more direct effect to observe, this study coincides with Hayes’s [78] argument that unrealistic expectations and visions have negative effects as well. These are different individual beliefs of overestimation resulting from information asymmetries. However, relating to literature and interviews, this issue is not analytics specific, but an issue of change management and human resistance to change, as observed with many technologies.
Concerning measures, the concept of demystification was of notable importance to the interviewees as part of their work. Scholars have addressed convincing employees and management, and, to enable employees to experience the value from analytics [12], recommend starting with smaller initiatives to obtain presentable successes [52] and discuss other measures to show the benefit of a data-driven organization as examined in the theoretical background. However, this study has identified a multitude of additional approaches to achieve this, which are used in parallel, displaying the practical relevance of this core category of measures. Their intent is to close the information asymmetries and set realistic expectations while building on objective and presentable benefits. At best, the cases presented are as close as possible to the tasks and processes of the individuals to be convinced. Neither is there indoctrination involved, nor an insistence on believing the benefits without evidence.
The notion of a “data-driven organization” to address human capability is broadly present in the literature [21,45,51]. This study underlines that this idea requires employees who are able to include data in their decision-making process instead of an organization composed only of analysts. Analysts are complementary to the organization’s roles, not substituting. Moreover, the interviews underlined that organizations already have analytics-savvy people amongst their employees—the topic has grown on organizations and did not appear spontaneously, as the rise of interest in recent years might suggest. These employees need to be used intelligently and have their skills uplifted. However, external talent can provide external stimulation and access to the latest methods, although attracting them might be cumbersome. However, a growing supply of talent and the increased use of existing resources result in a decrease of scarcity and its estimated impact [76].
Concerning the creation of data capabilities, this study has repeatedly emphasized the non-technical measures that need to be taken. Besides that, an extraordinary point raised in one interview is the reusage of unstructured data in free text fields, because the usual approach of dismissing the data and replacing the collection with selectable options was not possible. While data quality is an issue and there are obvious data collection errors [42], free text data collection might be beneficial for a variety of reasons. Hence, the collection approach should be selected in accordance with the information needs of the organization and not solely with the analytical methods it is intended to apply. Additionally, while analysts must cooperate with the users, for new product developments whose benefits to the organization are intended to be analyzed only after release, the analysts assume the user role and the scenario turns around. The user-orientation applies vice-versa in this situation and analysts need to be involved in new product development.
Measures for building technology capabilities were not found to be a regular focus of analytics literature. Platform concepts or general investments in IT have been suggested [10,13], while the technology measures collected in this study show a certain modesty. The single-source-of-truth is a vision to converge to, but leapfrogging is not considered as necessary due to its complexity for most LSCM organizations. First, established organizations experience a high heterogeneity of IT systems, which is challenging to integrate into one in a single action. Second, the resulting IT system will likely be challenging to manage. For this reason, even digital business models, which have an advantage in this area, must decide how much single-source-of-truth they can afford to out-benefit the effort to keep the system running. Concerning the modesty mentioned above, interviewees explained they currently have enough use cases, and therefore do not require single-source-of-truth right away. They can keep themselves busy, gain benefits from analytics, and transform their organizations to data-driven ones while the IT systems converge to a single-source-of-truth in stages.
The measures to build organizational capabilities intend to reduce the organizational distance between analytics and business process experts. Reducing distance has been discussed previously, with beneficial measures such as a hybrid model of centralized and decentralized localization of Analysts [79,80], or the implementation of data governance [75,81]. Beyond that, this study has presented further benefits of such measures. For example, accessibility of analytics to employees is created, the creation of oversight on analytics and exploitation of synergies from similar use cases is enabled, and trust in the analytical activities and the handling of data is built. Further, this study emphasizes the need for a user that utilizes an analytics initiative’s solution in a way that the value and benefit of that initiative is realized. In order to achieve this, measures must be taken such that the users—the employees intended to use the developed solution—contribute their requirements from the solution to the development process. In addition, for strategic purposes in LSCM organizations, analytics was emphasized as supporting the organizational strategy, not being it. analytics capabilities should not be developed for the sole purpose of developing them.
The measures to create procedural capabilities represent project management experiences with analytics. As reported in the literature, analytics is approached along different paths by different organizations [82]—or rather, the path is not clear in the beginning and organizations have to test what is best for them. Above, some measures were discussed as good project management practices detached from analytics. However, an organization coping with the complexity of larger analytics initiatives with low experience might be distracted from applying their known good practices immediately and therefore make fundamental project management mistakes. Hence, the literature has recommended implementing analytics technical policy committees that collect good practices and set standards [79]. Further, distributing the practices amongst analysts requires coaching, since they might be unaware of the hazards of certain practices [83]. Thus, practices may evolve as a reaction to issues, because it takes time to recognize issues as such. Due to this evolving character and, if existent, such collection likely to be considered as intellectual property not to be shared in interviews, additional practices relevant for this category may as a result not have been collected.
The vitality of presenting a product in such a way that the intended users have the most comprehension of the product’s benefits and features is not surprising, therefore the measures contributing to involvement and communication are not particularly innovative. Nor is it surprising that frequent involvement of users in the development process results in improved adherence to needs and requirements. However, the measures in this category are mostly of such a type but were explained as the results of gaining experience with analytics. The literature occasionally illustrates an outdated picture of “back office” analysts with few user interactions [4]. For such analysts, adopting collaborative approaches has been an effortful learning process. Transferring these approaches to analytics improves mutual understanding of users’ domain requirements and analysts’ analytical requirements [34]. Several measures building on these approaches have been presented, including, but not limited to, recommendations in the literature, such as visually appealing solutions and translation of solutions into business terms [44,50]. In addition, indirect measures of collaboration were presented, which are not part of the solution development process and may at first appear annoying or distracting but can provide benefits later on.

5. Conclusions

This study investigated barriers of implementing and successfully applying analytics in LSCM and measures to overcome barriers. The primary focus was to identify measures to provide managers in LSCM organizations interested in using analytics with indications of which barriers to pay attention to and which to avoid, as well as to provide them with measures to improve their actions and analytics initiatives including the data-driven decision-making nature of their employees. As a result, this study has pursued an exploratory objective. The secondary focus of this study is the systematization and extraction of core categories from the identified barriers and measures. With regard to the systematization of conceptual contributions by MacInnis [84], this research is “delineating,” with the researchers taking the metaphorical role of cartographers, mapping the entities and their relationships. To achieve this objective, this study used the Grounded Theory approach supplemented with the Q-Methodology.
The results of this study in this regard are as follows. Barriers of SCA have been mapped into two core categories: capabilities and culture, whereby culture addresses the behavior and beliefs of single individuals and not the culture of the organization. In detail, the barriers have been mapped to four subcategories in the capabilities category—unfitting conditions, unfitting resources, missing responsibility, and missing knowledge. Similar to this, subcategories of the culture category have been derived—unwillingness, emotion, and missing critical thinking. The identified measures have been mapped to derive general core categories of measures. However, the data collected did not allow generalization to the impact of certain specific measures on specific barriers. Comparable to a metaphoric ship sailing through mapped barriers blocking its journey across the sea, the specific measures to overcome the barriers depend on a variety of contextual particularities—like type of ship, skill of the crew, or the weather. However, general core categories of measures to apply and adapt to the individual particularities can be provided to ease the journey. These core categories of measures have been derived and the seven categories have been mapped onto groups of barriers: demystification, building human capabilities, building data capabilities, building technological capabilities, building organizational capabilities, building procedural capabilities, and involvement and communication. Condensed to these core categories, the measures have presented a pattern of addressing capability barriers but have an indirect effect on cultural barriers. Addressing cultural barriers directly has rarely been observed. In the general idea of analytics seeking objective evidence for decision making, this fits the approach of creating convincing evidence of benefits and building the skills to experience these benefits to influence the beliefs about analytics as opposed to addressing beliefs directly.

5.1. Managerial Implications

Managers in LSCM organizations can use the results of the study in a variety of ways. The systematization of barriers can be used to investigate individual analytics processes and initiatives. The barriers might not be visible to them as they fail to realize benefits and value from their initiatives. Considering the barriers identified, managers are able to critically reflect on and analyze their initiatives and take action accordingly. In regard to the discussion above, managers can use these results to gain awareness regarding their analytics initiatives.
In addition, measures can be derived to support applying analytics initiatives for individual organizations. While this study refrains from recommending specific measures for certain barriers, managers can review the collected measures, evaluate the fit to the individual context of their organization, and apply them with according adaptions. The core categories of measures, which this study puts into focus while recommending deriving individually adapted measures from them, can further provide starting points for measures which can be used by managers to address barriers in their own organizations.
Managers should certainly become attentive to measures that can be used outside of specific analytics initiatives. Organizational measures for establishing codes of conduct, allocating responsibilities, or organizing all affected business units to voice a harmonized request for investment and budget allocation, directly impact the quality of the analytics initiative due to the existence of better resources. Furthermore, indirect measures such as the creation of informal exchanges amongst analysts (or among analysts and non-analytics employees) can lead to informal relationships and an overview of available skills and knowledge. This overview of available skills and relationships to access them can shorten reaction times or improve reaction quality if issues are encountered during analytics initiatives requiring skills outside of the project team. If managers combine this with incentives to learn and test new approaches together, such as in communities of interest, the available range of skills in the organization may also improve.
Finally, while analysts may frequently not have the personality traits to relish the presentation of themselves and their results, this study emphasizes the importance of their results being presented to other peer groups. Thus, managers may need to identify presentable results and motivate analysts to present the achieved benefits, as well as providing them with occasions to present them. However, these occasions are not intended to “show off”, but to motivate the creation of more and new use cases for analytics that are beneficial and valuable to the organization.

5.2. Limitations and Further Research

As a limitation of this study, the low impact of the particularities of LSCM to the results of this study must be considered. It has been assumed, that the LSCM domain has a larger impact on the experienced barriers. In detail, more detailed insights on the integration of several business units along the flow of materials, the complexity of data collection of moving assets, and the cooperation with supply chain partners had been expected beforehand. While the choice of participants was influenced by the maturity of the organization with analytics in LSCM to specifically identify the LSCM inherent barriers, the participants predominantly reported a lack of such barriers due to their current SCA use cases not focusing on analytics collaborations. In the words of an interviewee, the organizations “are not there yet” and had barely the chance to experience these kinds of barriers. Collaborations on use cases along the supply chain play a negligible role for LSCM organizations. Resultingly, a future study is needed on barriers in SCA collaborations, when they occur more regularly. In relation to that, no implications from use cases crossing internal organizational lines have been observed. While this is regular in LSCM, it does not seem to increase the complexity of analytics initiatives, as long as effective stakeholder management is in place. However, the lack of barriers due to crossing organizational lines could originate from LSCM functions being used to cooperate with other functions in the organization as well, putting SCA initiatives in a favorable position. Since this study is not comparative, this interpretation must be checked in additional research. Regarding the expected and assumed technical issues, especially in data collection, participants have shown a strong confidence in solutions bypassing such barriers with proxies and reasonable means already at hand. However, whether these proxy solutions do work reliably and this kind of “problem fixing” is a reasonable measure of SCA requires a long-term observation and is beyond the focus of this research.
Another limitation is the exploratory nature of this research, which aims at identifying a wide range of effects and their relationships, but not their size. The size must be tested in confirmatory research with quantitative methods. This study, in its exploratory manner, is intended to provide a foundation for such quantitative research.
The sample of participants has been compiled with great effort in identifying reasonable experts in relevant organizations and this effort has only been halted, when the collected data through interviews had become repetitive with strongly decreased marginal insights. This approach corresponds to the methodological recommendations but does not guarantee the non-existence of further barriers or measures. In addition, this study concludes that the barriers and measures are dynamically evolving and developing. Thus, the identification of barriers in executing SCA initiatives and measures to cope with these barriers needs to be incorporated into further research. An additional limitation regarding the participants is their rejection to compare their barriers and measures to other domains, since they lack knowledge on other domains. However, this study has focused on LSCM and did not intend such a comparison. In fact, the researchers are satisfied with the self-awareness of the participants and their withhold from reporting assumptions or hearsay. As a result, this study can solely present the state of LSCM but has no data to create a comparison with other domains.
Further research is required to identify contextual factors that moderate the impact of measures. Since organizations are currently limited in ideas, multi-actor use cases in supply chain analytics need to be identified and their value investigated. As opposed to the limitation of collecting barriers from a posteriori perspective, as discussed above, this demands research in the creation of ideas, eventually based on LSCM needs and challenges such as increasing supply chain agility or resilience. Concerning the field of analytics, further research into the single-source-of-truth is needed, including investigating trade-offs between the availability of data and the speed of processing it, as well as pathways to convergence of IT systems. This study has collected a user perspective on that issues, that argues for a cost and benefit paradigm. However, further investigation in the single-source-of-truth is beyond this study. In addition, the scarcity of analytical talent needs to be reevaluated. While the observable demand for talent has seemingly created a surge in the quantity of analytical talent in the last years, the quality of that talent may not be satisfactory for organizations, which represents an issue beyond the scope of this study.

Author Contributions

Conception and design of study, T.T.H.; Recruiting of Interviewees, T.T.H. and B.N.; Collection of data, T.T.H.; Analysis and interpretation of data, T.T.H., B.N. and B.G.; Critical revision of analysis and interpretation, T.T.H., B.N. and B.G.; Draft of the manuscript, T.T.H.; Critical revising the manuscript, T.T.H., B.N. and B.G. All authors have read and agreed to the published version of the manuscript.

Funding

This research received no external funding.

Conflicts of Interest

The authors declare no conflict of interest.

References

  1. Monahan, S.; Ward, J.; Zimmermann, M.; Sonthalia, B.; Vanhencxthoven, M. Accelerating into Uncertainty; CSCMP: Lombard, IL, USA, 2017. [Google Scholar]
  2. Chung, G.; Gesing, B.; Chaturvedi, K.; Bodenbrenner, P. DHL Logistics Trend Radar; Innovation, DHL Customer Solutions & Innovation: Bonn, Germany, 2018. [Google Scholar]
  3. Reinsel, D.; Gantz, J.; Rydning, J. Data Age 2025: The Digitization of the World From Edge to Core; Seagate: Cupertino, CA, USA, 2018. [Google Scholar]
  4. Davenport, T.H.; Harris, J.G. Competing on Analytics: The New Science of Winning, 2nd ed.; Harvard Business School Press: Boston, MA, USA, 2017; ISBN 9781633693722. [Google Scholar]
  5. Souza, G.C. Supply chain analytics. Bus. Horiz. 2014, 57, 595–605. [Google Scholar] [CrossRef]
  6. Wang, G.; Gunasekaran, A.; Ngai, E.W.T.; Papadopoulos, T. Big data analytics in logistics and supply chain management: Certain investigations for research and applications. Int. J. Prod. Econ. 2016, 176, 98–110. [Google Scholar] [CrossRef]
  7. Chae, B.K.; Yang, C.; Olson, D.; Sheu, C. The impact of advanced analytics and data accuracy on operational performance: A contingent resource based theory (RBT) perspective. Decis. Support Syst. 2014, 59, 119–126. [Google Scholar] [CrossRef] [Green Version]
  8. Trkman, P. The critical success factors of business process management. Int. J. Inf. Manage. 2010, 30, 125–134. [Google Scholar] [CrossRef]
  9. Dutta, D.; Bose, I. Managing a big data project: The case of Ramco cements limited. Int. J. Prod. Econ. 2015, 165, 293–306. [Google Scholar] [CrossRef]
  10. Sanders, N.R. How to Use Big Data to Drive Your Supply Chain. Calif. Manage. Rev. 2016, 58, 26–48. [Google Scholar] [CrossRef]
  11. Lai, Y.; Sun, H.; Ren, J. Understanding the determinants of big data analytics (BDA) adoption in logistics and supply chain management. Int. J. Logist. Manag. 2018, 29, 676–703. [Google Scholar] [CrossRef]
  12. Schoenherr, T.; Speier-Pero, C. Data Science, Predictive Analytics, and Big Data in Supply Chain Management: Current State and Future Potential. J. Bus. Logist. 2015, 36, 120–132. [Google Scholar] [CrossRef]
  13. Kache, F.; Seuring, S. Challenges and opportunities of digital information at the intersection of Big Data Analytics and supply chain management. Int. J. Oper. Prod. Manag. 2017, 37, 10–36. [Google Scholar] [CrossRef]
  14. Richey, R.G.; Morgan, T.R.; Lindsey-Hall, K.; Adams, F.G. A global exploration of Big Data in the supply chain. Int. J. Phys. Distrib. Logist. Manag. 2016, 46, 710–739. [Google Scholar] [CrossRef]
  15. APICS. Exploring the Big Data Revolution; APICS: Chicago, IL, USA, 2015. [Google Scholar]
  16. Pearson, M.; Gjendem, F.H.; Kaltenbach, P.; Schatteman, O.; Hanifan, G. Big Data Analytics in Supply Chain: Hype or Here to Stay? Accenture: Munich, Germany, 2014. [Google Scholar]
  17. Thieullent, A.-L.; Colas, M.; Buvat, J.; KVJ, S.; Bisht, A. Going Big: Why Companies Need to Focus on Operational Analytics; Capgemini: Paris, France, 2016. [Google Scholar]
  18. Strauss, A.L.; Corbin, J.M. Basics of Qualitative Research: Techniques and Procedures for Developing Grounded Theory; SAGE Publications: Thousand Oaks, CA, USA, 1998. [Google Scholar]
  19. Brown, M. Illuminating Patterns of Perception: An Overview of Q Methodology; Carnegie-Mellon Univ: Pittsburgh, PA, USA, 2004. [Google Scholar]
  20. Valenta, A.L.; Wigger, U. Q-methodology: Definition and Application in Health Care Informatics. J. Am. Med. Informatics Assoc. 1997, 4, 501–510. [Google Scholar] [CrossRef] [PubMed] [Green Version]
  21. Holsapple, C.; Lee-Post, A.; Pakath, R. A unified foundation for business analytics. Decis. Support Syst. 2014, 64, 130–141. [Google Scholar] [CrossRef]
  22. Beer, D. Envisioning the power of data analytics. Information, Commun. Soc. 2018, 21, 465–479. [Google Scholar] [CrossRef]
  23. Gandomi, A.; Haider, M. Beyond the hype: Big data concepts, methods, and analytics. Int. J. Inf. Manage. 2015, 35, 137–144. [Google Scholar] [CrossRef] [Green Version]
  24. Cao, G.; Duan, Y.; Li, G. Linking Business Analytics to Decision Making Effectiveness: A Path Model Analysis. IEEE Trans. Eng. Manag. 2015, 62, 384–395. [Google Scholar] [CrossRef] [Green Version]
  25. Ransbotham, S.; Kiron, D.; Prentice, P.K. Minding the Analytics Gap. MIT Sloan Manag. Rev. 2015, 56, 63–68. [Google Scholar]
  26. Ghasemaghaei, M.; Hassanein, K.; Turel, O. Increasing firm agility through the use of data analytics: The role of fit. Decis. Support Syst. 2017, 101, 95–105. [Google Scholar] [CrossRef]
  27. Simchi-Levi, D.; Simchi-Levi, E.; Kaminsky, P. Designing and Managing the Supply Chain: Concepts, Strategies, and Cases, 3rd ed.; McGraw-Hill New York: Boston, MA, USA, 2003. [Google Scholar]
  28. Christopher, M. Logistics & Supply Chain Management; Financial Times Prentice Hall: New York, NY, USA, 2011. [Google Scholar]
  29. Dittfeld, H.; Scholten, K.; Van Donk, D.P. Burden or blessing in disguise: interactions in supply chain complexity. Int. J. Oper. Prod. Manag. 2018, 38, 314–332. [Google Scholar] [CrossRef]
  30. Bode, C.; Wagner, S.M. Structural drivers of upstream supply chain complexity and the frequency of supply chain disruptions. J. Oper. Manag. 2015, 36, 215–228. [Google Scholar] [CrossRef]
  31. Heckmann, I.; Comes, T.; Nickel, S. A critical review on supply chain risk – Definition, measure and modeling. Omega 2015, 52, 119–132. [Google Scholar] [CrossRef] [Green Version]
  32. Nitsche, B.; Durach, C.F. Much discussed, little conceptualized: supply chain volatility. Int. J. Phys. Distrib. Logist. Manag. 2018, 48, 866–886. [Google Scholar] [CrossRef]
  33. Herden, T.T.; Bunzel, S. Archetypes of Supply Chain Analytics Initiatives—An Exploratory Study. Logistics 2018, 2, 10. [Google Scholar] [CrossRef] [Green Version]
  34. Waller, M.A.; Fawcett, S.E. Data Science, Predictive Analytics, and Big Data: A Revolution That Will Transform Supply Chain Design and Management. J. Bus. Logist. 2013, 34, 77–84. [Google Scholar] [CrossRef]
  35. Brinch, M.; Stentoft, J.; Jensen, J.K.; Rajkumar, C. Practitioners understanding of big data and its applications in supply chain management. Int. J. Logist. Manag. 2018, 29, 555–574. [Google Scholar] [CrossRef]
  36. Oliveira, M.P.V.D.; McCormack, K.; Trkman, P. Business analytics in supply chains – The contingent effect of business process maturity. Expert Syst. Appl. 2012, 39, 5488–5498. [Google Scholar] [CrossRef]
  37. Zhu, S.; Song, J.; Hazen, B.T.; Lee, K.; Cegielski, C. How supply chain analytics enables operational supply chain transparency. Int. J. Phys. Distrib. Logist. Manag. 2018, 48, 47–68. [Google Scholar] [CrossRef]
  38. Ramanathan, R.; Philpott, E.; Duan, Y.; Cao, G. Adoption of business analytics and impact on performance: a qualitative study in retail. Prod. Plan. Control 2017, 28, 985–998. [Google Scholar] [CrossRef]
  39. Srinivasan, R.; Swink, M. An Investigation of Visibility and Flexibility as Complements to Supply Chain Analytics: An Organizational Information Processing Theory Perspective. Prod. Oper. Manag. 2018, 27, 1849–1867. [Google Scholar] [CrossRef]
  40. Trkman, P.; McCormack, K.; de Oliveira, M.P.V.; Ladeira, M.B. The impact of business analytics on supply chain performance. Decis. Support Syst. 2010, 49, 318–327. [Google Scholar] [CrossRef] [Green Version]
  41. Brynjolfsson, E. The productivity paradox of information technology. Commun. ACM 1993, 36, 66–77. [Google Scholar] [CrossRef]
  42. Hazen, B.T.; Boone, C.A.; Ezell, J.D.; Jones-Farmer, L.A. Data quality for data science, predictive analytics, and big data in supply chain management: An introduction to the problem and suggestions for research and applications. Int. J. Prod. Econ. 2014, 154, 72–80. [Google Scholar] [CrossRef]
  43. Roßmann, B.; Canzaniello, A.; von der Gracht, H.; Hartmann, E. The future and social impact of Big Data Analytics in Supply Chain Management: Results from a Delphi study. Technol. Forecast. Soc. Change 2018, 130, 135–149. [Google Scholar] [CrossRef]
  44. Wixom, B.H.; Yen, B.; Relich, M. Maximizing Value from Business Analytics. MIS Q. Exec. 2013, 12, 111–123. [Google Scholar]
  45. Ross, J.W.; Beath, C.M.; Quaadgras, A. You May Not Need Big Data After All. Havard Bus. Rev. 2013, 91, 90–98. [Google Scholar]
  46. Watson, H.J. Tutorial: Big Data Analytics: Concepts, Technologies, and Applications. Commun. Assoc. Inf. Syst. 2014, 34, 1247–1268. [Google Scholar] [CrossRef]
  47. Seddon, P.B.; Constantinidis, D.; Tamm, T.; Dod, H. How does business analytics contribute to business value? Inf. Syst. J. 2017, 27, 237–269. [Google Scholar] [CrossRef]
  48. McAfee, A.; Brynjolfsson, E. Big data: the management revolution. Harv. Bus. Rev. 2012, 90, 60–68. [Google Scholar]
  49. Marchand, D.A.; Peppard, J. Why IT fumbles analytics. Harv. Bus. Rev. 2013, 91, 104–112. [Google Scholar]
  50. Bose, R. Advanced analytics: opportunities and challenges. Ind. Manag. Data Syst. 2009, 109, 155–172. [Google Scholar] [CrossRef] [Green Version]
  51. Barton, D.; Court, D. Making advanced analytics work for you. Harv. Bus. Rev. 2012, 90, 78–83. [Google Scholar]
  52. Lavalle, S.; Lesser, E.; Shockley, R.; Hopkins, M.S.; Kruschwitz, N. Big Data, Analytics and the Path From Insights to Value. MIT Sloan Manag. Rev. 2011, 52, 21–32. [Google Scholar]
  53. Ellingsen, I.T.; Størksen, I.; Stephens, P. Q methodology in social work research. Int. J. Soc. Res. Methodol. 2010, 13, 395–409. [Google Scholar] [CrossRef]
  54. Rauer, J.; Kaufmann, L. Mitigating External Barriers to Implementing Green Supply Chain Management: A Grounded Theory Investigation of Green-Tech Companies’ Rare Earth Metals Supply Chains. J. Supply Chain Manag. 2015, 51, 65–88. [Google Scholar] [CrossRef]
  55. Saldanha, J.P.; Mello, J.E.; Knemeyer, A.M.; Vijayaraghavan, T.A.S. Implementing Supply Chain Technologies in Emerging Markets: An Institutional Theory Perspective. J. Supply Chain Manag. 2015, 51, 5–26. [Google Scholar] [CrossRef]
  56. Omar, A.; Davis-Sramek, B.; Fugate, B.S.; Mentzer, J.T. Exploring the Complex Social Processes of Organizational Change: Supply Chain Orientation From a Manager’s Perspective. J. Bus. Logist. 2012, 33, 4–19. [Google Scholar] [CrossRef]
  57. Kaufmann, L.; Carter, C.R.; Buhrmann, C. Debiasing the supplier selection decision: a taxonomy and conceptualization. Int. J. Phys. Distrib. Logist. Manag. 2010, 40, 792–821. [Google Scholar] [CrossRef]
  58. Nicholas, P.K.; Mandolesi, S.; Naspetti, S.; Zanoli, R. Innovations in low input and organic dairy supply chains—What is acceptable in Europe? J. Dairy Sci. 2014, 97, 1157–1167. [Google Scholar] [CrossRef] [Green Version]
  59. Miles, M.B.; Huberman, A.M.; Saldana, J. Qualitative Data Analysis: A Methods Sourcebook, 3rd ed.; Sage: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  60. Yin, R.K. Case Study Research: Design and Methods; SAGE Publications: Thousand Oaks, CA, USA, 2014. [Google Scholar]
  61. Bode, C.; Hübner, D.; Wagner, S.M. Managing Financially Distressed Suppliers: An Exploratory Study. J. Supply Chain Manag. 2014, 50, 24–43. [Google Scholar] [CrossRef]
  62. Macdonald, J.R.; Corsi, T.M. Supply Chain Disruption Management: Severe Events, Recovery, and Performance. J. Bus. Logist. 2013, 34, 270–288. [Google Scholar] [CrossRef]
  63. Rivera, L.; Gligor, D.; Sheffi, Y. The benefits of logistics clustering. Int. J. Phys. Distrib. Logist. Manag. 2016, 46, 242–268. [Google Scholar] [CrossRef] [Green Version]
  64. McCracken, G. The Long Interview; Sage: Newbury Park, CA, USA, 1988; ISBN 0803933533. [Google Scholar]
  65. Eisenhardt, K.M. Building theories from case study research. Acad. Manag. Rev. 1989, 14, 532–550. [Google Scholar] [CrossRef]
  66. Gligor, D.M.; Autry, C.W. The role of personal relationships in facilitating supply chain communications: a qualitative study. J. Supply Chain Manag. 2012, 48, 24–43. [Google Scholar] [CrossRef]
  67. Manuj, I.; Sahin, F. A model of supply chain and supply chain decision-making complexity. Int. J. Phys. Distrib. Logist. Manag. 2011, 41, 511–549. [Google Scholar] [CrossRef]
  68. De Leeuw, S.; Minguela-Rata, B.; Sabet, E.; Boter, J.; Sigurðardóttir, R. Trade-offs in managing commercial consumer returns for online apparel retail. Int. J. Oper. Prod. Manag. 2016, 36, 710–731. [Google Scholar] [CrossRef]
  69. Kaufmann, L.; Denk, N. How to demonstrate rigor when presenting grounded theory research in the supply chain management literature. J. Supply Chain Manag. 2011, 47, 64–72. [Google Scholar] [CrossRef]
  70. Amin, Z. Q methodology - A journey into the subjectivity of human mind. Singapore Med. J. 2000, 41, 410–414. [Google Scholar]
  71. Thornton, L.M.; Esper, T.L.; Morris, M.L. Exploring the impact of supply chain counterproductive work behaviors on supply chain relationships. Int. J. Phys. Distrib. Logist. Manag. 2013, 43, 786–804. [Google Scholar] [CrossRef]
  72. Flint, D.J.; Woodruff, R.B.; Gardial, S.F. Exploring the Phenomenon of Customers’ Desired Value Change in a Business-to-Business Context. J. Mark. 2002, 66, 102–117. [Google Scholar] [CrossRef]
  73. Hirschman, E.C. Humanistic Inquiry in Marketing Research: Philosophy, Method, and Criteria. J. Mark. Res. 1986, 23, 237–249. [Google Scholar] [CrossRef] [Green Version]
  74. Lincoln, Y.S.; Guba, E.G. Establishing trustworthiness. In Naturalistic Inquiry; Sage: Newbury Park, CA, USA, 1985; pp. 289–331. [Google Scholar]
  75. Kiron, D.; Prentice, P.K.; Ferguson, R.B. Raising the Bar With Analytics. MIT Sloan Manag. Rev. 2014, 55, 28–33. [Google Scholar]
  76. Manyika, J.; Chui, M.; Brown, B.; Bughin, J.; Dobbs, R.; Roxburgh, C.; Byers, A.H. Big data: The next frontier for innovation, competition, and productivity. McKinsey Glob. Inst. 2011, 156. [Google Scholar]
  77. Janssen, M.; van der Voort, H.; Wahyudi, A. Factors influencing big data decision-making quality. J. Bus. Res. 2017, 70, 338–345. [Google Scholar] [CrossRef]
  78. Hayes, J. The theory and practice of change management, 5th ed.; Palgrave Macmillan: London, UK, 2018. [Google Scholar]
  79. Grossman, R.L.; Siegel, K.P. Organizational Models for Big Data and Analytics. J. Organ. Des. 2014, 3, 20. [Google Scholar] [CrossRef] [Green Version]
  80. Díaz, A.; Rowshankish, K.; Saleh, T. Why data culture matters. McKinsey Q. 2018, 3, 36–53. [Google Scholar]
  81. Ransbotham, S.; Kiron, D.; Prentice, P.K. Beyond the Hype: The Hard Work behind Analytics Success; MIT Sloan Management Review; Massachusetts Institute of Technology: Cambridge, MA, USA, 2016. [Google Scholar]
  82. Kiron, D.; Shockley, R.; Kruschwitz, N.; Finch, G.; Haydock, M. Analytics: The Widening Divide. MIT Sloan Manag. Rev. 2012, 53, 1–23. [Google Scholar]
  83. Harris, J.G.; Craig, E. Developing analytical leadership. Strateg. HR Rev. 2011, 11, 25–30. [Google Scholar] [CrossRef]
  84. MacInnis, D.J. A Framework for Conceptual Contributions in Marketing. J. Mark. 2011, 75, 136–154. [Google Scholar] [CrossRef] [Green Version]
Figure 1. Barriers of Supply Chain analytics.
Figure 1. Barriers of Supply Chain analytics.
Logistics 04 00005 g001
Figure 2. Measures to support supply chain analytics.
Figure 2. Measures to support supply chain analytics.
Logistics 04 00005 g002
Table 1. Interview Participants.
Table 1. Interview Participants.
ParticipantPosition (Anonymized)Actoranalytics Exp. [yrs]
AManager (functional) AnalyticsOEM8
BData ScientistLSP12
CData ScientistSupplier3
DHead of AnalyticsOEM19
EManager AnalyticsRetail3
FDirector AnalyticsOEM8
GManager AnalyticsSupplier2
HHead of (functional) AnalyticsLSP6
IManager AnalyticsRetail14
JSen. Data ScientistLSP4
KData Scientistanalytics Provider3
LData Scientistanalytics Provider3
MHead of AnalyticsLSP3

Share and Cite

MDPI and ACS Style

Herden, T.T.; Nitsche, B.; Gerlach, B. Overcoming Barriers in Supply Chain Analytics—Investigating Measures in LSCM Organizations. Logistics 2020, 4, 5. https://doi.org/10.3390/logistics4010005

AMA Style

Herden TT, Nitsche B, Gerlach B. Overcoming Barriers in Supply Chain Analytics—Investigating Measures in LSCM Organizations. Logistics. 2020; 4(1):5. https://doi.org/10.3390/logistics4010005

Chicago/Turabian Style

Herden, Tino T., Benjamin Nitsche, and Benno Gerlach. 2020. "Overcoming Barriers in Supply Chain Analytics—Investigating Measures in LSCM Organizations" Logistics 4, no. 1: 5. https://doi.org/10.3390/logistics4010005

Article Metrics

Back to TopTop