Next Article in Journal
Composting of Municipal Solid Waste Using Earthworms and Ligno-Cellulolytic Microbial Consortia for Reclamation of the Degraded Sodic Soils and Harnessing Their Productivity Potential
Next Article in Special Issue
Digitalization and Firm Financial Performance in Healthcare: The Mediating Role of Intellectual Capital Efficiency
Previous Article in Journal
Natural Events Threatening the Cultural Heritage: Characterization, Prevention and Risk Management for a Sustainable Fruition
Previous Article in Special Issue
The Impact of Digitalization and Sustainability on Governance Structures and Corporate Communication: A Cross-Industry and Cross-Country Approach
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
Article

Conceptualizing Corporate Digital Responsibility: A Digital Technology Development Perspective

1
China Institute for Small and Medium Enterprise, Zhejiang University of Technology, Hangzhou 310023, China
2
School of Management, Zhejiang University of Technology, Hangzhou 310023, China
*
Author to whom correspondence should be addressed.
Sustainability 2023, 15(3), 2319; https://doi.org/10.3390/su15032319
Submission received: 31 October 2022 / Revised: 15 January 2023 / Accepted: 21 January 2023 / Published: 27 January 2023

Abstract

:
Managers and scholars require an appropriate conceptualization, as well as reliable and valid measures of corporate digital responsibility (CDR), to better understand and tackle issues involving CDR. Therefore, by combining insights from extant research on corporate responsibility in the digital realm, this article proposes to distinguish CDR into corporate digitized responsibility and corporate digitalized responsibility. Specifically, corporate digitized responsibility includes unbiased data acquisition, data protection, and data maintenance; corporate digitalized responsibility involves appropriate data interpretation, objective predicted results and tackling value conflicts in data-driven decision-making. Moreover, we also provide a valid measurement for CDR, and the findings demonstrate that a positive relationship exists between CDR and corporate digital performance. Finally, this article offers some suggestions for managers on how to tackle CDR issues and utilize digital technologies in appropriate ways.

1. Introduction

In the context of the continuous development of digital technology, corporate digital responsibility (CDR) is becoming a crucial issue both for practitioners and researchers. From a managerial perspective, some digital risks have surfaced alongside the application of digital technology [1]. These digital risks derived from digital technology can not only damage a company’s reputation, but also hinder the growth of the digital industry [2]. This is particularly relevant to digital-native companies because they are pioneers in processing and using cutting-edge digital technologies [3]. In recent years, incidents of deficiencies in CDR have occurred from time to time. For example, with the help of big data acquiring technology, Facebook affected the 2016 US presidential election. Another example is the now widespread phenomena where many travel applications have offered different prices for the same travel product or service to different consumers (i.e., digital price discrimination).
CDR is also important from an academic point-of-view. As early as the 1940s, MIT professor Norbert Wiener [4] was already thinking about computer ethics and inaugurated it as a field of scholarly research. Over the decades that followed, there has been an endless stream of research on computer ethics [5]. In recent years, with the rise of digital technology, the emerging literature on corporate responsibility in the digital realm is also deeply concerned with ethical challenges [6,7]. These studies indicate that the challenges of corporate digital ethics are a significant issue in the digital age. However, the concept of CDR was not formally proposed by Cooper, Siu, and Wei [8] until 2015. Since then, more and more scholars have studied the concept of CDR. For example, some studies argues that CDR is an extension of CSR [9,10]. In other words, CDR should be studied based on CSR. Other scholars reckon that CDR should be considered separately because CDR focuses explicitly on creating and using digital technology [11,12]. Unfortunately, research on CDR is still in its infancy; there is no consensus on how to define, conceptualize, and assess it. Moreover, there is little research on the influence of CDR on enterprise development [13].
Hence, our objectives are to develop a new conceptualization of CDR with respect to data-related practices from a digital technology development perspective on the basis of previous literature and create measures of CDR to test the importance of CDR in terms of corporate digital performance. This is because data as a production factor is essential to create value in the digital realm. However, we still lack a detailed and structured conceptualization of CDR with regard to ethical and responsible data-related practices.
Despite the practical and academic importance of this issue, previous attempts to conceptualize CDR have had limited success. Some scholars in information management/operations management fields have tended to illustrate corporate digital responsibilities through concrete forms of digitalization such as artificial intelligence, Machine learning, robotics, etc. [10,14,15]. These research streams focus on the corporate responsibility issues that occur in a specific digital technology scenario (e.g., online banking service, business digital transformation), and neglects the dynamic function of digital technology. However, corporate responsibility in the digital realm is following the development of digital technology [16]. Through a literature review, we found that the influence of digital technology development on corporate responsibility is mainly reflected in the following three aspects: (1) The first function of digital technology (i.e., the descriptive function of digital technology) stems from data storage and processing technologies, which help organizations reflect reality [17]. For example, the sales databases of online shopping platforms capture information about customers and their purchases [18]. The ethical issues that arise in this process, such as those related to privacy and fair information practices, need to be considered by enterprises. (2) The second function is called the normalized function of digital technology, which comes into being with the development of big data technology. This function means using digital technology to convert information in different formats into a unified format, allowing companies to perceive the world more efficiently. For instance, computer programs and the internet have made real-time transmission of information possible, which may result in ethical issues such as data selling and computer crime [19]. (3) The third function of digital technology (i.e., shaped function of digital technology) has emerged with the development of new digital technologies such as AI, cloud computing and IoT. This function is exhibited through the reshape capability of digital technology to change external physical conditions [20]. For example, AI imitates human thinking and 3D printing changes mechanical design and manufacturing [21]. This new digital technology introduces new uncertainties into the world and can sometimes even be a substitute for human decision-making.
Considering corporate responsibility in the digital realm evolves along with digital technology development, we conceptualize CDR by elaborating CDR into corporate digitized responsibility and corporate digitalized responsibility. On the one hand, corporate digitized responsibility refers to the ethical issues of enterprises in the digitization stage. Specifically, we divide corporate digitized responsibility into the following phases of a firm’s digitization process (i.e., creating, transferring, storing, and analyzing digital data) [16]. In this stage the main CDR issues include unbiased data acquisition, data protection, and data maintenance, which stems from the descriptive function and normalized function of digital technology. On the other hand, corporate digitalized responsibility represents the ethical issues of enterprises in the digitalization stage. By classifying the digitalization stages, we embed corporate digitalized responsibility into three different stages (i.e., the descriptive stage, the predictive stage, and the prescriptive stage) of the application of digital technologies [22]. The key CDR issues in this stage are appropriate data interpretation, objective predicted results, and tackling value conflicts in data-driven decision-making, which are related to the shaped function of digital technology.
Our study contributes to the existing literature in the following ways: first, we illustrate CDR from the perspective of the dynamic process of digital technology development, especially in terms of data-related practices. Our research combines ethical issues related to the descriptive, normalized, and shaped functions of digital technology and develops a general and accurate concept of CDR to explore the intrinsic factors in digital technology development leading to CDR. Second, as a reflection of the process of digital technology development, digitization and digitalization are adopted to describe CDR issues in a holistic manner. Third, to address the lack of empirical research on this topic, we create a series of measurements of CDR. We put forward 15 CDR indicators corresponding with the digitization and digitalization stages. Furthermore, we use corporate digital performance to test the importance of CDR.
We organize the structure of this study as follows: Section 2 reviews the existing literatures on corporate responsibility in digital realm and CDR; Section 3 proposes a new conception of CDR; Section 4 introduces the methodology; Section 5 presents the empirical results; and Section 6 discusses the conclusion, managerial implications, and limitations.

2. Literature Review

To deal with digital ethics and digital challenges, researchers have continuously introduced concepts such as computer ethics [23], information ethics [24], robot ethics [15], digital ethics [25], and ethics of emerging information and communication technology [26] (see Table 1 for details), most of which are important concepts with specific definitions and meanings related to the development of digital technology. We summarize them as corporate responsibility in the digital realm. The newly proposed concept of corporate digital responsibility (CDR) is probably the most accurate, general, and timely concept in today’s digital environment [11]. This is because CDR encapsulates the different ethical issues that companies face in the development of digital technology. Thus, we reviewed the literature related to this key concept (see Table 2 for details). Some conclusions appear below.

2.1. Corporate Responsibility in the Digital Realm

This subsection discusses concepts related to corporate responsibility in the digital realm. These concepts are relevant to corporate responsibility in the context of digital technology development. Hence, we classify them according to three digital technology functions (i.e., descriptive function, normalized function and shaped function) produced by the development of digital technology. The first category, including big data ethics, digital ethics, and information ethics, represents ethical issues related to the descriptive function of digital technology. The second category (i.e., computer ethics) is related to ethical issues arising from the normalized function of digital technology. The third category, such as Robot, ML, and AI ethics, is about ethical issues caused by the shaped function of digital technology.

2.1.1. Ethical Issues Related to the Descriptive Function of Digital Technology

Descriptive function refers to the function of collecting and combing data to form results that reflect reality. In this function of digital technology, ethical issues include digital privacy, digital security, biased data collection issues, etc. [2,25,27]. For example, Kshetri [2] argued that consumer welfare prejudice is the most important digital ethical issue during the data collection of consumers through big data technology. They also pointed out that consumers’ decisions to withhold information may hinder the development of modern society. Zwitter [27] explored big data ethics in terms of “privacy”, “propensity”, and “research ethics” aspects of digital technology. Likewise, Capurro [25] addressed digital privacy issues from a broader perspective and described the digital privacy issues that arise during the process of using digital technology to obtain data, such as information overload, the digital divide, and robotics.

2.1.2. Ethical Issues Related to the Normalized Function of Digital Technology

Normalized function means that companies use digital technology to convert information in different formats into a unified format, allowing companies to perceive the world more efficiently. For example, trading and communication efficiency have been greatly improved by e-business and instant messenger software [35]. However, ethical issues such as computer crime and intellectual property encroaching appear during the normalization of digital technology to the real world [28]. A vivid example is that the normalized function of digital technology makes standardization possible, allowing for repetitive work to be easily replaced by computers [28]. For instance, junior accountants are more or less being replaced by computerized accounting software. Corporate responsibilities in normalized functions such as the non-biased acquisition of data and protecting human rights in digital society are described in these studies.

2.1.3. Ethical Issues Related to the Shaped Function of Digital Technology

The shaped function of digital technology is exhibited through the reshape capability of digital technology to change external physical conditions. With the development of the new generation of digital technologies (e.g., AI, ML, and 3D printing), the shaped function of digital technology has been enhanced. For example, digital navigation algorithms decide where we go, and video recommendation algorithms determine which videos we watch. However, behind these “accurate” recommendations is the loss of discretionary power [36]. To better address these problems, the extant literature places emphasis on how to present objective results and how to tackle value conflicts between humans and robots/machines [6,15,29,30]. For instance, Asaro [29] argues that three things concerning robot ethics should be discussed during the utility of robots, i.e., the ethical systems built into robots; the ethics of the people who design and use robots; and the ethics of how people treat robots.

2.2. Corporate Digital Responsibility

Current studies on CDR can be divided into two research streams according to the relationship between CDR and CSR. One stream of literature argues that CDR is an extension of CSR. For instance, Herden et al. [9] propose that CDR is an extension of CSR, which takes into account the ethical opportunities and challenges of digitalization. In line with a CSR pyramid, they constructed the CDR pyramid mode with four levels and illustrated how companies should shoulder different levels of digital responsibility. Likewise, Thelisson, Morin, and Rochel [10] and Wade [34] reckoned CDR as a kind of digital CSR. They mainly elaborated CDR issues in terms of security autonomy and privacy, respecting equality, dealing with data, dealing with algorithms, taking impact on the environment into account, and ensuring a fair transition. In addition, some other CDR literature focuses on the new digital technology challenges that companies are facing. For example, Suchacka [32] explored the impact of AI on the labor force and business operations.
The other stream of literature prefers to regard CDR as an independent concept. For example, Lobschat et al. [11] emphasize that CDR should be considered explicitly and separately from CSR, because CSR focuses on the broader social impacts of firms and CDR focuses explicitly on creating and using digital technology. They defined CDR as a set of shared values and norms guiding an organization’s operation. Furthermore, they built a framework for CDR involving four stakeholders and four key stages including creation of technology and data capture; operations and decision-making; inspection and impact assessment; and refinement of technology. In addition, Mihale-Wilson et al. [12] proposed that CDR addresses challenges to organizations’ ethical behavior that are unique to the digital world and go far beyond CSR. They suggested ranking the various CDR dimensions by using consumers’ valuation of CDR norms and implementations. Wirtz et al. [33] further suggest that digital technology characteristics such as malleability are the reasons why CDR should be considered separately from CSR. They provide insights on the causes of CDR issues from the perspective of service organizations and proposed a set of strategies and tools to address these issues. Elliott et al. [14] consider CDR as a voluntary commitment by companies to address the social, economic, and ecological impacts of digital technology. They further illustrate CDR issues in these three areas, that is, purpose and trust, fair and equitable access for all society, invest in the new eco-economy, promote a sustainable planet to live, promote societal wellbeing, promote economic transparency, and reduce tech impact on environment.
To sum up the above two research streams, and in line with Mihale-Wilson [37], we believe that the two schools should be combined. On the one hand, CDR is an important part of CSR; on the other hand, in the context of the development of digital technology, CDR has its unique features.
Scholars have explained the connotation of CDR and offered some insightful perspectives (e.g., Herden et al. [9]; Lobschat et al. [11]). However, several limitations still exist. First, a large number of studies have highlighted CDR issues by focusing on discussing data protection and neglecting the role of digital technology. Second, lots of studies have ignored the dynamic development of digital technology. Most research explored the topics of CDR in a static digital technology scenario. Third, prior studies lack effective measurements and evaluation methods of CDR from the perspective of digital technology development. Due to the shortcomings in previous studies of CDR and the importance of the impact of digital technology development on corporate responsibility, we propose a framework for understanding CDR from both digitization and digitalization perspectives, as the two phases of digital technology development, to identify CDR issues. In addition, we create a series of measurements of CDR to test the importance of CDR in terms of corporate digital performance.

3. Conceptualization of CDR from a Digital Technology Development Perspective

In the literature review section, we classified ethical issues according to the functions of digital technology (i.e., descriptive function, normalized function, and shaped function) arising from the development of digital technology. To summarize, it can be seen that digital ethical issues occur at different stages of digital technology development. Additionally, there are still limitations of current CDR studies, such as focusing solely on the corporate responsibility issues that occur in a specific digital technology scenario. Therefore, we propose to conceptualize CDR from the perspective of digital technology development. To better depict CDR in the process of the development of digital technology, we believe that analyzing CDR from the perspectives of digitization and digitalization is essential. According to Ritter and Pedersen [38], digital firms mainly focus on dealing with digitizing capabilities (digitization) and digital value propositions (digitalization). According to the definition put forward by Yoo, Henfridsson, and Lyytinen [39], digitization is the encoding of analog information into a digital format. Similar to digitization, the descriptive function and normalized function of digital technology focus on data storage, processing, and conversion [17,19]. Digitalization emphasizes the application of digital technology [38]. It reflects the shaped function of digital technology to reshape external physical conditions [20]. Hence, these two stages correspond to the three functions of digital technology and are crucial to organizations in implementing a digital strategy. Exploring digital ethics from the perspectives of digitization and digitalization can help us tackle CDR issues and clarify the meanings and characteristics of CDR. We adopt corporate digitized responsibility and corporate digitalized responsibility as two frames for exploring the content of CDR.

3.1. Corporate Digitized Responsibility

In line with Brennen and Kreiss [16], digitization is the process of transformation from analog to digital data (i.e., creating, transferring, storing, and analyzing). In this series of processes of transformation, companies face many ethical challenges, such as those involving fair information practices, privacy, and data protection. Hence, corresponding to the stages of the digitization process, we propose three main components of corporate digitized responsibility: (1) unbiased data acquisition; (2) data protection; and (3) data maintenance.

3.1.1. Unbiased Data Acquisition

In the data creation stage, unbiased data acquisition means sufficiently sized data, including all possible variations, and nondiscriminatory data in terms of gender, race, age, etc. [40]. Although there is a dilemma between open data and data ownership, organizations still have a responsibility to strike the correct balance to allow for data that is sufficiently comprehensive in order to represent reality [41]. Various sources of data such as crowdsourcing, “data donors”, and the “quantified self” movement (where citizens share data through mobile device-connected technologies) are shared with organizations [42,43]. Avoiding data discrimination is crucial. For instance, if the faces of more people with light skin tones than dark skin tones are collected as original data for a facial recognition system, it is inevitable that the system will perform badly in recognizing people with darker skin tones. Consequently, unbiased data acquisition is a primary responsibility of the company.

3.1.2. Data Protection

Data protection is the duty of every company, especially in the data transfer stage. Data protection has two components: data privacy and data security. Data privacy is related to the proper handling of data, which must be authorized [42]. Due to the complexity of original data, especially unstructured data such as emails and blogs, which may contain personally identifiable information and intellectual property [44], companies must have the permission of the data owners before transferring data. In fact, many pieces of legislation, such as “The European Union’s General Data Protection Regulation, 2018”, have already been passed to protect individual digital privacy. Companies that ignore digital privacy will lose the public’s trust. Data security—a concept distinct from data privacy—means protecting data from compromise by external attackers and malicious insiders [45]. In reality, threats such as malware, intrusions, and accidental or intentional data loss occur occasionally. Companies have a responsibility to ensure data security through data governance and technology improvements.

3.1.3. Data Maintenance

Data maintenance describes ongoing correction and verification—the process of continual improvement and regular checks. Properly maintaining and caring for data is essential to ensuring that data remain accessible and usable during the data-storage period. Another reason for data maintenance is that high-quality data are vital to companies for value creation. Poor-quality data may lead to bad consequences, such as wrong decision-making. There are various reasons behind poor data quality, such as faulty data-collection methods, failures to update data, missing records, etc. [45]. By using proper data maintenance techniques, it is possible to avoid these mistakes to a great extent. Furthermore, keeping data quality at a high level is the prerequisite for getting robust digital analysis results [6]. Therefore, maintaining high data quality through continuous updates and periodic cleaning is one of the corporate digitized responsibilities.

3.2. Corporate Digitalized Responsibility

Digitalization is the application of digital technology [38]. Essentially, it is a way to express value propositions through digital methods. Firms are devoted to data mining and analysis to provide better digital value propositions [46]. Building on the digitalization concept and the three key types of big data analytical methods, we categorize digitalization into three stages: the descriptive stage, the predictive stage, and the prescriptive stage [22]. In these three stages, many CDR issues such as algorithmic regulation and human–computer interaction should be considered by companies. Thus, we identified three key components of corporate digitalized responsibility: (1) appropriate data interpretation; (2) objective predicted results; and (3) tackling value conflicts in data-driven decision-making.

3.2.1. Appropriate Data Interpretation

The descriptive stage involves the summarization and description of knowledge through all kinds of statistical methods [47] and presentation of a process or multiple processes across times using digital technologies such as dashboards and data visualization [48]. The purpose of appropriate data interpretation is to help enterprises make sense of numerical data that has been collected, analyzed, and presented. Thus, implementing data interpretation reasonably and appropriately is vital in the descriptive stage. On the one hand, selecting an appropriate data analysis technology means avoiding certain data interpretation problems, such as mistaking correlation for causation, confirmation bias, and irrelevant data [49]. No matter what statistical methods (e.g., mean, median, mode, standard deviation, or variance) are adopted, companies have a responsibility to read facts from figures to ensure a reasonable and objective interpretation. In addition, choosing an appropriate technology to present data or data processes accurately is also important. As the graphic representation of data, data visualization contributes a great deal in terms of producing images that communicate relationships among the data represented to the viewers of the images, making it possible for audiences to understand data more intuitively, clearly, and easily. Applications such as dashboards using visualization methods (e.g., heat maps, bubble clouds) can be useful for understanding how to compare a set of data with others or itself across different points of time [50]. For instance, COVID-19 data visualizations help viewers form an impression of the diffusion of the virus quickly and directly.

3.2.2. Objective Predicted Results

The predictive stage is concerned with forecasting and statistical modeling to determine future possibilities [47,51,52]. When companies implement predictive analysis technology, they must concern themselves with how to guarantee that their predicted results are objective. Objective predicted results come from both subjective and objective efforts. In terms of environments characterized by intense competition [53], some companies create statistical models through specific digital technology to serve certain purposes. For example, in the banking industry, agencies may choose machine-learning technology that leads to biased investment predictions to maximize profits [6]. Obviously, these predictions are made for various business reasons other than fairness. In terms of the objective aspect, there may be some unintentional biases (existing prejudices) present in algorithm technology that influence the predictive result [54,55]. For example, Amazon had to scrap an automated recruiting tool that was shown to discriminate unfairly against potential female would-be hires because the algorithm technology was trained on historical patterns in which men were primarily those hired [56]. In the endeavor to better conduct corporate digitalized responsibility, firms should respect the need to carry out digital forecasting responsibly and foster research on better algorithms.

3.2.3. Tackling Value Conflicts in Data-Driven Decision-Making

The prescriptive stage focuses on optimization and randomized testing to assess how businesses enhance their service levels while decreasing expenses [51]. During this stage, when digital technology is powerful enough to manipulate the world that we see, then value conflicts arise [57]. To some extent, digital techniques such as algorithms or AI can control our autonomous value judgment and deprive us of our individual cognition [58]. Nowadays, consumers are used to enjoying the algorithmic recommendation services provided by digital platforms, such as reading books through Amazon and driving with Google Maps [21]. Furthermore, a value conflict regarding “who takes digital responsibility?” has arisen. For example, the emergence of self-driving cars has given rise to a discussion about accident liability judgments, as both humans and automated cars are participants in the driving activity [15,59]. Moreover, some digital techniques can help you decide which university you should apply to based on accurate matching (e.g., ranking and reputation), meaning that your personal preferences may be the least important consideration [21]. In addition, most of the time, data-driven decision-making technology is more efficient and convenient, meaning that companies will eliminate many jobs [60]. Human beings have to face greater occupational pressures in the digital era [61]. At that time, the definition of the citizen will be rewritten. For example, robots will replace most repetitive role tasks and perform most service work [62].

4. Methods

In order to test our proposed framework, we created a series of measurements of corporate digitized responsibility and corporate digitalized responsibility, respectively. After that, we used corporate digital performance to test the importance of CDR. This attempt of empirical research enriches the theoretical research on CDR and fills the gap in the literature with respect to empirical evidence in CDR research.

4.1. Sample and Data

We collected information about enterprises listed on the Shanghai and Shenzhen stock exchanges, which are the two main stock markets in China. Among these enterprises, we selected high-technology enterprises because they have more experience in digitization and digitalization. First, we contacted the Departments/Bureaus of Industry and Information of Zhejiang province, Guangdong province, and Shanghai City because these three regions are relatively developed places in China with more high-tech enterprises. With their help, we got in touch with our target listed companies in these provinces and cities. Finally, there were 352 enterprises that reached the stage of preliminary intention to cooperate with us. We started the first round of the survey by email.
First, we sent invitation emails to the executives in charge of digital business in the 352 potential respondent companies, respectively. After receiving emails about recommended respondents over the following two weeks, we sent out questionnaires to these respondents via email. We re-sent the questionnaires two weeks after the first mailing to those who did not respond, and we contacted those who had not replied a month after the first mailing to interview them by telephone. The duration of this research took about six months, from May to October 2019. The participation rate was close to 70% of respondents contacted (256 out of 352). However, 54 respondents started but did not complete the questionnaire. The final valid sample consisted of 202 respondents. The mean time to answer the questionnaire was 14.2 min, which was slightly less than the time indicated in the pretest of 15 min. Digital or information managers accounted for 65% (131) of the respondents, and general managers accounted for 35% (71).
This response rate is satisfactory for a multi-region survey of listed companies [63]. We avoided non-respondent bias by observing the founding time, types of ownership, and sizes of the targeted companies. The results showed that, in terms of these characteristics, the non-responding companies were not statistically different from the responding companies, indicating that our sample can represent the population. Moreover, to avoid common method bias, we used Harman’s single-factor test [64] to control for method biases statistically. There was no overarching factor showing, which means that common method bias was not an issue in our data. Therefore, we have confidence in the validity and reliability of our study.
The empirical study focused on firms with different founding times, ownership, and sizes, as shown in Table 3. Specifically, most of the firms (167, or 82.7%) had been established for more than ten years, whereas 17 firms and 18 firms had been established for five to ten years and for less than five years, respectively. In terms of ownership, most of the firms (129, or 63.9%) were private, 51 were state-owned, and 22 were of other types. About half of the firms (103, or 51.0%) were small (fewer than 300 employees), whereas 58 firms had between 300 and 1000 employees and 41 had more than 1000. Finally, 71 firms were from the software industry, 63 from the electrical equipment industry, 43 from the computer industry, 21 from the pharmaceuticals industry, and 4 from other industries.

4.2. Questionnaire

In line with Chetty, Johanson, and Martín [65], the questionnaire content and design were pretested for face validity in two stages. First, five digital strategy experts reviewed an initial draft. Then, after minor modifications, a revised draft was tested on 50 firms (randomly selected from samples) through personal interviews with the executives in charge of digital business. The purpose of the test was to find problems in the questionnaire, such as logical problems, inappropriate words, or ambiguities. According to the test results, we adjusted and improved the questionnaire. At last, after two rounds of tests, the final version was then composed and presented to the respondents. The final version included 15 questions distributed in six blocks corresponding to the six measures of CDR and elaborated from an interval scale of 5 points. In view of the standardization of procedures and content, it was assumed that the questionnaire followed the criteria of validity and reliability.

4.3. Measures

4.3.1. Measurement of CDR

The constructs and measures are shown in Table 4. First, we explored CDR in terms of digitization and digitalization, as they are two important phases of digital technology development. According to Ritter [66], digitization and digitalization take place under different scenarios in a firm’s digital operations. The former is related to digitizing capabilities, whereas the latter is concerned with digital value propositions. Hence, embedding CDR into two phases of a firm’s digital operations to demonstrate the various conceptualizations of key constructs can help clarify the meaning of CDR. We applied four decision rules developed by Jarvis, Mackenzie, and Podsakoff [67] to judge the measurement model of CDR and build a structural model of CDR and firms’ digital performances.
Regarding corporate digitized responsibility, it means that there are CDR issues in the process of encoding analog information into a digital format. This process covers data creation, transfer, storage, and analysis [16]. Correspondingly, corporate digitized responsibility is reflected in unbiased data acquisition, data protection, and data maintenance. Unbiased data acquisition is reflected by collecting sufficiently sized data and nondiscriminatory data, including indicators such as the normality of encoding analog information into a digital format, data addressability, and data programmability [40]. Data protection is reflected by data privacy and data security, including indicators such as the publicity of data sources, anonymized data, and data protection systems [42,45]. Data maintenance is the process of continual data improvement and regular checks. It is reflected by indicators such as updating the database regularly and following strict rules for data storage and utilization [45]. Hence, we measure corporate digitized responsibility through these three measurements and their eight indicators.
Regarding corporate digitalized responsibility, it refers to the CDR issues during a company’s application of digital technology. As proposed by Sivarajah et al. [22], digitalization can be categorized into three functions: descriptive, predictive, and prescriptive. Accordingly, we use appropriate data interpretation, objective predicted results, and tackling value conflicts in data-driven decision-making, respectively, as measures [49]. The appropriate data interpretation is reflected by three indicators, i.e., data communicability, data traceability, and data associability. We measured the objective predicted results by whether a firm uses processed data to promote business analysis and/or guide operational decision-making [35,68]. In terms of the third measure, we chose a firm’s human concern in data-driven business operations and/or its willingness to integrate data-driven analytics and social value in business decisions as indicators [57]. In sum, we used three measurements and seven indicators to measure corporate digitalized responsibility.

4.3.2. Measurement of Firms’ Digital Performance

In terms of the measurement of firms’ digital performance, we referred to the studies of Eller et al. [69] and Zhang et al. [70] and collected data via questionnaire. More specifically, digital performance was measured with a five-point Likert scale ranging from 1 “strongly disagree” to 5 “strongly agree” (see Appendix A, Table A1). This scale measures the perception of executives in charge of digital business regarding the digital performance of their companies.

4.3.3. Data Analysis Technique

We estimated the model using SPSS24.0 and smartPLS3.0, a variance-based structural equation modeling method [71]. SPSS was used to perform an exploratory factor analysis of the CDR indicators in both corporate digitized responsibility and corporate digitalized responsibility. SEM-PLS was chosen because it was the most suitable data analysis technique for this study in terms of the research objectives and the nature of this study (the conceptualization of CDR, its two dimensions, and its novelty). According to Chin and Newsted [72] and Ringle, Wende, and Becker [73], this method neither requires the reproduction of the covariance or correlation matrix nor requires normally distributed data. This method makes latent variables function as linear combinations of the indicators, and the factor scores can be explicitly estimated.

5. Results

5.1. Exploratory Factor Analysis

We examined the relevance of reliability of the underlying variables of the 15 measures of CDR in the questionnaire through exploratory factor analysis (EFA). As shown in Table 5 and Table 6, the results show that the 15 indicators can represent six categories of CDR indicators in digitization and digitalization.

5.2. Measurement Model

In regard to the measurement model, all scales were assessed in terms of reliability and convergent validity. First, all item loadings were significantly above the suggested acceptance limit of 0.7, except for one and two items of constructs of corporate digitized responsibility and corporate digitalized responsibility, respectively, as shown in Table 7. Because these two constructs had excellent loading in general, and their average variance extracted was over the recommended thresholds (as we explain below), we decided to retain them in the model. In some specific situations, loadings between 0.5 and 0.7 can be acceptable [74,75]. Second, according to Fornell and Larcker [76], the average variance extracted should be higher than 0.5. The results in Table 7 show that all results were above 0.5; hence, the convergent validity was satisfied. Third, Cronbach’s Alpha was used to verify reliability. In line with Nunnally and Bernstein [77], Cronbach’s Alpha should be higher than 0.8, and our results were above this criterion. Fourth, as shown in Table 8, the weights for corporate digitized responsibility and corporate digitalized responsibility were significant (β = 0.65, p < 0.001 and β = 0.442, and p < 0.001, respectively), meaning the former made a higher contribution to CDR than the latter. Furthermore, we tested for multicollinearity and found the highest variance inflation factor was 1.786, which indicates that the measure is efficient [78]. Consequently, we can accept this measure as a valuable instrument built from reliable and valid constructs.

5.3. Structural Equation Model

To prove the reliability and validity of our multidimensional measurement of the impact of CDR on firms’ digital business operations, further research on the impact of CDR on firms’ digital business performance should be conducted. The structural model demonstrates the relationship between CDR and firms’ digital performance. This relationship was tested via the bootstrap method. The results in Figure 1 show that the effect of CDR on firms’ digital performance is significant (β = 0.743 and p < 0.001). Moreover, the results indicate that the reliability of the structural model is acceptable (R2 = 0.55 and SRMR = 0.078). This result improves the external validity of the measure of CDR.

6. Discussion

6.1. Findings

We created a comprehensive conceptualization of CDR which includes corporate digitized responsibility and corporate digitalized responsibility in the context of the development of digital technology. Based on the empirical analysis of 202 high-technology enterprises in China, we have provided three vital measures of corporate digitized responsibility and corporate digitalized responsibility, respectively, and have demonstrated that a positive relationship exists between CDR and corporate digital performance. These findings enrich the theoretical research on CDR and fill the gap with respect to empirical evidence in CDR research.
First, we described CDR from the perspective of the dynamic process of digital technology development. This process can be understood in terms of digitization and digitalization, and the findings show that the former is slightly more important than the latter (contribution of 0.650 vs. 0.442). This is consistent with previous studies (e.g., Lobschat et al. [11]; Thelisson, Morin, and Rochel [10]), which demonstrate the significance of elements such as data privacy, digital security, equal access, and data protection in CDR.
Second, according to the results of the EFA, in the digitization stage, we found that all three components (unbiased data acquisition, data protection, and data maintenance) make a significant contribution to the content of corporate digitized responsibility. The results confirm that ethical problems are usually raised due to biased data acquisition in firms’ data collecting. Accessing data equally and protecting data by digital technology is important for avoiding problems of corporate digital ethics. Although the existing literature has pointed out the significant position of privacy in the data transfer process [2,6,25], our findings further suggest that avoiding malware and intrusions and accidental or intentional data loss are the core work in companies’ digital responsibilities. In addition, to maintain high data quality, firms should perform regular audits of their data. Thus, maintaining data quality at a high level can help firms to get robust results with digital analysis and create value.
Third, in the digitalization stage, the results show that two measures of appropriate data interpretation and tackling value conflicts in data-driven decision-making make a vital contribution to corporate digitalized responsibility. In terms of appropriate data interpretation, our findings support the argument by Attaran and Gunasekaran [49] that avoiding data interpretation problems is the responsibility of a company in digital times. Additionally, the results support the idea that it is crucial for companies to choose a suitable technique to describe data in a manner that is appropriate for helping audiences to understand them. As for tackling value conflicts in data-driven decision-making, our results also argue that treating value conflicts in data-driven decision-making appropriately is another important factor of corporate digitalized responsibility. However, the measure regarding objective predicted results is not significant. As the last step in digitalization, predicted results are the ultimate presentation of a series of preceding operations [53]. Thus, to obtain an objective predicted result, strict preconditions must be met. This may be the reason this result is not that significant.
Finally, we examine the relationship between CDR and firms’ digital performance. The result is in line with many existing studies indicating that value-creation activities including digital value creation and efficiency promotion, have a positive effect on corporate digital performance [38]. Furthermore, digital-related questions such as privacy disclosure and data discrimination generated from the incorrect use of digital technologies will have negative effects on a company’s performance and reputation [2,11,79]. Hence, the importance of CDR to companies’ evolving processes is obvious.

6.2. Theoretical Contributions

This article contributes to the existing literature in the following ways: first, we illustrate CDR from the perspective of the dynamic process of digital technology development. Our research combines ethical issues related to descriptive, normalized, and shaped functions of digital technology and develops a general and accurate concept of CDR to explore the intrinsic factors in digital technology development leading to CDR. Our findings go beyond the limitations of the extant literature [6,15,30], which has tended to treat CDR from a static digital technology point of view.
Second, we developed a multidimensional view of CDR. Most previous studies have usually focused on one aspect of CDR. For instance, digital ethics or information ethics is concerned with digital security and digital privacy issues; computer ethics examines human rights in human–computer interactions; robot/ML/AI ethics places emphasis on how to present objective results and tackle conflicts between humans and these digital artifacts. We argue that the CDR issues exist in different stages of digitization and digitalization. To capture the nature of CDR, we propose that digitization and digitalization are two components of CDR. Furthermore, we considered unbiased data acquisition, data protection, and data maintenance as the three main components of CDR in digitalization. Similarly, we identified three key components of digitalization: appropriate data interpretation, objective predicted results, and tackling value conflicts in data-driven decision-making. Meanwhile, our study broadens the research directions in digital ethics.
Third, we created a reliable series of measurements of CDR. In this study, based on the comprehensive discussion of CDR in both the digitization and digitalization stages, we proposed 15 CDR indicators to measure the complicated content of CDR. The empirical results support our explanation of CDR from the perspective of digital technology development. Moreover, we associated CDR with firms’ digital performance and found a positive relationship between them [38]. Furthermore, we confirm that digitization better promotes a company’s digital performance than digitalization does. We provided implications for researchers exploring the relationships between different aspects of CDR and corporate digital performance.

6.3. Managerial Implications

Our results also offer some managerial implications. According to our empirical research, managers should place emphasis on CDR issues because CDR will strongly influence a firm’s digital performance in the digital era. In particular, managers should pay attention to the ethical challenges that arise during the three stages through which data pass as they move from analog to digital form. Specifically, in the data creation process, unbiased data acquisition should serve as one of the criteria for judging data quality. In practice, firms, especially digital firms, need to use digital technology to collect data as widely as possible while avoiding discrimination. For instance, managers should formulate strict data collection rules to avoid issues of privacy and fairness. In the data transfer process, managers should take preventive measures to ensure data protection. For example, managers should avoid malware and intrusions, as well as accidental or intentional data loss. Additionally, they also can respect issues of data ethics by defining data usage conditions. Specifically, companies can set conditions to hide consumer privacy information such as birth data or home addresses or refuse to share detailed personal data from their systems with other parties. Another measure to protect data is to develop a data security system. For instance, companies can prevent data leakage through data detection. In addition, many value conflicts will need rethinking, and a great deal of business conduct should be regulated. At present, appropriate data interpretation and prediction are not just a technical issue but also a challenge of values. Companies need to develop appropriate tools such as data visualization to meet various audiences’ needs for understanding data. More importantly, an ethical company should remain objective in its data presentation. For example, companies should make customers feel digitally equitable by presenting objective consumption data. Furthermore, it is a duty for companies to give objective predicted results to audiences rather than only focus on the company’s financial interests, especially when the true predicted results are in conflict with its financial interests. Finally, with the development of digital technology, the algorithmic era has arrived. As Mackenzie [80] observes, the capability to manipulate algorithms implies a capability to manipulate the world that we see. Value conflicts inevitably occur when AI and machine learning are prevalent. As shapers of digital culture [21,81], companies have a responsibility to guide the remolding of digital values.

6.4. Limitations

There are several limitations to our study. First, with the rapid development of digital technology, new CDR issues are continually cropping up. Although we try our best to identify CDR issues in the digital context, it is not possible for our conceptualization of CDR to cover all issues. Second, although our samples are drawn randomly, our data still only represent a specific region in China. More evidence from different regions or other countries will help to enhance the validity of the measurement and our research findings. Third, with the rapid development of digital technology, reality will, to some extent, be shaped by digital technologies [21]. CDR will become not only a management issue but also a topic relating to sociology, epistemology, and philosophy. Consequently, research topics such as digital culture should be explored in the future.

Author Contributions

Conceptualization, C.C. and M.Z.; methodology, C.C. and M.Z.; software, C.C.; validation, C.C. and M.Z.; formal analysis, C.C.; investigation, C.C. and M.Z.; resources, C.C.; data curation, C.C. and M.Z.; writing—original draft preparation, C.C. and M.Z.; writing—review and editing, M.Z.; visualization, C.C. and M.Z.; supervision, C.C. and M.Z.; project administration, C.C.; funding acquisition, C.C. All authors have read and agreed to the published version of the manuscript.

Funding

This research was funded by the National Natural Science Foundation of China (Grant number 42271179); the Teacher’s Professional Development Program of Zhejiang Province for Domestic Visitor in Universities of China (Grant number FX2021179); the Major Project of the National Social Science Fund of China (NSSFC) for investigating and interpreting the principles laid out at the Fifth Session of the 19th CPC Central Committee (Grant number 21ZDA013).

Institutional Review Board Statement

Not applicable.

Informed Consent Statement

Not applicable.

Data Availability Statement

The dataset generated and analyzed in this study is not publicly available. Dataset is available from the corresponding author on reasonable request.

Acknowledgments

The authors would like to thank Yingying Zhang, Jiayang Hu and Hongming Xie for their assistance with sample acquisition. The authors, however, bear full responsibility for the paper.

Conflicts of Interest

We declare that we have no financial and personal relationships with other people or organizations that can inappropriately influence our work, there is no professional or other personal interest of any nature or kind in any product, service and/or company that could be construed as influencing the position presented in, or the review of this manuscript.

Appendix A

Table A1. Questionnaire on a firm’s digital performance.
Table A1. Questionnaire on a firm’s digital performance.
Digital PerformanceStrongly DisagreeDisagreeNeutralAgreeStrongly Agree
Q3-1The quality of our digital solutions is superior compared to our competitors.12345
Q3-2The features of our digital solutions are superior compared to our competitors.12345
Q3-3The applications of our digital solutions are totally different from our competitors.12345
Q3-4Our digital solutions are different from those of our competitors in terms of product platform.12345
Q3-5Our new digital solutions are major improvements of existing products.12345
Q3-6Some of our digital solutions are new to the market at the time of launching.12345
Q3-7Our solutions have superior digital technology.12345
Q3-8New digital technology is readily accepted in our organization.12345

References

  1. Moorthy, J.; Lahiri, R.; Biswas, N.; Sanyal, D.; Ranjan, J.; Nanath, K.; Ghosh, P. Big data: Prospects and challenges. Vikalpa J. Decis. Mak. 2015, 40, 74–96. [Google Scholar] [CrossRef] [Green Version]
  2. Kshetri, N. Big data’s impact on privacy, security and consumer welfare. Telecommun. Policy 2014, 38, 1134–1145. [Google Scholar] [CrossRef] [Green Version]
  3. Bharadwaj, A.; El Sawy, O.A.; Pavlou, P.A.; Venkatraman, N. Digital business strategy: Toward a next generation of insights. MIS Q. 2013, 37, 471–482. [Google Scholar] [CrossRef]
  4. Wiener, N. Cybernetics: Or Control and Communication in the Animal and the Machine; Technology Press: Cambridge, MA, USA, 1948. [Google Scholar]
  5. Bynum, T.W. Computer ethics: Its birth and its future. Ethics Inf. Technol. 2001, 3, 109–112. [Google Scholar] [CrossRef]
  6. Lee, I.; Shin, Y.J. Machine learning for enterprises: Applications, algorithm selection, and challenges. Bus. Horiz. 2020, 63, 157–170. [Google Scholar] [CrossRef]
  7. Van Doorn, J.; Mende, M.; Noble, S.M.; Hulland, J.; Ostrom, A.L.; Grewal, D.; Petersen, J.A. Domo arigato Mr. Roboto: Emergence of automated social presence in organizational frontlines and customers’ service experiences. J. Serv. Res. 2017, 20, 43–58. [Google Scholar] [CrossRef] [Green Version]
  8. Cooper, T.; Siu, J.; Wei, K. Corporate Digital Responsibility—Doing Well by Doing Good. 2015. Available online: https://www.criticaleye.com/inspiring/insights-servfile.cfm?id=4431 (accessed on 28 December 2022).
  9. Herden, C.J.; Alliu, E.; Cakici, A.; Cormier, T.; Deguelle, C.; Gambhir, S.; Griffiths, C.; Gupta, S.; Kamani, S.R.; Kiratli, Y.S.; et al. “Corporate Digital Responsibility”: New corporate responsibilities in the digital age. Nachhalt. Manag. Forum 2021, 29, 13–29. [Google Scholar] [CrossRef]
  10. Thelisson, E.; Morin, J.H.; Rochel, J. AI governance: Digital responsibility as a building block: Towards an index of digital responsibility. Delphi 2019, 2, 167. [Google Scholar]
  11. Lobschat, L.; Mueller, B.; Eggers, F.; Brandimarte, L.; Diefenbach, S.; Kroschke, M.; Wirtz, J. Corporate digital responsibility. J. Bus. Res. 2019, 122, 875–888. [Google Scholar] [CrossRef]
  12. Mihale-Wilson, C.A.; Zibuschka, J.; Carl, K.V.; Hinz, O. Corporate digital responsibility—Extended Conceptualization and Empirical Assessment. In Proceedings of the 29th European Conference on Information Systems, Marrakech, Morocco, 14–16 June 2021. [Google Scholar]
  13. Mueller, B. Corporate digital responsibility. Bus. Inf. Syst. Eng. 2022, 64, 689–700. [Google Scholar] [CrossRef]
  14. Elliott, K.; Price, R.; Shaw, P.; Spiliotopoulos, T.; Ng, M.; Coopamootoo, K.; van Moorsel, A. Towards an equitable digital society: Artificial Intelligence (AI) and corporate digital responsibility (CDR). Society 2021, 58, 1–10. [Google Scholar] [CrossRef] [PubMed]
  15. Lin, P.; Abney, K.; Bekey, G. Robot ethics: Mapping the issues for a mechanized world. Artif. Intell. 2011, 175, 942–949. [Google Scholar] [CrossRef] [Green Version]
  16. Brennen, S.J.; Kreiss, D. Digitalization. The International Encyclopedia of Communication Theory and Philosophy; Wiley: Hoboken, NJ, USA, 2016; pp. 1–11. [Google Scholar]
  17. Dourish, P. Where the Action Is: The Foundation of Embodied Interaction; MIT Press: Cambridge, MA, USA, 2001. [Google Scholar]
  18. Ryals, L.; Payne, A. Customer Relationship Management in Financial Services: Towards Information-Enabled Relationship Marketing. J. Strateg. Mark. 2001, 9, 3–27. [Google Scholar] [CrossRef]
  19. Nevo, S.; Wade, M.R. The formation and value of IT-enabled resources: Antecedents and consequences of synergistic relationships. MIS Q. 2010, 34, 163–183. [Google Scholar] [CrossRef] [Green Version]
  20. Yoo, Y.; Boland, R.J., Jr.; Lyytinen, K.; Majchrzak, A. Organizing for innovation in the digitized world. Organ. Sci. 2012, 23, 1398–1408. [Google Scholar] [CrossRef]
  21. Baskerville, R.L.; Myers, M.D.; Yoo, Y. Digital first: The ontological reversal and new challenges for IS research. MIS Q. 2019, 44, 509–523. [Google Scholar] [CrossRef]
  22. Sivarajah, U.; Kamal, M.M.; Irani, Z.; Weerakkody, V. Critical analysis of Big Data challenges and analytical methods. J. Bus. Res. 2017, 70, 263–286. [Google Scholar] [CrossRef] [Green Version]
  23. Johnson, D.G. Computer Ethics; Prentice-Hall: Englewood Cliffs, NJ, USA, 1985. [Google Scholar]
  24. Parker, D.B. Rules of ethics in information processing. Commun. ACM 1968, 11, 198–201. [Google Scholar] [CrossRef]
  25. Capurro, R. Why information ethics? Int. J. Appl. Res. Inf. Technol. Comput. 2018, 9, 50–52. [Google Scholar] [CrossRef]
  26. Stahl, B.C.; Timmermans, J.; Flick, C. Ethics of emerging information and communication technologies: On the implementation of responsible research and innovation. Sci. Public Policy 2017, 44, 369–381. [Google Scholar] [CrossRef] [Green Version]
  27. Zwitter, A. Big Data ethics. Big Data Soc. 2014, 1, 1–6. [Google Scholar] [CrossRef]
  28. Taherdoost, H.; Sahibuddin, S.; Namayandeh, M.; Jalaliyoon, N. Propose an educational plan for computer ethics and information security. Procedia Soc. Behav. Sci. 2011, 28, 815–819. [Google Scholar] [CrossRef] [Green Version]
  29. Asaro, P.M. What should we want from a robot ethic? Int. Rev. Inf. Ethics 2006, 6, 9–16. [Google Scholar] [CrossRef]
  30. Malle, B.F. Integrating robot ethics and machine morality: The study and design of moral competence in robots. Ethics Inf. Technol. 2016, 18, 243–256. [Google Scholar] [CrossRef]
  31. Hagendorff, T. The ethics of AI ethics: An evaluation of guidelines. Minds Mach. 2020, 30, 1–22. [Google Scholar] [CrossRef] [Green Version]
  32. Suchacka, M. Corporate digital responsibility: New challenges to the social sciences. Int. J. Res. E-Learn. 2019, 5, 5–20. [Google Scholar] [CrossRef]
  33. Wirtz, J.; Hartley, N.; Kunz, W.H.; Tarbit, J.; Ford, J. Corporate Digital Responsibility at the Dawn of the Digital Service Revolution. 2021. Available online: https://papers.ssrn.com/sol3/papers.cfm?abstract_id=3806235 (accessed on 10 March 2022).
  34. Wade, M. Corporate responsibility in the digital era. MIT Sloan Management Review, 28 April 2020; p. 28. [Google Scholar]
  35. Nambisan, S.; Lyytinen, K.; Majchrzak, A.; Song, M. Digital innovation management: Reinventing innovation management research in a digital world. MIS Q. 2017, 41, 223–238. [Google Scholar] [CrossRef]
  36. Moustaka, V.; Theodosiou, Z.; Vakali, A.; Kounoudes, A.; Anthopoulos, L.G. Εnhancing social networking in smart cities: Privacy and security borderlines. Technol. Forecast. Soc. Chang. 2019, 142, 285–300. [Google Scholar] [CrossRef]
  37. Mihale-Wilson, C.; Hinz, O.; van der Aalst, W.; Weinhardt, C. Corporate Digital Responsibility. Bus. Inf. Syst. Eng. 2022, 64, 127–132. [Google Scholar] [CrossRef]
  38. Ritter, T.; Pedersen, C.L. Digitization capability and the digitalization of business models in business-to-business firms: Past, present, and future. Ind. Mark. Manag. 2020, 86, 180–190. [Google Scholar] [CrossRef]
  39. Yoo, Y.; Henfridsson, O.; Lyytinen, K. The new organizing logic of digital innovation: An agenda for information systems research. Inf. Syst. Res. 2010, 21, 724–735. [Google Scholar] [CrossRef]
  40. Ranschaert, E.R.; Morozov, S.; Algra, P.R. Artificial Intelligence in Medical Imaging: Opportunities, Applications and Risks; Springer: Berlin, Germany, 2019. [Google Scholar]
  41. Shakina, E.; Parshakov, P.; Alsufiev, A. Rethinking the corporate digital divide: The complementarity of technologies and the demand for digital skills. Technol. Forecast. Soc. Chang. 2021, 162, 120405. [Google Scholar] [CrossRef]
  42. Sestino, A.; Prete, M.I.; Piper, L.; Guido, G. Internet of Things and Big Data as enablers for business digitalization strategies. Technovation 2020, 98, 102173. [Google Scholar] [CrossRef]
  43. Liu, R.; Gupta, S.; Patel, P. The application of the principles of responsible AI on social media marketing for digital health. Inf. Syst. Front. 2021, 13, 1–25. [Google Scholar] [CrossRef]
  44. Kelley, D. Addressing the Unstructured Data Protection Challenge. 2008. Available online: https://www.xlsoft.com/en/services/materials/files/AddressingTheUnstructuredDataProtectionChallenge.pdf (accessed on 10 March 2022).
  45. Talha, M.; Abou El Kalam, A.; Elmarzouqi, N. Big Data: Trade-off between data quality and data security. Procedia Comput. Sci. 2019, 151, 916–922. [Google Scholar] [CrossRef]
  46. Lepenioti, K.; Bousdekis, A.; Apostolou, D.; Mentzas, G. Prescriptive analytics: Literature review and research challenges. Int. J. Inf. Manag. 2020, 50, 57–70. [Google Scholar] [CrossRef]
  47. Rehman, M.H.; Chang, V.; Batool, A.; Teh, Y.W. Big data reduction framework for value creation in sustainable enterprises. Int. J. Inf. Manag. 2016, 36, 917–928. [Google Scholar] [CrossRef] [Green Version]
  48. Banerjee, A.; Bandyopadhyay, T.; Acharya, P. Data analytics: Hyped up aspirations or true potential? Vikalpa 2013, 38, 1–12. [Google Scholar] [CrossRef]
  49. Attaran, M.; Gunasekaran, A. Applications of Blockchain Technology in Business: Challenges and Opportunities; Springer Nature: Berlin, Germany, 2019. [Google Scholar]
  50. Watson, H.J. Tutorial: Big data analytics: Concepts, technologies, and applications. Commun. Assoc. Inf. Syst. 2014, 34, 1247–1268. [Google Scholar] [CrossRef]
  51. Joseph, R.C.; Johnson, N.A. Big data and transformational government. IT Prof. 2013, 15, 43–48. [Google Scholar] [CrossRef]
  52. Waller, M.A.; Fawcett, S.E. Data science, predictive analytics, and big data: A revolution that will transform supply chain design and management. J. Bus. Logist. 2013, 34, 77–84. [Google Scholar] [CrossRef]
  53. Stedham, Y.; Yamamura, J.H.; Beekun, R.I. Gender differences in business ethics: Justice and relativist perspectives. Bus. Ethics A Eur. Rev. 2007, 16, 163–174. [Google Scholar] [CrossRef]
  54. Orlikowski, W.J.; Scott, S.V. The algorithm and the crowd: Considering the materiality of service innovation. MIS Q. 2015, 31, 201–216. [Google Scholar] [CrossRef]
  55. Setzke, D.S.; Riasanow, T.; Bhm, M.; Krcmar, H. Pathways to digital service innovation: The role of digital transformation strategies in established organizations. Inf. Syst. Front. 2021, 1–21. [Google Scholar] [CrossRef]
  56. Rachinger, M.; Rauter, R.; Müller, C.; Vorraber, W.; Schirgi, E. Digitalization and its influence on business model innovation. J. Manuf. Technol. Manag. 2019, 30, 1143–1160. [Google Scholar] [CrossRef] [Green Version]
  57. Echeverría, J.; Tabarés, R. Artificial intelligence, cybercities and technosocieties. Minds Mach. 2017, 27, 473–493. [Google Scholar] [CrossRef]
  58. Newell, S.; Marabelli, M. Strategic opportunities (and challenges) of algorithmic decision-making: A call for action on the long-term societal effects of ‘datification’. J. Strateg. Inf. Syst. 2015, 24, 3–14. [Google Scholar] [CrossRef]
  59. Bonnefon, J.F.; Shariff, A.; Rahwan, I. The social dilemma of autonomous vehicles. Science 2016, 352, 1573–1576. [Google Scholar] [CrossRef] [Green Version]
  60. Balsmeier, B.; Woerter, M. Is this time different? How digitalization influences job creation and destruction. Res. Policy 2019, 48, 103765. [Google Scholar] [CrossRef]
  61. Lobera, J.; Fernández Rodríguez, C.J.; Torres-Albero, C. Privacy, values and machines: Predicting opposition to artificial intelligence. Commun. Stud. 2020, 71, 448–465. [Google Scholar] [CrossRef]
  62. Howcroft, D.; Bergvall-Kåreborn, B. A typology of crowdwork platforms. Work. Employ. Soc. 2019, 33, 21–38. [Google Scholar] [CrossRef]
  63. Brock, D.M.; Shenkar, O.; Shoham, A.; Siscovick, I.C. National culture and expatriate deployment. J. Int. Bus. Stud. 2008, 39, 1293–1309. [Google Scholar] [CrossRef]
  64. Podsakoff, P.M.; MacKenzie, S.B.; Lee, J.Y.; Podsakoff, N.P. Common method biases in behavioral research: A critical review of the literature and recommended remedies. J. Appl. Psychol. 2003, 88, 879–903. [Google Scholar] [CrossRef] [PubMed]
  65. Chetty, S.; Johanson, M.; Martín, O.M. Speed of internationalization: Conceptualization, measurement and validation. J. World Bus. 2014, 49, 633–650. [Google Scholar] [CrossRef] [Green Version]
  66. Ritter, T. Alignment Squared: Driving Competitiveness and Growth through Business Model Excellence; CBS Competitiveness Platform: Frederiksberg, Denmark, 2014. [Google Scholar]
  67. Jarvis, C.B.; Mackenzie, S.B.; Podsakoff, P.M. A critical review of construct indicators and measurement model misspecification in marketing and consumer research. J. Consum. Res. 2003, 30, 199–219. [Google Scholar] [CrossRef] [Green Version]
  68. Nambisan, S.; Wright, M.; Feldman, M. The digital transformation of innovation and entrepreneurship: Progress, challenges and key themes. Res. Policy 2019, 48, 103773. [Google Scholar] [CrossRef]
  69. Eller, R.; Alford, P.; Kallmünzer, A.; Peters, M. Antecedents, consequences, and challenges of small and medium-sized enterprise digitalization. J. Bus. Res. 2020, 112, 119–127. [Google Scholar] [CrossRef]
  70. Zhang, X.; Xu, Y.; Ma, L. Research on successful factors and influencing mechanism of the digital transformation in SMEs. Sustainability 2022, 14, 2549. [Google Scholar] [CrossRef]
  71. Hair, J.F.; Sarstedt, M.; Pieper, T.M.; Ringle, C.M. The use of partial least squares structural equation modeling in strategic management research: A review of past practices and recommendations for future applications. Long Range Plan. 2012, 45, 320–340. [Google Scholar] [CrossRef]
  72. Chin, W.W.; Newsted, P.R. Structural equation modeling analysis with small samples using partial least squares. Stat. Strateg. Small Sample Res. 1999, 1, 307–341. [Google Scholar]
  73. Ringle, C.M.; Wende, S.; Becker, J.M. SmartPLS 3. SmartPLS GmbH, Boenningstedt. J. Serv. Sci. Manag. 2015, 10, 32–49. [Google Scholar]
  74. Chin, W.W. The partial least squares approach to structural equation modeling. Mod. Methods Bus. Res. 1998, 295, 295–336. [Google Scholar]
  75. Hulland, J. Use of partial least squares (PLS) in strategic management research: A review of four recent studies. Strateg. Manag. J. 1999, 20, 195–204. [Google Scholar] [CrossRef]
  76. Fornell, C.; Larcker, D. Evaluating structural equation models with unobservable variables and measurement error. J. Mark. Res. 1981, 18, 39–50. [Google Scholar] [CrossRef]
  77. Nunnally, J.C.; Bernstein, I.H. Psychometric Theory, 3rd ed.; McGraw-Hill: New York, NY, USA, 1994. [Google Scholar]
  78. Hair, J.F.; Hult, G.T.M.; Ringle, C.; Sarstedt, M. A Primer on Partial Least Squares Structural Equation Modeling (PLS-SEM); Sage Publications: Thousand Oaks, CA, USA, 2016. [Google Scholar]
  79. Saeidi, S.P.; Sofian, S.; Saeidi, P.; Saeidi, S.P.; Saaeidi, S.A. How does corporate social responsibility contribute to firm financial performance? The mediating role of competitive advantage, reputation, and customer satisfaction. J. Bus. Res. 2015, 68, 341–350. [Google Scholar] [CrossRef]
  80. Mackenzie, A. Cutting Code: Software and Sociality; Peter Lang Publishing: New York, NY, USA, 2006. [Google Scholar]
  81. Nasiri, M.; Saunila, M.; Ukko, J.; Rantala, T.; Rantanen, H. Shaping digital innovation via digital-related capabilities. Inf. Syst. Front. 2020, 1–18. [Google Scholar] [CrossRef]
Figure 1. Conceptualization model.
Figure 1. Conceptualization model.
Sustainability 15 02319 g001
Table 1. Recent studies of concepts related to corporate responsibility in the digital realm.
Table 1. Recent studies of concepts related to corporate responsibility in the digital realm.
TopicsConceptsDefinitionConceptualizationContributionsAuthors
Corporate responsibility in the digital realminformation ethicsInformation ethics is defined as dealing with the impact of digital ICT on society and the environment at large as well as with ethical questions dealing with the internet, digital information, and communications media in particular. Some authors reckon that issues of information ethics include privacy, information overload, internet addiction, the digital divide, surveillance, and robotics, particularly from an intercultural perspective. In addition, the conceptualization can be described in five aspects: 1. Impact on individuals: privacy, autonomy, treatment of humans, identity, security; 2. Consequences of society: digital divides, collective human identity and the good life; 3. Uncertainty of outcomes; 4. Perceptions of technology; and 5. Role of humans.The academic contribution of these articles goes beyond the ethical analysis of individual technologies and offers an array of ethical issues that are likely to be relevant across different ICTs. These papers deliver the message that information ethics can and should contribute to addressing the challenges of the digital age.[25,26]
Big data ethics No explicit definition.Big data ethics include privacy, security, consumer welfare, propensity, etc. Authors investigate the relation between characteristics of big data and privacy, security and consumer welfare issues from the standpoints of data collection, storage, sharing, and accessibility. [2,27]
Computer ethicsComputer ethics is defined as the analysis of the nature and social impact of computer technology and the corresponding formulation and justification of policies for the ethical use of such a technology.Computer ethics include intellectual property, privacy and data protection, equal access, responsibility and information, replacement of people in the workplace, computer crime, intellectual property, accuracy, accessibility, morality, and awareness.The authors emphasize the importance of computer ethics and develop a typology that highlights different combinations of automated and human social presence, and indicate literature gaps, thereby emphasizing avenues for future research.[28]
Machine learning challengesNo explicit definition.Machine learning challenges related to data privacy, protection rules, biases, equitably accessibility, and nondiscrimination.This paper points out the challenges of machine learning, which include ethical challenges, the shortage of machine-learning engineers, the data quality challenge, and the cost–benefit challenge.[6]
Robot ethicsRobot ethics encompasses ethical questions about how humans should design, deploy, and treat robots.Robot ethics can be divided into three broad categories: safety and errors, law and ethics, and social impact. Additionally, it can be illustrated by three aspects: the ethical systems built into robots; the ethics of people who design and use robots; and the ethics of how people treat robots. Authors conclude that robot ethics are more related to the ethical level of the people who design and use robots. Moreover, we should consider both ethical questions about how humans should design, deploy, and treat robots and questions about what moral capabilities a robot could and should have.[15,29,30]
AI ethicsAI ethics deals less with AI as such, than with ways of deviating or distancing oneself from problematic routines of action, with uncovering blind spots in knowledge, and with gaining individual self-responsibility.The author summarizes 8 AI ethical principles: privacy protection; accountability; fairness; transparency; safety; common good; explainability; and human oversightThe author gives a detailed overview of the field of AI ethics and also examines to what extent the respective ethical principles and values are implemented in the practice of research, development, and application of AI systems and how the effectiveness in the demands of AI ethics can be improved.[31]
Table 2. Recent studies of CDR.
Table 2. Recent studies of CDR.
TopicsConceptsDefinitionConceptualizationContributionsAuthors
CDRCDR as an extension of CSRCDR is an extension of a firm’s responsibilities which takes into account the ethical opportunities and challenges of digitalization.This paper classifies CDR from three aspects: environmental CDR, social CDR, and governance CDR.This paper expands on existing knowledge of CDR by clearly articulating how CDR relates to the general responsibilities of companies and by discussing which new topics might arise in these dimensions due to emerging technologies.[9]
Digital corporate social responsibilityCorporate digital responsibility means a kind of digital corporate social responsibility.The connotation of CDR includes securing autonomy and privacy, respecting equality, dealing with data, dealing with algorithms, taking impact on the environment into account, and ensuring a fair transition.This paper provides an outline of what digital responsibility is and proposes a Digital Responsibility Index to assess corporate behavior.[10]
CDR as an awareness of duties CDR means the awareness of duties binding the organizations active in the field of technological development and using technologies to provide services.CDR issues related to the threat of AI and automation to the workforce and business operations.The author identifies certain theoretical aspects and potential consequences related to threats posed by the development of new technologies, artificial intelligence, automation, and digitalization of social environments on a large scale.[32]
CDR as a set of shared value CDR is a set of shared values and norms guiding an organization’s operations with respect to the creation and operation of digital technology and data.The set of shared values and norms of an organization during the process of its creation of technology; data capture, operations, and decision-making; inspection and impact assessment; and refinement
of technology and data.
This article illustrates how an organization’s shared values and norms regarding CDR can get translated into actionable guidelines for users. This provides grounds for future discussions related to CDR readiness, implementation, and success.[11]
CDR in service contextNo explicit definition.In the service context, CDR encompasses the ethical responsibility inherent in the creation and operation of service technologies and customer data across all functions of a service organization to ensure customers are treated in an ethical and fair manner while their privacy rights are protected.This article uses a life-cycle stage perspective of data and technologies to understand digital risks, examine an organization’s business model and business partner ecosystem to identify risks origination, and finally proposes a set of strategies and tools for managers to choose.[33]
CDR as a voluntary commitmentCDR is a voluntary commitment by organizations fulfilling the corporate rationalizers’ role in representing community interests to inform “good” digital corporate actions and digital sustainability via collaborative guidance on addressing social, economic, and ecological impacts on digital society.The contents of CDR include promoting economic transparency, promoting societal wellbeing, reducing tech impact on environment, fair and equitable access for all society, investing in the new-economy, promoting a sustainable planet to live on, and purpose and trust.This paper uses harmonized and aligned approaches, illustrating the opportunities and threats of AI, while raising awareness of Corporate Digital Responsibility (CDR) as a potential collaborative mechanism to demystify governance complexity and to establish an equitable digital society.[14]
CDR should be considered separately from CSRNo explicit definition.CDR includes technology access and technological literacy, information transparency, customers’ economic interests, product safety and liability, privacy and data security, dispute resolution and redress, and governance and participation mechanisms.This paper conceptualizes CDR and gives an empirical analysis on consumers’ valuation of CDR norms and implementations.[12]
CDRNo explicit definition.CDR is a set of practices and behaviors that help an organization use data and digital technologies in a way that is socially, economically, technologically, and environmentally responsible.CDR include four categories: social, economic, technological, and environmental.
[34]
Table 3. Description of the sample.
Table 3. Description of the sample.
CharacteristicsDescriptionFrequency (N = 202)Percentage (%)
FoundingMore than 10 years16782.7
5 to 10 years178.4
Less than 5 years188.9
OwnershipPrivately-owned12963.9
State-owned5125.2
Others2210.9
SizeSmall (fewer than 300 employees)10351.0
Medium (300–1000 employees)5828.7
Large (1000 or more employees)4120.3
IndustrySoftware7135.1
Electrical equipment6331.2
Computers4321.3
Pharmaceuticals2110.4
Others42.0
Table 4. Constructs and measures.
Table 4. Constructs and measures.
ConstructIndicatorsMeasurement
Corporate digital responsibilityCorporate digitized responsibility
Corporate digitalized responsibility
Latent variable scores
Latent variable scores
Corporate digitized responsibilityUnbiased data acquisition
Data protection
Data maintenance
Latent variable scores
Latent variable scores
Latent variable scores
Unbiased data acquisitionWe emphasize the encoding of analog information into digital format.
We emphasize the collection of data that are addressable.
We emphasize the collection of data that are programmable.
Scale 1(low)–5(high)
Scale 1(low)–5(high)
Scale 1(low)–5(high)
Data protectionWe collect the data primarily from public sources.
We use anonymized data in our business practices.
We have established a data protection system.
Scale 1(low)–5(high)
Scale 1(low)–5(high)
Scale 1(low)–5(high)
Data maintenanceWe update the database regularly.
We have strict rules for data storage and utilization.
Scale 1(low)–5(high)
Scale 1(low)–5(high)
Corporate digitalized responsibilityAppropriate data interpretation
Objective predicted results
Tackling value conflicts in data-driven decision-making
Latent variable scores
Latent variable scores
Latent variable scores
Appropriate data interpretationWe use digital technology to make data communicable.
We use digital technology to make data traceable.
We use digital technology to make data associable.
Scale 1(low)–5(high)
Scale 1(low)–5(high)
Scale 1(low)–5(high)
Objective predicted resultsWe use the processed data to promote business analysis.
We use the processed data to guide operational decision-making.
Scale 1(low)–5(high)
Scale 1(low)–5(high)
Tackling value conflicts in data-driven decision-makingWe focus on human concern in data-driven business operations.
We usually integrate data-diven analytics and social value in business decision.
Scale 1(low)–5(high)
Scale 1(low)–5(high)
Table 5. Exploratory factor analysis of corporate digitized responsibility.
Table 5. Exploratory factor analysis of corporate digitized responsibility.
Corporate Digitized ResponsibilityUnbiased Data AcquisitionData ProtectionData Maintenance
We emphasize the encoding of analog information into a digital format. (Q1-1)0.8840.2360.183
We emphasize the collection of data that are addressable. (Q1-2)0.9020.2360.184
We emphasize the collection of data that are programmable. (Q1-3)0.8470.3400.198
We collect the data primarily from public sources. (Q1-4)0.3190.8500.191
We use anonymized data in our business practices. (Q1-5)0.2550.8860.218
We have established a data protection system. (Q1-6)0.2400.8770.226
We update the database regularly. (Q1-7)0.2050.2550.909
We have strict rules for data storage and utilization. (Q1-8)0.2090.2090.922
Table 6. Exploratory factor analysis of corporate digitalized responsibility.
Table 6. Exploratory factor analysis of corporate digitalized responsibility.
Corporate Digitalized ResponsibilityAppropriate Data InterpretationObjective Predicted ResultsTackling Value Conflicts of Data-Driven Decision-Making
We use digital technology to make data communicable. (Q2-1)0.9290.0900.167
We use digital technology to make data traceable. (Q2-2)0.9090.1110.182
We use digital technology to make data associable. (Q2-3)0.8820.1260.221
We use the processed data to promote business analysis. (Q2-4)0.1030.9420.141
We use the processed data to guide operational decision-making. (Q2-5)0.1330.9310.179
We focus on human concern in data-driven business operations. (Q2-6)0.2270.1960.891
We usually integrate data-driven analytics and social value in business decisions. (Q2-7)0.2140.1410.909
Table 7. Reliability and average variance extracted for constructs with reflective indicators.
Table 7. Reliability and average variance extracted for constructs with reflective indicators.
Construct/Indicators Mean SD Item Reliability LoadingCronbach’s AlphaConstruct Reliability Composite ReliabilityConvergent Validity AVE
Corporate digitized responsibility 0.9100.9270.615
Unbiased data acquisition
Q1-1
Q1-2
Q1-3

3.63
3.72
3.70

1.051
1.082
1.099

0.785
0.797
0.835
Data protection
Q1-4
Q1-5
Q1-6

3.93
4.03
4.03

0.853
0.795
0.799

0.824
0.819
0.808
Data maintenance
Q1-7
Q1-8

3.86
3.78

0.930
0.966

0.705
0.684
Corporate digitalized responsibility 0.8430.8810.519
Appropriate data interpretation
Q2-1
Q2-2
Q2-3

3.70
3.65
3.61

1.073
1.029
1.057

0.783
0.787
0.793
Objective predicted results
Q2-4
Q2-5

3.56
3.63

0.974
1.051

0.559
0.597
Tackling value conflicts in data-driven decision-making
Q2-6
Q2-7

3.92
4.01

0.999
0.923

0.754
0.730
Digital performance
Q3-1
Q3-2
Q3-3
Q3-4
Q3-5
Q3-6
Q3-7
Q3-8

3.95
3.63
3.93
4.00
4.27
4.38
4.16
4.23

1.135
1.056
0.998
1.002
0.856
0.819
0.881
0.827

0.771
0.731
0.846
0.834
0.791
0.813
0.805
0.812
0.9200.9350.642
Table 8. Item weights and multicollinearity tests for constructs with formative indicators.
Table 8. Item weights and multicollinearity tests for constructs with formative indicators.
ConstructWeightt-ValueVIF
Corporate digitized responsibility0.650 ***22.8621.786
Corporate digitalized responsibility0.442 ***15.4621.786
*** p < 0.001.
Disclaimer/Publisher’s Note: The statements, opinions and data contained in all publications are solely those of the individual author(s) and contributor(s) and not of MDPI and/or the editor(s). MDPI and/or the editor(s) disclaim responsibility for any injury to people or property resulting from any ideas, methods, instructions or products referred to in the content.

Share and Cite

MDPI and ACS Style

Cheng, C.; Zhang, M. Conceptualizing Corporate Digital Responsibility: A Digital Technology Development Perspective. Sustainability 2023, 15, 2319. https://doi.org/10.3390/su15032319

AMA Style

Cheng C, Zhang M. Conceptualizing Corporate Digital Responsibility: A Digital Technology Development Perspective. Sustainability. 2023; 15(3):2319. https://doi.org/10.3390/su15032319

Chicago/Turabian Style

Cheng, Cong, and Mengxin Zhang. 2023. "Conceptualizing Corporate Digital Responsibility: A Digital Technology Development Perspective" Sustainability 15, no. 3: 2319. https://doi.org/10.3390/su15032319

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Back to TopTop