4.1. Evolutionary Stability Strategies of the Government
According to assumptions 1–5 and
Table 3 and
Table 4, the expected return
for the government from choosing the “Supervise” strategy, the expected return
for the “Not supervise” strategy, and the average expected return
in the game are:
The equations for the replication dynamics of the government are obtained according to the Malthusian equation as follows:
differentiating with respect to
x yields
Based on the stability theorem, x is an Evolutionarily Stable Strategy (ESS) when and . Letting , we get , or , and we can see that when , for any x we have and . Any strategy of the government is a stable strategy. When , , , and so is the ESS of the government. Conversely, when , is the ESS.
Based on the above analysis, a phase diagram can be constructed to depict the dynamic change process of the government’s strategy, as shown in
Figure 2. In the figure,
forms a surface on which any strategy chosen by the government is stable, while we can see that the surface divides the cubic space into two regions
and
. In region
, the government’s strategy will be stable in the “Supervise” strategy, while in region
, the government’s strategy is stable in the “Not supervise” strategy. Analysis shows that government regulatory strategy is effective based on fines
F and user participation
z, but negatively affected by regulatory costs
and user incentives
H. Current fines for platform enterprises are ineffective due to their high market value. Regulations should be revised to align fines with turnover and revenue, and innovative technology and user incentives should be used to reduce costs and increase participation.
4.2. Evolutionary Stability Strategies of Digital Platforms
The expected benefit
for the digital platform from choosing the “Abuse data” strategy, the expected benefit
for the “Not abuse data” strategy, and the average expected benefit
in the game are:
Then the following gives the dynamic equation of the digital platform:
taking the derivative with respect to
y gives
Putting so as to get , or , we can see that when , for any y we have and , so any strategy of the digital platform is a stable strategy. When , , , and so is ESS. Conversely, when , is ESS.
Based on the above analysis, we can draw a phase diagram to illustrate the dynamic change process of the digital platforms’ strategies, as shown in
Figure 3. In the figure,
forms a surface on which any strategy chosen by the platform is stable, while we can see that the surface divides the cubic space into two regions
and
. In region
, the platforms’ strategy will be stable in the “Abuse data” strategy, while in region
, the government’s strategy is stable in the “Not abuse data” strategy. It can be deduced that the possibility of platforms abusing data is positively correlated with the probability of user claims
s and fines
F, while being negatively correlated with the profit margin the platform gains from such abuse
and the resulting loss of privacy
. Therefore, it is imperative to increase the cost of data abuse by either raising user awareness about the importance of privacy and data protection, empowering them to safeguard their rights. Alternatively, the government may levy targeted data taxes aimed at reducing platform companies’ profits derived from personal data abuse. However, there is a tendency for platform companies to invest more in data security technologies to commit more insidious data abuses.
4.3. Evolutionary Stability Strategies of Users
The expected payoff
for users who choose the “Participate in supervision” strategy, the expected payoff
for users who choose the “Not participate in supervision” strategy and the average expected payoff
in the game are:
The following gives the user’s dynamic equation:
taking the derivative with respect to
z, we get
Supposing so as to get , or , it can be seen that when , for any z with and , any strategy of the users is a stable strategy. When , , , and so is ESS. In the remaining case, when , is ESS.
Based on the above analysis, we can draw a phase diagram to illustrate the dynamic change process of the users’ strategies, as shown in
Figure 4.
In the figure, forms a surface on which any strategy chosen by the government is stable, while we can see that the surface divides the cubic space into two regions and . In region , the users’ strategy will be stable in the “Participate in supervision” strategy, while in region , the users’ strategy is stable in the “Not participate in supervision” strategy. Our analysis reveals that users’ likelihood of participating in supervision is positively associated with government incentives H, compensation s, government’s regulation x, and self-satisfaction A, but negatively related to information costs . Users consider the benefits of supervision before deciding to participate and are more likely to participate when government regulation is more prevalent. Therefore, the government should play a mandatory and guiding role in regulation to effectively stimulate users’ enthusiasm for participating in supervision.
4.4. Tripartite ESS Analysis
Assembling the equations of Equations (
6), (
11) and (
16), the replication dynamic system of the government, digital platforms and users is obtained. Setting
, yields 14 equilibrium points, including 8 pure strategies
and 6 mixed strategies (
,
,
). In addition, the stability of the equilibrium point of the replica dynamical system can be judged according to the Liapunov stability discriminant, which means that an equilibrium point such that all eigenvalues of the Jacobi matrix are nonpositive is the ESS of the system. The Jacobi matrix is constructed using Equation (
18).
The values of the eight pure strategies are put into the Jacobi matrix
J to find the eigenvalues of the Jacobi matrix at the different equilibrium points. As shown in
Table 5.
It is evident that numerous parameters and complex relationships influence the evolutionary trends of data abuse and its regulation in digital platforms. Hence, to focus on the key issues, the study exclusively focuses on the equilibrium point where digital platforms do not abuse data (
), which is divided into two scenarios, as shown in
Table 6.
Based on the above analysis, Propositions 1 to 2 can be posited.
Proposition 1. When , there exist two possible ESS points, and . When , , that is, , and so is the only ESS point where the government does not regulate digital platforms, platforms do not abuse data, and users do not participate in supervision. When , and , is the only ESS where the government does not supervise digital platforms, platforms do not abuse data, and users participate in supervision.
Proof. When , and , we can get , that is , and are not the ESSs, and , so is the only ESS. When , and , we can get , that is , and are not the ESSs, and , so is the only ESS. □
Proposition 1 indicates that when the government has no incentive to supervise (i.e., the costs of regulation outweigh the benefits of regulation
), there are two ways to prevent data abuse. One is to adjust the tax rate. According to the general principles of economics, an increase in the tax rate will inhibit enterprises from expanding their business, and in the case of a high tax rate, if platforms still insist on abusing data, most of the revenue will be collected by the government. When the costs outweigh the benefits, platforms often have no incentive to abuse data. For example, France passed a set of digital services tax (DST) rules in 2019 that imposes a 3% tax on digital platforms with over €25 million in annual taxable income [
49]. This approach increases the tax burden on platform businesses that abuse user data and promotes a fairer distribution of tax responsibilities. It serves as an example for other governments and regulators to curb data abuse by digital platforms through tax policies. The other is to rely on leveraging user self-efficacy and enhancing the perceived value of private data. A “free-rider” condition can be achieved when the user increases the probability of refusing to provide data to the platform. Under this stabilization strategy, the government’s optimal decision is not to supervise.
Proposition 2. When , there are two possible ESS points, and . When , , that is, , is the only ESS: The government supervises the digital platforms, platforms do not commit data abuse, and users do not participate in supervision. When , , that is, , is the only ESS: The government supervises the digital platforms, platforms do not commit data abuse, and users participate in supervision.
Proof. When , and , we can get , that is , and are not the ESSs, and , so is the only ESS. When , and , we can get , that is , and are not the ESSs, and , so is the only ESS. □
Proposition 2 shows that when the government is motivated to supervise (i.e., benefits of regulation outweigh the the costs of regulation ), it can maintain the existing tax rate on the one hand, and can effectively deter digital platform companies from committing data abuse by increasing the penalty F so that it is greater than the net increase in profits from the platform’s data abuse and covers the additional loss from a privacy breach. This is well documented in practice. For example, in the case of Facebook’s abuse of user data, the U.S. Department of Justice, the Federal Trade Commission and Facebook reached a 20-year settlement agreement on protecting user privacy, the main elements of which include Facebook paying a 5 billion dollar fine and accepting further regulation by the Federal Trade Commission. In addition, the SEC reached an administrative settlement over allegations that Facebook failed to adequately disclose the risk of user data misuse and required the company to pay a 100 million dollar fine to the SEC. These penalties serve as a striking testament to the importance of strict and rigorous oversight, as well as the vital role that fines play in incentivizing compliance with established best practices in data management. Proposition 2 further reveals that the difference in between the stable points of user participation (1,0,1) and non-participation (1,0,0) is . This implies that, on one hand, a larger number of users participating in data abuse results in fewer additional benefits due to negative network externalities, which decreases the platform’s incentive for data abuse. On the other hand, user participation in supervision can effectively address information asymmetry and improve the efficiency of government regulation. To implement this framework in practice, we suggest creating additional user supervision channels to lower costs and enhancing users’ privacy awareness to increase their satisfaction with supervision. This will encourage more active participation in supervision and ultimately achieve a successful dual regulatory system.
Summarizing Propositions 1 and 2, this study yields two corollaries.
Corollary 1. Both governments and users have measures with which to curb data abuse by digital platforms.
Whether the government chooses to regulate or not, the government can prevent the abuse of data by digital platforms through effective policy controls, which can set high penalties in a regulatory scenario or high tax rates in a non-regulatory scenario. High penalties are a huge pressure for digital platforms, which can force them to handle user data more cautiously and defend user privacy. Users can weaken the incentive for digital platforms to abuse data by refusing to provide data in a supervised context, and increase the incentive to claim to compress the profit margin of platforms. The backlash from users has forced platform companies to seek more sustainable business models, reduce their reliance on user data, and thereby reduce instances of data abuse. These findings can help governments make flexible choices based on how fines are actually implemented, how easily tax rates are adjusted, and the users’ perceived value of privacy.
Corollary 2. The risk of privacy breaches gives digital platforms a recurring incentive to abuse data.
With the improvement of data security technology, there is a possibility of recurrence of data abuse by digital platforms. Because of the existence of the risk of privacy leakage, platform companies can commit data abuse if they increase their investment in privacy protection so as to reduce the risk of an additional privacy leakage of the platform. At this point, it is necessary to weigh the relationship between the benefits of data abuse and the investment cost of privacy protection, when the technology advances to the point that only a small cost is needed to significantly reduce the privacy leakage risk, and the government tax rate r and penalty F remain unchanged, the platforms have an incentive to engage in data abuse again. Companies such as Google, Facebook, and Alibaba have faced numerous allegations of data abuse. Based on the findings of this study, it is plausible that they are continuously reducing privacy risks caused by data breaches, evading standard monitoring methods by users and governments, and resorting to more covert means of carrying out data misuse. Therefore, relevant government regulatory policies should focus on the efficiency and cost of data security technologies for timely adjustment, so as to achieve the elimination of repeated violations.