1. Introduction
E-commerce, search engines, and social media have promoted the rapid development of online platforms, making users enjoy a convenient life and leave a lot of private information on these platforms. After logging in, searching, or trading on online platforms, users will disclose multiple types of data, which, to a certain extent, reveal their personal characteristics, such as product preferences. Platforms can provide personalized services once they obtain users’ information, such as targeted advertising [
1] and product recommendations [
2,
3]. Furthermore, platforms are able to charge different prices for different users, which is called personalized pricing [
4]. One form of personalized pricing in reality is differentiated mobile coupons. For example, Lee and Choeh [
5] find that the instantaneity of mobile phones has made mobile coupons one of the fastest growing promotional channels used by retailers and other companies. Mobile coupons are economic discounts that consumers receive electronically via their mobile devices.
Advances in information technologies have made it easier for platforms to collect, store, and analyze users’ personal data. The development of digital technology promotes platforms to learn about users’ willingness to pay, effectively improving platform firms’ abilities to personalize prices for users and enhances profits [
6]. Shiller [
7] shows that, compared with traditional demographic data, firms increase their profits by about 12.2% through analyzing users’ browsing data. According to the browsing records and purchasing history of users, platforms can easily distinguish old users from the new and provide differentiated strategies for different segments. A typical example of platform setting lower prices for new users occurred in 2000, when a user found that the price of a DVD on Amazon was lower after clearing the “Cookies” on the computer [
8]. When the users clear the “Cookies”, the platform will no longer have their information, so these users can only be identified as new users, thus being charged a lower price. In addition, Dish, a TV service provider, assures new users that it can save them
$250, while offering new product functions to old users, such as voice remote control, steam application, high-dimensional channels and advanced search, and providing them with personalized suggestions [
9]. These examples show that platforms can not only offer price discounts to new users, but also offer higher product quality to old users. Furthermore, higher product quality is not only manifested in measurable product attributes, but also in perceived service quality. For example, Airbnb, a housing rental platform, can learn about the preferences of old users after analyzing their information. When the user makes a second purchase, Airbnb will give priority to presenting those houses that are well-aligned with users’ needs, thus bringing users a high-quality experience.
Although platforms can offer certain benefits to users through accessing their specific information, users are concerned about their privacy. As platforms collect more and more user information, users are increasingly aware that they are being monitored, which leads to their privacy concerns [
10,
11]. Additionally, one of the largest information providers, Acxiom, possesses more than 200 million pieces of information about Americans. Facebook, Google, and Amazon will collect more than 1 billion discrete units of information from users every month [
9]. Data collection and usage as shown above directly exacerbate users’ privacy concerns. Some scholars who study the personalized services of firms also admit that such practices may raise users’ privacy concerns [
12,
13]. However, digital services need users’ data to improve service quality and generate revenues. In an era when users are paying more and more attention to privacy, how to balance the advantages of information technology and users’ privacy concerns is a question worth considering.
The previous studies primarily focus on two aspects to alleviate users’ privacy. Firstly, different privacy regulations or methods are proposed to protect users’ privacy [
14,
15,
16]. In addition, some firms try to provide monetary benefits for users in exchange for their information. Information-collecting companies often offer a monetary reward to users to alleviate privacy concerns and ease the collection of personal information [
17]. Despite users paying great attention to their privacy, their economic behaviors present otherwise. This phenomenon of contradictory privacy-related decisions is referred to as the privacy paradox [
9,
18]. If users realize the benefits of providing privacy, their concerns about privacy may be alleviated [
19]. Quite a few papers have acknowledged the existence of the privacy paradox [
18,
20,
21,
22]. Since existing works have proved that monetary reward is an effective means to mitigate users’ privacy concerns, this paper tries to evaluate the impact of other benefits on this issue. We consider another feature of the product, i.e., this paper provides a new solution on how to ease the privacy concerns of users from the perspective of product quality.
To summarize, we consider users’ privacy concerns, examining the influence of the platforms’ behavior-based pricing model and quality discrimination on market participants. We focus on how the industry profits, user surplus, and social welfare will be affected after platforms implement quality discrimination, and discuss the impacts of market competition intensity and user privacy concerns on the results. Specifically, we address the following research questions: (1) With quality discrimination, how will the decision-making of platforms change? (2) Considering user privacy concerns, how should platforms provide product quality and charge for new and old users? (3) What influence does the market competition intensity have on the platforms’ quality strategy? (4) What changes will occur regarding industry profits, user surplus, and social welfare when considering quality discrimination? To solve these problems, we construct a two-period price-quality duopoly Hotelling competition game model, and analyze the equilibrium results with quality discrimination. Our contribution to the various strands of literature described above is as follows. Firstly, our study extends the study on BBPD by focusing not only on behavior-based pricing discrimination, but also on quality discrimination. Secondly, our study also focuses on consumers’ privacy concerns and introduces privacy cost into the model so as to study the impact of consumers’ privacy concerns on BBPD. Lastly, our study also provides a new idea for how to alleviate consumers’ privacy concerns, that is, how to alleviate consumers’ privacy concerns from the perspective of product quality. Our results not only provide new ideas for scholars to explore how to effectively mitigate the users’ privacy concerns, but also present some enlightenment for general platforms to make more effective strategies in the era when users pay more attention to privacy.
The paper is structured into seven sections. In
Section 2, we review the related literature. In
Section 3, we describe the problem, explain the symbols, and present the model. In
Section 4, we show the equilibrium results of the benchmark model and main model. In
Section 5, we analyze the social welfare impact of quality discrimination. In
Section 6, we provide our conclusions and implications and the
Appendix A contains proofs not provided in the main text.
2. Literature Review
This study is primarily related to the economic literature on behavior-based price discrimination (BBPD), which specifically means that platforms offer different prices to different users according to their purchase history. Zhang and Wang [
23] consider online and offline e-commerce practices, while our study focuses on online platform practices. With the development of information technology, this pricing method is more and more widely used in data-driven industries. Most papers on BBPD mainly explore how platforms create price discrimination for new and old users. The conclusion of Fudenberg and Tirole [
19] is that firms will reward new users with low prices. Shaffer and Zhang [
24] extend BBPD and have challenged this view. They think that, when users face lower switching cost, firms may reward old users on price, i.e., the price offered to old users is lower. Belleflamme et al. [
6] demonstrate that, if firms do not practice personalized pricing, they will eventually set the price at the marginal level, which will lead to the Bertrand paradox. Most scholars pay attention to how this pricing method is implemented but ignore its impact on consumers’ purchasing intention. In the empirical model, Akram et al. [
25] assume that consumers’ trust in network providers would affect their purchasing behavior, and regard whether users’ privacy was protected as an item to measure their sense of trust. The hypothesis is proved to be valid. Since BBPD will spark privacy concerns among users, our study contributes to these studies through considering users’ privacy concerns.
There is also extensive literature about BBPD which involves quality discrimination. Pazgal and Soberman [
26], based on the research of Fudenberg and Tirole [
19], assume that firms provide high-quality products for old users, and present that, when firms adopt BBPD, they will charge lower prices to new users. They presuppose that old users will get a product with higher quality, while Li [
27] regards the decision-making on quality as an endogenous process. The author uses a two-period dynamic game theory model to reveal the unique role of quality discrimination. It is found that there is an essential difference between quality discrimination and BBPD. BBPD intensifies the competition in the second period but weakens it in the first period. On the contrary, quality discrimination reduces competition in the second period and intensifies competition in the first period. Laussel and Resende [
28] explore the influence of firms’ personalized practices concerning products and prices for old customers on firms’ profits, and focus on the effect of the size of firms’ old turfs and firms’ initial products on the results. Li [
27] investigates the impact of quality discrimination on market competition. By contrast, this study focuses on how firms make price-quality decisions under different market competition intensity. Li [
27] points out that firms should reward the new users in the price dimension and reward the old users in the quality dimension, which is consistent with our work.
Finally, our research is also related to the literature on how to alleviate users’ privacy concerns. A vast amount of literature considers the establishment of privacy protection regulations to alleviate users’ privacy concerns [
14,
15,
16]. However, there is a huge gap between the views on the effectiveness of privacy regulation. Tayor [
14] examines the impact of two regimes, i.e., confidential regime and disclosure regime on social welfare. He shows that, under the disclosure when users do not anticipate the sale of their information, users will be worse off while firms will fare well. Stigler [
29] thinks that privacy protection causes low efficiency, i.e., the policies to protect user privacy have not improved social welfare. Lee et al. [
16] report that whether privacy regulation increases social welfare is contingent on the specific circumstances, suggesting that regulation should be tailored to the circumstances. Loertscher and Marx [
30] explore the two sides of interventions concerning an environment in which a digital monopoly can use data to either only improve matching or to improve matching as well as to adjust pricing. They report that privacy protection should protect users’ information rent, not their privacy. In addition, many privacy advocates believe that privacy regulations can have a positive impact on the tech giants’ data practices, while some critics worry that such restrictions could reduce firms’ investment in quality [
31]. Since there are a number of open questions about the effectiveness of privacy regulation, this study examines how to make users better off from another perspective. Our finding is that, under certain conditions of market competition intensity and user privacy cost, platforms’ quality discrimination may benefit users.
Kummer and Schulte [
32] reveal the transaction of exchanging money for privacy in the smartphone app market. Developers offer their apps at lower prices in exchange for more access to personal information, and users balance between lower prices and more privacy. Xu et al. [
33] examined the usage of location-based services and found that monetary incentives made users more willing to be located by the operator, since product price, i.e., monetary dimension, has been proven to be beneficial to easing user privacy. This study puts forward a new idea, i.e., platforms can use higher product quality in exchange for users’ privacy. In other words, if platforms provide users with higher quality products, users’ privacy concerns may be alleviated to a certain extent.
In conclusion, we can find that the research on pricing decisions between platforms is relatively common, and some scholars consider other strategies based on product pricing [
26,
34], but there are still some limitations. First of all, the research on BBPD is relatively mature, but the consideration of privacy concerns is overlooked. BBPD will bring their privacy concerns for using specific information of users, but many models do not recognize users’ privacy concerns. This study extends the literature of BBPD and focuses on the impact of user privacy cost on BBPD. Secondly, research on price-quality decision-making between competitive platforms is still relatively rare. Based on BBPD, what impact will the quality discrimination have on the results of competitive equilibrium? In addition, the interactive effect between price and quality needs further study. Finally, the literature on alleviating users’ privacy concerns either focuses on how to implement privacy protection measures or provides monetary rewards; few scholars have studied another feature of products, i.e., product quality to alleviate user privacy concerns. This paper extends this issue.
Therefore, based on the research of Li [
27], we construct a two-period duopoly competition game model considering the privacy cost and switching cost of users, obtain the equilibrium solutions under two scenarios i.e., with quality discrimination and non-quality discrimination, and explore the influence of quality discrimination on industry profits, user surplus and social welfare. Furthermore, this study also discusses the impact of quality discrimination on users’ privacy concerns. Our finding can help platforms to make better decisions in an environment where users are increasingly concerned about privacy, and have significant meaning for relevant government personnel to put forward appropriate initiatives to guide the effective operation of the market. For users, it can enable them to learn about the formulation process of platforms’ strategies, so as to make more informed consumption decisions.
5. Social Welfare Impact of Quality Discrimination
From the equilibrium results in
Section 4, we can further get the total profits, user surplus and social welfare of the two platforms with quality discrimination and non-quality discrimination, as shown in
Table 3.
Proposition 3 can be obtained from
Table 3:
Proposition 3. When and , or , we can know and , while, when , we can get .
Figure 5 presents our numerical results of Proposition 3 and shows how the platform’s total profit is affected by the privacy cost, i.e.,
and competition intensity, i.e.,
.
Figure 5a demonstrates that, when the competition intensity is weak, the total platform profit decreases first and then increases with the privacy cost under quality discrimination. In addition, compared with the scenario without quality discrimination, the platform profit is higher under quality discrimination.
Figure 5b demonstrates that, when the competition intensity is strong, the total platform revenue will increase with the privacy cost, and the total platform revenue will be greater without quality discrimination. Proposition 5 shows that, when the market competition intensity is relatively weak, with users’ privacy cost, quality discrimination can bring more considerable profits. Propositions 3 and 4 have shown that platforms will choose to reward old users with higher quality and attract new users with lower price when the competition intensity is weak and the privacy cost of users is high. These two strategies enable platforms to expand their turf and achieve higher profits.
Proposition 5 also shows that, when the market competition is strong, with the increase of users’ privacy cost, the revenue from quality discrimination is always lower than that without quality discrimination. This is because, as shown in
Figure 4, the uniform quality set by platforms for all users without quality discrimination is slightly higher than that set by firms for new and old users with quality discrimination. With the increase of users’ privacy cost, higher quality product can attract more users. Therefore, compared with adopting quality discrimination, platforms can obtain more profits without quality discrimination.
Proposition 4. When and , we can know , while, when , there is always .
Figure 6 presents our numerical results of Proposition 4 and shows how the user surplus is affected by the privacy cost, i.e.,
and competition intensity, i.e.,
.
Figure 6a depicts that, when competition intensity is weak, user surplus decreases first and then increases with the privacy cost under quality discrimination, while the user surplus continues to decrease without quality discrimination.
Figure 6b depicts that, when competition intensity is strong, user surplus decreases with the privacy cost under both two scenarios. In addition, the user surplus under quality discrimination is always higher. Proposition 4 shows that, when competition intensity is weak and privacy cost is high, user surplus increases as users’ privacy cost grows under quality discrimination. In addition, compared with the scenario of without quality discrimination, user surplus is higher under quality discrimination. In addition, when the competition intensity is strong, users will always be better off with quality discrimination.
This is because, when the competition intensity is weak, as the privacy cost of users increases, platforms will reward old users with higher quality and attract new users with lower price. This benefits users more, so the user surplus increases with users’ privacy cost.
On the other hand, when the competition intensity is strong, the products produced by platforms are highly differentiated, and the first-period choice of users indicates that the products produced by the original platforms can better meet their preferences. Therefore, users are reluctant to switch to a new platform for purchase in the second period, but as users’ privacy cost increases, users become worse off if staying with the original platform. Therefore, user surplus decreases as users’ privacy cost declines. Furthermore, with quality discrimination, platforms also set lower prices for new users and set higher quality for old users. Therefore, in this case, users will always be in a better position if there is quality discrimination.
Proposition 5. When and , we can obtain and , while when , we can know and .
Figure 7 presents our numerical results of Proposition 5 and how the social welfare is affected by the privacy cost, i.e.,
and competition intensity, i.e.,
.
Figure 7a demonstrates that, when the competition intensity is weak, with the privacy cost, the social welfare under quality discrimination declines first and then increases, while the social welfare without quality discrimination shows a slightly upward trend. However, there is always the higher social welfare under quality discrimination.
Figure 7b depicts that, when competition intensity is strong, social welfare decreases with the privacy cost. In addition, the social welfare is greater without quality discrimination. Proposition 5 shows that, when competition intensity is weak and there is quality discrimination, social welfare goes up as users’ privacy cost increases. However, when the competition intensity is strong, whether there is quality discrimination or not, with the increase of users’ privacy cost, the user surplus always declines.
This is because, when the competition intensity is weak as users’ privacy cost grows, the platforms’ total profit and the user surplus are increasing, so the social welfare is also on the rise. However, when the competition intensity is strong, the total profits obtained by platforms are higher without quality discrimination. Although the user’s surplus is small at this time, it can be found from the comparison of whether there is quality discrimination or not that the total platforms’ profit gap is larger than the user’s surplus gap. Therefore, when the competition intensity is weak, the social welfare without quality discrimination is higher than that with quality discrimination.
6. Conclusions
This paper establishes a two-period dynamic competition game model. In the first period, the platform sets the same price for all users, while, in the second period, it sets differentiated prices for new and old users and sets different levels of product quality. Firstly, the user’s privacy cost is considered to define the user’s utility function, and the different demand functions of platforms in the two periods are derived based on the Hotelling model. The optimal strategies regarding the price and quality of product or service by competing platforms are studied under the conditions of quality discrimination and non-quality discrimination. In addition, the impact of market competition intensity and user privacy cost on industry profits, user surplus, and social welfare are also investigated.
The research shows that, if there is no quality discrimination, when the privacy cost of users is low, platforms will reward new users with low price. However, with quality discrimination, there are two situations in which platforms will set lower prices for new users and higher quality for old users: when users’ privacy cost is high and competition intensity is weak, or when users’ privacy cost is low and competition intensity is strong. In these two scenarios, platforms will reward new and old users from price and quality dimensions in equilibrium. Furthermore, when the market competition intensity is weak, with the increase of users’ privacy cost, compared with not conducting quality discrimination, platforms implementing quality discrimination can achieve more profits.
When it comes to user surplus and social welfare, with weak competition intensity and quality discrimination, both user surplus and social welfare will increase with users’ privacy cost rising. When the competition intensity is strong, whether there is quality discrimination or not, with the increase of users’ privacy cost, user surplus and social welfare are always decreasing.
Our analysis provides deep theoretical implications of the effect of quality discrimination on platforms’ profits, user surplus, and social welfare. Extant research on BBPD acknowledges that BBPD can raise consumer privacy concerns, but offers few solutions, and our analysis makes a theoretical contribution to this aspect. We show that, when the market competition intensity is weak, with the increase of users’ privacy cost, quality discrimination can bring more considerable profits for platforms. As for user surplus and social welfare, when the competition intensity is weak and with quality discrimination, both user surplus and social welfare will increase with users’ privacy cost. However, when the competition intensity is strong, whether there is quality discrimination or not, with the increase of users’ privacy cost, the user surplus and social welfare are always decreasing. Part of our conclusion is similar to Li [
24]’s study to some extent. For example, we agree that platforms should reward new customers on the price dimension and old customers on the quality dimension in some situations. However, different from Li [
24]’s, in which the impact of quality discrimination on BBPD is focused, we focus on how quality discrimination alleviates consumer privacy concerns, so we consider the privacy cost, and explore the conditions where the quality discrimination can ease consumer privacy concerns.
The above conclusions also have important managerial implications for platforms seeking to enhance profits. With the development of modern technology, incidents concerning the leakage of users’ privacy emerge one after another, and users’ privacy concerns are growing. Therefore, how to effectively use big data while reducing users’ privacy concerns becomes extremely significant. Currently, privacy regulation is mainly relied on to alleviate consumers’ privacy concerns, but its effectiveness is controversial. The conclusion of this study provides a new way to solve this problem, i.e., platforms can provide higher product quality for old users in exchange for their privacy. For example, platforms can endeavor to innovate their products, designing products that are more in line with the preferences of old users. Platforms can also provide users with a higher level of service quality, including providing users with more diversified and accurate content push or product recommendation, so as to alleviate their privacy concerns to a certain extent.
Additionally, the conclusions of this paper provide some practical implications and explain Airbnb’s strategy in the Chinese market to a certain extent. Airbnb is an online platform on which registered users and third-party service providers can communicate and trade directly with each other. Airbnb is mainly targeted at Chinese users who want to go to foreign countries, while Tujia and Xiaozhu, as Airbnb’s competitors, are targeted at domestic users. Therefore, the products of Airbnb and the other two platforms are quite different. These platforms collect a large amount of users’ data on the platform, and release coupons to new users. According to Airbnb’s platform policy, the display position or ranking of products in search results may depend on many factors, including but not limited to the preferences of guests and homeowners, ratings, and convenience of booking. However, for the old users, Airbnb shows users more diversified characteristic houses based on “embedded technology”. Furthermore, Airbnb not only provides accommodation but also provides users with a “home” experience. In order to satisfy the users’ pursuit of authenticity and interactive accommodation experience, Airbnb builds a virtual community of platform for old users. In this community, old users can speak freely and share their housing demands and experiences. Additionally, Airbnb also launched the “Story” function in 2017, allowing old users to share their stories on their travel. These services provided by Airbnb to old users have greatly improved their perceived product quality, thus alleviating privacy concerns to a certain extent.
This study still has some limitations, such as only considering the average privacy cost of users, and not exploring the more complex market competition situation of platforms, etc. These are aspects that can be considered in future research work. In addition, further deep research in the future can also verify the conclusions of this study from an empirical perspective.