1.1. Theoretical Framework
1.2. Aims and Research Questions
- RQ1 What are the results for product attribute acquisition and useful review analysis?
- RQ2 Does the evaluation indicator system works for product evaluation and what’s the accuracy for evaluation system for priority?
- RQ3 What are the results for product improvement strategies, i.e., what’s the valuable improvement ideas refined?
2. Materials and Methods
- Useful review acquisition. We apply a web crawler system to collect online reviews and preprocess the reviews with Latent Dirichlet Allocation (LDA)  to obtain product attributes and user emotion. Those reviews that contain both attribute words and emotion words are considered as useful reviews.
- Product evaluation indicator system establishment. With the product attributes and user emotion, we establish multi-dimensional indicators (e.g., users’ satisfaction and attention) to evaluate the priority of product attributes for improvement.
- Improvement strategy analysis. By selecting negative reviews for target product attributes, with the technology of text mining, we can find the dissatisfaction and propose improvement strategies.
2.1. Useful Review Acquisition
2.1.1. Product Attribute Acquisition
2.1.2. Useful Review Analysis
2.2. Product Evaluation Indicator System
- To understand users’ satisfaction () with the j-th product attribute, the average emotion value is calculated as Equation (2), where means the number of reviews in the j-th product attribute group. High represents that the users are generally satisfied with the j-th product attribute.
- The second indicator for product evaluation is users’ attention to individual product attributes. If a review comments on one product attribute, then it is considered that the reviewer is concerned about the product attribute. To measure the user attention () on the j-th product attribute, we calculate the proportion of reviews in the corresponding product attribute group as Equation (3), where N is the total number of useful reviews. High represents that the j-th product attribute is hotly discussed in user reviews and gains high user attention.
- For one product attribute, the manufacturer aims to make more users more satisfied, which is measured by maximizing . However, due to the limited resources of the enterprise, the decision that maximizes the benefits of the enterprise is the optimal decision. Therefore, it is important to select several high-priority product attributes for product improvement first. And the priority () is calculated by the evaluation promotion space measurement as Equation (4), which means that lower satisfaction and higher user attention make higher evaluation promotion space.
2.3. Improvement Strategy Analysis
- Obtain related reviews in the target product attribute groups.
- Select negative reviews with .
- Apply text mining methods to extract users’ opinions and then manually adjust the results.
3. Case Study and Result Discussion
3.1. Data Collection and Preprocessing
3.2. Results for Useful Review Acquisition (RQ1)
3.3. Product Evaluation Indicator System Establishment and Validation (RQ2)
3.3.1. Principle Component Analysis
3.3.2. Multiple Linear Regression Analysis
3.4. Improvement Strategy Analysis for Smartphones (RQ3)
3.4.1. The Analysis of Evaluation Indicators for Overall Smartphones
3.4.2. The Analysis of Evaluation Indicators for Individual Smartphones
3.4.3. Improvement Idea Refining
- Enhance the user experience of operating system: improve the stability, fluency, and compatibility of various App versions; Optimize speed and response speed of operating system, accelerate boot speed; Implement features to optimize the temperature control system; Enhance user experience for the game environment to improve the efficiency and fluency of games; Increase the construction of domestic server of App Store to speed up App updating and downloading.
- Enhance customer service professionalism: improve customer service attitude, accelerate user event processing speed, simplify customer service process, and increase the staff of customer service; Improve the delivery services: speed up the delivery of products and transport speed, improve the delivery service attitude of couriers.
- Enhance the personality of the smartphone: enrich the color of the product to provide a wealth of choices; Improve the standard of smartphone styling design in line with the current consumer aesthetic, reduce or eliminate the notch screen; Provide large size models to give consumers a variety of choices; Improve the workmanship of products: improve the production quality of OEM factories and reduce defects of smartphones.
4. Conclusions and Future Work
- Improve the automation of review analysis so that the method can be quickly migrated to other products.
- Consider the tradeoff computing for the pursuit of user opinions, innovation, and cost.
- Apply the improvement strategies to real production practice to validate the methods.
Institutional Review Board Statement
Informed Consent Statement
Data Availability Statement
Conflicts of Interest
- Eisenhardt, K.M.; Tabrizi, B.N. Accelerating Adaptive Processes: Product Innovation in the Global Computer Industry. Adm. Sci. Q. 1995, 40, 84–110. [Google Scholar] [CrossRef]
- Kessler, E.H.; Chakrabarti, A.K. Speeding up the pace of new product development. J. Prod. Innov. Manag. 1999, 16, 231–247. [Google Scholar] [CrossRef]
- Spanos, Y.E.; Vonortas, N.S.; Voudouris, I. Antecedents of innovation impacts in publicly funded collaborative R&D projects. Technovation 2015, 36, 53–64. [Google Scholar] [CrossRef]
- Tao, X.; Xianqiang, Z. Iterative innovation design methods of internet products in the era of big data. Packag. Eng. 2016, 37, 1–5. [Google Scholar]
- Dou, R.; Zhang, Y.; Nan, G. Customer-oriented product collaborative customization based on design iteration for tablet personal computer configuration. Comput. Ind. Eng. 2016, 99, 474–486. [Google Scholar] [CrossRef]
- Wang, C.; Zhao, W.; Wang, H.J.; Chen, L. Multidimensional customer requirements acquisition based on ontology. Jisuanji Jicheng Zhizao Xitong/Comput. Integr. Manuf. Syst. 2016, 22, 908–916. [Google Scholar]
- Lv, Z.; Song, H.; Member, S.; Basanta-val, P.; Steed, A.; Jo, M.; Member, S. Next-generation big data analytics: State of the art, challenges, and future research topics. IEEE Trans. Ind. Inform. 2017, 13, 1891–1899. [Google Scholar] [CrossRef]
- Erevelles, S.; Fukawa, N.; Swayne, L. Big Data consumer analytics and the transformation of marketing. J. Bus. Res. 2016, 69, 897–904. [Google Scholar] [CrossRef]
- Gharajeh, M.S. Biological Big Data Analytics. Adv. Comput. 2017, 1–35. [Google Scholar] [CrossRef]
- Roh, S. Big Data Analysis of Public Acceptance of Nuclear Power in Korea. Nucl. Eng. Technol. 2017, 49, 850–854. [Google Scholar] [CrossRef]
- Jin, J.; Liu, Y.; Ji, P.; Kwong, C.K. Review on recent advances in information mining from big consumer opinion data for product design. J. Comput. Inf. Sci. Eng. 2019, 19, 010801. [Google Scholar] [CrossRef]
- Hu, N.; Zhang, J.; Pavlou, P.A. Overcoming the J-shaped distribution of product reviews. Commun. ACM 2009, 52, 144–147. [Google Scholar] [CrossRef]
- Korfiatis, N.; García-Bariocanal, E.; Sánchez-Alonso, S. Evaluating content quality and helpfulness of online product reviews: The interplay of review helpfulness vs. review content. Electron. Commer. Res. Appl. 2012, 11, 205–217. [Google Scholar] [CrossRef]
- Lim, S.; Henriksson, A.; Zdravkovic, J. Data-Driven Requirements Elicitation: A Systematic Literature Review. SN Comput. Sci. 2021, 2, 1–35. [Google Scholar] [CrossRef]
- Baizhang, M.; Zhijun, Y. Product features extraction of online reviews based on LDA model. Comput. Integr. Manuf. Syst. 2014, 20, 96–103. [Google Scholar]
- Lycett, M. ‘Datafication’: Making sense of (big) data in a complex world. Eur. J. Inf. Syst. 2013, 22, 381–386. [Google Scholar] [CrossRef]
- Banerjee, S.; Bhattacharyya, S.; Bose, I. Whose online reviews to trust? Understanding reviewer trustworthiness and its impact on business. Decis. Support Syst. 2017, 96, 17–26. [Google Scholar] [CrossRef]
- Karimi, S.; Wang, F. Online review helpfulness: Impact of reviewer profile image. Decis. Support Syst. 2017, 96, 39–48. [Google Scholar] [CrossRef]
- Forman, C.; Ghose, A.; Wiesenfeld, B. Examining the relationship between reviews and sales: The role of reviewer identity disclosure in electronic markets. Inf. Syst. Res. 2008, 19, 291–313. [Google Scholar] [CrossRef]
- Dave, K.; Lawrence, S.; Pennock, D.M. Mining the peanut gallery: Opinion extraction and semantic classification of product reviews. In Proceedings of the 12th International Conference on World Wide Web, WWW 2003, Budapest, Hungary, 20–24 May 2003; pp. 519–528. [Google Scholar] [CrossRef]
- Zhang, W.; Xu, H.; Wan, W. Weakness Finder: Find product weakness from Chinese reviews by using aspects based sentiment analysis. Expert Syst. Appl. 2012, 39, 10283–10291. [Google Scholar] [CrossRef]
- Novgorodov, S.; Guy, I.; Elad, G.; Radinsky, K. Generating product descriptions from user reviews. In Proceedings of the World Wide Web Conference, San Francisco, CA, USA, 13–17 May 2019; pp. 1354–1364. [Google Scholar]
- Jing, R.; Yu, Y.; Lin, Z. How Service-Related Factors Affect the Survival of B2T Providers: A Sentiment Analysis Approach. J. Organ. Comput. Electron. Commer. 2015, 25, 316–336. [Google Scholar] [CrossRef]
- Kang, D.; Park, Y. Review-based measurement of customer satisfaction in mobile service: Sentiment analysis and VIKOR approach. Expert Syst. Appl. 2014, 41, 1041–1050. [Google Scholar] [CrossRef]
- Yang, L.; Li, Y.; Wang, J.; Sherratt, R.S. Sentiment analysis for E-commerce product reviews in Chinese based on sentiment lexicon and deep learning. IEEE Access 2020, 8, 23522–23530. [Google Scholar] [CrossRef]
- Blei, D.M.; Ng, A.Y.; Jordan, M.I. Latent dirichlet allocation. J. Mach. Learn. Res. 2003, 3, 993–1022. [Google Scholar]
- Juan, C. A Method of Adaptively Selecting Best LDA Model Based on Density. Chin. J. Comput. 2008, 31, 1781–1787. [Google Scholar]
- Dong, Z.; Dong, Q. Hownet and the Computation of Meaning (with Cd-rom); World Scientific: Beijing, China, 2006. [Google Scholar]
- Textblob Documentation. Available online: https://buildmedia.readthedocs.org/media/pdf/textblob/latest/textblob.pdf (accessed on 13 May 2021).
- iPhone—Apple. Available online: https://www.apple.com/iphone/ (accessed on 20 March 2021).
- HUAWEI Phones. Available online: https://consumer.huawei.com/cn/phones/?ic_medium=hwdc&ic_source=corp_header_consumer (accessed on 20 March 2021).
- Wold, S.; Esbensen, K.; Geladi, P. Principal component analysis. Chemom. Intell. Lab. Syst. 1987, 2, 37–52. [Google Scholar] [CrossRef]
- Benesty, J.; Chen, J.; Huang, Y.; Cohen, I. Pearson correlation coefficient. In Noise Reduction in Speech Processing; Springer: Berlin, Heidelberg, 2009; pp. 1–4. [Google Scholar]
- Moore, D.S.; Kirkland, S. The Basic Practice of Statistics; WH Freeman: New York, NY, USA, 2007; Volume 2. [Google Scholar]
- Montaño, J.; Palmer, A.; Sesé, A.; Cajal, B. Using the R-MAPE index as a resistant measure of forecast accuracy. Psicothema 2013, 25, 500–506. [Google Scholar] [CrossRef]
- Ho-Dac, N.N. The value of online user generated content in product development. J. Bus. Res. 2020, 112, 136–146. [Google Scholar] [CrossRef]
- Fan, Z.P.; Li, G.M.; Liu, Y. Processes and methods of information fusion for ranking products based on online reviews: An overview. Inf. Fusion 2020, 60, 87–97. [Google Scholar] [CrossRef]
- Wang, J.N.; Du, J.; Chiu, Y.L. Can online user reviews be more helpful? Evaluating and improving ranking approaches. Inf. Manag. 2020, 57, 103281. [Google Scholar] [CrossRef]
- Zhao, Y.; Xu, X.; Wang, M. Predicting overall customer satisfaction: Big data evidence from hotel online textual reviews. Int. J. Hosp. Manag. 2019, 76, 111–121. [Google Scholar] [CrossRef]
- Zhang, H.; Rao, H.; Feng, J. Product innovation based on online review data mining: A case study of Huawei phones. Electron. Commer. Res. 2018, 18, 3–22. [Google Scholar] [CrossRef]
- Ibrahim, N.F.; Wang, X. A text analytics approach for online retailing service improvement: Evidence from Twitter. Decis. Support Syst. 2019, 121, 37–50. [Google Scholar] [CrossRef]
|packaging list||List of items available for smartphone packaging.|
|size & weight||The length, width, height and weight of the smart phone.|
|processor||The CPU and GPU of the smartphone.|
|storage||Memory capacity, read/write speed, etc.|
|battery||Capacity and endurance of the battery.|
|service||Pre-sale consultation, after-sales service, etc.|
|sensor||Sensors, such as gyroscopes, light sensors, barometric pressure sensors, humidity sensors, etc.|
|price||The price of the smartphone.|
|durability||Product durability and product quality.|
|shoot||Camera number, image accuracy, aperture size, image stabilization performance, etc.|
|screen||Screen size, resolution, color gamut, and so on.|
|data connection||The network performance of the smartphone.|
|appearance||The shape, color, and material of the smartphone.|
|operating system||iOS or Android, and some applications and functions in system.|
|audio & video||Audio and video playback quality.|
|Product Attribute||#Attribute Word||Product Attribute||#Attribute Word|
|size & weight||12||shoot||24|
|sensor||10||audio & video||22|
|Product Attribute||Pearson Correlation Coefficient||p-Value|
|size & weight||0.081||0.539|
|battery||0.542 (>0.3)||0.000 (<0.05)|
|service||0.401 (>0.3)||0.001 (<0.05)|
|price||0.344 (>0.3)||0.007 (<0.05)|
|durability||0.329 (>0.3)||0.010 (<0.05)|
|screen||0.397 (>0.3)||0.002 (<0.05)|
|data connection||0.412 (>0.3)||0.001 (<0.05)|
|operating system||0.434 (>0.3)||0.001 (<0.05)|
|audio & video||0.254||0.051|
|No. PC||Eigenvalue||Variance Contribution (%)||Cumulative Variance Contribution (%)|
|No.||True Bad Reivew Rate||Predicted Bad Review Rate||Error (%)|
|Product Attribute||S (Satisfaction)||A (Attention)||P (Priority)|
|size & weight||0.682||0.047||0.015|
|audio & video||0.421||0.072||0.041|
|Product Attribute||S (Satisfaction)||A (Attention)||P (Priority)|
|size & weight||0.637||0.039||0.014|
|audio & video||0.057||0.047||0.044|
|operating system||system||crash, unstable, poor compatibility, reboot, bug, frozen, etc.||207|
|response||laggy, slow, insensitive, etc.||48|
|game||laggy, overheating, bad experience, etc.||36|
|function||few, poor, etc.||31|
|App||slow download speed, poor compatibility, etc.||23|
|service||customer service||poor attitude, slight over, many excuses, passive, etc.||155|
|service||slow, bad, barely satisfactory, etc.||62|
|delivery||slow, wordless, poor packaging, etc.||37|
|after-sale||poor service, cumbersome, etc.||25|
|attitude||poor, scurviness, etc.||18|
|appearance||notch screen||not look well, etc.||83|
|shape||ugly, small, not goodlooking, etc.||42|
|color||dull, less, no personality, etc.||25|
|design||ugly, unreasonable, etc.||17|
Publisher’s Note: MDPI stays neutral with regard to jurisdictional claims in published maps and institutional affiliations.
© 2021 by the authors. Licensee MDPI, Basel, Switzerland. This article is an open access article distributed under the terms and conditions of the Creative Commons Attribution (CC BY) license (https://creativecommons.org/licenses/by/4.0/).