Next Article in Journal
Ensemble Voting-Based Multichannel EEG Classification in a Subject-Independent P300 Speller
Next Article in Special Issue
A Multilayer CARU Framework to Obtain Probability Distribution for Paragraph-Based Sentiment Analysis
Previous Article in Journal
Seismic Analysis Method for Underground Structure in Loess Area Based on the Modified Displacement-Based Method
Previous Article in Special Issue
Efficient Estimate of Low-Frequency Words’ Embeddings Based on the Dictionary: A Case Study on Chinese
 
 
Article

DATLMedQA: A Data Augmentation and Transfer Learning Based Solution for Medical Question Answering

1
Department of Informatics, King’s College London, Strand, London WC2R 2LS, UK
2
Department of Computer Science, School of Engineering and Applied Science, Gonzaga University, Spokane, WA 99258, USA
*
Author to whom correspondence should be addressed.
Academic Editor: Johann Eder
Appl. Sci. 2021, 11(23), 11251; https://doi.org/10.3390/app112311251
Received: 16 October 2021 / Revised: 19 November 2021 / Accepted: 20 November 2021 / Published: 26 November 2021
With the outbreak of COVID-19 that has prompted an increased focus on self-care, more and more people hope to obtain disease knowledge from the Internet. In response to this demand, medical question answering and question generation tasks have become an important part of natural language processing (NLP). However, there are limited samples of medical questions and answers, and the question generation systems cannot fully meet the needs of non-professionals for medical questions. In this research, we propose a BERT medical pretraining model, using GPT-2 for question augmentation and T5-Small for topic extraction, calculating the cosine similarity of the extracted topic and using XGBoost for prediction. With augmentation using GPT-2, the prediction accuracy of our model outperforms the state-of-the-art (SOTA) model performance. Our experiment results demonstrate the outstanding performance of our model in medical question answering and question generation tasks, and its great potential to solve other biomedical question answering challenges. View Full-Text
Keywords: BERT; GPT-2; XGBoost; T5-Small; medical question answering; transfer learning BERT; GPT-2; XGBoost; T5-Small; medical question answering; transfer learning
Show Figures

Figure 1

MDPI and ACS Style

Zhou, S.; Zhang, Y. DATLMedQA: A Data Augmentation and Transfer Learning Based Solution for Medical Question Answering. Appl. Sci. 2021, 11, 11251. https://doi.org/10.3390/app112311251

AMA Style

Zhou S, Zhang Y. DATLMedQA: A Data Augmentation and Transfer Learning Based Solution for Medical Question Answering. Applied Sciences. 2021; 11(23):11251. https://doi.org/10.3390/app112311251

Chicago/Turabian Style

Zhou, Shuohua, and Yanping Zhang. 2021. "DATLMedQA: A Data Augmentation and Transfer Learning Based Solution for Medical Question Answering" Applied Sciences 11, no. 23: 11251. https://doi.org/10.3390/app112311251

Find Other Styles
Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Article Access Map by Country/Region

1
Back to TopTop