Natural Language Processing: Emerging Neural Approaches and Applications

Edited by
March 2022
544 pages
  • ISBN978-3-0365-2271-5 (Hardback)
  • ISBN978-3-0365-2272-2 (PDF)

This book is a reprint of the Special Issue Natural Language Processing: Emerging Neural Approaches and Applications that was published in

Biology & Life Sciences
Chemistry & Materials Science
Computer Science & Mathematics
Environmental & Earth Sciences
Physical Sciences

This Special Issue highlights the most recent research being carried out in the NLP field to discuss relative open issues, with a particular focus on both emerging approaches for language learning, understanding, production, and grounding interactively or autonomously from data in cognitive and neural systems, as well as on their potential or real applications in different domains.

  • Hardback
License and Copyright
© 2022 by the authors; CC BY-NC-ND license
tourism big data; text mining; NLP; deep learning; clinical named entity recognition; information extraction; multitask model; long short-term memory; conditional random field; relation extraction; entity recognition; information extraction; long short-term memory network; multi-turn chatbot; dialogue context encoding; WGAN-based response generation; BERT word embedding; text summary; reinforce learning; FAQ classification; encoder-decoder neural network; multi-level word embeddings; BERT; bidirectional RNN; cloze test; Korean dataset; machine comprehension; neural language model; sentence completion; primary healthcare; chief complaint; virtual medical assistant; spoken natural language; disease diagnosis; medical specialist; protein–protein interactions; deep learning (DL); convolutional neural networks (CNN); bidirectional long short-term memory (bidirectional LSTM); dialogue management; user simulation; reward shaping; conversation knowledge; multi-agent reinforcement learning; language modeling; classification; error probability; error assessment; logic error; neural network; LSTM; attention mechanism; programming education; deep learning; neural architecture search; word ordering; Korean syntax; adversarial attack; adversarial example; sentiment classification; deep learning; relation extraction; dual pointer network; context-to-entity attention; text classification; rule-based; word embedding; Doc2vec; paraphrase identification; encodings; R-GCNs; BERT; contextual features; sentence retrieval; TF−ISF; BM25; language modeling; partial match; sequence similarity; word to vector; word embeddings; antonymy detection; polarity; text normalization; natural language processing; deep neural networks; causal encoder; question classification; multilingual; convolutional neural networks; Natural Language Processing (NLP); deep learning; transfer learning; open information extraction; relation extraction; recurrent neural networks; word embeddings; natural language processing; deep neural networks; causal encoder; bilingual translation; speech-to-text; LaTeX decompilation; natural language processing; word representation learning; word2vec; sememes; attention mechanism; structural information; sentiment analysis; zero-shot learning; news analysis; cross-lingual classification; multilingual transformers; natural language processing; knowledge base; commonsense; sememe prediction; attention model; sentiment analysis; natural language processing; deep learning; ontologies; fixing ontologies; quick fix; quality metrics; online social networks; rumor detection; Cantonese; XGA model; delayed combination; CNN dictionary; named entity recognition; deep learning NER; bidirectional LSTM CRF; CoNLL; OntoNotes; natural language processing; toxic comments; classification; deep learning; neural networks; n/a