Next Article in Journal
Data-Driven Baseline Analysis of Climate Variability at an Antarctic AWS (2020–2024)
Previous Article in Journal
TinyML Classification for Agriculture Objects with ESP32
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Jokes or Gibberish? Humor Retention in Translation with Neural Machine Translation vs. Large Language Model

by
Mondheera Pituxcoosuvarn
* and
Yohei Murakami
College of Information Science and Engineering, Ritsumeikan University, Ibaraki, Osaka 567-8570, Japan
*
Author to whom correspondence should be addressed.
Digital 2025, 5(4), 49; https://doi.org/10.3390/digital5040049
Submission received: 5 August 2025 / Revised: 17 September 2025 / Accepted: 29 September 2025 / Published: 2 October 2025

Abstract

Humor translation remains a significant challenge due to its reliance on wordplay, cultural context, and nuance. This study compares a Neural Machine Translation (NMT) system (hereafter referred to as MT) with a Large Language Model (GPT-based translation using three different prompts) for translating jokes from English to Thai. Results show that GPT-based models significantly outperform MT in humor retention, with the explanation-enhanced prompt (GPT-Ex) achieving the highest joke preservation rate (62.94%) compared to 50.12% in MT. Additionally, humor loss was more frequent in MT, while GPT-based models, particularly GPT-Ex, better retained jokes. A McNemar test confirmed significant differences in annotation distributions across models. Beyond evaluation, we propose using GPT-based models with optimized prompt engineering to enhance humor translation. Our refined prompts improved joke retention by guiding the model’s understanding of humor and cultural nuances.
Keywords: humor translation; neural machine translation; large language models application humor translation; neural machine translation; large language models application

Share and Cite

MDPI and ACS Style

Pituxcoosuvarn, M.; Murakami, Y. Jokes or Gibberish? Humor Retention in Translation with Neural Machine Translation vs. Large Language Model. Digital 2025, 5, 49. https://doi.org/10.3390/digital5040049

AMA Style

Pituxcoosuvarn M, Murakami Y. Jokes or Gibberish? Humor Retention in Translation with Neural Machine Translation vs. Large Language Model. Digital. 2025; 5(4):49. https://doi.org/10.3390/digital5040049

Chicago/Turabian Style

Pituxcoosuvarn, Mondheera, and Yohei Murakami. 2025. "Jokes or Gibberish? Humor Retention in Translation with Neural Machine Translation vs. Large Language Model" Digital 5, no. 4: 49. https://doi.org/10.3390/digital5040049

APA Style

Pituxcoosuvarn, M., & Murakami, Y. (2025). Jokes or Gibberish? Humor Retention in Translation with Neural Machine Translation vs. Large Language Model. Digital, 5(4), 49. https://doi.org/10.3390/digital5040049

Article Metrics

Back to TopTop