Previous Article in Journal
Probing Emergent World Representations in Go Life-and- Death Problems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Feedback-Aware Inference for Iterative Multi-Sample Text Generation

by
Andreea Dutulescu
1,2,
Stefan Ruseti
1,2,*,
Mihai Dascalu
1,2 and
Danielle S. McNamara
3
1
Computer Science and Engineering Department, National University of Science and Technology POLITEHNICA Bucharest, 313 Splaiul Independentei, 060042 Bucharest, Romania
2
Academy of Romanian Scientists, Str. Ilfov 3, 050044 Bucharest, Romania
3
Learning Engineering Institute, Arizona State University, Tempe, AZ 85287, USA
*
Author to whom correspondence should be addressed.
AI 2026, 7(5), 171; https://doi.org/10.3390/ai7050171 (registering DOI)
Submission received: 1 April 2026 / Revised: 8 May 2026 / Accepted: 12 May 2026 / Published: 15 May 2026

Abstract

Generating multiple text sequences and refining them through feedback is essential for improving the quality of outputs in many NLP tasks. While Large Language Models can leverage iterative feedback during inference, smaller models often lack this capability due to limited capacity and the absence of suitable training paradigms. In this paper, we propose a novel Feedback-Aware Inference approach that enables iterative sequence generation with integration of feedback signals. Our method allows models to generate multiple sequences, incorporate feedback from previous iterations, and refine outputs accordingly. This approach dynamically adjusts to different quality metrics, making it adaptable to various contexts and objectives. We evaluate our approach on two distinct tasks: Answer Selection for Question Generation and Keyword Generation, arguing for its generalizability and effectiveness. Results show that our method outperforms strong baselines, maintaining high performance across iterations and achieving superior results even with smaller, open-source models.
Keywords: feedback-aware; multi-sample generation; Question Generation; keywords generation feedback-aware; multi-sample generation; Question Generation; keywords generation

Share and Cite

MDPI and ACS Style

Dutulescu, A.; Ruseti, S.; Dascalu, M.; McNamara, D.S. Feedback-Aware Inference for Iterative Multi-Sample Text Generation. AI 2026, 7, 171. https://doi.org/10.3390/ai7050171

AMA Style

Dutulescu A, Ruseti S, Dascalu M, McNamara DS. Feedback-Aware Inference for Iterative Multi-Sample Text Generation. AI. 2026; 7(5):171. https://doi.org/10.3390/ai7050171

Chicago/Turabian Style

Dutulescu, Andreea, Stefan Ruseti, Mihai Dascalu, and Danielle S. McNamara. 2026. "Feedback-Aware Inference for Iterative Multi-Sample Text Generation" AI 7, no. 5: 171. https://doi.org/10.3390/ai7050171

APA Style

Dutulescu, A., Ruseti, S., Dascalu, M., & McNamara, D. S. (2026). Feedback-Aware Inference for Iterative Multi-Sample Text Generation. AI, 7(5), 171. https://doi.org/10.3390/ai7050171

Article Metrics

Back to TopTop