Next Article in Journal
M2UNet: A Segmentation-Guided GAN with Attention-Enhanced U2-Net for Face Unmasking
Previous Article in Journal
Considering the Impact of Adverse Weather: Integrated Scheduling Optimization of Berths and Quay Cranes
Previous Article in Special Issue
Anomaly Deviation-Based Window Size Selection of Sensor Data for Enhanced Fault Diagnosis Efficiency in Autonomous Manufacturing Systems
 
 
Font Type:
Arial Georgia Verdana
Font Size:
Aa Aa Aa
Line Spacing:
Column Width:
Background:
This is an early access version, the complete PDF, HTML, and XML versions will be available soon.
Article

Feature Selection Using Nearest Neighbor Gaussian Processes

by
Konstantin Posch
1,
Maximilian Arbeiter
1,2,
Christian Truden
3,*,
Martin Pleschberger
4 and
Jürgen Pilz
1,*
1
Department of Statistics, University of Klagenfurt, 9020 Klagenfurt, Austria
2
Infineon Technologies Austria AG, 9500 Villach, Austria
3
Department of Economics, Analytics and Operations Research, University of Klagenfurt, 9020 Klagenfurt, Austria
4
KAI Kompetenzzentrum Automobil- und Industrieelektronik GmbH, 9524 Villach, Austria
*
Authors to whom correspondence should be addressed.
Mathematics 2026, 14(3), 476; https://doi.org/10.3390/math14030476
Submission received: 23 November 2025 / Revised: 26 January 2026 / Accepted: 28 January 2026 / Published: 29 January 2026

Abstract

We introduce a novel Bayesian approach for feature (variable) selection using Gaussian process regression, which is crucial for enhancing interpretability and model regularization. Our method employs nearest neighbor Gaussian processes as scalable approximations to classical Gaussian processes. Feature selection is performed by conditioning the process mean and covariance function on a random set representing the indices of relevant variables. A priori beliefs regarding this set control the feature selection, while reference priors are assigned to the remaining model parameters, ensuring numerical robustness in the process covariance matrix. For model inference, we propose a Metropolis-within-Gibbs algorithm. The effectiveness of the proposed feature selection approach is demonstrated through evaluation on simulated data, a computer experiment approximation, and two real-world data sets.
Keywords: feature selection; dimensional reduction; hierarchical Bayes; nearest neighbor Gaussian process; model uncertainty; Metropolis–Hastings algorithm; variable selection feature selection; dimensional reduction; hierarchical Bayes; nearest neighbor Gaussian process; model uncertainty; Metropolis–Hastings algorithm; variable selection

Share and Cite

MDPI and ACS Style

Posch, K.; Arbeiter, M.; Truden, C.; Pleschberger, M.; Pilz, J. Feature Selection Using Nearest Neighbor Gaussian Processes. Mathematics 2026, 14, 476. https://doi.org/10.3390/math14030476

AMA Style

Posch K, Arbeiter M, Truden C, Pleschberger M, Pilz J. Feature Selection Using Nearest Neighbor Gaussian Processes. Mathematics. 2026; 14(3):476. https://doi.org/10.3390/math14030476

Chicago/Turabian Style

Posch, Konstantin, Maximilian Arbeiter, Christian Truden, Martin Pleschberger, and Jürgen Pilz. 2026. "Feature Selection Using Nearest Neighbor Gaussian Processes" Mathematics 14, no. 3: 476. https://doi.org/10.3390/math14030476

APA Style

Posch, K., Arbeiter, M., Truden, C., Pleschberger, M., & Pilz, J. (2026). Feature Selection Using Nearest Neighbor Gaussian Processes. Mathematics, 14(3), 476. https://doi.org/10.3390/math14030476

Note that from the first issue of 2016, this journal uses article numbers instead of page numbers. See further details here.

Article Metrics

Article metric data becomes available approximately 24 hours after publication online.
Back to TopTop