Next Article in Journal
Entropy? Exercices de Style
Next Article in Special Issue
Maximum Entropy Analysis of Flow Networks: Theoretical Foundation and Applications
Previous Article in Journal
Underwater Bearing-Only and Bearing-Doppler Target Tracking Based on Square Root Unscented Kalman Filter
Previous Article in Special Issue
Entropic Regularization of Markov Decision Processes
Article Menu
Issue 8 (August) cover image

Export Article

Open AccessFeature PaperArticle

A General Framework for Fair Regression

1
Department of Engineering Science, University of Oxford, Oxford OX13PJ, UK
2
Faculty of Business and Law, Northampton University, Northampton NN15PH, UK
*
Author to whom correspondence should be addressed.
Entropy 2019, 21(8), 741; https://doi.org/10.3390/e21080741
Received: 29 April 2019 / Revised: 17 July 2019 / Accepted: 22 July 2019 / Published: 29 July 2019
(This article belongs to the Special Issue Entropy Based Inference and Optimization in Machine Learning)
  |  
PDF [1607 KB, uploaded 29 July 2019]
  |  

Abstract

Fairness, through its many forms and definitions, has become an important issue facing the machine learning community. In this work, we consider how to incorporate group fairness constraints into kernel regression methods, applicable to Gaussian processes, support vector machines, neural network regression and decision tree regression. Further, we focus on examining the effect of incorporating these constraints in decision tree regression, with direct applications to random forests and boosted trees amongst other widespread popular inference techniques. We show that the order of complexity of memory and computation is preserved for such models and tightly binds the expected perturbations to the model in terms of the number of leaves of the trees. Importantly, the approach works on trained models and hence can be easily applied to models in current use and group labels are only required on training data. View Full-Text
Keywords: machine learning; algorithmic fairness; kernel methods; constrained learning; Gaussian process; decision tree; neural network machine learning; algorithmic fairness; kernel methods; constrained learning; Gaussian process; decision tree; neural network
Figures

Figure 1

This is an open access article distributed under the Creative Commons Attribution License which permits unrestricted use, distribution, and reproduction in any medium, provided the original work is properly cited (CC BY 4.0).
SciFeed

Share & Cite This Article

MDPI and ACS Style

Fitzsimons, J.; Al Ali, A.; Osborne, M.; Roberts, S. A General Framework for Fair Regression. Entropy 2019, 21, 741.

Show more citation formats Show less citations formats

Note that from the first issue of 2016, MDPI journals use article numbers instead of page numbers. See further details here.

Related Articles

Article Metrics

Article Access Statistics

1

Comments

[Return to top]
Entropy EISSN 1099-4300 Published by MDPI AG, Basel, Switzerland RSS E-Mail Table of Contents Alert
Back to Top