Abstract
This study proposes a unified stochastic framework for approximating and computing the gradient of every smooth function evaluated at non-independent variables, using -spherical distributions on with . The upper-bounds of the bias of the gradient surrogates do not suffer from the curse of dimensionality for any . Additionally, the mean squared errors (MSEs) of the gradient estimators are bounded by for any , and by when with N the sample size and some constants. Taking allows for achieving dimension-free upper-bounds of MSEs. In the case where , the upper-bound is reached with a constant. Such results lead to dimension-free MSEs of the proposed estimators, which boil down to estimators of the traditional gradient when the variables are independent. Numerical comparisons show the efficiency of the proposed approach.