Higher-order Neyman orthogonality: a general way to reduce bias from nuisance parameters in moment-condition models
The paper shows how to build estimating equations that are insensitive, up to a chosen order, to errors in nuisance-parameter estimates. In plain terms, the authors give a recipe for changing the equations you solve so that small mistakes in parts of the model that you are not primarily interested in have a much smaller effect on the final estimate. This can reduce bias and make inference more reliable when those nuisance parts are difficult to estimate accurately.
The authors work in the familiar moment-condition framework used in many econometric methods. A researcher observes independent data and has two sets of equations: one set identifies the nuisance parameters and another identifies the parameter of interest. The paper constructs a replacement for the target equations so that its expected derivatives with respect to the nuisance parameters are zero up to order q. That vanishing of derivatives is what the authors call q-th order Neyman orthogonality. The new formula is given in closed form and uses independent copies of the data and a certain matrix that inverts the Jacobian (the matrix of derivatives) of the nuisance equations.
Why does higher-order orthogonality help? If the estimating equations are orthogonal to order q, the estimator is less sensitive to the error in estimating the nuisance parameters. This weakens the speed at which nuisance estimates must converge for valid inference. For example, first-order orthogonality typically needs a nuisance convergence rate faster than n^(−1/4), while q-th order orthogonality lowers that requirement to about n^(−1/(2(q+1))). The paper gives explicit simple formulas when the nuisance moments are affine (linear in the nuisance), which covers linear regression and instrumental variables, and shows how to add correction terms in genuinely nonlinear cases.