In a number of recent developments, second and higher order methods of statistical inference have been proposed in some once differentiable problems. We consider two specific examples. In the first problem, a high-order unbiased statistical expansion is developed in the estimation of a general functional of a high-dimensional mean vector. The formula systematically and explicitly solves the associated de-bias problem of all orders. In particular, the method directly provides optimal convergence rates in the estimation of the absolute and fractional norms of a high-dimensional mean vector and other non-smooth additive functionals in the Gaussian sequence model and under low-moment conditions on the noise based on independent and identically distributed observations. In the second problem, second order Stein’s formula as a special form of twice Gaussian integration by parts corrects the bias of the Lasso and other convex regularized estimators in high-dimensional linear regression via an unbiased estimating equation. With an associate central limit theorem as a special form of the Poincare´ inequality for the once differential function in Stein’s formula, the de-biased estimator is proven to achieve asymptotic normality and efficiency in regular statistical inference of linear functionals of the regression coefficient vector. In a related problem, the second order Stein methods justify the use of a scaled Mallow’s Cp as a selector of an estimator in the Lasso solution path to achieve the performance of the oracle minimizer of the prediction loss within a regret of smaller order than the minimax convergence rate.