11 Mean squared error of the local polynomial estimator
$$
$$
Here we merely present a result given in Tsybakov (2008). One makes assumptions serving the same purposes as those made in Chapter 9 and establishes the result similarly. There is nothing conceptually new here.
Proposition 11.1 (MSE of local polynomial estimator under Hölder smoothness) Suppose \(m \in \text{Hölder}(C,k,\alpha,[0,1])\) and let \(\beta = k + \alpha\). Then under assumptions given in Section 1.6 of Tsybakov (2008) we have \[ \operatorname{MSE}\hat m^{\operatorname{LP}(k)}_{n,h}(x) \leq h^{2\beta} C_1 + \frac{1}{nh} C_2 \tag{11.1}\] for each \(x \in [0,1]\) with \(\operatorname{MSE}\)-optimal bandwidth \(h\) given by \(h_\operatorname{opt} = c^*n^{-1/(2\beta + 1)}\) such that \[ \operatorname{MSE}\hat m^{\operatorname{LP}(k)}_{n,h_{\operatorname{opt}}}(x) \leq C^*n^{-2\beta/(2\beta + 1)}, \tag{11.2}\] for all \(x \in [0,1]\), for some constants \(C_1,C_2,c^*,C^*>0\).
Next we consider how to choose the bandwidth via crossvalidation for the local polynomial and Nadaraya-Watson estimators.