Question

I am attempting to find a reference which explains how one computes standard errors for local polynomial regression? Specifically, in R one can use the loess function to get a model object and then use the predict function to retrieve standard errors. Is there a reference somewhere to what is actually happening? What about in the case when there may be serial correlation in the residuals, one must adjust this using Newey-West type methods, is there a way to use the sandwich package to do this as you would for a regular OLS using lm?

I tried looking at the source but the standard error computation calls a C function.

Was it helpful?

Solution

The "Source" section of ?loess tells you that the underlying C-code comes from the cloess package of Cleveland et al., and points you to its web home:

Source: The 1998 version of ‘cloess’ package of Cleveland, Grosse and Shyu. A later version is available as ‘dloess’ at http://www.netlib.org/a>.

Going there, you will find a link to a 50 page document (warning: postscript doc) that should tell you everything you need to know about this implementation of loess. In Cleveland's words:

This guide describes crucial steps in the proper analysis of data using loess. Please read it.

Of particular interest will be the first couple pages of "Section 4: Statistical and Computational Methods".

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top