You can use the ncdf
package to read the 10 files into R, combine them into one big nlon x nlat x time x nfiles
array using abind
from the abind
package, and then using apply
to average out the file
dimension. This all assumes that you have enough RAM to load these 10 datasets into memory, i.e. they cannot be too big.
Alternatively, I would have a look at CDO, which is a command line tool to manipulate NetCDF files. There might also be a way to get what you want from that tool. This might be a more memory friendly option.