I've written a variation of the cumsum
function, where I multiply the previous sum by a decay factor before adding the current value:
decay <- function(x, decay=0.5){
for (i in 2:length(x)){
x[i] <- x[i] + decay*x[(i-1)]
}
return(x)
}
Here's a demo, using a binary variable to make the effect clear:
set.seed(42)
Events <- sample(0:1, 50, replace=TRUE, prob=c(.7, .3))
plot(decay(Events), type='l')
points(Events)
Compiling this function speeds it up a lot:
#Benchmark
library(compiler)
library(rbenchmark)
cumsum_decayCOMP <- cmpfun(cumsum_decay)
Events <- sample(0:1, 10000, replace=TRUE, prob=c(.7, .3))
benchmark(replications=rep(100, 1),
cumsum_decay(Events),
cumsum_decayCOMP(Events),
columns=c('test', 'elapsed', 'replications', 'relative'))
test elapsed replications relative
1 cumsum_decay(Events) 3.28 100 6.979
2 cumsum_decayCOMP(Events) 0.47 100 1.000
But I suspect vectorizing would improve it even more. Any ideas?