Question

I wonder if there is a fast way to keep summing an element until a threshold is found.

For example

threshold=100;
a=0; MAX=1000;
for i=1:MAX
    a=a+normrnd(2,1,1,1);
    if (a>threshold)
      index=i; break;
     end 
 end

This works fine, but it is quite slow. Cumsum would be really useful here. However, the only way I know to use cumsum in this situation is like this:

a=cumsum(normrnd(2,1, MAX,1)); 
index=find(a>threshold,1);

Which become progressively more inefficient when MAX increase.

So basically I am looking for a cumulative summation method that preserve the speed of cumsum but that allows me to set a threshold. Any idea? Thank you

Was it helpful?

Solution 2

Basically, this solution combines both your solutions, it iterates over segments of size 1000 and uses your cumsum solution to process 1000 elements at once.

threshold=100;
a=0; MAX=9999999;
%segment size
S=1000;
for segment_start=1:S:MAX
    %process segment from _start to _end
    segment_end=min(segment_start+S-1,MAX);
    %which contains _count elements
    segment_count=segment_end-segment_start+1;
    %now use the cumsum solution to process the segment
    c=normrnd(2,1,segment_count,1);
    c=cumsum(c);
    index=find(c>threshold-a,1,'first');
    if isempty(index)
        %need to continue
        a=a+c(end);
    else
        %threshhold reached, found solution
        a=a+c(index);
        break;
    end

end

OTHER TIPS

You can use a binary search. If your elements are random variables with a known distribution, or at least known expected value you can make a good initial guess.

In the case you have posted, we have that E[normrnd(2,1,1,1)] = 2, and since threshold = 100 you start by generating 50 numbers and summing them. If you overestimate then you search for the correct index with binary search. If you underestimate then you use the amount you underestimated to make a new guess, and continue to do so untill you overestimate.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top