문제

I'm looking to minimize a function with potentially random outputs. Traditionally, I would use something from the scipy.optimize library, but I'm not sure if it'll still work if the outputs are not deterministic.

Here's a minimal example of the problem I'm working with:

def myfunction(self, a): 
    noise = random.gauss(0, 1)
    return abs(a + noise)

Any thoughts on how to algorithmicly minimizes its expected (or average) value?

A numerical approximation would be fine, as long as it can get "relatively" close to the actual value.

We already reduced noise by averaging over many possible runs, but the function is a bit computationally expensive and we don't want to do more averaging if we can help it.

도움이 되었습니까?

해결책

It turns out that for our application using scipy.optimize anneal algorithm provided a good enough estimate of the local maximum.

For more complex problems, pjs points out that Waeber, Frazier and Henderson (2011) link provides a better solution.

라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top