Pergunta

I'm looking to minimize a function with potentially random outputs. Traditionally, I would use something from the scipy.optimize library, but I'm not sure if it'll still work if the outputs are not deterministic.

Here's a minimal example of the problem I'm working with:

def myfunction(self, a): 
    noise = random.gauss(0, 1)
    return abs(a + noise)

Any thoughts on how to algorithmicly minimizes its expected (or average) value?

A numerical approximation would be fine, as long as it can get "relatively" close to the actual value.

We already reduced noise by averaging over many possible runs, but the function is a bit computationally expensive and we don't want to do more averaging if we can help it.

Foi útil?

Solução

It turns out that for our application using scipy.optimize anneal algorithm provided a good enough estimate of the local maximum.

For more complex problems, pjs points out that Waeber, Frazier and Henderson (2011) link provides a better solution.

Licenciado em: CC-BY-SA com atribuição
Não afiliado a StackOverflow
scroll top