Pergunta

In "regret" styled analysis over $T$ steps of an iterative algorithm $\{x_i \in F \}_{i=1}^T$ (where $F$ is some feasible set) being given the sequence of loss functions $\{ f_i\}_{i=1}^T$ one defines the regret $R_T = \sum_{i=1}^Tf_t(x_i) - \min_{x \in F} \sum_{i=1}^T f_t(x)$ Most typical analyses assume that the $f$s are convex.

Is this notion of "regret" lower bounded? Or under what conditions is it lower bounded?

I guess "minimizing the regret" does not make sense because one can always have an "Oracle" access to the sequence of points $x^*_i$ such that $x^*_i = \min_{x \in F} f_i(x)$. Then for this sequence for $x^*$ points the regret is only at most $0$.

Nenhuma solução correta

Licenciado em: CC-BY-SA com atribuição
Não afiliado a cs.stackexchange
scroll top