Pergunta

Given a sequence of positive reals $a_1, a_2, \dots, a_n$ and an integer $m$, for each $j$ assign $a_j$ to a machine $i$, $1< i < m$, so as to minimize the maximum, over $i$, of the sum of all reals assigned to machine $i$.

Theorem: There is no randomized $m$-machine scheduling algorithm with a competitive ratio less than $4/3$ for any $m \geq 2$.

The following is the Proof Sketch given in the paper:

Consider the job sequences $m \times 1$ and $m \times 1, 2$. (Here $m \times 1$ denotes a sequence of $m$ 1’s.) If the algorithm schedules the $m$ 1‘s on $m$ different machines with probability $p$, the worst-case ratio between its cost and the optimal cost is at least $\max\{2 – p, 1 + p/2\} \sim 4/3$.

I have problem understanding the proof. I would really appreciate if anyone can explain the proof in details, especially how to arrive at the worst-case ratio of $\max\{2-p, 1+p/2\}$.

Nenhuma solução correta

Licenciado em: CC-BY-SA com atribuição
Não afiliado a cs.stackexchange
scroll top