The large values of n
and m
limit your choices. If you model it as n*m
boolean decision variables, so in the order of 10^10
variables... then it might be to big to load a single instance into RAM memory. If you model it as 20*n
variables of type M, then you limit the number of connections for a variable to an arbitrary number (20 in this case) and need to add a hard constraints that no 2 variables for the same n should have the same M value, but it would be loadable into RAM memory.
Presuming that your constraints (performance score) isn't trivial: the only optimization algorithms I 've seen scale to such a high number of variables is Local Search variants (Tabu Search, Late Acceptance, ...), for example in the Roadef2012 competition.