Question

The $3-SAT$ problem is known to be NP-complete problem. Which means that (as far as I understand), unless $P \neq NP$, for every algorithm $A$ which decides $3-SAT$, $A$ runs in super polynomial time (I know that this is not well defined). A stronger assumption is the "Strong Exponential Time Hypothesis " which states that every algorithm has to run in worst case exponential time.
Does that mean, for example, that every algorithm $A$ (here I refer to a "real" or practical implementation, such as Glucoses, DPLL, Z3, etc.), has an instance of size, say 200, which $A$ will not be able to solve in a reasonable time? I see many sat solvers solving formulas with millions of variables in very short time; and I am aware of subsets of $3-SAT$ which are easy, such as Horn clauses, or a random instance with a known density - but it seems that these solvers work amazingly fast without any assumptions on the input.
I have searched for Sat benchmarks, or instances which should be hard, but when running them on a modern sat solver, it doesn't seem to be a problem.
It is known that complexity assumptions (or claims) are about the asymptotic behavior, my question is: do we know for which $n_0$ these assumptions "kick in" in the case of $3-SAT$? Can we generate, or at least be confident of the existence of a relatively small formula ($< 200$), such that every algorithm requires $\sim2^{200}$ operations?

No correct solution

Licensed under: CC-BY-SA with attribution
Not affiliated with cs.stackexchange
scroll top