Question

I'm doing a knapsack optimization problem involving dynamic programming and branch & bound. I noticed that when the capacity and the item of the problem gets large, filling up the 2D table for the dynamic programming algorithm will get exponentially slower. At some point I am going to assume that I am suppose to switch algorithm depending on the size of the problem (since lecture have give two types of optimization)?

I've tried to google at what point (what size) should I switch from dynamic programming to branch & bound, but I couldn't get the result that I wanted.

Or, is there another way of looking at the knapsack problem in which I can combine dynamic programming and branch & bound as one algorithm instead of switching algorithm depending of the size of the problem?

Thanks.

Was it helpful?

Solution

Often when you have several algorithms that solve a problem but whose runtimes have different characteristics, you determine (empirically, not theoretically) when one algorithm becomes faster than the other. This is highly implementation- and machine-dependent. So measure both the DP algorithm and the B&B algorithm and figure out which one is better when.

A couple of hints:

  • You know that DP's runtime is proportional to the number of objects times the size of the knapsack.
  • You know that B&B's runtime can be as bad as 2# objects, but it's typically much better. Try to figure out the worst case.
  • Caches and stuff are going to matter for DP, so you'll want to estimate its runtime piecewise. Figure out where the breakpoints are.
  • DP takes up exorbitant amounts of memory. If it's going to take up too much, you don't really have a choice between DP and B&B.
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top