Question

I'm working on a way to solve this problem : Given an amount of 0 and 1, generate the list containing all the possible combinations of 0 and 1.

Example : if we have one 1 and two 0, the algorithm will return

 001
 010
 100

The problem is that the classic combination algorithms are not optimized for this purpose. This is the result a non optimized combination algorithm would return :

001
001
010
010
100
100

As you can see, all combinations are repeated twice because 0 are interpreted as different elements.

How an algorithm can be made to generate the list of possible combination inputting the number of 0 and 1, without repeating combination ?

PS : This is will be used in Javascript

EDIT : Solved ! Using @Batch method.

function combin(o, i)
{
    if (o == 0 && i > 0) 
    {
        for (var j = 0, s = ''; j < i; j++)
        {
            s += '1';
        }
        return [s];
    } 
    else if (i == 0 && o > 0)
    {
        for (var j = 0, s = ''; j < o; j++)
        {
            s += '0';
        }
        return [s];
    } 
    else if (i == 0 && 0 == o)
    {
        return [''];
    } 
    else if (i > 0 && o > 0)
    {
        var l = combin(o - 1, i);
        for (var j in l)
        {
            l[j] = '0' + l[j];
        }
        var k = combin(o, i-1);
        for (var j in k)
        {
            k[j] = '1' + k[j];
        }
        return l.concat(k);
    }
}
Was it helpful?

Solution

Here's one way to do it: start out with the string putting all 0s to the front. Now shift the rightmost 0 through to the end. Afterwards, shift the second rightmost 0 one position to the right, put the last 0 right next to it. Again, shift the rightmost 0 through to the end. Shift the second rightmost 0 another step right, put the rightmost 0 next to it again, shift shift shift. Once you shift the pair of the two rightmost 0s, start shifting the third right most 0.... you get the picture.

Example: three 0s, three 1s:

000111
001011
001101
001110
010011
010101
010110
011001
011010
011100
100011
100101
100110
101001
101010
101100
110001
110010
110100
111000

Not sure how to write a nice loop for that, but once you grasp the idea, you can play around and try to figure one out if you'd like. Maybe you can find a nifty recursion though, can't think of one for this method right now though.


A more elegant approach would be the following; notice that the ways to generate all strings with n 0s and m 1s is:

Start the string with a 0, append all combinations of generating strings from n-1 0s and m 1s, or start the string with a 1 and append all combinations of generating strings from n 0s and m-1 1s. The sets of generated strings are disjoint, so a simple recursion will do, no worrying about strings being generated multiple times necessary. If you prefer, you can do it iteratively as well (iterate over the length of the generated string for that, for iteration i, keep sets for no 0 used, one 0 used, ..., i 0s used, etc.)

All in all, recursion seems the way to go to me. Base cases are simple: if n = 0, the only string you can get is 1^m, if m = 0 the only string you can get is 0^n (where ^ denotes repetition).


In case you want to implement that recursion and also a way to test it (to some degree), notice that the number of strings you can produce is n + m choose n = n + m choose m, so counting the number of strings you get will give you a hint whether what you're doing works as intended. Not sure whether javascript offers easy access to binomial coefficients.

OTHER TIPS

I assume you want to do this fast. If you represent the 0/1 pattern as bits of an integer, the smallest one is "0...01..1" e.g. 0001111 if you have 3 0-bits and 4 1-bits - and the largest integer is "1..10..0". (e.g. 1111000 ).

Your problem is then a known problem of generating the lexicographically next bit permutation (i.e. the next integer with same number of 1 bits).

A simple implementation using code from http://graphics.stanford.edu/~seander/bithacks.html#NextBitPermutation is the following (in pseudocode):

v = '0..01..1'b  // Initialize t as smallest integer of the pattern
while ( t <= '1..10..0'b ):
    t = (v | (v - 1)) + 1
    v = t | ((((t & -t) / (v & -v)) >> 1) - 1)
    print v

see http://www.geeksforgeeks.org/next-higher-number-with-same-number-of-set-bits/ and references therein for a more detailed explanation on how the algorithm works.

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top