Question

So I'm attempting to take a string of 1's and 0's and convert it to its decimal equivalent as if the string were a bit string. I'm relatively unfamiliar with Java so I wrote the function in Python first, shown below. It works perfectly.

def stringToBitString(bs):
    #
    # bs = "10101"
    #
    ans = 0       # 32 bits of 0

    for bit in bs:
        ans = (ans << 1) | (ord(bit) - ord('0'))
    return (and)

However, upon trying to translate it to Java, I came up with this.

public int toInt(String path) {

    int answer = 0;

    for(int i = 0; i < path.length(); i++) {

        int bit = path.charAt(i);
        answer = (answer << 1) | (bit - 0);
    }
    return answer;        
}

This method does give me an int, but in the form of ASCII. For example stringToBitString("1") produces 1, wheres toInt("1") yields 48. Can anyone possibly tell me what I'm doing wrong at this point?

Was it helpful?

Solution

What you are doing is subtracting '1' or '0' which have ASCII value 49 and 48 to 0 resulting in 49 or 48.

char bit = path.charAt(i);
answer = (answer << 1) | (bit - '0');
Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top