문제

So I'm attempting to take a string of 1's and 0's and convert it to its decimal equivalent as if the string were a bit string. I'm relatively unfamiliar with Java so I wrote the function in Python first, shown below. It works perfectly.

def stringToBitString(bs):
    #
    # bs = "10101"
    #
    ans = 0       # 32 bits of 0

    for bit in bs:
        ans = (ans << 1) | (ord(bit) - ord('0'))
    return (and)

However, upon trying to translate it to Java, I came up with this.

public int toInt(String path) {

    int answer = 0;

    for(int i = 0; i < path.length(); i++) {

        int bit = path.charAt(i);
        answer = (answer << 1) | (bit - 0);
    }
    return answer;        
}

This method does give me an int, but in the form of ASCII. For example stringToBitString("1") produces 1, wheres toInt("1") yields 48. Can anyone possibly tell me what I'm doing wrong at this point?

도움이 되었습니까?

해결책

What you are doing is subtracting '1' or '0' which have ASCII value 49 and 48 to 0 resulting in 49 or 48.

char bit = path.charAt(i);
answer = (answer << 1) | (bit - '0');
라이센스 : CC-BY-SA ~와 함께 속성
제휴하지 않습니다 StackOverflow
scroll top