This is because s.charAt(i)
is a character code of the digit, not its numeric value. the UNICODE code point for '0'
is U+0030
; for '1'
, it is U+0031
.
Change to
int ch=s.charAt(i) - '0';
to fix the problem (demo on ideone). The reason this works is that the code points of digits are arranged sequentially. Subtracting the code point of zero from a code point of another decimal digit produces "the distance" between that code point and the corresponding digit, which corresponds to the numeric value of the digit.