Question

Concerning primitives: When I cast from smaller to bigger types, the casts are implicit, when I cast from bigger to smaller types, I need to explicitly cast the primitives, that's clear due to loss of data. But there is something I don't get. When I up- or downcast to char in some cases (byte and short), I always need to explicitly cast in both directions although byte (8bit) fits into char (16bit)?

(see also http://docs.oracle.com/javase/tutorial/java/nutsandbolts/datatypes.html)

See my examples...

public class CastingTest
{
    public static void main(String[] args)
    {
        //casting from smaller to bigger types
        short c = 13;
        int d = c;

        byte f = 34;
        short g = f;

        byte h = 20;
        long i = h;

        byte var03 = 6;
        double var04 = var03;   

        //casting from bigger to smaller types
        int j = 12;
        short k = (short)j;

        long m = 56;
        int n = (int)m;

        double o = 19;
        short p = (short)o;

        //not possible without explicit cast, but why?
        byte var01 = 3;
        char var02 = (char)var01;

        short var05 = 5;
        char var06 = (char)var05;

        char var07 = 'k';
        short var08 = (short)var07;
    }
}
Was it helpful?

Solution

char is Java's only unsigned type, therefore its value range does not fully contain any other Java type's value range.

You must use an explicit cast operator for any conversion where the target type's range doesn't fully cover the source type's range.

OTHER TIPS

I've taken this from the language specification:

"First, the byte is converted to an int via widening primitive conversion (§5.1.2), and then the resulting int is converted to a char by narrowing primitive conversion (§5.1.3)."

Licensed under: CC-BY-SA with attribution
Not affiliated with StackOverflow
scroll top