Java: Subtract ‘0’ from char to get an int… why does this work?

That’s a clever trick. char’s are actually of the same type / length as shorts. Now when you have a char that represents a ASCII/unicode digit (like ‘1’), and you subtract the smallest possible ASCII/unicode digit from it (e.g. ‘0’), then you’ll be left with the digit’s corresponding value (hence, 1)

Because char is the same as short (although, an unsigned short), you can safely cast it to an int. And the casting is always done automatically if arithmetics are involved

Leave a Comment