In C, why is sizeof(char) 1, when ‘a’ is an int?

In C ‘a’ is an integer constant (!?!), so 4 is correct for your architecture. It is implicitly converted to char for the assignment. sizeof(char) is always 1 by definition. The standard doesn’t say what units 1 is, but it is often bytes.

Leave a Comment