Using int for character types when comparing with EOF

getchar result is the input character converted to unsigned char and then to int or EOF i.e. it will be in the -1 — 255 range that’s 257 different values, you can’t put that in an 8 bit char without merging two of them. Practically either you’ll mistake EOF as a valid character (that will happen if char is unsigned) or will mistake another character as EOF (that will happen if char is signed).

Note: I’m assuming an 8 bit char type, I know this assumption isn’t backed up by the standard, it is just by far the most common implementation choice.

Leave a Comment