Why does #define not require a semicolon?

#define MAX_STRING 256;

means:

whenever you find MAX_STRING when preprocessing, replace it with 256;. In your case it’ll make method 2:

#include <stdio.h>
#include <stdlib.h>
#define MAX_STRING 256;

int main(void) {
    char buffer [256;];
}

which isn’t valid syntax. Replace

#define MAX_STRING 256;

with

#define MAX_STRING 256

The difference between your two codes is that in first method you declare a constant equal to 256 but in the second code you define MAX_STRING to stand for 256; in your source file.

The #define directive is used to define values or macros that are used by the preprocessor to manipulate the program source code before it is compiled. Because preprocessor definitions are substituted before the compiler acts on the source code, any errors that are introduced by #define are difficult to trace.

The syntax is:

#define CONST_NAME VALUE

if there is a ; at the end, it’s considered as a part of VALUE.

to understand how exactly #defines work, try defining:

#define FOREVER for(;;)
...
    FOREVER {
         /perform something forever.
    }

Interesting remark by John Hascall:

Most compilers will give you a way to see the output after the preprocessor phase, this can aid with debugging issues like this.

In gcc it can be done with flag -E.

Leave a Comment