printf of a size_t variable with lld, ld and d type identifiers

That’s because what you’ve pushed on the stack is three 32-bit values and your format string tries to use four of them or, more accurately, one 64-bit value and two 32-bit values.

In the first case, the lld sucks up two 32-bit values, the ld sucks up the third one and the u gets whatever happens to be on the stack after that, which could really be anything.

When you change the order of the format specifiers in the string, it works differently because the ld sucks up the first 32-bit value, the u sucks up the second and the lld sucks up the third plus whatever happens to be on the stack after that. That’s why you’re getting different values, it’s a data alignment/availability issue.

You can see this in action with the first value. 429496729700 is equal to (4294967296 + 1) * 100, i.e., (232+1)*100. Your code snippet

printf("lld=%lld, ld=%ld, u=%u\n", temp, temp, temp);

actually has the following effect:

What you pass     Stack     What printf() uses
-------------     -----     ------------------
                 +-----+
100              | 100 | \
                 +-----+  = 64-bit value for %lld.
100              | 100 | /
                 +-----+
100              | 100 |    32-bit value for %ld.
                 +-----+
                 | ?   |    32-bit value for %u (could be anything).
                 +-----+

In the second case

printf("ld=%ld, u=%u, lld=%lld\n", temp, temp, temp);

the following occurs:

What you pass     Stack     What printf() uses
-------------     -----     ------------------
                 +-----+
100              | 100 |    32-bit value for %ld.
                 +-----+
100              | 100 |    32-bit value for %u.
                 +-----+
100              | 100 | \
                 +-----+  = 64-bit value for %lld (could be anything).
                 | ?   | /
                 +-----+

Leave a Comment