How does *(&arr + 1) – arr give the length in elements of array arr?

This is a mine field, but I’ll give it a try:

  • &arr returns a pointer to an int[5]
  • + 1 steps the pointer one int[5]
  • *(&arr + 1) dereferences the result back to an int(&)[5]
    I don’t know if this causes undefined behavior, but if it doesn’t, the next step will be:
  • *(&arr + 1) - arr does pointer arithmetics after the two int[5]‘s have decayed to int pointers, returning the diff between the two int pointers, which is 5.

Rewritten to make it a bit clearer:

int  arr[5] = {5, 8, 1, 3, 6};

int (*begin_ptr)[5] = &arr + 0;     // begin_ptr is a  int(*)[5]
int (*end_ptr)[5]   = &arr + 1;     // end_ptr is a    int(*)[5]

// Note:
//       begin_ptr + 1        ==  end_ptr
//       end_ptr - begin_ptr  ==  1

int (&begin_ref)[5] = *begin_ptr;   // begin_ref is a  int(&)[5]
int (&end_ref)[5]   = *end_ptr;     // end_ref is a    int(&)[5]   UB here?

auto len = end_ref - begin_ref; // the array references decay into int*
std::cout << "The length of the array is: " << len << '\n'; // 5

I’ll leave the question if it’s UB or not open but referencing an object before the referenced storage has been allocated does look a bit suspicious.

Leave a Comment