Why don’t structs support inheritance?

The reason value types can’t support inheritance is because of arrays.

The problem is that, for performance and GC reasons, arrays of value types are stored “inline”. For example, given new FooType[10] {...}, if FooType is a reference type, 11 objects will be created on the managed heap (one for the array, and 10 for each type instance). If FooType is instead a value type, only one instance will be created on the managed heap — for the array itself (as each array value will be stored “inline” with the array).

Now, suppose we had inheritance with value types. When combined with the above “inline storage” behavior of arrays, Bad Things happen, as can be seen in C++.

Consider this pseudo-C# code:

struct Base
{
    public int A;
}

struct Derived : Base
{
    public int B;
}

void Square(Base[] values)
{
  for (int i = 0; i < values.Length; ++i)
      values [i].A *= 2;
}

Derived[] v = new Derived[2];
Square (v);

By normal conversion rules, a Derived[] is convertible to a Base[] (for better or worse), so if you s/struct/class/g for the above example, it’ll compile and run as expected, with no problems. But if Base and Derived are value types, and arrays store values inline, then we have a problem.

We have a problem because Square() doesn’t know anything about Derived, it’ll use only pointer arithmetic to access each element of the array, incrementing by a constant amount (sizeof(A)). The assembly would be vaguely like:

for (int i = 0; i < values.Length; ++i)
{
    A* value = (A*) (((char*) values) + i * sizeof(A));
    value->A *= 2;
}

(Yes, that’s abominable assembly, but the point is that we’ll increment through the array at known compile-time constants, without any knowledge that a derived type is being used.)

So, if this actually happened, we’d have memory corruption issues. Specifically, within Square(), values[1].A*=2 would actually be modifying values[0].B!

Try to debug THAT!

Leave a Comment