Why not to inherit from std::allocator

A lot of people are going to post in this thread that you should not inherit from std::allocator because it doesn’t have a virtual destructor. They’ll talk about polymorphism and slicing and deleting via pointer-to-base class, none of which are permitted by the allocator requirements as detailed in section 17.6.3.5 [allocator.requirements] of the standard. Until someone demonstrates that a class derived from std::allocator fails to meet one of those requirements, it’s simple cargo cult mentality.

That said, there is little reason to derive from std::allocator in C++11. C++11’s overhaul of allocators introduced the traits template std::allocator_traits to sit between an allocator and its users and provide reasonable defaults for many of the required features via template metaprogramming. A minimal allocator in C++11 can be as simple as:

template <typename T>
struct mallocator {
  using value_type = T;

  mallocator() = default;
  template <class U>
  mallocator(const mallocator<U>&) {}

  T* allocate(std::size_t n) {
    std::cout << "allocate(" << n << ") = ";
    if (n <= std::numeric_limits<std::size_t>::max() / sizeof(T)) {
      if (auto ptr = std::malloc(n * sizeof(T))) {
        return static_cast<T*>(ptr);
      }
    }
    throw std::bad_alloc();
  }
  void deallocate(T* ptr, std::size_t n) {
    std::free(ptr);
  }
};

template <typename T, typename U>
inline bool operator == (const mallocator<T>&, const mallocator<U>&) {
  return true;
}

template <typename T, typename U>
inline bool operator != (const mallocator<T>& a, const mallocator<U>& b) {
  return !(a == b);
}

EDIT: Proper use of std::allocator_traits isn’t fully present in all standard libraries yet. For example, the sample allocator above doesn’t work correctly with std::list when compiled with GCC 4.8.1 – the std::list code complains about missing members since it hasn’t been updated yet.

Leave a Comment