If you must define your own string type, then don’t inherit from std::string
but define your own Character Traits class and do something like
typedef std::basic_string<unsigned char, utf8_traits> utf8string;
See also Herb Sutter’s website.
More Related Contents:
- UTF8 to/from wide char conversion in STL
- What is the Windows equivalent for en_US.UTF-8 locale?
- Using Unicode in C++ source code
- How do you properly use WideCharToMultiByte
- Read Unicode UTF-8 file into wstring
- C programming: How to program for Unicode?
- C++ & Boost: encode/decode UTF-8
- does (w)ifstream support different encodings
- Should I use wchar_t when using UTF-8?
- Do C++11 regular expressions work with UTF-8 strings?
- Specification of source charset encoding in MSVC++, like gcc “-finput-charset=CharSet”
- What’s the difference between UTF-8 and UTF-8 without BOM?
- Output unicode strings in Windows console app
- What is the difference between _tmain() and main() in C++?
- What is the difference between UTF-8 and Unicode?
- How well is Unicode supported in C++11?
- What is the proper way to URL encode Unicode characters?
- How to convert a UTF-8 string into Unicode?
- 😃 (and other Unicode characters) in identifiers not allowed by g++
- How does UTF-8 “variable-width encoding” work?
- How to create a UTF-8 string literal in Visual C++ 2008
- Is a wide character string literal starting with L like L”Hello World” guaranteed to be encoded in Unicode?
- Convert a unicode String In C++ To Upper Case
- Windows Unicode C++ Stream Output Failure
- Is wchar_t needed for unicode support?
- Character sets – Not clear
- Is there a way to convert from UTF8 to ISO-8859-1?
- What does _T stands for in a CString
- converting narrow string to wide string
- Why does wide file-stream in C++ narrow written data by default?