Urgent: Initialize unicode STL string

Discussion in 'Mac Programming' started by goyalpk, Aug 15, 2009.

  1. goyalpk macrumors newbie

    Joined:
    Aug 7, 2009
    #1
    Hi there,
    I have declared unicode STL string like below. How can I initialize this string with hard coded string like wide character string intialization?

    typedef std::basic_string<unichar> UnicodeString;

    UnicodeString myString = L"SampleApp";

    Thanks so much in advance.
    Pankaj
     
  2. lee1210 macrumors 68040

    lee1210

    Joined:
    Jan 10, 2005
    Location:
    Dallas, TX
    #2
    You might need to be more specific in regard to your intent. It seems like you're trying to do some C++ shenanigans using NSString's unichar, defined here:
    http://developer.apple.com/document...e/NSString.html#//apple_ref/doc/c_ref/unichar

    If so, what is so bad about an NSString? If you're writing straight C++, why not wchar_t?

    With all of that said, using L'w' isn't going to always give you a 16-bit value that would be the same thing as a unichar. This just seems like a whole lot of trouble. You'd need to create a Character Traits for unichar, then, i guess, initialize using an array of unichars with each character cast to unichar.

    Short and long of it: why aren't you using NSString or wchar_t?

    -Lee
     
  3. goyalpk thread starter macrumors newbie

    Joined:
    Aug 7, 2009
    #3
    Thanks so much for reply. I need to use unichar. wchar_t is 4 bytes on Mac while on Windows it is 2 bytes. All the algorithms were written considering wchar_t is 2 bytes.
    Thanks.
    Pankaj

     
  4. gnasher729 macrumors P6

    gnasher729

    Joined:
    Nov 25, 2005
    #4
    Why not use plain std::string with UTF-8 encoding? No incompatibility between Windows and MacOS X, no byte ordering problems.
     

Share This Page