How to convert the Unicode character to escaped character in the objective c?

Discussion in 'iOS Programming' started by madcat, Feb 3, 2011.

  1. madcat macrumors newbie

    Feb 3, 2011
    How do i convert the unicode character into escaped character string?
  2. chown33 macrumors 604

    Aug 9, 2009
    Open the NSString class reference.
    Search for: NSNonLossyASCIIStringEncoding
  3. madcat, Feb 3, 2011
    Last edited by a moderator: Feb 3, 2011

    madcat thread starter macrumors newbie

    Feb 3, 2011
    thanks., see below the code that i'm using.. is this correct?

    		NSData *xmlData = [NSData dataWithContentsOfFile:pathXMLSkill];
    		//NSString *skillData = [[[NSString alloc]initWithData:xmlData encoding:NSUTF8StringEncoding] autorelease];
    		NSString *skillData = [[[NSString alloc]initWithData:xmlData encoding:NSNonLossyASCIIStringEncoding] autorelease];
    		//xmlDocPtr doc = xmlParseMemory([skillData UTF8String], [skillData lengthOfBytesUsingEncoding:NSUTF8StringEncoding]);	
    		xmlDocPtr doc = xmlParseMemory([skillData UTF8String], [skillData lengthOfBytesUsingEncoding:NSNonLossyASCIIStringEncoding]);	
    stringSkill	  = [NSString stringWithUTF8String:(const char *) keywordSkill];
  4. chown33 macrumors 604

    Aug 9, 2009
    It's probably wrong. The degree of wrongness, and the consequences, are data-dependent. You can give it some data that would work, or some other data that would fail.

    NSData *xmlData = [NSData dataWithContentsOfFile:pathXMLSkill];
    [COLOR="Green"]NSString *skillData = [[[NSString alloc]initWithData:xmlData encoding:NSNonLossyASCIIStringEncoding] autorelease];[/COLOR]
    xmlDocPtr doc = xmlParseMemory([skillData UTF8String], [skillData [COLOR="Red"]lengthOfBytesUsingEncoding:NSNonLossyASCIIStringEncoding[/COLOR]]);	
    [COLOR="Blue"]stringSkill = [NSString stringWithUTF8String:(const char *) keywordSkill];[/COLOR]
    The green-hilited statement might be wrong. XML has a small number of canonical encodings. Any other encoding should be specified in the data itself. NonLossyASCII is not one of the canonical encodings, so converting the bytes to a string under the assumption there are no invalid NonLossyASCII chars might be wrong. Furthermore, backslash has a special meaning in NonLossyASCII, so unless you can guarantee that backslash is always and only used to signify NonLossyASCII, then the conversion could misinterpret the data.

    The red-hilited fragment is almost certainly wrong. The length of a UTF8string is almost certainly less than that of its NonLossyASCII-encoded representation. So you're passing xmlParseMemory() a count that's almost certainly greater than the actual number of UTF8 bytes.

    There's no way to assess the correctness of the blue-hilited statement, since none of the variables appear in any other code.

    Please provide an example of the XML data with NonLossyASCII. You should probably be decoding it as part of the XML parse operation, instead of before the XML parse.

Share This Page