Dealing with Hex

Discussion in 'iOS Programming' started by blubyu, Jun 6, 2013.

  1. blubyu macrumors member

    Joined:
    Feb 10, 2010
    #1
    Please bear with me as I am noobie noob when it comes to programming. I am writing a small app that sends out multicast traffic. I have the very basic app working and can multicast out the string "hello world". I have verified this by capturing the data using wire shark. What I need to do now is send out Hex values instead of a string. For some reason I am totally baffled as to how to put hex into my variable. Any help here would be greatly appreciated.

    This is what my variable looks like now:

    Code:
    NSData *data = [@"hello world" dataUsingEncoding:NSUTF8StringEncoding];
     
  2. ArtOfWarfare, Jun 6, 2013
    Last edited by a moderator: Jun 6, 2013
  3. blubyu thread starter macrumors member

    Joined:
    Feb 10, 2010
    #3
    Thank you for the link. After I read it a couple times I think it is exactly what I need.
     
  4. PhoneyDeveloper macrumors 68030

    PhoneyDeveloper

    Joined:
    Sep 2, 2008
    #4
    I assume you mean binary data. You can create an NSMutableData and use appendBytes to append ints and any kind of binary data you like. Or you can create a buffer that holds your binary data and use dataWithBytes:length: to create your NSData object.
     
  5. blubyu thread starter macrumors member

    Joined:
    Feb 10, 2010
    #5
    I was able to take the example from the link above and make it work. I have a feeling that as I get more into my project I might need something a little more flexible. Every little bit I learn shows me how much I more I want to know :)
     
  6. PhoneyDeveloper macrumors 68030

    PhoneyDeveloper

    Joined:
    Sep 2, 2008
    #6
    OK. That example was different from what I thought you meant (which is fine). Why do you want a hex version of an ascii string?

    Anyway, I would use a mutable string and appendFormat rather than the sprintf and malloc shown in that example. Up to you.
     
  7. lastcall macrumors member

    lastcall

    Joined:
    Jan 10, 2013
    #7
    Maybe he wants to encode the plain-text message to hex format so it's not easily read from prying eyes.
    Otherwise I see no point in doing it. If you encode it, then you need to decode it too.
     
  8. blubyu thread starter macrumors member

    Joined:
    Feb 10, 2010
    #8
    Well to be honest I want to deal with hex only. I am very new to programming and for the life of me I couldn't figure out how to get hex values into my variable. The example from the link above allows me to creat a string with the hex values that I want and then convert that to straight hex in a variable. For now that will get me started. Eventually though I would like to cut the conversion out and deal straight with the hex values.

    Does that make sense?
     
  9. chown33, Jun 7, 2013
    Last edited: Jun 7, 2013

    chown33 macrumors 604

    Joined:
    Aug 9, 2009
    #9
    No, not really.

    Some of the places where you say "hex" are clearly wrong, and the only conceivably correct word is "binary".

    Also "in a variable" is vague. A variable can be anything whose value varies. It might be a number or an object. The type of the object might be NSData or NSString or something else. So when you say "convert that to straight hex in a variable" it's not at all clear what you're trying to accomplish.


    Are you learning from a book or tutorial? Exactly which one? Please post the exact URL of an online tutorial, or the complete title, author, and edition of a book.

    EDIT
    Be specific about exactly what you want to happen. Don't describe how, just what.

    Example:
    I have the input string @"Greetings, earthoids and humanlings." I want to send this out in a multicast message, where each ASCII character in the original message is converted into two hex digits in the ASCII character set. For example, the letter 'G' would appear as the two letters "47" in the multicast message.​
    Note that in addition to describing exactly what input and output will be, I have specified a character encoding (ASCII) for interpreting the input string as binary character codes. I have also specified the output character set for the hex data.

    Because ASCII is a limited character set, and Unicode is larger, you have to be specific about what each character of an input string will be interpreted as. You also have to specify what character set the hex output will be.

    If you really mean "binary" instead of "hex", then you have to say "binary" at the correct place in the description.
     
  10. blubyu thread starter macrumors member

    Joined:
    Feb 10, 2010
    #10
    K. I get what I'm saying isn't exactly what I mean. But I am trying :)

    What my app will do is multicast out a 638 byte packet. The first 126 bytes are fairly static (I have to increment the sequence number with every packet sent). The final 512 bytes are the data that I need the "receivers" to see.

    I know what the entire packet needs to look like in hex which is why I keep referring to hex in my posts. I'm not currently in front of my iMac or I would post the hex values so that you could see what I'm trying to send.

    Currently what I'm doing is creating an NSString with the value that I want to send (for example: aaaaaaaabbbbbbbbcccccccc), then converting that to an NSData in the format I need to send (aaaaaaaa bbbbbbbb cccccccc).

    What I'm doing is working fine. I am capturing the multicast packet with wire shark and the "reciever" on my network is seeing the data and reacting the way it should (turning lights on and off and changing colors).

    For now I am going to continue using what I have but eventually I would like to learn a better (more efficient, flexible) way of doing it.
     
  11. xArtx, Jun 7, 2013
    Last edited: Jun 7, 2013

    xArtx macrumors 6502a

    Joined:
    Mar 30, 2012
    #11
    Then it sounds like you're dealing with an ASCII representation of hex data.

    If what you are doing is working why are you bothering with the NSString first?

    Could you not do this in the first place:
    Code:
    NSData *data = [@"123456789ABCDEF" dataUsingEncoding:NSUTF8StringEncoding];
    or does the initial entry of the data require you receive it to a string?
     
  12. blubyu thread starter macrumors member

    Joined:
    Feb 10, 2010
    #12
    At this point I think what I need to do is test a few things out with my code so that I can better describe what I'm trying to do. It is very possible that I'm making this much harder then it needs to be :)
     
  13. chown33 macrumors 604

    Joined:
    Aug 9, 2009
    #13
    Please do that.

    This is another example of an ambiguous description. You're using hex characters for both the NSString and the NSData representation.

    Accuracy is important in programming. Post a real example of real data when you get to your Mac. The fake data isn't clarifying anything.

    This suggests you're controlling some Hue lighting, or maybe some other devices. If that's so, then say it. We can look up the protocol for talking to Hue lights. Same for Insteon or other devices. Without knowing the device, your question has no context.
     
  14. blubyu thread starter macrumors member

    Joined:
    Feb 10, 2010
    #14
    Ok, hopefully I can explain a little better by showing the code I am using :)

    This is part of the 638 byte packet that I am trying to send:

    Code:
    NSString *rootLayer=@"001000004153432d45312e3137000000726e00000004c0de0080c69b11e095720800200c9a66";
    If I send that out as a string without interpreting it as hex values then I get this:
    Code:
    30 30 31 30 30 30 30 30   34 31 35 33 34 33 32 64    001000004153432d
    34 35 33 31 32 65 33 31   33 37 30 30 30 30 30 30    45312e3137000000
    37 32 36 65 30 30 30 30   30 30 30 34 63 30 64 65    726e00000004c0de
    30 30 38 30 63 36 39 62   31 31 65 30 39 35 37 32    0080c69b11e09572
    30 38 30 30 32 30 30 63   39 61 36 36                     0800200c9a66
    
    As you can see it is sending the exact string that I told it to. I don't want this. I need to send that string as hex values not ascii chars.

    When I take the string above and run it through my code to interpret the string as hex, I get this:

    Code:
    00 10 00 00 41 53 43 2d   45 31 2e 31 37 00 00 00   ....ASC-E1.17...
    72 6e 00 00 00 04 c0 de   00 80 c6 9b 11 e0 95 72   rn..............
    08 00 20 0c 9a 66                                               .. ..f  
    
    As you can see, this time it sent the actual hex codes that I want to send on the wire.

    I just reread the post from the link that was posted earlier (which is where I got my code from). Here is probably an easier way of trying to describe what I am doing:

    I have an NSString of hexadecimal digits like this:

    Code:
    NSString *rootLayer=@"001000004153432d45312e3137000000726e00000004c0de0080c69b11e095720800200c9a66";
    And I want to change it to NSData like this:

    Code:
    NSData *rootLayer=<00100000 4153432d 45312e31 37000000 726e0000 0004c0de 0080c69b 11e09572 0800200c 9a66>;
    I see that while I was typing this chown33 replied and asked me some questions. I will do my best to answer them :)

    I have posted some real data above. The lights that I am trying to control are christmas lights. Each pixel has three separate rgb leds in them. I have the specs on the data that I need to send to them to turn them on, off, change color, intensity. I can provide the full breakdown of the packet but to be honest at this point that is the one thing about my project that I do understand :)

    The above data that I posted is what is called the Root Layer for my packet. Here is the full breakdown of it:

    Code:
    00 10 - Define RLP Preamble Size
    00 00 - RLP Post Amble Size
    41 53 43 2d 45 31 2e 31 37 00 00 00 - Identifies packet as E1.17
    72 6e - Protocol flags and Length
    00 00 00 04 - Identifies RLP Data as 1.31. Protocol PDU
    c0 de 00 80 c6 9b 11 e0 95 72 08 00 20 0c 9a 66 - Senders SID
    
    So, a little clearer?
     
  15. ArtOfWarfare macrumors 604

    ArtOfWarfare

    Joined:
    Nov 26, 2007
  16. blubyu thread starter macrumors member

    Joined:
    Feb 10, 2010
    #16
    I guess because at this point it's the only way I know how to do it. I am open to learning a better way to build my packet before I send it.
     
  17. firewood macrumors 604

    Joined:
    Jul 29, 2003
    Location:
    Silicon Valley
    #17
    Sounds like you want to send raw binary data, which you currently have in the form of an NSString containing a hexidecimal representation of a sequence of bytes (perhaps for printability).
     
  18. subsonix macrumors 68040

    Joined:
    Feb 2, 2008
    #18
    If you want to send the number as is, do not use a string! Strings are encoded in some form or other, it's that encoding you see in your hex dumps. You do not need to convert it to hexadecimal first.
     
  19. blubyu thread starter macrumors member

    Joined:
    Feb 10, 2010
    #19
    You both are correct :)

    I have looked at the code I am using from the link posted near the beginning of this thread. I see that it is taking my NSString and taking each ASCII character, converting it to its hexadecimal equivalent, appending two of them together and then puting them into an array. Now that I know it is in any array I can go ahead and start manipulating just the parts of the array that need changing before I send each packet out on to the network.
     
  20. chown33 macrumors 604

    Joined:
    Aug 9, 2009
    #20
    Above, in red, I have corrected the most obvious incorrect uses of the word "hex".

    You haven't explained how the hex string came to be. Is there a function or method that built it? From what data? Why doesn't that function or method return binary?

    An NSData object is a container for a sequence of binary bytes. That is, its contents are binary. When the contents are displayed, as in <00100000 ..., the values are shown as hexadecimal (i.e. hex), even though the contents remain in binary.

    What is the manufacturer or brand of the lights? What is the model number (if any)? What country were they sold in? What year?

    Or did you make them yourself?

    Or did someone else make a custom design just for you?


    Is the spec available online? What's the URL?

    Which doesn't really help anyone else gain an understanding. To do that, you'd have to clearly identify the product, the protocol, where to get detailed information, etc. Please do so.

    Is that called the Root Layer because that's what the actual protocol spec calls it, or is that just your name for it? Again, not having access to the actual protocol spec impedes everyone else's understanding.

    Not as much as you might think.

    Does RLP means "Root Layer Packet"?

    I have no idea what E1.17 means, nor how that relates to the protocol or the device. In other words, why would anyone care that the packet is identified as E1.17?

    The values of 72 6e as protocol "length" suggests a non-obvious encoding of length. Again, a protocol doc would explain how length is encoded, so someone could simply read it rather than posting queries.

    I can't tell what RLP or Protocol PDU means. No idea what 1.31 means either.

    What's a SID, and why does a sender need one?
     
  21. blubyu thread starter macrumors member

    Joined:
    Feb 10, 2010
    #21
    chown33, thank you for taking the time to point out the lack of information in my posts. I am trying to give you every thing I can :) I will try and answer all of your questions


    To me Hex is just a shorthand form of binary. I get what you are saying though so I will try and use binary in any of my future posts

    The Hex string came to be by reading the docs that I will share later in this post. It details the 638 byte packet that I need to send. It was fairly simple to go through it and craft the first packet that I need to send in order to simply test if my code could actually send the packet so that my receiver could see it. Nothing more nothing less.

    This I did not know. Please remember that I am very new at this (which should be obvious), so I do not have a very firm grasp on Objective-C.


    The manufacture, model number, country, year doesn't matter. The spec for the protocol that I am using defines a sender (my software) and a receiver (a hardware interface that connects your christmas lights to the network). As long as the lights are supported by the receiver then they will work. My software doesn't have to care what lights are connected.

    Here is the URL to the pdf file. When I first viewed it I had to sign in using an email address. I don't know if this direct URL will make you do the same or not.

    Link to specs


    See link above

    That is what it is called in the spec


    Yes

    E1.17 is the ANSI standard for controlling Entertainment Technology - Architecture for Control Networks (ACN). The E1.31 protocol is sending E1.17 over ethernet.

    The spec document can explain this far better then I can :)

    Again, thank you for taking the time to respond to all my posts. I really do appreciate it.
     
  22. firewood macrumors 604

    Joined:
    Jul 29, 2003
    Location:
    Silicon Valley
    #22
    Incorrect. You are converting each character from its hexidecimal representation to 4 bits of binary data. "From" not "to".
     
  23. blubyu thread starter macrumors member

    Joined:
    Feb 10, 2010
    #23
    Thank you for this. In my head I was seeing what it was doing but I obviously wasn't describing it correctly.
     

Share This Page