Binary & Xcode

Discussion in 'iOS Programming' started by americanGTA, Aug 13, 2013.

  1. americanGTA macrumors member

    Joined:
    Sep 25, 2011
    #1
    Do you need to learn binary if you want to make iPhone apps using Xcode?
     
  2. Tander macrumors 6502a

    Tander

    Joined:
    Oct 21, 2011
    Location:
    Johannesburg, South Africa
    #2
    No you don't.

    You will need to learn some C and a lot of Objective-C though.
     
  3. ArtOfWarfare macrumors 604

    ArtOfWarfare

    Joined:
    Nov 26, 2007
    #3
    0
     
  4. Tander macrumors 6502a

    Tander

    Joined:
    Oct 21, 2011
    Location:
    Johannesburg, South Africa
    #4
    Haha - I see what you did there, nice! :D
     
  5. xArtx macrumors 6502a

    Joined:
    Mar 30, 2012
    #6
    No, but you will have to learn to start counting from zero instead of one.
    and in the end, you will need to understand both hex and binary to go anywhere.
     
  6. firewood macrumors 604

    Joined:
    Jul 29, 2003
    Location:
    Silicon Valley
    #7
    If you don't know some binary, you will have problems (gnarly bugs) with basic logic correctness, maybe arithmetic, and figuring where any of your data will fit (types) without corruption or worse.
     
  7. xArtx macrumors 6502a

    Joined:
    Mar 30, 2012
    #8
    I think it would be better if it's the program you're writing that pushes you to
    learn something like that, than going out with zero motivation to learn something you're not interested in at the time.
     
  8. Duncan C macrumors 6502a

    Duncan C

    Joined:
    Jan 21, 2008
    Location:
    Northern Virginia
    #9
    You can get started without knowing how binary works, but if you get far enough into it you really ought to understand how computers represent numbers.

    It's like playing guitar. You can play guitar by using tabs, and don't need to read music, but at some point not knowing how to read music is going to hold you back. Same with understanding binary (and hexadecimal as well)
     
  9. xArtx macrumors 6502a

    Joined:
    Mar 30, 2012
    #10
    It seems kind of sad doesn't it? They are still correct.
    Because 20-30 years ago, the answer is absolutely different.
     
  10. ArtOfWarfare macrumors 604

    ArtOfWarfare

    Joined:
    Nov 26, 2007
    #11
    ... I'm confused why you all say knowing binary and/or hex is ever necessary. You may not understand exactly how/why or'ing flags together works, but other than that I can't think of any common reasons binary/hex come up.

    Knowing how various data types are implemented also seems like it's rarely important.

    If I make a language, I'm planning on removing any remote reason (other than something along the lines of you're making a decimal <-> binary conversion application,) to know binary or how any data types are implemented (the emphasis would be on reading/writing the code quickly, not necessarily having it compile/run quickly.) I'm planning on simply not having the bitwise operators.
     
  11. TouchMint.com macrumors 68000

    TouchMint.com

    Joined:
    May 25, 2012
    Location:
    Phoenix
    #12
    Binary might be a stretch unless you are doing some crazy stuff but hex is used in the design world so if you are creating certain animations for games and what not you might be working with those characters.
     
  12. ArtOfWarfare macrumors 604

    ArtOfWarfare

    Joined:
    Nov 26, 2007
    #13
    That sounds pretty specialized, to me. I've made games without touching hex.
     
  13. subsonix macrumors 68040

    Joined:
    Feb 2, 2008
    #14
    You need to know binary if you ever plan on using the bitwise operators in C, using bit fields, or bit vectors. It's typically something you mostly only see in lower level frameworks though, and not something you need to know to get started, at all. But in any event it's pretty easy to learn if you should want to.
     
  14. xArtx, Aug 15, 2013
    Last edited: Aug 15, 2013

    xArtx macrumors 6502a

    Joined:
    Mar 30, 2012
    #15
    It depends on whether or not the answer is limited to this platform or era,
    which for the intention of this thread, it is.

    There are plenty of applications today that would certainly require knowledge
    of how to convert between all bases.

    iOS does a lot for you. You wouldn't wonder why if you had to write the routine
    that scans the keyboard you're typing on for button presses.

    What would be the most efficient method of storing and accessing a large monochrome bitmap?
    or what would the screen buffer look like for a device that had a monochrome display (i.e.. LCD)?
    How would data driven PWM work?
    What if you had to look at a row of buttons read from a single port,
    or access serial or parallel ports with your own libraries?
    How would you go about writing your own cipher routines?

    It sounds like the kind of language that is of use only for very high level platforms,
    but that's not the beginning and the end, even in the modern world.
     
  15. lloyddean macrumors 6502a

    Joined:
    May 10, 2009
    Location:
    Des Moines, WA
    #16
    If you skip learning this simple base subject what else will you skip learning?
     
  16. ArtOfWarfare, Aug 15, 2013
    Last edited: Aug 15, 2013

    ArtOfWarfare macrumors 604

    ArtOfWarfare

    Joined:
    Nov 26, 2007
    #17
    Everything else that isn't necessary?

    Do you understand transistors? N doping and P doping? It's vital to having the computer work, but I'd bet extremely few programmers will ever face a task requiring them to understand it.

    Yes, the language would be very high level. I drew up a list of what seemed important to me and most of them could never be feasible in a low level language. My language, as I've currently written it, has very, very few reserved characters, and I've intentionally made it so that anyone can implement binary operators.

    Further, I don't see a need for more low level languages, nor myself as being well positioned to write one. C has been the indisputable champion of low level for quite a while and I don't expect that to change anytime soon, unless maybe Go can unseat it. I actually was inspired a lot by Go.
     
  17. xArtx macrumors 6502a

    Joined:
    Mar 30, 2012
    #18
    That again, depends on the platform, and/or era.
    But even in modern times (i.e.. right now) it still only depends on the platform.
    Even today, in proprietary devices where anything to conserve memory or storage is a go,
    fonts and non-colour dependent image data is still a binary array:
    [​IMG]
    But that's only one example, they are endless.

    asm is still pretty much a prerequisite for commercial applications involving microcontrollers.
    You can do anything in a high level language, but will require a more expensive chip.
    That's not the idea when you aim to produce tens of thousands of units.
     
  18. ArtOfWarfare macrumors 604

    ArtOfWarfare

    Joined:
    Nov 26, 2007
    #19
    This topic is about Xcode, though. Does anyone actually use Xcode for asm (I know you could, but does anyone actually?) I'd imagine you would use MPLAB or some other IDE if you're writing code for microcontrollers.
     
  19. xArtx macrumors 6502a

    Joined:
    Mar 30, 2012
    #20
    Yes :)
    You probably wouldn't begin a project for Xcode with asm in mind.
    but might pick up a project that is already C with inline assembler.
    (I am currently involved with one of those).

    Again, it depends on the platform, and I've been careful to qualify that every step of the way.
    You have mentioned writing a language. That doesn't necessarily tie the topic to Xcode.
    In fact, I can't imagine why anyone would write a language for use only with Apple devices
    other than to take more work out of the programmer's hands to come up with a product quicker.
     
  20. subsonix macrumors 68040

    Joined:
    Feb 2, 2008
    #21
    In my oppinion it's like having a toolbox and ignoring some of your available tools, and not ignoring using them, that's ok, but ignoring to learn them all together. Why do that, if you do not know them then you are in no position to decide if they are useful or not. (I'm referring to the C bitwise operators, bitmaps and so on here btw because it's not like binary numbers are used directly, it's just that it requires an understanding of how binary numbers work).

    However, the correct answer to the OP is no, because it's in no way a requirement to start using Xcode and learning Obj-C and Cocoa.
     
  21. lloyddean macrumors 6502a

    Joined:
    May 10, 2009
    Location:
    Des Moines, WA
    #22
    Well actually yes although its been awhile, about 42 years I'd say, as I started from the electronics side of things. I've always found knowing how things work as well as when to use the knowledge to be very useful in some project or other.

    I'm not saying you need to know everything "NOW" but it sure helps to learn at some point.
     
  22. firewood macrumors 604

    Joined:
    Jul 29, 2003
    Location:
    Silicon Valley
    #23
    Long ago, when I worked with many many engineers and software types, I learned that the engineers who designed the best hardware knew a lot about the software that was to run on it, and that the programmers who wrote the best code know a lot about the hardware their code was running on and how to utilize it to best effect.

    So, "required", no. (For a C- grade.) But if you want to be really good, or one of the best...

    Keep learning what's under the basics.
     
  23. lloyddean macrumors 6502a

    Joined:
    May 10, 2009
    Location:
    Des Moines, WA
    #24
    Since you're somewhat reluctant to the subject I recommend the following book which makes the subject a little more fun and entertaining -

    "Bebop to the Boolean Boogie, Third Edition: An Unconventional Guide to Electronics"
     
  24. xArtx, Aug 17, 2013
    Last edited: Aug 18, 2013

    xArtx macrumors 6502a

    Joined:
    Mar 30, 2012
    #25
    Since the digits in that image up there are already an array of monural bits,
    it's small enough to store in a header where a byte is eight pixels.
    Then to get colour out of it, mask a full RGB pixel with ^ 0xFF.
    You could hear about that inverting a pixel, but might not really know why
    without understanding that toggles every bit.

     

Share This Page