Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
No, but you will have to learn to start counting from zero instead of one.
and in the end, you will need to understand both hex and binary to go anywhere.
 
If you don't know some binary, you will have problems (gnarly bugs) with basic logic correctness, maybe arithmetic, and figuring where any of your data will fit (types) without corruption or worse.
 
I think it would be better if it's the program you're writing that pushes you to
learn something like that, than going out with zero motivation to learn something you're not interested in at the time.
 
Do you need to learn binary if you want to make iPhone apps using Xcode?

You can get started without knowing how binary works, but if you get far enough into it you really ought to understand how computers represent numbers.

It's like playing guitar. You can play guitar by using tabs, and don't need to read music, but at some point not knowing how to read music is going to hold you back. Same with understanding binary (and hexadecimal as well)
 
You can get started without knowing how binary works, but if you get far enough into it you really ought to understand how computers represent numbers.

It's like playing guitar. You can play guitar by using tabs, and don't need to read music, but at some point not knowing how to read music is going to hold you back. Same with understanding binary (and hexadecimal as well)

It seems kind of sad doesn't it? They are still correct.
Because 20-30 years ago, the answer is absolutely different.
 
... I'm confused why you all say knowing binary and/or hex is ever necessary. You may not understand exactly how/why or'ing flags together works, but other than that I can't think of any common reasons binary/hex come up.

Knowing how various data types are implemented also seems like it's rarely important.

If I make a language, I'm planning on removing any remote reason (other than something along the lines of you're making a decimal <-> binary conversion application,) to know binary or how any data types are implemented (the emphasis would be on reading/writing the code quickly, not necessarily having it compile/run quickly.) I'm planning on simply not having the bitwise operators.
 
Binary might be a stretch unless you are doing some crazy stuff but hex is used in the design world so if you are creating certain animations for games and what not you might be working with those characters.
 
Binary might be a stretch unless you are doing some crazy stuff but hex is used in the design world so if you are creating certain animations for games and what not you might be working with those characters.

That sounds pretty specialized, to me. I've made games without touching hex.
 
You need to know binary if you ever plan on using the bitwise operators in C, using bit fields, or bit vectors. It's typically something you mostly only see in lower level frameworks though, and not something you need to know to get started, at all. But in any event it's pretty easy to learn if you should want to.
 
It depends on whether or not the answer is limited to this platform or era,
which for the intention of this thread, it is.

There are plenty of applications today that would certainly require knowledge
of how to convert between all bases.

iOS does a lot for you. You wouldn't wonder why if you had to write the routine
that scans the keyboard you're typing on for button presses.

If I make a language, I'm planning on removing any remote reason (other than something along the lines of you're making a decimal <-> binary conversion application,) to know binary or how any data types are implemented (the emphasis would be on reading/writing the code quickly, not necessarily having it compile/run quickly.) I'm planning on simply not having the bitwise operators.
What would be the most efficient method of storing and accessing a large monochrome bitmap?
or what would the screen buffer look like for a device that had a monochrome display (i.e.. LCD)?
How would data driven PWM work?
What if you had to look at a row of buttons read from a single port,
or access serial or parallel ports with your own libraries?
How would you go about writing your own cipher routines?

It sounds like the kind of language that is of use only for very high level platforms,
but that's not the beginning and the end, even in the modern world.
 
Last edited:
If you skip learning this simple base subject what else will you skip learning?

Everything else that isn't necessary?

Do you understand transistors? N doping and P doping? It's vital to having the computer work, but I'd bet extremely few programmers will ever face a task requiring them to understand it.

What would be the most efficient method of storing and accessing a large monochrome bitmap?
or what would the screen buffer look like for a device that had a monochrome display (i.e.. LCD)?
How would data driven PWM work?
What if you had to look at a row of buttons read from a single port,
or access serial or parallel ports with your own libraries?
How would you go about writing your own cipher routines?

It sounds like the kind of language that is of use only for very high level platforms,
but that's not the beginning and the end, even in the modern world.

Yes, the language would be very high level. I drew up a list of what seemed important to me and most of them could never be feasible in a low level language. My language, as I've currently written it, has very, very few reserved characters, and I've intentionally made it so that anyone can implement binary operators.

Further, I don't see a need for more low level languages, nor myself as being well positioned to write one. C has been the indisputable champion of low level for quite a while and I don't expect that to change anytime soon, unless maybe Go can unseat it. I actually was inspired a lot by Go.
 
Last edited:
Further, I don't see a need for more low level languages, nor myself as being well positioned to write one. C has been the indisputable champion of low level for quite a while and I don't expect that to change anytime soon, unless maybe Go can unseat it. I actually was inspired a lot by Go.

That again, depends on the platform, and/or era.
But even in modern times (i.e.. right now) it still only depends on the platform.
Even today, in proprietary devices where anything to conserve memory or storage is a go,
fonts and non-colour dependent image data is still a binary array:
BH4_SnapshotB.png

But that's only one example, they are endless.

asm is still pretty much a prerequisite for commercial applications involving microcontrollers.
You can do anything in a high level language, but will require a more expensive chip.
That's not the idea when you aim to produce tens of thousands of units.
 
That again, depends on the platform, and/or era.
But even in modern times (i.e.. right now) it still only depends on the platform.
Even today, in proprietary devices where anything to conserve memory or storage is a go,
fonts and non-colour dependent image data is still a binary array:
Image
But that's only one example, they are endless.

asm is still pretty much a prerequisite for commercial applications involving microcontrollers.
You can do anything in a high level language, but will require a more expensive chip.
That's not the idea when you aim to produce tens of thousands of units.

This topic is about Xcode, though. Does anyone actually use Xcode for asm (I know you could, but does anyone actually?) I'd imagine you would use MPLAB or some other IDE if you're writing code for microcontrollers.
 
This topic is about Xcode, though. Does anyone actually use Xcode for asm (I know you could, but does anyone actually?) I'd imagine you would use MPLAB or some other IDE if you're writing code for microcontrollers.

Yes :)
You probably wouldn't begin a project for Xcode with asm in mind.
but might pick up a project that is already C with inline assembler.
(I am currently involved with one of those).

Again, it depends on the platform, and I've been careful to qualify that every step of the way.
You have mentioned writing a language. That doesn't necessarily tie the topic to Xcode.
In fact, I can't imagine why anyone would write a language for use only with Apple devices
other than to take more work out of the programmer's hands to come up with a product quicker.
 
In my oppinion it's like having a toolbox and ignoring some of your available tools, and not ignoring using them, that's ok, but ignoring to learn them all together. Why do that, if you do not know them then you are in no position to decide if they are useful or not. (I'm referring to the C bitwise operators, bitmaps and so on here btw because it's not like binary numbers are used directly, it's just that it requires an understanding of how binary numbers work).

However, the correct answer to the OP is no, because it's in no way a requirement to start using Xcode and learning Obj-C and Cocoa.
 
Everything else that isn't necessary?

Do you understand transistors? N doping and P doping? It's vital to having the computer work, but I'd bet extremely few programmers will ever face a task requiring them to understand it.

Well actually yes although its been awhile, about 42 years I'd say, as I started from the electronics side of things. I've always found knowing how things work as well as when to use the knowledge to be very useful in some project or other.

I'm not saying you need to know everything "NOW" but it sure helps to learn at some point.
 
Do you understand transistors? N doping and P doping? It's vital to having the computer work, but I'd bet extremely few programmers will ever face a task requiring them to understand it.

Long ago, when I worked with many many engineers and software types, I learned that the engineers who designed the best hardware knew a lot about the software that was to run on it, and that the programmers who wrote the best code know a lot about the hardware their code was running on and how to utilize it to best effect.

So, "required", no. (For a C- grade.) But if you want to be really good, or one of the best...

Keep learning what's under the basics.
 
Since you're somewhat reluctant to the subject I recommend the following book which makes the subject a little more fun and entertaining -

"Bebop to the Boolean Boogie, Third Edition: An Unconventional Guide to Electronics"
 
Since the digits in that image up there are already an array of monural bits,
it's small enough to store in a header where a byte is eight pixels.
Then to get colour out of it, mask a full RGB pixel with ^ 0xFF.
You could hear about that inverting a pixel, but might not really know why
without understanding that toggles every bit.

 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.