Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.
I believe it's like math. Everyone can do it but you need to be taught in the right way. Many people are convinced that they can't be taught math or science or programming because they haven't had the right teacher who could engage them the right way. Some people find a certain style of book is just right for them - others need to have a more conversational kind of learning experience, and still other people learn other ways.

I agree. I was taught the very basics of programming by a teacher who went out of his way to make sure I understood the basics as a college freshman. If it weren't for that one-one-one help... I don't know where I would be these days.
 
That's almost like saying I can't see what C can do that I can't do in Assembly language. For me, the main value of objects is in code reuse. Using objects makes code far more reusable.

I am curious... Why you can't reuse functions? Don't get me wrong... I am just trying to learn what makes Objects more suitable to do that. I can't find it!
 
I am curious... Why you can't reuse functions? Don't get me wrong... I am just trying to learn what makes Objects more suitable to do that. I can't find it!

You can absolutely do it that way, but objects exist to help group related variables and functions together. If you find yourself writing a lot of functions that all take a struct parameter and manipulate it in some way, then chances are that those functions should all become methods on an object and the members of that struct should become ivars for that same object.
 
I started developing iOS apps 2 years ago with zero coding experience. I turn 54 in August.
 
C is the language you want to learn first.

I know this won't be a popular choice, but I disagee. I think some sort of assembly or machine language is the best first choice. It will give you some understanding of what is really going-on under the hood from the git-go.

I was lucky in the regard, as the first language I learned was IBM 1620 machine code. (NOT assembly language.) I had to punch instructions onto punch cards, one instruction per card. (The IBM 1620 was a decimal - actually - BCD - machine - so the instructions were just strings of decimal digits. A simple loader program read the cards. How did the loader program get loaded? You flipped some switches on the front-panel...)

After we learned machine code, then they let us learn Fortran. (C had just been invented, and nobody had heard of it.)

This was 1971, and I was in high school - obviously very lucky to be in a high school - one of a small handfull in the country - that had a computer.

In college, again, the first language we learned was MIX - a simulated machine with a simple assembly lanaguage. Then they let us learn PL/1. (Still nobody had heard of C. At least if you were running IBM hardware.)

I think it's tremendously useful to first understand how a computer works - and then later you can better appreciate higher-level languages.

20? When I started, you had to be 30 before they'd let you near a computer. Except in a few rare places, and I was lucky enough to be in one of those places.
 
I know this won't be a popular choice, but I disagee. I think some sort of assembly or machine language is the best first choice. It will give you some understanding of what is really going-on under the hood from the git-go.

I was lucky in the regard, as the first language I learned was IBM 1620 machine code. (NOT assembly language.) I had to punch instructions onto punch cards, one instruction per card. (The IBM 1620 was a decimal - actually - BCD - machine - so the instructions were just strings of decimal digits. A simple loader program read the cards. How did the loader program get loaded? You flipped some switches on the front-panel...)

After we learned machine code, then they let us learn Fortran. (C had just been invented, and nobody had heard of it.)

This was 1971, and I was in high school - obviously very lucky to be in a high school - one of a small handfull in the country - that had a computer.

In college, again, the first language we learned was MIX - a simulated machine with a simple assembly lanaguage. Then they let us learn PL/1. (Still nobody had heard of C. At least if you were running IBM hardware.)

I think it's tremendously useful to first understand how a computer works - and then later you can better appreciate higher-level languages.

20? When I started, you had to be 30 before they'd let you near a computer. Except in a few rare places, and I was lucky enough to be in one of those places.

I learned MIPS assembly using the MARS simulator earlier this year. While I appreciate knowing it and I feel it completes my understanding of how a computer works, I think knowing assembly is only slightly more useful than knowing how to make a half adder, or a full adder, or an AND gate, or any other digital circuit, when it comes to programming.

C is actually necessary. It's the common denominator of most modern programming languages. I don't think I've ever found looking at the assembled or compiled code even remotely useful in a real, non school, programming task.
 
IMO ASM of some sort is most useful in the real world in embedded environments. There are plenty of situations when you need to make use of every bit of processing power your microcontroller has.

B
 
Most people who can 'program' go a rubbish job of it - their code is utter rubbish - really inefficient etc. If you want to learn to program - not 'program', your best bet is to start of learning Python. It will teach all the basic data structures, what classes and functions are, recursion, iteration etc. You can also learn about object-orientated programming and modular programming in Python.

From there, these skills are easily transferrable to other languages like Objective-C - give it a day max before you have some kind of interactive iPhone app from scratch.

For me, I did start when I was 9 (now 19). But I didn't learn proper programming structure for my first few years. My first real language was Objective-C (other than stuff like JavaScript and PHP, which I don't really count), and I didn't find it too bad. But then I went to University this year, got forced to learn Python, as well as good programming structure. I wish I had known this stuff earlier, because it would have made the last 4 years so much easier for me.

Summary: There is no language where it is easier to pickup programming theory than Python. And best of all - it's pre-installed on Mac. Download something like TextWrangler as an editor - it's free.
 
No experience at ALL with coding.
I am 57.
Did RB, made a nice app that sold very well.
Moved to iOS with some cool ideas.
Meh.
Got bored with it, moving on.
Specialization is for insects.

So no, you are not. But don't be surprised if one day you look at it all and ask why.
 
Last edited by a moderator:
I'm 20 years old and feel like its too late...

Hello WitheIphone5.

I'm 40.

I'never did serious programming.
I don't have a clue of what the future will be for me in this area.

I just know this: I do want to learn programming and doing apps!

Apart from the related knowledge to this, it's all I need to know.

Best of Luck!
 
I'm a programmer with 30 years experience (I started when I was 8 :)

You do *not* want to start with C if your goal is to learn Objective-C. It will just get in your way.

You need to learn how to think methodically about problems before attempting to write anything very complex, and most iPhone apps, while they don't involve a lot of actual code, can be quite complex in how all the pieces of the frameworks go together.

I would highly recommend taking this course on Coursera.org: https://www.coursera.org/course/programdesign -- it will give you the understanding of how to write complex programs without getting hung up on the syntax of a particular language. Use what you learn in the course to bootstrap you into writing code in Objective-C, learn the fiddly bits (Objective-C has a lot of them, although they've slowly been stripping of the cruft in more recent versions of Xcode), and then move on to learning how to program using the iOS frameworks.
 
I highly recommend starting with C. If you want to be a good programmer, you need a solid foundation to build from. And C is incredibly solid. Even Objective-C requires knowledge of C. C is used pretty much everywhere and everything (not literally). Once you've got C nailed down, then shift to Objective-C.
 
I think the last two posts highlight that there is no "one good way" to learn objective C 2.0.

The two major books illustrate this too. Kochan takes the approach that Objective C can and should be learned as a first language. Hillegass teaches the fundamentals of C in a few pages.

The best way is the one that works for you.

B
 
I think the last two posts highlight that there is no "one good way" to learn objective C 2.0.

The two major books illustrate this too. Kochan takes the approach that Objective C can and should be learned as a first language. Hillegass teaches the fundamentals of C in a few pages.

The best way is the one that works for you.

B

Interestingly enough, I have both books. ;)
 
Which resonates better with you?

B

Both were useful for me as I was already a seasoned programmer before I learned from those books.

Starting with C is better for the long term. Almost every "popular" modern programming language is based off C in some way. And that is not likely to change anytime soon. A (somewhat) good analogy is like learning Latin roots (or Roman, Greek, etc.) instead of just one particular language. Latin roots perpetuate across many languages in the world. You can pretty much grasp the basic meanings of words in many languages without specifically learning each and every one. That doesn't mean it's enough. It just means your understanding of languages is a bit broader.

If you're just tinkering around short term, not really "serious" about programming as a career, then by all means learning whatever language you are interested in. But if you're thinking about this as a potential career or something "serious", then I highly recommend C.
 
Personally, for me, I found that knowing too much C was a hindrance to embracing OOP as my brain always wanted to go back to the non-OOP way of doing things.

I also recently ported some C++ code to make use of STL instead of homegrown classes and found lots of C style inefficiency in the code. (not my code, but code I have worked with for a long time.)

I agree that learning C is a necessity for anyone who intends to make a living writing code. I'm just not sure how deep one should really go before making the leap to some higher level object language.

B
 
A (somewhat) good analogy is like learning Latin roots (or Roman, Greek, etc.) instead of just one particular language. Latin roots perpetuate across many languages in the world. You can pretty much grasp the basic meanings of words in many languages without specifically learning each and every one.

I think this is a reason for NOT starting with C. I would never study Latin, a dead language, if I had the goal to learn any of languages still in use today.

You can reach the same level of knowledge simply by directly learning one of C "Descendants" and more directly reach your goal. For instance, I speak fluently Portuguese and Spanish. Never studied Latin, but I am able to read Italian, French, Catalan and even Romanian text without much effort.

I think a person that learns Java, for instance, would not have much problem to read C code either.
 
I have Zero programming/coding experience, no HTML, php, or java.

Then, I think you should go straight Obj-C.

In this thread, many recommended either Java or C first.

IMHO, you definitely NOT want to learn Java first. You wouldn't be able to use it in Mac (/iOS) programming, except for the OOP. Syntax- and logics-wise it differs a LOT from Obj-C / Cocoa - almost all major method names are different, for example (see e.g. count vs. length / size). While I do recommend Java for C# would-be programmers as those two languages (environments) are far closer to each other, with two languages this distinct, you would gain little from learning Java first as an Obj-C programmer.

C has a lot of stuff you simply won't need 99,99% of times. For example, string manipulation (strcpy etc.) belong here, whihc you won;t need in most cases. Heck, most Obj-C apps don't even need malloc() and the like. That is, while it's certainly good to know C, in your case, as a complete beginner, I wouldn't bother with it. It's rather hard to learn and understand - Obj-C is WAAAY easier to learn.

That is, start right with Obj-C. Get a beginner's book and follow its tutorials. AFTER you have a working knowledge of Obj-C should you even think of learning C so that you can also program access to libraries not having an Obj-C wrapper, but in no way the other way round. Again, do NOT start with plain C!




----------

I think this is a reason for NOT starting with C.

Indeed - many of C's unique functions isn't widely used in Mac / iOS programming. Why bother with learning them, then? It's indeed like learning Latin as a first language.

I think a person that learns Java, for instance, would not have much problem to read C code either.

I don't think so. Java is much-much easier to grasp (if one does understand OOP) and is, therefore, very hard for a Java programmer (without ever having programmed in, say, assembly) to understand what malloc(), strcpy(), strcmp(), heap vs. stack etc. is all about. This is true even with the latest Java versions where for example enums are already supported, making it possible to recognize them in C apps by a Java programmer.
 
Personally, for me, I found that knowing too much C was a hindrance to embracing OOP as my brain always wanted to go back to the non-OOP way of doing things.

Yup, this is a common problem.

HOWEVER! Knowing a procedural, machine- language like C indeed helps a LOT when learning OOP. Actually, this is how I teach OOP on both my Java and Obj-C (iOS) courses for people that know C.

Most OOP (including all the three major languages / classlibs I've mentioned: Java (incl. Android), Obj-C (Mac/iOS) and C# (Win)) books / tutorials are highly theoretical. What I've found is that directly saying "hey, look at how malloc()'ed structs look like on the heap! How do you think a global function could access their fields? Yes, via a pointer, typically passed as the first parameter to the global function. Why isn't there any pointer / reference passed to Obj-C / C# / Java methods, then? Because they're passed implicitly. This is how a method in Obj-C / Java / C# is translated to pseudo-machine C: <now, I show a function with a reference / pointer type as the first parameter, with the other parameters same as with the OOP original>. See? Everything you do in OOP can be emulated in C - albeit, in cases (e.g., polymorphism), in a very convoluted way" and so on - I explain polymorphism, inheritance, type compatibility during inheritance etc. similarly, explaining everything showing malloced structs on the heap. It's much-much easier to learn and understand OOP this way if one already knows for example C. No abstract "a car has wheels" or "a house has windows" crap - they're far harder to grasp.

This doesn't mean the OP should learn C as the first language, though. It's just that he should find somebody / a tutorial that explains OOP keeping the "behind-the-scenes" working in mind.
 
I'm surprised people here are equating C to latin. One is a dead language and the other is a language that is still very much in use all over the place. Kernels, device drivers, libraries, embedded applications, older applications still in active development... C is still one of the major players in those areas and always will be. You want that fancy high-level language to interface with a system library? That could mean having to write C or debug someone else's implementation. It's not going anywhere.

Plus, everything is always compared to C. Go on an interview and it's almost assumed you know C and are good with it. I'm shocked whenever I run into an interview candidate who doesn't know C well. It's a huge strike against them, even if C isn't the primary language they would be using most on the job.
 
1.But if you want to learn c++ for example, is better to go straight for c++ or start with C and progress to c++ ?

2. If someone want`s to learn web development it`s still better to learn c/c++ first for a better understanding ?

Is there any good known books for C/C++ ? (besides the c++ Stroustrup book)
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.