Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

The Guy

macrumors newbie
Original poster
Dec 18, 2010
9
0
Hey guys,

I've been wanting to learn to program for a while, but I can't decide which language to learn. I've seen arguments for C and for Python. Most people saying C is a best first language say it teaches you more about programming and computers. Most people saying Python is the best say that it is more powerful and more practical than lower level languages like C.
 

chown33

Moderator
Staff member
Aug 9, 2009
10,750
8,422
A sea of green
I see no question here. I see no background, either; you haven't said anything about what you already know about computers, programming, logic, etc.

What kind of programs do you want to write? If you had to prioritize the top 3 features of your choice, what would they be, in order?

From the meager information you've provided so far, I'd say it doesn't matter which one you do, as long as you do something. Try learning each one for 2 weeks, then pick the one you like better after 4 weeks.

Accept that you will make mistakes. It's intrinsic to all programming. Think, create, test, debug, fix; that's the basic iteration, and it applies to the meta-levels as well (the choosing of languages, features, architecture, etc.). The only sure way to avoid failure is to do nothing. Paradoxically, that's also the one way to ensure failure: when nothing happens, nothing happens.
 
Last edited:

SidBala

macrumors 6502a
Jun 27, 2010
533
0
I don't know what your overall objective is but I will recommend C anyway.

C++ more specifically. It teaches you all important concepts and it will give you a very good sense of how the hardware works.
 

Cromulent

macrumors 604
Oct 2, 2006
6,802
1,096
The Land of Hope and Glory
C++ more specifically. It teaches you all important concepts and it will give you a very good sense of how the hardware works.

Why do people keep saying this? C++ gives you no indication of how modern computers work.

C++ is a pretty high level of abstraction from the way the hardware works. Sure there are languages like Python that are much higher again but that doesn't make C or C++ any less high level. I mean what does C++ actually tell you at a hardware level? The register keyword? Not really showing you much about the real inner workings of the computer.

If you really want to understand how the hardware works get yourself an 8 bit embedded device, read the datasheet and do some assembly programming on it.
 

jiminaus

macrumors 65816
Dec 16, 2010
1,449
1
Sydney
If you really want to understand how the hardware works get yourself an 8 bit embedded device, read the datasheet and do some assembly programming on it.

Or do some retro computing, like I'm prone to doing.

This afternoon I was playing with an Altair 8800 on SIMH. I read the following valuable advise.
Care must be taken not to POKE data into the storage area occupied by Altair BASIC or the system may be POKEd to death, and BASIC will have to be loaded again.

-- Altair 8800 basic reference manual, MITS 1977, p. 27

It tickled every nerd funny bone in my being! :D
 

firewood

macrumors G3
Jul 29, 2003
8,108
1,345
Silicon Valley
Learn both, starting with Python.

Both C and Python have something to offer a beginning programmer, and the variety will more quickly give you a broader understanding of programming (as opposed to thinking the syntax and tricks of only one language is the only way to do things).

Start with Python because there are less details you will have to memorize for your first few programs to work.
 

SidBala

macrumors 6502a
Jun 27, 2010
533
0
Why do people keep saying this? C++ gives you no indication of how modern computers work.

C++ is a pretty high level of abstraction from the way the hardware works. Sure there are languages like Python that are much higher again but that doesn't make C or C++ any less high level. I mean what does C++ actually tell you at a hardware level? The register keyword? Not really showing you much about the real inner workings of the computer.

If you really want to understand how the hardware works get yourself an 8 bit embedded device, read the datasheet and do some assembly programming on it.

I would have to disagree. Coding in C/C++ does give you a sense of how the hardware works. I have done a lot of c/c++ programming and a bit of assembly too. Certainly the memory management give you a sense of how the stack and the heap work. Function calls and return values directly map to operations on the stack. More over, the performance of the code written in c/c++ very nearly matches similar code written in assembly. The fact that you can write code that you know will behave a certain way on certain hardware itself gives you a good sense of what the hardware is doing.

Compare all this with a bytecode or interpreted language. Everything is done over an abstraction layer which really prevents the coder from knowing what the real hardware is doing. That is a great thing for actual coding, but not so much for learning.
 

KnightWRX

macrumors Pentium
Jan 28, 2009
15,046
4
Quebec, Canada
I would have to disagree. Coding in C/C++ does give you a sense of how the hardware works. I have done a lot of c/c++ programming and a bit of assembly too. Certainly the memory management give you a sense of how the stack and the heap work. Function calls and return values directly map to operations on the stack.

But how does that give you a sense of how the hardware works ? For the average C programmer, calling a function and storing its return value doesn't actually require him to do anything differently than in let's say Perl or any other other interpreted language. He calls the function by its name and stores the return value in a variable.

Heck, even memory management doesn't quite tell you how the stack and heap function and what is the difference between both. The concept is abstracted so that it's all dynamic or static allocation depending on the data type. For all we know, the compiler doesn't even have to use a heap and stack paradigm. The underlying hardware either.

In fact, the ANSI C standard makes sure to abstract of all that so that the language will work across varying hardware. Even I/O is abstracted, notice how there's no direct manipulation of input/output devices in C. You get 3 standard buffers, stdin, stdout and stderr and you read from them/write to them. You cannot control things like echoing chars or not, reading character by character directly as it is inputed, etc.. because all of those are hardware dependant. Same for output. There's no color, there's no graphics, there's nothing but strings of characters stored in a buffer and sent to the output device in either the output or the error buffer.

ANSI C is quite detached from the hardware. Now if you want to argue that many hardware manipulation libraries are written in C or that it has the "register" keyword (which the standard says the compiler is free to ignore), then yes, C is can teach you a thing or two about the underlying hardware. Otherwise, it's just a high-level language.
 

balamw

Moderator emeritus
Aug 16, 2005
19,366
979
New England
Most people saying C is a best first language say it teaches you more about programming and computers.

Just to bring it back to the OP's question.

I would argue that the main reason C is a good language to learn (and "teaches you more about programming and computers") is similar to why it is good to learn Latin, many other languages derive from it so knowing C can lead you to other things more easily. In that way it does give you a better "foundation" than python.

However, python or another higher level, C derived, languages can give you standard tools to facilitate many tasks that may be difficult in straight C.

e.g. if learning C++, a new programmer should probably learn to use STL early on instead of spending all of their initial learning time duplicating what is already implemented in STL, and I'm still not convinced that spending a lot of time learning how to use do console I/O using printf and scanf is really useful if your goal is ultimately to develop mostly GUI apps.

Ultimately, I agree with chown33, your choice will depend on what it is that you want to do with your code. Some things will be easier in python, others may be easier in C or a C derived language like Objective-C, or C++, ...

B
 

KnightWRX

macrumors Pentium
Jan 28, 2009
15,046
4
Quebec, Canada
and I'm still not convinced that spending a lot of time learning how to use do console I/O using printf and scanf is really useful if your goal is ultimately to develop mostly GUI apps.

Actually, if you're spending a lot of time learning how to use console I/O, you should probably get out of the field all together. The point of not going GUI to learn programming is that it is much quicker to simply printf()/fgets() (I hate scanf()) to make a quick interface and get input to your program than it is to learn the metaphors behind a GUI toolkit and get something displayed on screen, much less user input back to your program.

You then can spend a lot more time actually learning the language rather than learning a GUI library. Once you have the language down, it's time to implement GUIs.

Just take Win32. Let's say you're trying to learn C on Windows to write Win32 programs (aside from being a masochist), to get to a point where you can display something on screen using a simple MessageBox(), you'll have to learn what is a Window Handle, how to jam enumerator types in a single value using an OR operations and how to check return values for simple "button" input (like YES or NO or Cancel on your MessageBox()). InputBox() ? Not a Win32 function. So now to input some numbers or strings, you have to learn all about creating a custom dialog, the message loop, receiving and processing Windows messages, the structure of a Windows message... etc..

By the time you've learned all of that (Petzold is quite the large volume), you failed to learn about looping, memory management, data types, conditional statements, code blocks, etc.. You didn't actually learn C at all or you got so confused making head or tails of Petzold that you quit altogether.

GUI toolkits are most often complicated beasts (even the simple ones like GTK+ or nCurses), and that is why most people suggest just using the ANSI standard printf()/fgets() functions to actually learn the C language before moving on to GUIs.
 

jiminaus

macrumors 65816
Dec 16, 2010
1,449
1
Sydney
Actually, if you're spending a lot of time learning how to use console I/O, you should probably get out of the field all together. The point of not going GUI to learn programming is that it is much quicker to simply printf()/fgets() (I hate scanf()) to make a quick interface and get input to your program than it is to learn the metaphors behind a GUI toolkit and get something displayed on screen, much less user input back to your program.

You then can spend a lot more time actually learning the language rather than learning a GUI library. Once you have the language down, it's time to implement GUIs.

It depends on the audience and their objectives.

I've seen classes of Bachelor of Business students learn to program quite successfully using Visual Basic 6, and they started off learning GUIs. I think your argument is why not to start learning to program by learning Win32 programming, not an argument why not to start learning programming by creating GUIs. If we'd started the business students off with the command-line, the majority would have switched off. The fact that they could create a simple GUI program by drag'n'drop by the end of the first class seemed to excite them. And my the end, once they were comfortable and confident with the idea of program, they knew looping, conditionals, data types, and even object-orientation (well, VB6's style of OOP).

But this is a class of Business students taking a programming minor. A class of Computer Science students, though, is a different story. CS students should know all the minute details. In my CS degree we did indeed start with MIPS assembly before moving on to Ada. Unfortunately they changed it and now they start and stick with Java, and assembly language programming has been stuffed into a computer architecture course. :(
 

KnightWRX

macrumors Pentium
Jan 28, 2009
15,046
4
Quebec, Canada
I've seen classes of Bachelor of Business students learn to program quite successfully using Visual Basic 6

I actually hesitated to talk about Visual Basic in my post. My 2nd college semester, we had just come off of a pure C curriculum in the 1st semester and they "taught us GUIs" using Visual Basic. The old pre-.NET Visual Basic was as much out of the way as you could get. There was just simply no "GUI metaphor" to learn from it. Drag and drop controls, double-click it to set an "action". You very much spent all your time actually doing Basic and learning the control structures, blocks, conditionals, looping and storage (On top of it, it did have an InputBox() function compared to plain Win32....), which in our case was useless, as we had learned all that in 1st semester. The easiest programming class I ever took, especially considering how limited the Basic language is compared to C. About the only concept we learned there was sharing code between events (using Globals... err.. thanks but I could have figured that one out in 5 minutes... don't need 45 hours of class for it).

However, even with the new Visual Studio tools, learning to code GUIs in Windows still requires a lot of "learning the GUI metaphors". On OS X or Linux or Unix ? It's the same. No GUI toolkit has that "out of the way" feel that Visual Basic brought in those days (no, I didn't try Real Basic on Mac, I can't comment there).

Even Interface Builder requires that you actually know what you are doing on a certain level and requires learning a GUI metaphor for connecting classes and the controls you dragged to your view. It requires learning about Outlets and Actions, how to properly define those in your code, extend a view controller (so you're learning inherentance before conditionals and looping ?), etc..

But that is besides the point since we're on a Mac forum and discussing learning C or Python as a first language. I guess you were just nitpicking and yes, your nitpick was justified. But I still stand by my point. If you don't know the basics (pun not intended) of programming, it's better to stick to printf()/fgets() while learning the ropes before getting into more complicated I/O scenarios.
 

balamw

Moderator emeritus
Aug 16, 2005
19,366
979
New England
I think your argument is why not to start learning to program by learning Win32 programming, not an argument why not to start learning programming by creating GUIs.

Exactly.

Referring to Petzold dates you KnightWRX, and the fact I get the reference dates me. Today the beginning Windows programmer would be working with .NET managed code and probably C# instead of C or C++.

In fact many C# versions of "Hello World" are form or dialog based, because that is what the audience expects an application to be, and it's dead simple. (Unlike Petzold era c/Win32 code.) Those who want to go deeper can learn how to use the console, but they don't have to to build useful tools in C# or VBA.

B
 

KnightWRX

macrumors Pentium
Jan 28, 2009
15,046
4
Quebec, Canada
Exactly.

Referring to Petzold dates you KnightWRX, and the fact I get the reference dates me. Today the beginning Windows programmer would be working with .NET managed code and probably C# instead of C or C++.

In fact many C# versions of "Hello World" are form or dialog based, because that is what the audience expects an application to be, and it's dead simple. (Unlike Petzold era c/Win32 code.) Those who want to go deeper can learn how to use the console, but they don't have to to build useful tools in C# or VBA.

B

Ok, so I used the most complicated GUI to program there is around to prove a point :D (though pure Xlib is probably as hellish, I should give that a spin someday for kicks, thank god GTK+ was around when I switched to Linux in the late 90s).

If anything though, I'd suggest using nCurses if you really want a "GUI". You can get some colors printed out in quite a simple manner if only to at least make your text interface look better than just gray on black, plus you get access to the great getch() :

Code:
$ cat main.m
#import <Foundation/Foundation.h>
#include <stdlib.h>
#include <ncurses.h>

int main(int argc, char ** argv)
{
	initscr();
	start_color();

	if(has_colors())
	{
		init_pair(1, COLOR_CYAN, COLOR_BLACK);
		attron(A_BOLD | COLOR_PAIR(1));
		printw("Hello World!");	
		attroff(A_BOLD | COLOR_PAIR(1));
		refresh();
	}
	getch();
	endwin();	
	
	return EXIT_SUCCESS;
}
 

subsonix

macrumors 68040
Feb 2, 2008
3,551
79
Why pick curses if your end goal is to write OS X GUI applications? Interface builder shares nothing, philosophically or conceptually with curses. And as a consequence, what you have picked up from curses will have little bearing on what your about to tackle in IB.
 

Jaimi

macrumors regular
Jul 22, 2009
135
2
Hey guys,

I've been wanting to learn to program for a while, but I can't decide which language to learn. I've seen arguments for C and for Python. Most people saying C is a best first language say it teaches you more about programming and computers. Most people saying Python is the best say that it is more powerful and more practical than lower level languages like C.

C has been nearly abandoned for over a decade. C++ is the language of choice for systems programmers, thought it can be quite difficult to learn the nuances. C# is the language of choice for business programmers on Windows, and somewhat on other platforms (through mono). It's similar to Java, another good choice, but more powerful.

I would recommend to start with C#. It's easy, it's fast, well supported, and a lot of people use it. There are a ton of resources for it as well.

If you are developing just for Mac, then you're kind of stuck with Objective-C.
 

balamw

Moderator emeritus
Aug 16, 2005
19,366
979
New England
Ok, so I used the most complicated GUI to program there is around to prove a point :D

I think the point I was trying to make is that to many who have grown up in the mouse and GUI age, it is the console metaphor that is completely foreign. Thus somehow bringing in the GUI early on even in the form of a dialog box like this: http://msdn.microsoft.com/en-us/library/aa984463(v=vs.71).aspx keeps it relatable. Focusing on console I/O makes it seem academic rather than practical and makes the goal seem distant.

It's just like for those of us growing up in the VT100 days, many of the things that had arisen around punchcards (like FORTRAN's funky column restrictions http://en.wikipedia.org/wiki/Fortran#FORTRAN) were just alien concepts.

This is one of the reasons I like Stevenson's "Cocoa and Objective-C Up and Running" book. it cuts to the chase quickly, though surely if you want to be serious about it you need to go back and fill in with something like Hillegass or Kochan.

That said, learning is a very personal thing, and what works for one person won't necessarily work for someone else.

B
 

Rodimus Prime

macrumors G4
Oct 9, 2006
10,136
4
personally I would say C is a good starting language to get down the basics. That or VB.

The gui part of programing is pretty limited on what you really will be doing in coding and honestly I find coding guis more annoying that anything else as they often times can be very time consuming fine turning it and really have very little real meat to them in terms of what you are really trying to figure out how to do in you code. Hence the reason why most people just used some type of IDE to generated gui code for them.

Back to the original point. I say C is the best to get started in as most of the major programing languages out there used in industry are some type of C based language. Once you have the basics of C programing down then you can jump ship and figure out which one you like the base.

For me I know java a heck of a lot better than C. I have the basics of C down then started learning Java and I started doing more Object orientated programing.

C will teach you the basics of function calls, methods, loops ect. After that it just takes time to learn things.
In one semester of school I was programing in Java, C, and C# for different classes and really did not find it that hard to jump between them worse part was doing Java calls in C and C calls in Java but that was a syntax issue and an easy one at that to fix.
python from my understanding is pretty different than C and does not transfer as well into other common languages and it is more of a scripting language than a programing one. That is just my limited understanding of it and I never have used it. Had a room mate who bitch to hi heaven about it when a lab partner programmed their project in it and he had to make it work with his C code.
 

LordCalvert

macrumors newbie
Apr 24, 2011
1
0
Fortran 2008

Fortran 2008 is an excellent programming language for a beginner. It is a modern, high-level, object oriented language which has a much more natural syntax than any C type language. Also, it doesn't give you nearly as much rope with which to hang yourself as C does. Some references can be found here and here.
 

ulbador

macrumors 68000
Feb 11, 2010
1,554
0
People always recommend C as a good foundation. As someone above mentioned, it's like learning Latin as a precursor to learning other languages.

While you are busy beating your head against the wall trying to figure out basic memory management in C, someone else would be diving into C#, Java or any number of higher level/managed or scripting languages such as PHP, Python or Ruby and actually accomplishing something that will keep them engaged.

I mean, come on, just compare:

Code:
char str[15];
strcpy(str, "Hello world!");

To

Code:
$str = "Hello World!";

or 

String str = "Hello world!";

In the first example, if you change the text to "Hello world, my name is Joe the hotdog maker!", you are going to have weird things happen. And while the simple assignment may not be the best example, when you start digging into changing, replacing or copying a string, it's a whole other issue. It becomes pretty clear, especially to a incipient software engineer, that it would be much better to learn the core concepts instead of the language idiosyncrasies.

It would honestly be best to use whatever programming language that can keep you interested in learning the basics: loops, variables, functions, etc. I was talking to my brother today, and we got to talking about how many newer programmers don't even understand the use for a simple loop.

Once you understand these core concepts, you can almost pick up any language you want; it's mostly just syntactic differences.
 
Last edited:

firewood

macrumors G3
Jul 29, 2003
8,108
1,345
Silicon Valley
C has been nearly abandoned for over a decade.

Pure nonsense, given that there are 100X more embedded systems than windows PC currently running your life, most likely programmed in C. (The typical windows laptop also has about a half dozen embedded processors in the box keeping it going. Sorry, no C# there.)
 

balamw

Moderator emeritus
Aug 16, 2005
19,366
979
New England
It would honestly be best to use whatever programming language that can keep you interested in learning the basics: loops, variables, functions, etc. I was talking to my brother today, and we got to talking about how many newer programmers don't even understand the use for a simple loop.

And, as chown33 pointed out in the first reply, what will keep you interested in learning will depend heavily on what you actually want to do. As several have already pointed out, if you ultimately want to play with embedded processors or want to move on to Objective C, stick with C.

One of my favorite programming environments is Agilent VEE. It's kid of like putting together your program with Lego bricks and is focused on the flow of data between the blocks. This allows it to do things like parallel execution of threads, etc... without being explicitly told to do so. It also does away with many loops by dealing with vectors and matrices in single functions.

B
 

KnightWRX

macrumors Pentium
Jan 28, 2009
15,046
4
Quebec, Canada
While you are busy beating your head against the wall trying to figure out basic memory management in C, someone else would be diving into C#, Java or any number of higher level/managed or scripting languages such as PHP, Python or Ruby and actually accomplishing something that will keep them engaged.

I mean, come on, just compare:

Code:
char str[15];
strcpy(str, "Hello world!");

To

Code:
$str = "Hello World!";

or 

String str = "Hello world!";

Wait, what's wrong with :

Code:
char str[] = "Hello world!";

Which works the same way and avoids the pitfall you mentionned ? Maybe you should go back to learning C before you try to make examples showing how it's "lesser" than other languages. ;) Do hashes next! :D

C lacks a String type and an hash type. That's hardly a showstopper to learning the language. Also, no one is asking someone to learn all the intricate details of C and of the ANSI C library before jumping to another language. C is a fine language to learn the basics of programming that will carry over other languages like Java, Objective-C, C++, etc..

- looping
- conditionals
- blocks
- variable scopes
- data types
- arrays
- functions, their arguments, returns
- pointers and their management

All of those are used in other languages and learning them in C means you're syntax away from doing them in other languages just fine. And if is an if, it's just how you place paranthesis and define the block under it that changes. And for languages like Java and heck even Perl that borrow a lot of syntax from C, it will look almost the same.

What you say C makes hard (lack of certain complex types) makes C the best platform to learn. Let's say our new guy learns right in Objective-C, having NSString and NSDictionary to manage all of his string and hashing needs. What happens when he goes to Java ? [string isEqualToString: otherstring]; isn't there. He needs to relearn it all. If instead he went the C route and learned what a string really is to the computer (an array with some kind of terminating character) he knows that the "String" data type is not a data type, but usually a complex object that is provided by the framework. He's not under the impression that it works the same accross languages/platforms.

Pure nonsense, given that there are 100X more embedded systems than windows PC currently running your life, most likely programmed in C. (The typical windows laptop also has about a half dozen embedded processors in the box keeping it going. Sorry, no C# there.)

Not to mention most system software. On your typical Unix server, most everything running in the background besides the application is written in C. The clustering software, the RDBMS, the Web server, heck the Kernel and all its drivers... C is still very much alive.
 
Last edited:
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.