PDA

View Full Version : First programming language: C, Python or something else




The Guy
Apr 24, 2011, 12:06 AM
Hey guys,

I've been wanting to learn to program for a while, but I can't decide which language to learn. I've seen arguments for C and for Python. Most people saying C is a best first language say it teaches you more about programming and computers. Most people saying Python is the best say that it is more powerful and more practical than lower level languages like C.



chown33
Apr 24, 2011, 01:12 AM
I see no question here. I see no background, either; you haven't said anything about what you already know about computers, programming, logic, etc.

What kind of programs do you want to write? If you had to prioritize the top 3 features of your choice, what would they be, in order?

From the meager information you've provided so far, I'd say it doesn't matter which one you do, as long as you do something. Try learning each one for 2 weeks, then pick the one you like better after 4 weeks.

Accept that you will make mistakes. It's intrinsic to all programming. Think, create, test, debug, fix; that's the basic iteration, and it applies to the meta-levels as well (the choosing of languages, features, architecture, etc.). The only sure way to avoid failure is to do nothing. Paradoxically, that's also the one way to ensure failure: when nothing happens, nothing happens.

SidBala
Apr 24, 2011, 04:05 AM
I don't know what your overall objective is but I will recommend C anyway.

C++ more specifically. It teaches you all important concepts and it will give you a very good sense of how the hardware works.

Cromulent
Apr 24, 2011, 04:27 AM
C++ more specifically. It teaches you all important concepts and it will give you a very good sense of how the hardware works.

Why do people keep saying this? C++ gives you no indication of how modern computers work.

C++ is a pretty high level of abstraction from the way the hardware works. Sure there are languages like Python that are much higher again but that doesn't make C or C++ any less high level. I mean what does C++ actually tell you at a hardware level? The register keyword? Not really showing you much about the real inner workings of the computer.

If you really want to understand how the hardware works get yourself an 8 bit embedded device, read the datasheet and do some assembly programming on it.

jiminaus
Apr 24, 2011, 05:12 AM
If you really want to understand how the hardware works get yourself an 8 bit embedded device, read the datasheet and do some assembly programming on it.

Or do some retro computing, like I'm prone to doing.

This afternoon I was playing with an Altair 8800 on SIMH. I read the following valuable advise.

Care must be taken not to POKE data into the storage area occupied by Altair BASIC or the system may be POKEd to death, and BASIC will have to be loaded again.

-- Altair 8800 basic reference manual, MITS 1977, p. 27


It tickled every nerd funny bone in my being! :D

firewood
Apr 24, 2011, 09:49 AM
Learn both, starting with Python.

Both C and Python have something to offer a beginning programmer, and the variety will more quickly give you a broader understanding of programming (as opposed to thinking the syntax and tricks of only one language is the only way to do things).

Start with Python because there are less details you will have to memorize for your first few programs to work.

SidBala
Apr 24, 2011, 04:14 PM
Why do people keep saying this? C++ gives you no indication of how modern computers work.

C++ is a pretty high level of abstraction from the way the hardware works. Sure there are languages like Python that are much higher again but that doesn't make C or C++ any less high level. I mean what does C++ actually tell you at a hardware level? The register keyword? Not really showing you much about the real inner workings of the computer.

If you really want to understand how the hardware works get yourself an 8 bit embedded device, read the datasheet and do some assembly programming on it.

I would have to disagree. Coding in C/C++ does give you a sense of how the hardware works. I have done a lot of c/c++ programming and a bit of assembly too. Certainly the memory management give you a sense of how the stack and the heap work. Function calls and return values directly map to operations on the stack. More over, the performance of the code written in c/c++ very nearly matches similar code written in assembly. The fact that you can write code that you know will behave a certain way on certain hardware itself gives you a good sense of what the hardware is doing.

Compare all this with a bytecode or interpreted language. Everything is done over an abstraction layer which really prevents the coder from knowing what the real hardware is doing. That is a great thing for actual coding, but not so much for learning.

KnightWRX
Apr 24, 2011, 05:03 PM
I would have to disagree. Coding in C/C++ does give you a sense of how the hardware works. I have done a lot of c/c++ programming and a bit of assembly too. Certainly the memory management give you a sense of how the stack and the heap work. Function calls and return values directly map to operations on the stack.

But how does that give you a sense of how the hardware works ? For the average C programmer, calling a function and storing its return value doesn't actually require him to do anything differently than in let's say Perl or any other other interpreted language. He calls the function by its name and stores the return value in a variable.

Heck, even memory management doesn't quite tell you how the stack and heap function and what is the difference between both. The concept is abstracted so that it's all dynamic or static allocation depending on the data type. For all we know, the compiler doesn't even have to use a heap and stack paradigm. The underlying hardware either.

In fact, the ANSI C standard makes sure to abstract of all that so that the language will work across varying hardware. Even I/O is abstracted, notice how there's no direct manipulation of input/output devices in C. You get 3 standard buffers, stdin, stdout and stderr and you read from them/write to them. You cannot control things like echoing chars or not, reading character by character directly as it is inputed, etc.. because all of those are hardware dependant. Same for output. There's no color, there's no graphics, there's nothing but strings of characters stored in a buffer and sent to the output device in either the output or the error buffer.

ANSI C is quite detached from the hardware. Now if you want to argue that many hardware manipulation libraries are written in C or that it has the "register" keyword (which the standard says the compiler is free to ignore), then yes, C is can teach you a thing or two about the underlying hardware. Otherwise, it's just a high-level language.

balamw
Apr 24, 2011, 05:44 PM
Most people saying C is a best first language say it teaches you more about programming and computers.

Just to bring it back to the OP's question.

I would argue that the main reason C is a good language to learn (and "teaches you more about programming and computers") is similar to why it is good to learn Latin, many other languages derive from it so knowing C can lead you to other things more easily. In that way it does give you a better "foundation" than python.

However, python or another higher level, C derived, languages can give you standard tools to facilitate many tasks that may be difficult in straight C.

e.g. if learning C++, a new programmer should probably learn to use STL early on instead of spending all of their initial learning time duplicating what is already implemented in STL, and I'm still not convinced that spending a lot of time learning how to use do console I/O using printf and scanf is really useful if your goal is ultimately to develop mostly GUI apps.

Ultimately, I agree with chown33, your choice will depend on what it is that you want to do with your code. Some things will be easier in python, others may be easier in C or a C derived language like Objective-C, or C++, ...

B

KnightWRX
Apr 24, 2011, 06:07 PM
and I'm still not convinced that spending a lot of time learning how to use do console I/O using printf and scanf is really useful if your goal is ultimately to develop mostly GUI apps.

Actually, if you're spending a lot of time learning how to use console I/O, you should probably get out of the field all together. The point of not going GUI to learn programming is that it is much quicker to simply printf()/fgets() (I hate scanf()) to make a quick interface and get input to your program than it is to learn the metaphors behind a GUI toolkit and get something displayed on screen, much less user input back to your program.

You then can spend a lot more time actually learning the language rather than learning a GUI library. Once you have the language down, it's time to implement GUIs.

Just take Win32. Let's say you're trying to learn C on Windows to write Win32 programs (aside from being a masochist), to get to a point where you can display something on screen using a simple MessageBox(), you'll have to learn what is a Window Handle, how to jam enumerator types in a single value using an OR operations and how to check return values for simple "button" input (like YES or NO or Cancel on your MessageBox()). InputBox() ? Not a Win32 function. So now to input some numbers or strings, you have to learn all about creating a custom dialog, the message loop, receiving and processing Windows messages, the structure of a Windows message... etc..

By the time you've learned all of that (Petzold is quite the large volume), you failed to learn about looping, memory management, data types, conditional statements, code blocks, etc.. You didn't actually learn C at all or you got so confused making head or tails of Petzold that you quit altogether.

GUI toolkits are most often complicated beasts (even the simple ones like GTK+ or nCurses), and that is why most people suggest just using the ANSI standard printf()/fgets() functions to actually learn the C language before moving on to GUIs.

jiminaus
Apr 24, 2011, 06:36 PM
Actually, if you're spending a lot of time learning how to use console I/O, you should probably get out of the field all together. The point of not going GUI to learn programming is that it is much quicker to simply printf()/fgets() (I hate scanf()) to make a quick interface and get input to your program than it is to learn the metaphors behind a GUI toolkit and get something displayed on screen, much less user input back to your program.

You then can spend a lot more time actually learning the language rather than learning a GUI library. Once you have the language down, it's time to implement GUIs.


It depends on the audience and their objectives.

I've seen classes of Bachelor of Business students learn to program quite successfully using Visual Basic 6, and they started off learning GUIs. I think your argument is why not to start learning to program by learning Win32 programming, not an argument why not to start learning programming by creating GUIs. If we'd started the business students off with the command-line, the majority would have switched off. The fact that they could create a simple GUI program by drag'n'drop by the end of the first class seemed to excite them. And my the end, once they were comfortable and confident with the idea of program, they knew looping, conditionals, data types, and even object-orientation (well, VB6's style of OOP).

But this is a class of Business students taking a programming minor. A class of Computer Science students, though, is a different story. CS students should know all the minute details. In my CS degree we did indeed start with MIPS assembly before moving on to Ada. Unfortunately they changed it and now they start and stick with Java, and assembly language programming has been stuffed into a computer architecture course. :(

KnightWRX
Apr 24, 2011, 07:23 PM
I've seen classes of Bachelor of Business students learn to program quite successfully using Visual Basic 6

I actually hesitated to talk about Visual Basic in my post. My 2nd college semester, we had just come off of a pure C curriculum in the 1st semester and they "taught us GUIs" using Visual Basic. The old pre-.NET Visual Basic was as much out of the way as you could get. There was just simply no "GUI metaphor" to learn from it. Drag and drop controls, double-click it to set an "action". You very much spent all your time actually doing Basic and learning the control structures, blocks, conditionals, looping and storage (On top of it, it did have an InputBox() function compared to plain Win32....), which in our case was useless, as we had learned all that in 1st semester. The easiest programming class I ever took, especially considering how limited the Basic language is compared to C. About the only concept we learned there was sharing code between events (using Globals... err.. thanks but I could have figured that one out in 5 minutes... don't need 45 hours of class for it).

However, even with the new Visual Studio tools, learning to code GUIs in Windows still requires a lot of "learning the GUI metaphors". On OS X or Linux or Unix ? It's the same. No GUI toolkit has that "out of the way" feel that Visual Basic brought in those days (no, I didn't try Real Basic on Mac, I can't comment there).

Even Interface Builder requires that you actually know what you are doing on a certain level and requires learning a GUI metaphor for connecting classes and the controls you dragged to your view. It requires learning about Outlets and Actions, how to properly define those in your code, extend a view controller (so you're learning inherentance before conditionals and looping ?), etc..

But that is besides the point since we're on a Mac forum and discussing learning C or Python as a first language. I guess you were just nitpicking and yes, your nitpick was justified. But I still stand by my point. If you don't know the basics (pun not intended) of programming, it's better to stick to printf()/fgets() while learning the ropes before getting into more complicated I/O scenarios.

balamw
Apr 24, 2011, 07:23 PM
I think your argument is why not to start learning to program by learning Win32 programming, not an argument why not to start learning programming by creating GUIs.

Exactly.

Referring to Petzold dates you KnightWRX, and the fact I get the reference dates me. Today the beginning Windows programmer would be working with .NET managed code and probably C# instead of C or C++.

In fact many C# versions of "Hello World" are form or dialog based, because that is what the audience expects an application to be, and it's dead simple. (Unlike Petzold era c/Win32 code.) Those who want to go deeper can learn how to use the console, but they don't have to to build useful tools in C# or VBA.

B

KnightWRX
Apr 24, 2011, 07:28 PM
Exactly.

Referring to Petzold dates you KnightWRX, and the fact I get the reference dates me. Today the beginning Windows programmer would be working with .NET managed code and probably C# instead of C or C++.

In fact many C# versions of "Hello World" are form or dialog based, because that is what the audience expects an application to be, and it's dead simple. (Unlike Petzold era c/Win32 code.) Those who want to go deeper can learn how to use the console, but they don't have to to build useful tools in C# or VBA.

B

Ok, so I used the most complicated GUI to program there is around to prove a point :D (though pure Xlib is probably as hellish, I should give that a spin someday for kicks, thank god GTK+ was around when I switched to Linux in the late 90s).

If anything though, I'd suggest using nCurses if you really want a "GUI". You can get some colors printed out in quite a simple manner if only to at least make your text interface look better than just gray on black, plus you get access to the great getch() :



$ cat main.m
#import <Foundation/Foundation.h>
#include <stdlib.h>
#include <ncurses.h>

int main(int argc, char ** argv)
{
initscr();
start_color();

if(has_colors())
{
init_pair(1, COLOR_CYAN, COLOR_BLACK);
attron(A_BOLD | COLOR_PAIR(1));
printw("Hello World!");
attroff(A_BOLD | COLOR_PAIR(1));
refresh();
}
getch();
endwin();

return EXIT_SUCCESS;
}

subsonix
Apr 24, 2011, 07:50 PM
Why pick curses if your end goal is to write OS X GUI applications? Interface builder shares nothing, philosophically or conceptually with curses. And as a consequence, what you have picked up from curses will have little bearing on what your about to tackle in IB.

Jaimi
Apr 24, 2011, 07:58 PM
Hey guys,

I've been wanting to learn to program for a while, but I can't decide which language to learn. I've seen arguments for C and for Python. Most people saying C is a best first language say it teaches you more about programming and computers. Most people saying Python is the best say that it is more powerful and more practical than lower level languages like C.

C has been nearly abandoned for over a decade. C++ is the language of choice for systems programmers, thought it can be quite difficult to learn the nuances. C# is the language of choice for business programmers on Windows, and somewhat on other platforms (through mono). It's similar to Java, another good choice, but more powerful.

I would recommend to start with C#. It's easy, it's fast, well supported, and a lot of people use it. There are a ton of resources for it as well.

If you are developing just for Mac, then you're kind of stuck with Objective-C.

balamw
Apr 24, 2011, 08:00 PM
Ok, so I used the most complicated GUI to program there is around to prove a point :D

I think the point I was trying to make is that to many who have grown up in the mouse and GUI age, it is the console metaphor that is completely foreign. Thus somehow bringing in the GUI early on even in the form of a dialog box like this: http://msdn.microsoft.com/en-us/library/aa984463(v=vs.71).aspx keeps it relatable. Focusing on console I/O makes it seem academic rather than practical and makes the goal seem distant.

It's just like for those of us growing up in the VT100 days, many of the things that had arisen around punchcards (like FORTRAN's funky column restrictions http://en.wikipedia.org/wiki/Fortran#FORTRAN) were just alien concepts.

This is one of the reasons I like Stevenson's "Cocoa and Objective-C Up and Running (http://www.amazon.com/Cocoa-Objective-C-Running-Scott-Stevenson/dp/0596804792)" book. it cuts to the chase quickly, though surely if you want to be serious about it you need to go back and fill in with something like Hillegass or Kochan.

That said, learning is a very personal thing, and what works for one person won't necessarily work for someone else.

B

Rodimus Prime
Apr 24, 2011, 08:23 PM
personally I would say C is a good starting language to get down the basics. That or VB.

The gui part of programing is pretty limited on what you really will be doing in coding and honestly I find coding guis more annoying that anything else as they often times can be very time consuming fine turning it and really have very little real meat to them in terms of what you are really trying to figure out how to do in you code. Hence the reason why most people just used some type of IDE to generated gui code for them.

Back to the original point. I say C is the best to get started in as most of the major programing languages out there used in industry are some type of C based language. Once you have the basics of C programing down then you can jump ship and figure out which one you like the base.

For me I know java a heck of a lot better than C. I have the basics of C down then started learning Java and I started doing more Object orientated programing.

C will teach you the basics of function calls, methods, loops ect. After that it just takes time to learn things.
In one semester of school I was programing in Java, C, and C# for different classes and really did not find it that hard to jump between them worse part was doing Java calls in C and C calls in Java but that was a syntax issue and an easy one at that to fix.
python from my understanding is pretty different than C and does not transfer as well into other common languages and it is more of a scripting language than a programing one. That is just my limited understanding of it and I never have used it. Had a room mate who bitch to hi heaven about it when a lab partner programmed their project in it and he had to make it work with his C code.

LordCalvert
Apr 24, 2011, 10:46 PM
Fortran 2008 is an excellent programming language for a beginner. It is a modern, high-level, object oriented language which has a much more natural syntax than any C type language. Also, it doesn't give you nearly as much rope with which to hang yourself (http://www.literateprogramming.com/ctraps.pdf) as C does. Some references can be found here (ftp://ftp.nag.co.uk/sc22wg5/N1701-N1750/N1729.pdf) and here (http://www.nag.co.uk/IndustryArticles/Fortran_Matters_Cohen.pdf).

ulbador
Apr 24, 2011, 11:16 PM
People always recommend C as a good foundation. As someone above mentioned, it's like learning Latin as a precursor to learning other languages.

While you are busy beating your head against the wall trying to figure out basic memory management in C, someone else would be diving into C#, Java or any number of higher level/managed or scripting languages such as PHP, Python or Ruby and actually accomplishing something that will keep them engaged.

I mean, come on, just compare:



char str[15];
strcpy(str, "Hello world!");



To


$str = "Hello World!";

or

String str = "Hello world!";


In the first example, if you change the text to "Hello world, my name is Joe the hotdog maker!", you are going to have weird things happen. And while the simple assignment may not be the best example, when you start digging into changing, replacing or copying a string, it's a whole other issue. It becomes pretty clear, especially to a incipient software engineer, that it would be much better to learn the core concepts instead of the language idiosyncrasies.

It would honestly be best to use whatever programming language that can keep you interested in learning the basics: loops, variables, functions, etc. I was talking to my brother today, and we got to talking about how many newer programmers don't even understand the use for a simple loop.

Once you understand these core concepts, you can almost pick up any language you want; it's mostly just syntactic differences.

Bill McEnaney
Apr 24, 2011, 11:51 PM
Here's a link to an online edition of an astoundingly readable book about Haskell, my favorite programming language (http://learnyouahaskell.com/). Here's one to an online book about Erlang (http://learnyousomeerlang.com/).

firewood
Apr 25, 2011, 01:31 AM
C has been nearly abandoned for over a decade.

Pure nonsense, given that there are 100X more embedded systems than windows PC currently running your life, most likely programmed in C. (The typical windows laptop also has about a half dozen embedded processors in the box keeping it going. Sorry, no C# there.)

MorphingDragon
Apr 25, 2011, 03:34 AM
That or VB.

I think you were meant to say Python. ;)

balamw
Apr 25, 2011, 06:18 AM
It would honestly be best to use whatever programming language that can keep you interested in learning the basics: loops, variables, functions, etc. I was talking to my brother today, and we got to talking about how many newer programmers don't even understand the use for a simple loop.

And, as chown33 pointed out in the first reply, what will keep you interested in learning will depend heavily on what you actually want to do. As several have already pointed out, if you ultimately want to play with embedded processors or want to move on to Objective C, stick with C.

One of my favorite programming environments is Agilent VEE (http://en.wikipedia.org/wiki/Agilent_VEE). It's kid of like putting together your program with Lego bricks and is focused on the flow of data between the blocks. This allows it to do things like parallel execution of threads, etc... without being explicitly told to do so. It also does away with many loops by dealing with vectors and matrices in single functions.

B

KnightWRX
Apr 25, 2011, 07:13 AM
While you are busy beating your head against the wall trying to figure out basic memory management in C, someone else would be diving into C#, Java or any number of higher level/managed or scripting languages such as PHP, Python or Ruby and actually accomplishing something that will keep them engaged.

I mean, come on, just compare:



char str[15];
strcpy(str, "Hello world!");



To


$str = "Hello World!";

or

String str = "Hello world!";


Wait, what's wrong with :


char str[] = "Hello world!";


Which works the same way and avoids the pitfall you mentionned ? Maybe you should go back to learning C before you try to make examples showing how it's "lesser" than other languages. ;) Do hashes next! :D

C lacks a String type and an hash type. That's hardly a showstopper to learning the language. Also, no one is asking someone to learn all the intricate details of C and of the ANSI C library before jumping to another language. C is a fine language to learn the basics of programming that will carry over other languages like Java, Objective-C, C++, etc..

- looping
- conditionals
- blocks
- variable scopes
- data types
- arrays
- functions, their arguments, returns
- pointers and their management

All of those are used in other languages and learning them in C means you're syntax away from doing them in other languages just fine. And if is an if, it's just how you place paranthesis and define the block under it that changes. And for languages like Java and heck even Perl that borrow a lot of syntax from C, it will look almost the same.

What you say C makes hard (lack of certain complex types) makes C the best platform to learn. Let's say our new guy learns right in Objective-C, having NSString and NSDictionary to manage all of his string and hashing needs. What happens when he goes to Java ? [string isEqualToString: otherstring]; isn't there. He needs to relearn it all. If instead he went the C route and learned what a string really is to the computer (an array with some kind of terminating character) he knows that the "String" data type is not a data type, but usually a complex object that is provided by the framework. He's not under the impression that it works the same accross languages/platforms.

Pure nonsense, given that there are 100X more embedded systems than windows PC currently running your life, most likely programmed in C. (The typical windows laptop also has about a half dozen embedded processors in the box keeping it going. Sorry, no C# there.)

Not to mention most system software. On your typical Unix server, most everything running in the background besides the application is written in C. The clustering software, the RDBMS, the Web server, heck the Kernel and all its drivers... C is still very much alive.

ehoui
Apr 25, 2011, 07:47 AM
Just choose one and start. Either is fine and there is no "right" answer here.

ulbador
Apr 25, 2011, 11:11 AM
Which works the same way and avoids the pitfall you mentionned ? Maybe you should go back to learning C before you try to make examples showing how it's "lesser" than other languages. ;) Do hashes next! :D


Congrats on missing the entire point of the example. Read the next paragraph.



What you say C makes hard (lack of certain complex types) makes C the best platform to learn. Let's say our new guy learns right in Objective-C, having NSString and NSDictionary to manage all of his string and hashing needs. What happens when he goes to Java ? [string isEqualToString: otherstring]; isn't there. He needs to relearn it all.



Again, it's not about learning exact way to do things in this language or that language. You are 100% correct that there is no "isEqualToString" in Java. At the same time, there is a ".equals()" method that accomplishes the same thing. The core concept, an object method that you can call to compare one sequence of characters to another is EXACTLY the same, there isn't anything to "relearn". As you said, it's just a syntactic difference.

I've trained and watched over dozens of new programmers during the last 12 years of my professional career. The ones that do the best are the ones who can actually see returns on their work immediately instead of fighting with cryptic memory management and often obscure error messages and warnings (not to say Java stack traces are 100 percent clear to a newbie or even anyone).

Edit:

To be fair, I noticed I used strcmp instead of strcpy in the original post. One of the pitfalls of typing way after my bedtime. My apologies for the confusion.

Dr Kevorkian94
Apr 25, 2011, 03:23 PM
Personally I am going to learn C++ only because that is the only language they teach at my school, but someone told me it was the best place to start. I'm an apple fanatic as well and I thought it would be a good base to start with I'f I ever decide to start with iOS,Then go on to objective-c.

GorillaPaws
Apr 25, 2011, 08:48 PM
Both are excellent options, and the language that's best for you to start with depends on how you learn best. An analogy might be that learning C first is a bit like learning how to do art by first starting with a pencil and paper and learning the fundamentals, slowly adding one color at a time, learning color theory, then progressing to paints, etc. Starting with Python is a bit more analogous to going straight to the acrylic paints and making a big beautiful mess, learning how to make things without necessarily having evolved all of the knitty-gritty art theory knowledge that is underpinning the work you're doing.

To be a great artist it's important to learn the theory eventually, but some will never discover the joys of painting if they can't make it through the months of charcoal sketches of fruit bowls, whereas others thrive on a ground-up approach. You will almost certainly want to learn C eventually, but the direction you take comes down to your learning style.

Amerabian
Apr 26, 2011, 03:48 PM
Start with C/C++..

For a start, get one of those San's Teach yourself C/C++ books.

They're really good to start with.

Sydde
Apr 26, 2011, 06:52 PM
HyperTalk







;)

balamw
Apr 26, 2011, 06:58 PM
HyperTalk

Where is the iPad's equivalent of HyperCard?

B

jiminaus
Apr 26, 2011, 07:04 PM
Where is the iPad's equivalent of HyperCard?

B

FileMaker Go. ;) (I do speak in jest)

macsmurf
Apr 26, 2011, 10:50 PM
I recommend Java, as always. It lets you focus on the essentials and is relatively easy to learn. There's also a lot of learning material on Java. A lot of colleges use Java in Programming 101. There is a reason for that.

Python would be my second choice, but I think Python would have been very confusing for me as a first language. I have considerable doubt that multi-paradigm languages are optimal learning languages for beginners.

firewood
Apr 27, 2011, 01:06 AM
Where is the iPad's equivalent of HyperCard?



Safari.

HyperCard was a web browser before there was WWW to browse.

Now you can author "cards" for pretty much everybody/everything in HTML/Javascript instead.

weerez1214
Apr 27, 2011, 08:28 AM
Hey guys,

I've been wanting to learn to program for a while, but I can't decide which language to learn. I've seen arguments for C and for Python. Most people saying C is a best first language say it teaches you more about programming and computers. Most people saying Python is the best say that it is more powerful and more practical than lower level languages like C.

Learning C as your first ever programming language is not going to be easy unless you already think like a programmer.

Pointers will confuse you, and details like linking and header files will get in the way of actual learning.

Start with something like Python, or for a bigger more complex language, try Java. Java is EXCEEDINGLY easy to learn, and has a gorgeous syntax. Not to mention, there are examples galore. Find websites from intro CS classes taught in Java/Python etc and do the labs.

Here's somewhere to start with Java. If you go through all the labs on this page, you'll be well on your way. http://www.cse.wustl.edu/~cytron/101Pages/f09/