PDA

View Full Version : Java or C? Which is better?




pchipchip
Jun 10, 2012, 04:31 PM
So I am thinking of taking an online course for either Java or C npbut I can't decide cpwhich one would be moe useful. I am eventually going to learn with Objective C, but I heard that is harder to learn as a first language. So out of Java and C which has the most uses, what are they, and which is easier as a sort of gateway language for learning others?



lee1210
Jun 10, 2012, 05:07 PM
C if Objective-C is next. Both are good, you should eventually learn both. I'm not going to vote because you should eventually know some of each.

-Lee

ender land
Jun 10, 2012, 05:13 PM
Hard to say, I guess. Depends on a million factors.

grapes911
Jun 10, 2012, 05:20 PM
If you learn the concepts of object-oriented programming, then any OOP langage is relativity easy to pick up. Java or C, I don't think it matters. Concepts are more important than APIs and syntax.

pmau
Jun 10, 2012, 05:25 PM
C might be a little hard to learn programming concepts.
Java is a great learning langauge for OOP.

Some other interpreted languages like Ruby, Python or Perl might give you other ideas what to do or learn.

Languages are designed for a purpose, there's really no better or worse.
It mostly depends on what you are aiming for.

I prefer compiled languages that result in machine code, but of course Java is used widely in enterprise environments where cross-platform ailability is more important than special features.

Keep an open mind and don't dismiss other peoples choices for tools to solve their problems.

talmy
Jun 10, 2012, 05:27 PM
C provides a good background in programming basics. It's been called a glorified Assembler. Java provides a great introduction to Object Oriented Programming. I use both, C for embedded programming (microcontrollers) and Java for everything else I can use it for. Consider also: Objective-C if you are interested only in Mac Programming is the obvious choice. C++ might be a good compromise choice, especially when combined with Qt.

wrldwzrd89
Jun 10, 2012, 05:28 PM
I'm a die-hard Java fan... but that does not mean that I eschew C. I have used it in the past - the biggest problem with it is that it's not as newbie-friendly as it could be. This is why I throw my support behind D as a C alternative - all the power of C, all the user-friendliness and object-oriented flexibility of Java in one language.

ArtOfWarfare
Jun 10, 2012, 07:59 PM
C

Obj-C is just an extension of C, so any C code can be mixed in with Obj-C. If your goal is ultimately Obj-C, you'll need to learn C sooner or later if you want a comprehensive knowledge of Obj-C... May as well just learn it now.

Mac_Max
Jun 10, 2012, 08:11 PM
It likely doesn't matter. Find out which instructor is better and decide based on that. Java uses C style Syntax, just like C++, C#, Java, Objective C, Javascript, and many others. Your introductory programming class is almost always the same when dealing with strongly typed, statically linked languages (you'll understand what that means later) and most of the concepts in an Into to CS/Programming class are universal amongst all languages and computers.

You'll learn:

What types are, i.e. int, char, float.

Arrays and possibly other containers.

A little bit about references (pointers in C & ref type v.s. value type in Java).

File IO... because everyone loves writing to text files.

How to create functions.

All of that is pretty much the same basic idea in C, Java, C++, etc. Your second programming course is where things will diverge.

If your class is pushed harder than your average intro to CS class (or does everything from a 10,000 ft overview) you might end up doing Object Oriented Programming and GUI (simple) programming in the Java class.

I can't really speak for what I more advanced C class will do for you since I started with Java, skipped the basic intro to C course, and went directly into mid to upper level C++ & C# classes in school (and read and experimented A LOT at home). My guess would be learning how to do more with pointers like pointer arithmetic and building custom data structures.

balamw
Jun 10, 2012, 08:15 PM
I am eventually going to learn with Objective C, but I heard that is harder to learn as a first language.

Only you can tell if Objective C is harder to learn as a first language. Kochan's fine book doesn't think so, and Hillegass' book thinks that ~70 pages of C is enough to give you the basics.

If your goal is Objective-C I still suggest, try Kochan first, fall-back to Hillegass and then and only then, switch to C if that isn't working for you.

B

Sydde
Jun 10, 2012, 09:06 PM
It is kind of like comparing grapefruits and pomegranates. C is useful for a broad range of work, though going from Objective-C to procedural C can be a little disorienting if the former is what you cut your teeth on. Objective-C is essential if you want to write for iOS, but Java is what you need for Android. From what I have seen, the verbosity of Objective-C makes code a little easier to understand compared to Java, so maybe Java is the best place to start, since it might be a harder climb, making Cocoa more downhill.

I would say, if at all possible, try to learn both at the same time.

throAU
Jun 10, 2012, 09:43 PM
Even though I hate the language, i voted for Java.

Why? Because it is more general purpose and cross platform. You can do webserver code with it. You can build apps with it. You can build applets with it. Whilst yes, technically you can do web server side stuff with C, security is a very difficult thing to get right.

Also, you are somewhat protected from needing to learn the ins and outs of memory management and pointers.

Daveoc64
Jun 10, 2012, 10:36 PM
Might as well ask which religion is the best next!

Java for me though!

Sydde
Jun 10, 2012, 11:24 PM
Might as well ask which religion is the best next!

You mean, like, go over to a bike forum and start a discussion on helmets?

larswik
Jun 10, 2012, 11:36 PM
If you are just starting out I would start with C. Both Java and Objective are built from C.

You can figure out the tools pretty easy. The hard part is knowing how to use them to solve problems. I could put a tool chest in front of you and you know what wrench are and what they do. But you have to use those tools to put together a car engine. That is the part that takes time getting use to IMO.

tomozj
Jun 11, 2012, 02:21 PM
In the last year I've learnt both Java and am learning C now, and I'm glad of the order I did it in. Java will teach you important OOP concepts (remember C isn't object oriented!) and improve your general programming skills. Java is much easier to work with and debug compared to C, where I'm finding it extremely hard. You don't want to be spending a long time to find a really small bug when you're learning to program.

Don't stress too much about learning a lot of C before diving in. I coded an app and published to the app store before learning either Java or C just from searching the web and from my experience with web scripting languages. In hindsight, I struggled because I didn't know programming concepts (such as OOP), not because I didn't know C specifically.

Get Eclipse (Java IDE, works on most platforms) and go from there. The IDE will help you write code (autocompletion, giving you errors and warnings as you write code) and will make debugging your code really easy. Good luck!

surma884
Jun 12, 2012, 09:35 AM
In C there is something called a "pointer". In C you also have to allocate and deallocate memory yourself (memory management). Java does that for you and you also don't have pointers. I like C and Java both. You should learn both. Also you should learn on your own first and then take the online course. That way it will be easier.

Once you learn C and Java you will learn the syntax and be able to program in C#, C++, and any C based syntax language.

Setmose
Jun 12, 2012, 12:11 PM
Are you kidding? Java is for GUI wimps. There's a reason that operating systems are written in C and C++. :apple:

hchung
Jun 12, 2012, 02:15 PM
Like Mac_Max said above, the quality of instructor at a school will likely matter more than your choice of language.

If you're planning on working hard at it and becoming a good developer, start with C. You're better off knowing how the computer interprets your code.
Once you're accustomed to how the computer works, then build on top of that by adding objects.

If you're planning on being last decade's equivalent of a web "programmer", then go dive right into Java as a start.

Note: I'm not saying Java's bad. I'm saying, if you don't have an idea of what you're actually telling the CPU to do, you'll always be second rate.

If you've learned Java (or VB or any scripting language), feel comfortable with it, and looking to decide what to do next, then do yourself a favor and go back to learn C. If you become comfortable with C, you'll be a better engineer even if you don't have to use it.

ytk
Jun 12, 2012, 03:50 PM
all the user-friendliness… of Java

That's like saying all of the trustworthiness of a used-car salesman.

To the OP: Learn Ruby. You'll get much better with it much more quickly, you'll be able to use it for far more than C or Java (particularly on the Mac), and most importantly, you'll actually enjoy using it way more than C or Java. And when you get the basics down in Ruby, you can go off and learn other languages. And as you learn the "features" of other languages, your reaction will likely be something along the lines of, "Oh, I get it. That's just the hard way of doing something that's straightforward in Ruby!"

That said, there's a value to learning C, in that there's a value in knowing how to take apart an engine and put it back together. But you're best off learning how to drive before you tackle that task, and guess what—C makes you build the car before you can actually start driving. Java's a bit better, in that it's more like assembling a car from a kit, but the downside to that is that you're still just following directions because "that's the way it's done", but you won't have a clue why it's done that way for a very long time, if ever.

Ruby, on the other hand, puts you in the driver's seat from the get-go. And as you learn the language, you'll realize you have way more control over the car than you knew at first, and you've actually learned way more in the process than you would have with C or Java.

wrldwzrd89
Jun 12, 2012, 04:42 PM
That's like saying all of the trustworthiness of a used-car salesman.

To the OP: Learn Ruby. You'll get much better with it much more quickly, you'll be able to use it for far more than C or Java (particularly on the Mac), and most importantly, you'll actually enjoy using it way more than C or Java. And when you get the basics down in Ruby, you can go off and learn other languages. And as you learn the "features" of other languages, your reaction will likely be something along the lines of, "Oh, I get it. That's just the hard way of doing something that's straightforward in Ruby!"

That said, there's a value to learning C, in that there's a value in knowing how to take apart an engine and put it back together. But you're best off learning how to drive before you tackle that task, and guess what—C makes you build the car before you can actually start driving. Java's a bit better, in that it's more like assembling a car from a kit, but the downside to that is that you're still just following directions because "that's the way it's done", but you won't have a clue why it's done that way for a very long time, if ever.

Ruby, on the other hand, puts you in the driver's seat from the get-go. And as you learn the language, you'll realize you have way more control over the car than you knew at first, and you've actually learned way more in the process than you would have with C or Java.
Thanks, I'll have to add Ruby to my list of languages to learn then ;)

ytk
Jun 12, 2012, 07:31 PM
Thanks, I'll have to add Ruby to my list of languages to learn then ;)

Do yourself a favor and move it to the top of the list. ;)

Seriously, you'll never want to write
for(int i=0; i<30; i++) {
...
}
/* Does this execute 29, 30, or 31 times? */

once you get used to just writing
30.times do
...
end

In fact, Ruby has no ++ or -- operators, and I've never missed them (+= still works). That may seem insignificant, but it actually says a lot about the way Ruby handles control structures. Having used a variety of languages over the years, starting with BASIC and going through C, C++, Perl, Objective-C, and Java (with a couple of other more obscure ones thrown in here and there), I can honestly say that Ruby is the one language that fundamentally changed the way I think about programming more than any other.

As I've said above, C has its uses, and having at least a working understanding of it is a must for any serious programmer. For that matter, if you want to write software for the Mac, it pays to at least be able to understand Objective-C (although you can write full-fledged Cocoa applications on the Mac in Ruby, with probably 95% or more of the functionality of Objective-C available to you and the addition of all of the cool Ruby features). But I'd gladly toss all of the rest of the languages I've learned in the dustbin and never touch them again in favor of Ruby.

Sorry if I sound like a raving Ruby fanatic, but I'm just trying to convey how truly awesome this language is. Granted, I'm largely a hobbyist programmer (although I do a fair amount of programming in my job, it's not a part of my job—it's mostly automation of tasks that would otherwise have to be done manually). But after four years of using Ruby, I still regularly have "Oh, cool!" moments when using it, and I can't say that about any of the other languages I've ever used.

Kenndac
Jun 12, 2012, 11:36 PM
for(int i=0; i<30; i++) {
...
}
/* Does this execute 29, 30, or 31 times? */


Easy — 30. 0-based math isn't hard.

Anyway. If you're moving to Objective-C, C is best. It's a lot harder than Java especially at the beginning, and isn't object oriented like Java, but Objective-C is a strict superset of C and you'll be using C stuff all the time in it.

sigma8
Jun 13, 2012, 11:47 AM
I'd start with C. And I mean plain old C. If you're truly a learner, it makes sense to focus on syntax and basic concepts first--without all the OO stuff. I would almost say C, Java, then Objective-C. And actually, you could probably do better to replace Java with Ruby. The problem with Java, is that it's a very heavy language. It's very secure and goes to lengths to improve maintainability largely by increasing verbosity and process. Those would help you a lot if you were writing an enterprise level accounting module, but I think it would only slow and obscure your learning of concepts.

Plus, both Java and Objective-C get stuff done largely through frameworks, which each have their own learning curves. C and Ruby (without Rails) aren't quite as framework-focused. If you're goal is to end up in Obj-C only, it would save some sanity and unlearning if you skipped Java's framework and OO peculiarities and just focused on Objective C. Consider that learning Java practically requires you to learn Eclipse, it's ubiquitous and substantial IDE. That knowledge will be mostly thrown out the door when you have to learn Xcode.

Totally huge "on the other hand": if you want to position yourself to do both iOS and Android development, you really need to learn Java. So if there is a well taught course, I'd take it. Java is certainly worth learning, but mostly on its own merits. I wouldn't rate it highly as a starter language.

splitpea
Jun 13, 2012, 12:22 PM
They're both better for different things. For instance, it's easier to build cross-platform software in Java, but for embedded device firmware you get far better performance from C.

In terms of learning, C (and I really mean C, not C++ or Objective-C) as a comparatively low-level language will require you to learn a lot about how your computer interprets programs, what primitive data types are represented and how they're processed, how memory is managed and what a pointer really is, etc.

All of that will serve you well when you work in higher level languages, from Java to Objective-C to scripting languages like Javascript or PHP.

All that low-level stuff can mean it's more challenging to learn, though, as in addition to learning to debug basic control-flow logic (conditionals, loops, function calls), you'll also have to juggle memory management and data types.

Honestly, I'd recommend learning procedural control flow first with a dynamically-typed scripting language that handles your memory management for you (Javascript, PHP, Python, Ruby, etc).

After that, C to understand the underlying fundamentals (or even a toy assembler if you want to dig deeper); then an object-oriented language (whether that's Java, Objective-C, or going back to Ruby or Python) to learn OOP. Functional programming is another paradigm worth learning for the heck of it (Lisp is the ultimate functional language, but it's also one of the most common techniques used in Javascript, among others).

sigma8
Jun 13, 2012, 02:12 PM
Referring back to the fact the choice you make may be determined by supply (only java or C classes offered), versus demand, I'd still say C, all things being equal. However, like someone else said earlier, if the instructor for Java is clearly superior, go for that. In a classroom setting, instructor is everything.

If the hands-down best instructor is teaching COBOL, stick a fork in your hand and sign up for that (don't worry, this won't happen).

r.harris1
Jun 13, 2012, 04:33 PM
I'm a huge fan of both, and others. I started in C lo these many years ago and then picked up C++, Java, Objective C, etc. What worked for me was learning one language really well (in my case C) and this in turn made moving between other languages a piece of cake (well sort of :D).

C is great, as others have pointed out, because you're forced to get into the nitty gritty of memory management, file and string manipulation and other components that get hidden behind the higher level frameworks available in, for instance, Java. You'll always come out on top if you have a grasp of the "magic" that goes on behind ARC, garbage collection and higher level string packages and libraries available.

ytk
Jun 13, 2012, 04:50 PM
Easy — 30. 0-based math isn't hard.

Sure, but you had to think about it. And if you had accidentally typed i<=30 instead of i<30, you're now executing 31 times. On the other hand, a Ruby statement like "30.times" will execute, well, 30 times.

This is a trivial example, though. Let's say you have a function that collects sensor data and stuffs it into an array of indeterminate length. After collecting the data, you then want to print out the results, one per line.

In C, your code would look something like this:

float data_array[ARRAY_SIZE] ; /* The array that will hold our sensor data */
int sensor_measurements ; /* Holds the return value of collect_sensor_data(), which indicates how many measurements were collected */
int i;
sensor_measurements = collect_sensor_data(data_array, ARRAY_SIZE) ; /* collect_sensor_data() takes the maximum number of measurements so we don't overrun our buffer */
for(i=0 ; i < sensor_measurements; i++) {
printf("Measurement %i: %f\n", i+1, data_array[i]);
}

The equivalent code in Ruby would be:

sensor_data().each_with_index{ |value, index| puts "Measurement #{index + 1}: #{value}" }

The Ruby code is not only simpler and more compact, it's far easier to conceptualize. Why should a beginner need to know what a pointer is to complete a task like this, let alone understand how pointers and arrays are related in C? Making an absolute beginner learn stuff like this right off the bat is like teaching someone to fly on a 747. As splitpea said, you're best off learning the basics like flow control with a high-level language where you don't have to worry about managing memory, buffer overruns, pointers, and so on. There's time to learn about those things later, when you're ready to find out what's actually going on under the hood.

However, I don't think there's a problem with learning procedural programming, then object-oriented programming, and THEN digging down into the nitty gritty of the hardware. Many OOP languages so heavily abstract the underlying system that you don't really need to know anything about memory management and such in order to grasp the concepts. That said, I don't think you'll ever really understand object-orientation without a grounding in the fundamentals of memory management and so on. A good programmer will need to know all of it, of course. But if you throw a beginner headlong into C right off the bat, that person may well end up not a programmer at all.

hchung
Jun 14, 2012, 01:06 PM
sensor_data().each_with_index{ |value, index| puts "Measurement #{index + 1}: #{value}" }

The Ruby code is not only simpler and more compact, it's far easier to conceptualize. Why should a beginner need to know what a pointer is to complete a task like this, let alone understand how pointers and arrays are related in C? Making an absolute beginner learn stuff like this right off the bat is like teaching someone to fly on a 747.

I'm sorry, but I'd disagree. Telling somebody to learn Ruby instead of C is telling somebody to learn to fly a 747 instead of a Cessna. The amount of complexity hidden behind that code hides so much that you're not able to conceptualize how to fly (understand your code), but rather tweak knobs on a fancy control panel (call opaque APIs and hope for the best).

From reading the Ruby code, I understand what that is supposed to do, but I don't have a clue what it actually does.

That's my main concern. There's two major goals that people desire:
1) engineering for production
2) rapid prototyping

They're both good things. But they're opposed to each other. One's goal is to make something that's architecturally sound and designed to be actually used by a massive audience. The other is to get something to the point of working to prove a concept.

Ruby, JS, Python, etc. is biased towards the rapid prototyping camp because it's so much easier to get something up and running. But it's so much harder to get something performant.

In C, you can see exactly what's going on because it's practically a human readable assembler. It'll take you a lot longer to build something large in C, but dive right into your debugger and profiler and you'll know what's going on whenever you want.

When you're starting out, you need to know the basics.

As an aside:
How many people here know what Cocoa Bindings are? How many people here have used them? It's a perfect example of how something that sounds conceptually awesome and should make life easier for the programmer can turn into absolute hell once you need to debug it.

TwinMonkeys
Jun 14, 2012, 04:35 PM
C or Java?

The answer depends on what you're trying to accomplish. Both can be used to create quality, large scale systems.

If you're interested in Mac Programming, C would be better.

If you're interested in getting a programming job with a big company, Java would be better most likely.

But it really depends on your goal.

ytk
Jun 14, 2012, 09:11 PM
I'm sorry, but I'd disagree. Telling somebody to learn Ruby instead of C is telling somebody to learn to fly a 747 instead of a Cessna. The amount of complexity hidden behind that code hides so much that you're not able to conceptualize how to fly (understand your code), but rather tweak knobs on a fancy control panel (call opaque APIs and hope for the best).

You're not hoping for the best. You're telling the computer what you want it to do, and it's doing it. This is the fundamentals of programming. With a higher level language, you don't need to be as explicit about each step. Which is a Good Thing, because you're not trying to learn to think like a computer. You're trying to learn to think like a programmer first. A good programmer can easily learn to think like a computer. It's very hard to teach a computer to think like a programmer.

From reading the Ruby code, I understand what that is supposed to do, but I don't have a clue what it actually does.

That's nonsense. You know exactly what the code does from reading it (assuming you understand the syntax of the language, of course). You may not be aware exactly how the computer is carrying your instructions out, but so what? By that logic, you can't learn C until you've taken sufficient courses in physics, electronics, and microprocessor design to understand exactly what's happening inside the wiring.

That's my main concern. There's two major goals that people desire:
1) engineering for production
2) rapid prototyping

They're both good things. But they're opposed to each other. One's goal is to make something that's architecturally sound and designed to be actually used by a massive audience. The other is to get something to the point of working to prove a concept.

Ruby, JS, Python, etc. is biased towards the rapid prototyping camp because it's so much easier to get something up and running. But it's so much harder to get something performant.


This argument has been done to death, and it's obviously disproved by the large number of high-traffic websites out there running code based on Ruby, Javascript, Python, PHP, and so on. How many sites out there are running on C?

Performance for the sake of performance is a pointless goal, given that processors are fast enough now that we can legitimately place more value on the programmer's time and effort than the computer's processing capability.

In C, you can see exactly what's going on because it's practically a human readable assembler. It'll take you a lot longer to build something large in C, but dive right into your debugger and profiler and you'll know what's going on whenever you want.

If your code isn't working the way you intend it to, you've made a mistake. Find it and fix it. That's true in any language. In C, it's just a lot easier to make mistakes, because there are more mistakes to make. The only thing you're learning is what hoops you have to jump through. I guarantee you that if your code doesn't work as intended, it's not Ruby's fault. It's yours.

And performance is a non-issue for people learning to program. Most of the time, it's a non-issue period. In the event that it becomes crucial for some reason, well, that's the time to optimize your code, or drop down into a lower level programming language like C.

In fact, there's an argument to be made that you're better off learning to program on a language that offers less in terms of raw performance. Let's say you write some code that's horribly inefficient for some reason. If you write that code in C, given how fast processors are nowadays you may not even realize it. Write it in a high-level language like Ruby or Python, though, and it might start to drag. Guess what? That's a great opportunity to learn to optimize your code! The novice C programmer, however, is likely to continue blithely onward, remaining blissfully unaware that he's just picked up a horrible programming technique. Just because you can make a bubble sort go really really fast in C doesn't make it an efficient sorting algorithm.

When you're starting out, you need to know the basics.

You're confused about what constitutes "the basics". Flow control, variables, functions, logic—these are the basics. Debugging a buffer overrun and casting pointers is the advanced stuff. Would you forbid people from driving until they can repair an engine?

r.harris1
Jun 14, 2012, 09:42 PM
You're confused about what constitutes "the basics". Flow control, variables, functions, logic—these are the basics. Debugging a buffer overrun and casting pointers is the advanced stuff. Would you forbid people from driving until they can repair an engine?

I sort of agree - I'm OK with people learning something like Ruby or Python to get their feet wet with flow control, etc. But get them into a lower level language class sooner rather than later. Interestingly, I've found over the years that folks I've worked with or hired who have a solid grounding in a lower-level language are well placed to pick up the Rubies and Pythons of the world but less so the other way around. They can do it, just seems more of a struggle. And the lessons learned in debugging something like C play well in troubleshooting really complex issues in large systems they may one day help write.

Just my observations and 2 cents.

chown33
Jun 14, 2012, 09:43 PM
How many sites out there are running on C?

At a minimum, everything running Apache web server (http://en.wikipedia.org/wiki/Apache_HTTP_Server) or Nginx (http://en.wikipedia.org/wiki/Nginx), both written in C. This seems to be a solid majority (http://en.wikipedia.org/wiki/Web_server#Market_share).

None of which really means anything one way or the other about the rest of your arguments. Mainly because most beginners don't start by designing and coding the backend of a website.

softwareguy256
Jun 14, 2012, 10:45 PM
C++ is by far the most powerful language with proven concepts decades old. Once a sufficient level has been reached it is virtually IMPOSSIBLE to write bad C++ code. Now people who can't understand things like the fence post concept are just low quality programmers and cannot be trusted to work on mission critical software (this is software where the programmers are paid good and treated like rockstars). Hence you have many dumbed down languages where developers work in cubes and are essentially commodity resources.

Java has its purposes, mainly because of its VM feature. Other than that, it is a horrid language with many many annoying restrictions. It's non-deterministic running time makes it a no-go for anything mission-critical with a time component.

softwareguy256
Jun 14, 2012, 11:01 PM
Anyone who can't write a for loop shouldn't be working as a programmer. There needs to be a minimum level of competence. Anyway, when that 128 byte cache-lined haswell comes out SOMEONE is going to have to write the optimized code... very likely me.


Sure, but you had to think about it. And if you had accidentally typed i<=30 instead of i<30, you're now executing 31 times. On the other hand, a Ruby statement like "30.times" will execute, well, 30 times.

This is a trivial example, though. Let's say you have a function that collects sensor data and stuffs it into an array of indeterminate length. After collecting the data, you then want to print out the results, one per line.

In C, your code would look something like this:

float data_array[ARRAY_SIZE] ; /* The array that will hold our sensor data */
int sensor_measurements ; /* Holds the return value of collect_sensor_data(), which indicates how many measurements were collected */
int i;
sensor_measurements = collect_sensor_data(data_array, ARRAY_SIZE) ; /* collect_sensor_data() takes the maximum number of measurements so we don't overrun our buffer */
for(i=0 ; i < sensor_measurements; i++) {
printf("Measurement %i: %f\n", i+1, data_array[i]);
}

The equivalent code in Ruby would be:

sensor_data().each_with_index{ |value, index| puts "Measurement #{index + 1}: #{value}" }

The Ruby code is not only simpler and more compact, it's far easier to conceptualize. Why should a beginner need to know what a pointer is to complete a task like this, let alone understand how pointers and arrays are related in C? Making an absolute beginner learn stuff like this right off the bat is like teaching someone to fly on a 747. As splitpea said, you're best off learning the basics like flow control with a high-level language where you don't have to worry about managing memory, buffer overruns, pointers, and so on. There's time to learn about those things later, when you're ready to find out what's actually going on under the hood.

However, I don't think there's a problem with learning procedural programming, then object-oriented programming, and THEN digging down into the nitty gritty of the hardware. Many OOP languages so heavily abstract the underlying system that you don't really need to know anything about memory management and such in order to grasp the concepts. That said, I don't think you'll ever really understand object-orientation without a grounding in the fundamentals of memory management and so on. A good programmer will need to know all of it, of course. But if you throw a beginner headlong into C right off the bat, that person may well end up not a programmer at all.

Kenndac
Jun 14, 2012, 11:24 PM
Sure, but you had to think about it. And if you had accidentally typed i<=30 instead of i<30, you're now executing 31 times. On the other hand, a Ruby statement like "30.times" will execute, well, 30 times.

No I didn't, because I'm a good programmer. ;)

Look, I like Ruby as much as the next guy and use it quite frequently, but the point you're making here isn't really a good one.

Sure, if I typed i<=30 instead of i<30, the loop will run 31 times. In Ruby, if you type "31.times" instead of "30.times", the loop will run 31 times. Both are 1-character typos, and to me, both are as obvious as each other. Just because you're not fluent enough at C to be able to read and write it without thinking doesn't suddenly mean Ruby is better.

lee1210
Jun 14, 2012, 11:32 PM
The Internet: Where the answer to "Should I take a C class or Java class?" is an argument about Ruby.

-Lee

Sydde
Jun 14, 2012, 11:40 PM
The Internet: Where the answer to "Should I take a C class or Java class?" is an argument about Ruby.

-Lee

Everyone has an opinion and everyone has a cowboy hat.

lloyddean
Jun 14, 2012, 11:48 PM
Damn! Has anyone seen my cowboy hat?

macsmurf
Jun 15, 2012, 03:06 AM
I'm going to restrict myself to which is the better learning language. Which language is better is like asking whether a hammer is better than a screwdriver. People who don't understand this are not to be trusted.

I would say that Java is easier to learn mainly because it protects you from "strange" errors. That is, programming errors will give you more sensible error messages than C. Errors in C can sometimes lead to weird behavior because you're operating fairly close to the OS as opposed to Java.

Some people seem to think that it will put hair on your chest. I happen to think one should go with the stuff that is easiest to learn first and once you have a solid foundation, you can move on to other languages. When teaching programming, most universities start you out with Java which seems to support that notion.

I think my progression was something like Pascal -> C -> Assembler -> Python -> Java -> Perl (Yikes!) -> Ruby -> Scala (plus messing around in a handful of other languages). That means I went from fairly high-level to very low-level and then back up. I feel that is an excellent way to learn.

BTW, some people recommend Python or Ruby as a first programming language. It depends on how you learn. I personally feel that these are difficult languages to learn (properly) because of their multiparadigm nature. You are not expected to understand the previous sentence :)

If you ask me what is my favorite language I would say Scala currently. Would I recommend it as a first programming language? No.

ytk
Jun 15, 2012, 03:10 AM
No I didn't, because I'm a good programmer. ;)

So, you've never made a fencepost error then? Not even once?

Sure, if I typed i<=30 instead of i<30, the loop will run 31 times. In Ruby, if you type "31.times" instead of "30.times", the loop will run 31 times. Both are 1-character typos, and to me, both are as obvious as each other. Just because you're not fluent enough at C to be able to read and write it without thinking doesn't suddenly mean Ruby is better.

Typing 31 instead of 30 would stick out like a sore thumb, and would be noticed immediately. Typing <= instead of < is something you might do without even realizing it, and the mistake might not be caught until it causes a problem, if even then. Or say you were iterating over a subsection of an array, from j to k. So you'd type "for(i=j; i<=k, i++)". But later, you say to yourself, "Aha, what if k exceeds the size of the array?" Well, the array size is set via #define. So just prior to your for loop, you insert "if(k>ARRAY_SIZE) k = ARRAY_SIZE;". Oops.

Think it never happens? Think again. The fact is that this sort of error is common enough that there's an entire Wikipedia article dedicated to it. It doesn't matter how fluent you are at C; if you think you're immune to making such errors, you're not a good programmer, just an arrogant one.

macsmurf
Jun 15, 2012, 04:55 AM
Think it never happens? Think again. The fact is that this sort of error is common enough that there's an entire Wikipedia article dedicated to it. It doesn't matter how fluent you are at C; if you think you're immune to making such errors, you're not a good programmer, just an arrogant one.

I'm sorry. I can't resist.

I agree that off by one errors happens even to experienced programmers which is why the foreach construct exist in many languages. The basic "iterating through a list" for loop is, however, not that difficult once you've done it 1000 times.

I don't think one example, however elegant, is enough to judge a language as a good learning language. You need to take a more general look.

On that note: In Ruby you have lambda expressions in the form of blocks. You also have functions in the form of Procs which is pretty much the same thing although Procs are actually objects. Then you have methods which is the OOP approach but then you can have methods within the methods which, while extremely useful, sort of breaks the analogy. So at least four interelated concept for "a chunk of code out can execute repeatedly". Also, you get closures which is IMHO a difficult concept to grasp the first time around.

In Ruby you have different methods for the same thing and almost the same thing. What exactly is the difference between map and collect? There isn't one. However, what about the difference between each and map? There is a significant difference. Lots of sugar can also obscure what is going on and when is sugar actually stuff with different behaviour that just look like sugar? What about truly magic stuff like the no such method trick? If you look at other people's code (and you should) you're bound to run into it.

In Ruby you have the functional side-effect free programming concept mixed with OOP programming which promotes the idea of mutable objects. By the way, you also have Perl like implicit variables if that's your thing. A beginner would have no idea when to use one and when to use the other.

Powerful, fun, and some times elegant? Yes. My first language of choice? No. It depends on how you learn but I think Ruby would have confused me a great deal.

Finally, a pet peeve of mine: In you example you use sensor_data()... implying that this is a function but since no-args parentheses are optional in Ruby you could just as easily have written sensor_data... which, to the untrained eye, looks like a variable. This is confusing and can lead to weird errors.

cytomatrix
Jun 15, 2012, 06:19 AM
HTML is better than both.

ytk
Jun 15, 2012, 07:06 AM
I agree that off by one errors happens even to experienced programmers which is why the foreach construct exist in many languages. The basic "iterating through a list" for loop is, however, not that difficult once you've done it 1000 times.

Of course it's not that difficult. It's a trivial example to demonstrate the point.

On that note: In Ruby you have lambda expressions in the form of blocks. You also have functions in the form of Procs which is pretty much the same thing although Procs are actually objects.

Blocks are objects. Specifically, they're Procs. A Proc can also be explicitly generated and passed around, but you can pass a block to a method and it'll show up as a Proc.

Then you have methods which is the OOP approach but then you can have methods within the methods which, while extremely useful, sort of breaks the analogy.

Why? Methods are an object just like everything else in Ruby. Ergo, they can contain other objects. It's actually quite elegant in its simplicity.

So at least four interelated concept for "a chunk of code out can execute repeatedly". Also, you get closures which is IMHO a difficult concept to grasp the first time around.

I only count two: Methods and Procs. Methods are tied to an object, Procs are not.

In Ruby you have different methods for the same thing and almost the same thing. What exactly is the difference between map and collect? There isn't one.

Yes, it's called aliasing. It's very clear from the documentation that collect and map are synonyms. I'm not sure what the big deal is here.

However, what about the difference between each and map? There is a significant difference.

Okay, so? Again, yes, some methods have aliases, and some methods actually do different things. A quick glance at the documentation will clear this up if you're confused.

Lots of sugar can also obscure what is going on and when is sugar actually stuff with different behaviour that just look like sugar? What about truly magic stuff like the no such method trick? If you look at other people's code (and you should) you're bound to run into it.

There's nothing magic about method_missing. It works exactly like you'd expect in an object oriented language. Again, if you're confused about how a specific language function works, the documentation is all right there. I guarantee you I could explain how method_missing works to a new programmer more easily than I could explain pointers.

In Ruby you have the functional side-effect free programming concept mixed with OOP programming which promotes the idea of mutable objects. By the way, you also have Perl like implicit variables if that's your thing. A beginner would have no idea when to use one and when to use the other.

Unless the beginner were to, say, read a tutorial maybe?

Finally, a pet peeve of mine: In you example you use sensor_data()... implying that this is a function but since no-args parentheses are optional in Ruby you could just as easily have written sensor_data... which, to the untrained eye, looks like a variable. This is confusing and can lead to weird errors.

That's a feature, not a bug, and it's a very cool one once you understand it. Yes, in this case I added the parentheses to indicate it was a method for this specific example (I didn't want anyone thinking I was cheating and just accessing a variable rather than calling a function), but I'd normally leave them out. The reason for this is that it doesn't matter whether sensor_data is a variable or a method. This is quite powerful once you grok it.

Consider a Price object, that can store and retrieve a value in dollars or euros. Rather than having a variable called "dollars" and methods called "dollars()", "euros()", "set_dollars(x)", and "set_euros(x)", you can just have a variable called "dollars" and methods called "euros" and "euros=(x)". You have the euros method return convert the dollars value to euros and return it, and the euros= method convert the value to dollars and store it. Now, users of your class can treat dollars and euros as "virtual variables" that are simply different abstractions of the same thing. And what's more, if you later change your class to be euro-centric (or even pound-centric) instead of dollar-centric, you don't need to modify any of the code that refers to your class.

Don't get me wrong. Ruby can and will give you more than enough rope to hang yourself. That's a side effect of being powerful and useful. But even though it's a fairly complicated language under the surface, from the standpoint of a novice you can write very clear and effective procedural code, and understand what exactly it is you're doing from a strictly logical viewpoint. You can also learn the fundamentals of OOP in a language that's designed around the concepts. In my opinion, those are far more important things to learn than pointer arithmetic and making sure you null-terminate your strings. And a new programmer who is seeing results more quickly because he's not trying to figure out why his program is spitting out garbage for some inexplicable reason is more likely to stick with it.

macsmurf
Jun 15, 2012, 08:08 AM
First of all: Did you learn Ruby as your first programming language? Just wondering.

Of course it's not that difficult. It's a trivial example to demonstrate the point.


Except that a trivial example doesn't actually demonstrate the point. But I get it :)


Blocks are objects. Specifically, they're Procs. A Proc can also be explicitly generated and passed around, but you can pass a block to a method and it'll show up as a Proc.

Why? Methods are an object just like everything else in Ruby. Ergo, they can contain other objects. It's actually quite elegant in its simplicity.


If you don't know what an object is then that is difficult to grasp: "An object is a structure that has state and behavior. Except that in Ruby the behavior is actually objects. Which is also the case with the state. So what exactly is the difference between state and behavior? Well," ... etc.

Compare that with Java: "An object is a structure that has state and behavior". Done. No fancy stuff. My experience is that beginners have difficulties distinguishing between classes and objects. That should be the focus in the beginning.


I only count two: Methods and Procs. Methods are tied to an object, Procs are not.


Conceptually there are more than two, at least in the eyes of the beginner.


Yes, it's called aliasing. It's very clear from the documentation that collect and map are synonyms. I'm not sure what the big deal is here.


The problem is bloat. There is simply no good reason for having more than one method that does the same thing as another method. In my opinion there is no good reason for each to exist or each_with_index. What if you really needed a map_with_index? Well, it's not there. Actually, what you probably really needed was a zip_with_index since that would be usable for both each and map. It's not there.


Okay, so? Again, yes, some methods have aliases, and some methods actually do different things. A quick glance at the documentation will clear this up if you're confused.


But will the beginner realize why the difference is significant? In fact, a beginner that says "I'm confused. Why is that method ever necessary. Why not just use map?" would be especially promising.


There's nothing magic about method_missing. It works exactly like you'd expect in an object oriented language. Again, if you're confused about how a specific language function works, the documentation is all right there. I guarantee you I could explain how method_missing works to a new programmer more easily than I could explain pointers.


I'm not confused. I'm saying that I believe a beginner would be confused and the documentation, although very good in Ruby, is not necessarily the best teaching tool.


Unless the beginner were to, say, read a tutorial maybe?


No. Are you going to explain to a beginner the difference between declarative and imperative style, what programming paradigm promotes what style, and when you should use on thing over the other? Having two ways of doing the same thing leads to confusion. Note that I'm not arguing against having two ways of doing the same thing. I'm arguing against having it in a teaching language. What about implicit variables? I would argue that they are mostly useful in the case of regular expressions but should generally be avoided in most other situations.


That's a feature, not a bug, and it's a very cool one once you understand it. Yes, in this case I added the parentheses to indicate it was a method for this specific example (I didn't want anyone thinking I was cheating and just accessing a variable rather than calling a function), but I'd normally leave them out. The reason for this is that it doesn't matter whether sensor_data is a variable or a method. This is quite powerful once you grok it.

Consider a Price object, that can store and retrieve a value in dollars or euros. Rather than having a variable called "dollars" and methods called "dollars()", "euros()", "set_dollars(x)", and "set_euros(x)", you can just have a variable called "dollars" and methods called "euros" and "euros=(x)". You have the euros method return convert the dollars value to euros and return it, and the euros= method convert the value to dollars and store it. Now, users of your class can treat dollars and euros as "virtual variables" that are simply different abstractions of the same thing. And what's more, if you later change your class to be euro-centric (or even pound-centric) instead of dollar-centric, you don't need to modify any of the code that refers to your class.


So what you're saying is that in the case of properties, it is a feature, but I would argue that it's confusing in other respects.


Don't get me wrong. Ruby can and will give you more than enough rope to hang yourself. That's a side effect of being powerful and useful. But even though it's a fairly complicated language under the surface, from the standpoint of a novice you can write very clear and effective procedural code, and understand what exactly it is you're doing from a strictly logical viewpoint. You can also learn the fundamentals of OOP in a language that's designed around the concepts. In my opinion, those are far more important things to learn than pointer arithmetic and making sure you null-terminate your strings. And a new programmer who is seeing results more quickly because he's not trying to figure out why his program is spitting out garbage for some inexplicable reason is more likely to stick with it.

I think a simple learning language is better than a complicated learning language that, as you say, gives you more than enough rope to hang yourself with. I don't consider C the best alternative. Java is a rather verbose OO-language that forces you to realize what is actually going on at the language level. Not a lot of features. That's unfortunate when it comes to the rest of us and I would personally choose Ruby over Java any day all else being equal, but I'm not a beginner.

Sydde
Jun 15, 2012, 04:11 PM
If you ask me what is my favorite language I would say Scala currently. Would I recommend it as a first programming language? No.

Well, my favorite language is AL, but for most applications, it is highly impractical these days, and not really even faster than what you get with a good compiler. Perhaps a good place to start might be classic MS Basic, complete with numbered lines and all. Teach about a month of that, then move on to a more modern language, at which point the student will find the change in structure and syntax tremendously empowering. Kind of like holding back on the reins at first, then letting the horse run.

sicn
Jun 16, 2012, 07:16 AM
It depends largely on what type of applications you want to write. Server-Side Business Applications are more often than not written in Java (and Java EE).

However, for you to better understand the basics of what is going in your computer when your code runs I would say C (or maybe even Assembler ;-)). You will learn what a piece of text actually looks like in memory, how you create and dispose used memory et cetara.

Also, if you got to work with Objective C (IMHO a horrible language, but I guess if you like a mix of SmallTalk and C you may have a different opinion) you will eventually have to learn some concepts of C anyway.

zapbranighan
Jun 16, 2012, 12:05 PM
If you are a beginner, then I would suggest Java. It is usually the first programming language taught by university. I think it's because it's conceptually easier to learn. Once you've decided that you like programming then you should move on to C. I think C has more challenging concepts. But the programming syntax is fairly easy to understand especially if you already know Java. Plus it's just more gratifying because you will be able to more sooner with Java then with C because of the Java libraries. You can work with GIFs, GUIs, etc, fairly sooner with Java then with C.

pragmatous
Jun 17, 2012, 12:41 AM
Easier coding == less power
difficult coding == more power

This is why VB is easy to code but lacks power and why C++ is difficult to code but has more power.

Easier coding languages might but be ok for learning but it teaches you how to be a lazy programmer instead of taking time into optimizing your code. You also have less power.

Honestly the best language to learn is Java. Once you learn java C++ is easy. Master these two languages and you can code in any language.

Sure, but you had to think about it. And if you had accidentally typed i<=30 instead of i<30, you're now executing 31 times. On the other hand, a Ruby statement like "30.times" will execute, well, 30 times.

This is a trivial example, though. Let's say you have a function that collects sensor data and stuffs it into an array of indeterminate length. After collecting the data, you then want to print out the results, one per line.

In C, your code would look something like this:

float data_array[ARRAY_SIZE] ; /* The array that will hold our sensor data */
int sensor_measurements ; /* Holds the return value of collect_sensor_data(), which indicates how many measurements were collected */
int i;
sensor_measurements = collect_sensor_data(data_array, ARRAY_SIZE) ; /* collect_sensor_data() takes the maximum number of measurements so we don't overrun our buffer */
for(i=0 ; i < sensor_measurements; i++) {
printf("Measurement %i: %f\n", i+1, data_array[i]);
}

The equivalent code in Ruby would be:

sensor_data().each_with_index{ |value, index| puts "Measurement #{index + 1}: #{value}" }

The Ruby code is not only simpler and more compact, it's far easier to conceptualize. Why should a beginner need to know what a pointer is to complete a task like this, let alone understand how pointers and arrays are related in C? Making an absolute beginner learn stuff like this right off the bat is like teaching someone to fly on a 747. As splitpea said, you're best off learning the basics like flow control with a high-level language where you don't have to worry about managing memory, buffer overruns, pointers, and so on. There's time to learn about those things later, when you're ready to find out what's actually going on under the hood.

However, I don't think there's a problem with learning procedural programming, then object-oriented programming, and THEN digging down into the nitty gritty of the hardware. Many OOP languages so heavily abstract the underlying system that you don't really need to know anything about memory management and such in order to grasp the concepts. That said, I don't think you'll ever really understand object-orientation without a grounding in the fundamentals of memory management and so on. A good programmer will need to know all of it, of course. But if you throw a beginner headlong into C right off the bat, that person may well end up not a programmer at all.

knightlie
Jun 18, 2012, 05:54 AM
Seriously, you'll never want to write
for(int i=0; i<30; i++) {
...
}
/* Does this execute 29, 30, or 31 times? */

Franky, if you know how the for construct works and you can't work out what that loop means then you're probably not a very good programmer. It runs 30 times, zero to 29.

I hate that kind of quasi-English code that Ruby seems to be using, might as well be writing COBOL.

----------

Sure, but you had to think about it.

No. It's zero to 29, 30 times.

And if you had accidentally typed i<=30 instead of i<30, you're now executing 31 times. On the other hand, a Ruby statement like "30.times" will execute, well, 30 times.

This is a trivial example, though. Let's say you have a function that collects sensor data and stuffs it into an array of indeterminate length. After collecting the data, you then want to print out the results, one per line.

In C, your code would look something like this:

float data_array[ARRAY_SIZE] ; /* The array that will hold our sensor data */
int sensor_measurements ; /* Holds the return value of collect_sensor_data(), which indicates how many measurements were collected */
int i;
sensor_measurements = collect_sensor_data(data_array, ARRAY_SIZE) ; /* collect_sensor_data() takes the maximum number of measurements so we don't overrun our buffer */
for(i=0 ; i < sensor_measurements; i++) {
printf("Measurement %i: %f\n", i+1, data_array[i]);
}

The equivalent code in Ruby would be:

sensor_data().each_with_index{ |value, index| puts "Measurement #{index + 1}: #{value}" }

The Ruby code is not only simpler and more compact...

That Ruby code is completely unreadable....

Cromulent
Jun 18, 2012, 07:47 AM
This is why VB is easy to code but lacks power and why C++ is difficult to code but has more power.


As long as both languages are turing complete (which they are) then they are equally powerful. This argument about one language being more powerful than the other is rubbish. Maybe more elegant or precise but more powerful? No.

talmy
Jun 18, 2012, 08:06 AM
As long as both languages are turing complete (which they are) then they are equally powerful. This argument about one language being more powerful than the other is rubbish. Maybe more elegant or precise but more powerful? No.

Well, one car can be more powerful than another, but as long as they can go down the road they are both touring complete (pun intended).

An old friend of mine back in college (talking about 1970 here) wrote the Tower of Hanoi puzzle in SNOBOL and made the claim it was a superior language because only SNOBOL could do it. Turns out he meant to say "in a single line" but for over a decade, every time I came up against a new language I wrote the puzzle and sent him the program listing. I'm sure a couple dozen in all.

Cromulent
Jun 18, 2012, 08:16 AM
Well, one car can be more powerful than another, but as long as they can go down the road they are both touring complete (pun intended).

This is a poor argument. A car does not have to exactly simulate another car for it to be considered complete.

A turing machine on the other hand is only considered turing complete if it is capable of simulating any other turing complete machine. Thus two turing complete languages can simulate one another exactly. Hence the argument that a turing complete language is more powerful than another turing complete language is rubbish.

balamw
Jun 18, 2012, 08:32 AM
Thus two turing complete languages can simulate one another exactly.

Just like the vehicle comparison, the usual definition of "power" doesn't just mean function, but includes some measure of efficiency.

A Ferrari, semi-trailer truck and Smart car will all get you from point A to point B. One has an advantage in raw speed, one has a benefit in transporting cargo and one may just be easier to park when you get there. Functionally they are equivalent.

Which is more "powerful" in the end will have a lot to do with what it will actually be asked to do and will be interfacing with.

B

subsonix
Jun 18, 2012, 09:28 AM
^^

Depends, if A is "in the city" and B is "in the woods" then it's likely that only one will get you to point B.

Turing completeness only considers computability, real world constraints can still limit what a turing complete language can do, access to hardware, enforced permission restrictions and so on.

To add to the above, SQL is not turing complete, but it's much more powerful than both C or Java if the objective is database queries.

http://shebang.ws/turing-completeness-********.html
http://en.wikipedia.org/wiki/Turing_tarpit

Edit: The first link should end with büll$h1t (fill it in for yourselves)

Sydde
Jun 18, 2012, 11:40 AM
Just like the vehicle comparison, the usual definition of "power" doesn't just mean function, but includes some measure of efficiency.

A Ferrari, semi-trailer truck and Smart car will all get you from point A to point B. One has an advantage in raw speed, one has a benefit in transporting cargo and one may just be easier to park when you get there. Functionally they are equivalent.

Which is more "powerful" in the end will have a lot to do with what it will actually be asked to do and will be interfacing with.

B

Different levels of skill or different skill sets may be involved in driving each of these vehicles quickly and efficiently to their destination. The ability to drive a Ferrari well may not translate readily to the semi. So it is with programming tools.

balamw
Jun 18, 2012, 02:01 PM
Depends, if A is "in the city" and B is "in the woods" then it's likely that only one will get you to point B.

Different levels of skill or different skill sets may be involved in driving each of these vehicles quickly and efficiently to their destination. The ability to drive a Ferrari well may not translate readily to the semi. So it is with programming tools.

Both valid points, which reinforce the argument that neither Java, nor C, are intrinsically superior.

It all depends what you want to do with them and in what environment.

B

splitpea
Jun 18, 2012, 02:23 PM
Hehe.

OP: "Should I learn to drive on an ATV or a Formula 1 racecar?"

MacRumors: "ATVs are useless, drive the F1 car, it'll teach you more about driving." "Yeah, but you can't take the F1 off-road." "You can if you really want to, plus it's faster on the highway." "But ATVs are better for beginners because you can get one with automatic transmission."

Sensible answer: "Take your dad's auto-transmission Camry out to the mall parking lot on a Sunday. Learn to use your steering wheel, brakes and accelerator, and mirrors. Drive it on the back roads and then the highway. Learn to change a flat tire. THEN worry about ATVs and racecars or how to build a custom suspension or repair your own engine."

MacRumors probable response: "Camrys are totally overrated, learn on a Civic instead." "Civics break down all the time, you should drive an Accord instead."

deadshift
Jun 18, 2012, 03:01 PM
Which language is better at solving your problems depends on what problems you have. Java is better for learning because it is more recently created, and the syntax is cleaner and more defined.

C has so much heritage that the syntax is almost infinite. Also, pointers. Yuck. So I agree with those who said that learning the concept of programming is more important than the language you first pick; but that being said, Java provides you an easier platform on which to learn the concepts. If you start with C, you'll spend more time first learning the language before you get to understanding the concepts.

GorillaPaws
Jun 18, 2012, 03:07 PM
I've been reluctant to post here because these discussions are always a bit awkward, especially because they often radically shift away from language and concepts the OP will understand. I will try to keep my reply here understandable to someone without any programming experience, for the benefit of the OP and anyone else who may find this in the future in a similar situation.

In this particular case, I think C makes more sense because the OP's primary goal is to learn Cocoa programming. Objective-C is the goal, and Objective-C is an extension of C. Given this, the OP will need to learn C eventually (and may never have a need to learn Java). Based on these facts I think C is the obvious choice.

The exception to this is that some people struggle with C, and do better learning higher level languages to start. If the OP finds himself in this situation, I think Python is a great higher-level first language choice. The syntax is clean and simple, and it can be used for a wide variety of tasks, including interfacing with Cocoa via PyObj-C way down the road once they have a solid understanding of Cocoa, Objective-C and Python.

Sydde
Jun 18, 2012, 03:54 PM
Also, pointers. Yuck.
Pointers are an important concept. Once you understand them, they make sense. Perhaps the OP will just get them. In a way, you have made a stronger case for C over Java.

"Camrys are totally overrated, learn on a Civic instead." "Civics break down all the time, you should drive an Accord instead."
No way, not a Honda, those encourage bad driving habits. All the most obnoxious traffic moves I have ever seen were by Honda drivers.

chown33
Jun 18, 2012, 04:42 PM
MacRumors probable response: "Camrys are totally overrated, learn on a Civic instead." "Civics break down all the time, you should drive an Accord instead."

No way! Learn to drive a motorcycle. It will also make you more aware of the skills necessary to ride a horse, a camel, or a tauntaun (http://en.wikipedia.org/wiki/List_of_Star_Wars_creatures#Tauntaun).

knightlie
Jun 19, 2012, 02:55 AM
Which language is better at solving your problems depends on what problems you have. Java is better for learning because it is more recently created, and the syntax is cleaner and more defined.

C has so much heritage that the syntax is almost infinite. Also, pointers. Yuck. So I agree with those who said that learning the concept of programming is more important than the language you first pick; but that being said, Java provides you an easier platform on which to learn the concepts. If you start with C, you'll spend more time first learning the language before you get to understanding the concepts.

Best post so far.

Both languages are valid depending on which direction you are coming from. To learn OOP concepts, which is a requirement these days IMO, it's Java. To learn the basics of programming - structures, functions, etc. - it's C. It boils down to which you want to start with.

firewood
Jun 19, 2012, 04:17 PM
As long as both languages are turing complete (which they are) then they are equally powerful.

Assuming you don't care about performance, wall clock time, memory requirements, and battery consumption on a portable device, given currently available development tools.

pragmatous
Jun 19, 2012, 05:01 PM
It depends on how you define powerful :)

As long as both languages are turing complete (which they are) then they are equally powerful. This argument about one language being more powerful than the other is rubbish. Maybe more elegant or precise but more powerful? No.

softwareguy256
Jun 30, 2012, 07:02 AM
As long as both languages are turing complete (which they are) then they are equally powerful. This argument about one language being more powerful than the other is rubbish. Maybe more elegant or precise but more powerful? No.

yeah and theoretically you can use a turing tape machine and wait N > 10000 years to get the answer. In the real world time matters.

firewood
Jun 30, 2012, 06:55 PM
As long as both languages are turing complete (which they are) then they are equally powerful.

If you are talking about Turing completeness, neither has an infinite tape, but C reveals a bit more of the underlying physical Turing machine (addressable virtual memory as the finite tape) than does Java. One less level of indirection.

Sydde
Jun 30, 2012, 09:19 PM
If you are talking about Turing completeness, neither has an infinite tape, but C reveals a bit more of the underlying physical Turing machine (addressable virtual memory as the finite tape) than does Java. One less level of indirection.
asm
{
// some assembly code
}

I would say that C, in most or all of its flavours, can reveal effectively all of the underlying machine, to a much greater extent than Java ever could.

firewood
Jul 1, 2012, 09:31 AM
Although many compilers and IDEs will allow a programmer to mix C and assembly language, 68k, ppc, x86, ia32, armv7, etc. assembly languages themselves are not part of the official ANSI C language.

Sydde
Jul 1, 2012, 01:15 PM
Kind of odd thing, really. I seem to recall a conference recently where they adjusted the official specification of C (ANSI or I-triple-E or somesuch) to encompass strings and threads and such. Now, I am sort of a rube, I see those things as extensions.h, things that are kind of optional, whereas asm is literally part of the language. And yet, what is between the braces of an asm block is usually unportable, possibly not even compiler-supported, so asm is kind of both a part of C and at the same time, pragmatically, not a part of C.

Skulltrail
Jul 5, 2012, 10:27 PM
Start with Java. There's a reason why most Computer Science/Information Technology majors start with it. It's a great programming language for new programmers. Once you become familiar with Java, general OOP, and general programming techniques and processes, C will be a breeze and definitely useful.

firewood
Jul 5, 2012, 11:39 PM
Start with C. Mobile device programming is getting more and more popular. And programmers with experience at a lower level of abstraction from the hardware (which C provides much better than Java) have a better feel for developing apps that can use a smaller memory footprint, a more predictable UI latency, and use up less battery life.

Mr Baldman
Jul 6, 2012, 03:47 AM
Start with C. Mobile device programming is getting more and more popular. And programmers with experience at a lower level of abstraction from the hardware (which C provides much better than Java) have a better feel for developing apps that can use a smaller memory footprint, a more predictable UI latency, and use up less battery life.

this.

C is a good foundation for computer programming in general and is far closer to the metal than Java - C was designed as a language for writing operating systems.

It may be harder, and more difficult to create complex applications with, but in the long term you will appreciate how easy modern languages and their frameworks make it nowadays (such as java, c#/.net, jsNode, ObjectiveC/XCode).

Most important of all, using modern frameworks/languages are an absolute breeze once you understand the disciplines of coding in C.

Futhark
Jul 6, 2012, 02:46 PM
I was confused weather to Learn C or Objective-C as so many people gave me conflicting answers what to learn. what i did do is buy C for Dummies which i'm thoroughly enjoying and i also purchased Programming in Objective-C 4th Edition. The author of the Objective book says his book will teach you all you need to know basically if this is going to be your first programming language, here is a few paragraphs that he says that you might be interested in:

What You Will Learn from This Book


When I contemplated writing a tutorial on Objective-C, I had to make a fundamental decision. As with other texts on Objective-C, I could write mine to assume that the reader already knew how to write C programs. I could also teach the language from the perspective of using the rich library of routines, such as the Foundation and UIKit frameworks. Some texts also take the approach of teaching how to use the development tools, such as the Mac’s Xcode and the tool formerly known as Interface Builder to design the UI.

I had several problems adopting this approach. First, learning the entire C language before learning Objective-C is wrong. C is a procedural language containing many features that are not necessary for programming in Objective-C, especially at the novice level. In fact, resorting to some of these features goes against the grain of adhering to a good object-oriented programming methodology. It’s also not a good idea to learn all the details of a procedural language before learning an object-oriented one. This starts the programmer in the wrong direction, and gives the wrong orientation and mindset for fostering a good object-oriented programming style. Just because Objective-C is an extension to the C language doesn’t mean you have to learn C first.

So I decided neither to teach C first nor to assume prior knowledge of the language. Instead, I decided to take the unconventional approach of teaching Objective-C and the underlying C language as a single integrated language, from an object-oriented programming perspective. The purpose of this book is as its name implies: to teach you how to program in Objective-C. It does not profess to teach you in detail how to use the development tools that are available for entering and debugging programs, or to provide in-depth instructions on how to develop interactive graphical applications. You can learn all that material in greater detail elsewhere, after you’ve learned how to write programs in Objective-C. In fact, mastering that material will be much easier when you have a solid foundation of how to program in Objective-C. This book does not assume much, if any, previous programming experience. In fact, if you’re a novice programmer, with some dedication and hard work you should be able to learn Objective-C as your first programming language. Other readers have been successful at this, based on the feedback I’ve received from the previous editions of this book.

This book teaches Objective-C by example. As I present each new feature of the language, I usually provide a small complete program example to illustrate the feature. Just as a picture is worth a thousand words, so is a properly chosen program example. You are strongly encouraged to run each program (all of which are available online) and compare the results obtained on your system to those shown in the text. By doing so, you will learn the language and its syntax, but you will also become familiar with the process of compiling and running Objective-C programs.

firewood
Jul 6, 2012, 05:09 PM
I was confused weather to Learn C or Objective-C...

You really need to learn both. Objective C may be a pure superset of C, but the typical usage of the 2 languages (object messaging versus procedural routines) is moderately different.

If you learn both high level and low level programming ideas, you'll know what level of abstraction you're coding with, and what level of attack might be more or less appropriate for the programming problem at hand.

poobah
Jul 13, 2012, 07:27 AM
Perhaps a different slant, actually getting a job...

If I needed a good general purpose "programmer guy" (with apologies to the ladies), and you told me I could hire one of 2 programmers, one who was a good C programmer, and one who was a good (pick any of the high level abstract away every detail languages... java, ruby, whatever), I will take the C programmer every time.

I can teach a C programmer the other stuff, because the C programmer has a solid foundation. Like it or not, C is the underlying structure on which many of the higher level languages are built. Once you know C, it is a simple matter to "get spun up" on the higher level stuff.

It's easier to teach a mechanic to drive, than to teach a driver how to repair an engine. The mechanic understands the systems, how they work, what the implications and interactions are, etc. The driver knows the big pedal makes the car go faster.

If all you know is how to make calls on a massive library, you are missing fundamentals. If you have no understanding of how these libraries accomplish their tasks, you are ill prepared to choose the appropriate one.
Oh, and when I need to have a micro-controller talk to your application, I have to hire someone else to program it.

(caveat.... I'd really like a programmer who knows C *and* has a good grasp of OOP concepts. Those 2 skills cover a vast swath of programming requirements in the real world.)

talmy
Jul 13, 2012, 08:39 AM
I can teach a C programmer the other stuff, because the C programmer has a solid foundation. Like it or not, C is the underlying structure on which many of the higher level languages are built. Once you know C, it is a simple matter to "get spun up" on the higher level stuff.

If all you know is how to make calls on a massive library, you are missing fundamentals. If you have no understanding of how these libraries accomplish their tasks, you are ill prepared to choose the appropriate one.
Oh, and when I need to have a micro-controller talk to your application, I have to hire someone else to program it.

You've "got it" but C really has nothing to do with it. What's needed is a background in the fundamentals -- data structures, algorithms, OOP, design patterns... And it doesn't really matter which language you use. Now it may be true that a C programmer knows more about these fundamentals than someone who has only used, say, Perl, but it's not necessarily so. The language is just a tool and you use whatever is available.

My personal history -- wrote my first program in 1968 in a Basic-like language and later in Fortran, first C program in 1980, Java in the late 1990s. Writing programs is part of my occupation as an Electrical Engineer and always has been. I use C for embedded code, C, C++ (with Qt4), and Java elsewhere. A well used set of Knuth's Art of Computer Programming is on my shelf as well as Design Patterns by the GoF, and a first edition of The C Programming Language (among other books).

mrichmon
Jul 13, 2012, 07:46 PM
C++ is by far the most powerful language with proven concepts decades old. Once a sufficient level has been reached it is virtually IMPOSSIBLE to write badgood C++ code.

There, I fixed that for you. ;)

C++ is a useful and powerful language. It is widely used and also widely abused. Writing good, reliable, and maintainable C++ code requires strong programmer discipline. Without careful use, C++ ends up being developer-hostile.

mrichmon
Jul 13, 2012, 08:00 PM
Sure, but you had to think about it. And if you had accidentally typed i<=30 instead of i<30, you're now executing 31 times. On the other hand, a Ruby statement like "30.times" will execute, well, 30 times.

This is a trivial example, though. Let's say you have a function that collects sensor data and stuffs it into an array of indeterminate length. After collecting the data, you then want to print out the results, one per line.

In C, your code would look something like this:

float data_array[ARRAY_SIZE] ; /* The array that will hold our sensor data */
int sensor_measurements ; /* Holds the return value of collect_sensor_data(), which indicates how many measurements were collected */
int i;
sensor_measurements = collect_sensor_data(data_array, ARRAY_SIZE) ; /* collect_sensor_data() takes the maximum number of measurements so we don't overrun our buffer */
for(i=0 ; i < sensor_measurements; i++) {
printf("Measurement %i: %f\n", i+1, data_array[i]);
}

The equivalent code in Ruby would be:

sensor_data().each_with_index{ |value, index| puts "Measurement #{index + 1}: #{value}" }

The Ruby code is not only simpler and more compact, ...

The Ruby code shown is not a direct equivalent of the C code shown.

The C code is calling a separate function that may be in a separate library to obtain the data values. Once the values are obtained then the C code prints out each of the data values.

In comparison, the Ruby code assumes that the data values are already available and just prints out the data values one by one. As such, the Ruby code only implements half of the work performed by the C code. Assuming the work is defined as:

Obtain the data values
Print the data values


There are also several ways that the C code can be shorten to be more compact. There are also similar ways that the Ruby code can be expanded to have a more natural equivalency with the C code.

For example (using C99):

float data_array[ARRAY_SIZE] ;
int sensor_measurements = collect_sensor_data(data_array, ARRAY_SIZE) ;
for(int i=0 ; i < sensor_measurements; i++) {
printf("Measurement %i: %f\n", i+1, data_array[i]);
}

poobah
Jul 13, 2012, 08:40 PM
You've "got it" but C really has nothing to do with it. What's needed is a background in the fundamentals -- data structures, algorithms, OOP, design patterns... And it doesn't really matter which language you use. Now it may be true that a C programmer knows more about these fundamentals than someone who has only used, say, Perl, but it's not necessarily so. The language is just a tool and you use whatever is available.

My personal history -- wrote my first program in 1968 in a Basic-like language and later in Fortran, first C program in 1980, Java in the late 1990s. Writing programs is part of my occupation as an Electrical Engineer and always has been. I use C for embedded code, C, C++ (with Qt4), and Java elsewhere. A well used set of Knuth's Art of Computer Programming is on my shelf as well as Design Patterns by the GoF, and a first edition of The C Programming Language (among other books).

Completely agree. My definition of "good C programmer" includes the fundamentals. Without them, you are a bad C programmer :D

My bookshelf looks like yours, though I have Code Complete next to my 'gang of four' book. I also require any new team members to read "The Design of Everyday Things". The not so good programmers always say "well these things are obvious", then give me code that violates several of the ideas.

macsmurf
Jul 14, 2012, 06:55 AM
Perhaps a different slant, actually getting a job...

If I needed a good general purpose "programmer guy" (with apologies to the ladies), and you told me I could hire one of 2 programmers, one who was a good C programmer, and one who was a good (pick any of the high level abstract away every detail languages... java, ruby, whatever), I will take the C programmer every time.

I can teach a C programmer the other stuff, because the C programmer has a solid foundation. Like it or not, C is the underlying structure on which many of the higher level languages are built. Once you know C, it is a simple matter to "get spun up" on the higher level stuff.


I disagree. It's not a sliding scale. It's different paradigms. Different languages promote different ways of thinking about solving problems.


It's easier to teach a mechanic to drive, than to teach a driver how to repair an engine. The mechanic understands the systems, how they work, what the implications and interactions are, etc. The driver knows the big pedal makes the car go faster.


It's easier to teach anyone how to drive than how to repair an engine.


If all you know is how to make calls on a massive library, you are missing fundamentals. If you have no understanding of how these libraries accomplish their tasks, you are ill prepared to choose the appropriate one.
Oh, and when I need to have a micro-controller talk to your application, I have to hire someone else to program it.


Being a good C programmer does not give you any special insight into how a library accomplishes its tasks, as opposed to being a good * programmer.

Lucene, for example, is a search engine library for Java. In order to get a deep understanding of that library you need some knowledge of the approaches for searching through text. Whether or not those approaches have been implemented in C is utterly irrelevant.


(caveat.... I'd really like a programmer who knows C *and* has a good grasp of OOP concepts. Those 2 skills cover a vast swath of programming requirements in the real world.)

Why? You just said it was easy to teach him. Just give him a week to read through GoF ;)

macsmurf
Jul 14, 2012, 08:01 AM
The Ruby code shown is not a direct equivalent of the C code shown.

The C code is calling a separate function that may be in a separate library to obtain the data values. Once the values are obtained then the C code prints out each of the data values.

In comparison, the Ruby code assumes that the data values are already available and just prints out the data values one by one. As such, the Ruby code only implements half of the work performed by the C code. Assuming the work is defined as:

Obtain the data values
Print the data values



Huh? In the Ruby program, sensor_data() (a function) obtains the data values. each_with_index then performs some action on each of the values, where the action in this case is printing.

Although I disagree with the poster's conclusion, the point I believe he was trying to make was that a rather large part of the C program is about managing the array of values and not actually about the task at hand (printing the values collected from somewhere else). In the Ruby program all that is abstracted away. We don't really care what the underlying data structure returned from sensor_data() is, nor do we need to manage it.

ytk
Jul 15, 2012, 04:00 AM
The Ruby code shown is not a direct equivalent of the C code shown.

The C code is calling a separate function that may be in a separate library to obtain the data values. Once the values are obtained then the C code prints out each of the data values.

In comparison, the Ruby code assumes that the data values are already available and just prints out the data values one by one. As such, the Ruby code only implements half of the work performed by the C code.

That's incorrect. I should know—I wrote the code! :D

As macsmurf explained, sensor_data() is a function that collects and returns the value of the sensor data, which can then be operated on directly with no need to store it in an interim variable.

Of course, I only put in the parentheses for this example to make it clear that sensor_data was a function. Most Ruby programmers would leave them off, writing sensor_data.each_with_index... I didn't do that because I thought someone might raise the objection you raised if I did, that the data was assumed to be there and never collected. But that's one of the things that's really cool about Ruby—it doesn't matter either way! sensor_data might be a variable or a function that returns a value, and the programmer accessing it would neither know know care which it was, as long as it returns the correct data when you reference it.

throAU
Aug 2, 2012, 05:53 PM
Ruby. Lol.


It might be easier, but I know of no real projects written in ruby. It is simply not as commonly used as either Java or C.

If you learn either C (and then ObjC is a superset) or Java you'll have plenty of work available.

As I said previously, Java is probably more versatile in terms of actual employment (cross platform apps), but C gets you closer to the hardware and can be used for lower level stuff.

Lower level stuff though is mostly a solved problem these days (just use the OS libraries from a higher level language) unless you're involved in a few specific industries, like game development, for example.

tr!pf!3
Aug 3, 2012, 01:58 AM
C is just so much faster.

kthomp
Aug 3, 2012, 03:40 AM
So out of Java and C which has the most uses, what are they, and which is easier as a sort of gateway language for learning others?

Which is better and/or which has the most uses? Neither. Both. Each is better than the other depending on the problem you're trying to solve. To put it a different way, there is no Golden Hammer (http://en.wikipedia.org/wiki/Law_of_the_instrument).

Which is easier as a gateway language? I would say Java would be easier for a novice to pick up, but C is probably better as a precondition for learning Objective C.

What are their uses? Impossible to list. Both are exemplars of particular kinds of languages, and frankly, approaches to programming. I think anyone interested in programming should be familiar with both.

VinegarTasters
Aug 3, 2012, 05:45 AM
You should learn C, then migrate to C++. If you can, learn assembly as well. .NET, JAVA, Python, Ruby, and all interpreted languages are going out of style. As things get more mobile, battery life and limited memory are more of a factor than the benefits you get from dumbing down via abstraction from actual hardware. There IS one golden hammer... assembly. Speed will prevent you from using Java or C#, etc, to make AAA games. You can use C, as that is fairly low for drivers (things close to the kernel), but even people go towards assembly for inner loops in AAA games when they code in C.

There was a brief period of time where webservers in Java were ok, but their benefits were gotten via loading tons of things inside main memory, regurgitating rather than processing. If you do research and speed or memory size is not important, then slow chugging Java, C# is ok. Simple web apps like html widgets are ok in these slow languages, but even then they will slow down your OS so much people rather not use them, and download apps instead. (look at the number of widgets in osx from developers).

Cromulent
Aug 3, 2012, 09:02 AM
If you do research and speed or memory size is not important, then slow chugging Java, C# is ok.

I hope you realise that Java JIT compilation allows for optimisations of the code that are simply not possible in a statically compiled language such as C or C++. Thus in some circumstances Java can actually outperform C or C++.

Speed of the language though is often not the problem. The thing that most often makes a program slow is the algorithm used to implement it. If you have a ****** algorithm it doesn't matter if you implement it in assembly, it will still be dog slow.

softwareguy256
Aug 3, 2012, 10:14 PM
Java is dog slow for many reasons. you sound like a college kid who has no clue of the business. I'd send you home if I was interviewing you.

I hope you realise that Java JIT compilation allows for optimisations of the code that are simply not possible in a statically compiled language such as C or C++. Thus in some circumstances Java can actually outperform C or C++.

Speed of the language though is often not the problem. The thing that most often makes a program slow is the algorithm used to implement it. If you have a ****** algorithm it doesn't matter if you implement it in assembly, it will still be dog slow.

talmy
Aug 3, 2012, 11:01 PM
Java is dog slow for many reasons. you sound like a college kid who has no clue of the business. I'd send you home if I was interviewing you.

Then I might hire him. I had a fairly complex simulation which was originally written in C++ back in 1998 that I recoded into Java which I was just learning at the time, and it was, indeed, faster. And there were good reasons for this, mainly because the language was better suited for what I was trying to accomplish and I found better ways to implement the algorithms because of it. Sometimes the "efficiency" of a compiled language (or assembler) doesn't get you faster operation, or faster implementation. I was selling Forth systems I developed in the 1980s which outperformed C (a financial system where they had to figure out how to embed the Forth calculation engine in a larger system because their attempt to implement it in C was half the speed) and even assembly (a stepper motor controller where the assembler code was using a poor algorithm for advancing motor position).

kthomp
Aug 4, 2012, 12:06 AM
You should learn C, then migrate to C++. If you can, learn assembly as well. .NET, JAVA, Python, Ruby, and all interpreted languages are going out of style. As things get more mobile, battery life and limited memory are more of a factor than the benefits you get from dumbing down via abstraction from actual hardware.

Java is the primary target language and environment in Android development, dropping down into native when necessary. So the most popular mobile O/S uses one of your examples of a language which is "going out of style", which makes me question your argument :confused:

VinegarTasters
Aug 4, 2012, 02:28 AM
Java is the primary target language and environment in Android development, dropping down into native when necessary. So the most popular mobile O/S uses one of your examples of a language which is "going out of style", which makes me question your argument :confused:

Yes, Java WAS Android's primary language, forcing everyone to use it. Then they realized how slow it was, and how few hardcore action games are on it, they had to release the NDK, where you can program in C/C++, but still stuck in their slow Java wrappers. If Java could do anything, NDK wouldn't have been needed. Even Chrome mobile and jellybean OS shows they are moving away from Java to keep up in speed. Do you know why Android mobile phones usually have double or quadruple the amount of ram than iOS devices? Because Java is a memory hog. Same with C# and all interpreted languages. The common yardstick is this: Java/C# 10 times slower than C. 5-10 times more memory than C. Java was so slow, in benchmarks they had to use so much tricks to make it meet or barely faster than C/C++. But if you do the same tricks in C/C++ you end up being 10 faster again. (things like unrolling loops, table lookups, etc).

Same thing with C# (that Java copied language) and the .NET interpreted stuff. I don't know what Oracle was thinking when they bought SUN. If they start putting everything in Java, they will surely lose to SAP. The performance is so slow., while speed is so critical in databases. Even Google bypassed slow harddrive and putting everything into RAM. (YES, they would rather pay millions buying RAM than save money buying harddrives).

Now does this mean Java or C# is bad? No. But in mobile devices yes, they sure are. Even in desktop, the slowness from LLVM technology in Lion and Mountain Lion and the huge increase in minimum RAM makes them laughing stocks. While doing some work, I pressed the file open from TextEdit. It took 10 seconds for the stupid dialog to pop up. I can almost guaranteed this was a virtual machine JIT. The second time, it popped up quicker. There are so many of these spinning circle sessions. This was in Mountain Lion. I don't know what they are thinking, but I am sure if they end up being like Corel, not putting performance in their design and implementation specs, they have no one else to blame but themselves. Just because it is the latest thing (like the rewritable optical disks in NeXT) doesn't mean it is the best. Performance is critical in OS. It limits what apps do on top of them if the OS is the bottleneck. EVERYONE know Windows games run faster then OSX games on same hardware. Why is this common knowledge? SLOW OS. Even Objective-C is slower than C/C++. I don't even know why it is even necessary to use it just because at the time it was released they thought it was the greatest technology... No, you choose appropriate behavior for appropriate situations. The OS being the lowest layer, MUST RUN FAST, or it will slow everything ABOVE it. Performance is critical in that situation.

Now someone will now come in and say, well, I did this thing in slow language, but it ended up faster than C or assembly! Yes, if you do something using a slow algorithm in a faster language, it may be slower than a faster algorithm using a slow language. If you did the same algorithm in both languages, then of course C or assembly will be faster almost 1000% faster. For example, bubble sort versus quicksort.

Cromulent
Aug 4, 2012, 04:22 AM
Java is dog slow for many reasons. you sound like a college kid who has no clue of the business. I'd send you home if I was interviewing you.

You obviously have no idea what you are talking about. Frankly I'd be worried if you were interviewing me. Get a clue before making stupid posts.

Oh and if you are going to make a claim that "Java is dog slow for many reasons" then you need to back up your claim with facts. Otherwise you just look like a fool.

VinegarTasters
Aug 4, 2012, 05:38 AM
You obviously have no idea what you are talking about. Frankly I'd be worried if you were interviewing me. Get a clue before making stupid posts.

Oh and if you are going to make a claim that "Java is dog slow for many reasons" then you need to back up your claim with facts. Otherwise you just look like a fool.


I think you are out of your league.

http://stackoverflow.com/questions/10260021/c-intel-vs-java-hotspot-vs-c-sharp-benchmark-questions-code-and-results-i


You are talking microseconds versus milliseconds.

http://wiki.answers.com/Q/What_is_smaller_micro_or_milli

Yes, C is micro. Java is milli. C# is even slower.

Cromulent
Aug 4, 2012, 08:26 AM
I think you are out of your league.

http://stackoverflow.com/questions/10260021/c-intel-vs-java-hotspot-vs-c-sharp-benchmark-questions-code-and-results-i


You are talking microseconds versus milliseconds.

http://wiki.answers.com/Q/What_is_smaller_micro_or_milli

Yes, C is micro. Java is milli. C# is even slower.

I think you are the one who is out of their league. Rather than quoting Q&A links from unverified sources or a page from Wiki answers. How about some real academic papers?

Since you seem incapable of actually providing decent evidence to support your claims I will do it for you.

http://www.philippsen.com/JGI2001/finalpapers/18500097.pdf

Here is a quote:

We tested four Java environments under Linux. Of these, the IBM 1.3 JDK gave the best performance on all the benchmarks except MolDyn. In almost all cases of execution are less than for the NT version of the same JDK. The Blackdown 1.3 and the two versions of the Sun 1.3 were roughly comparable overall, though there are significant differences on individual benchmarks.

Comparisons with C compilers on the system are very favourable. The IBM 1.3 JDK is on average slightly faster than KAI C++ and only 15% slower than GCC. The mean ratio of fastest Java to fastest C execution times in only 1.07. At this level of difference there is no case for preferring C to Java on grounds of performance.

That means that Java is faster than one C++ compiler and only slightly slower than another C++ compiler. Also to add these benchmarks were conducted using 1.3 of the JDK. Since that time significant improvements have been made which will increase performance even more.

So until you can show some proper evidence to back up your claims (and that means academic peer reviewed papers not something you have found on Stackoverflow or Wiki Answers) please stop posting because you clearly have no idea what you are talking about.

VinegarTasters
Aug 4, 2012, 10:04 AM
I think you are the one who is out of their league. Rather than quoting Q&A links from unverified sources or a page from Wiki answers. How about some real academic papers?

Since you seem incapable of actually providing decent evidence to support your claims I will do it for you.

http://www.philippsen.com/JGI2001/finalpapers/18500097.pdf

Here is a quote:



That means that Java is faster than one C++ compiler and only slightly slower than another C++ compiler. Also to add these benchmarks were conducted using 1.3 of the JDK. Since that time significant improvements have been made which will increase performance even more.

So until you can show some proper evidence to back up your claims (and that means academic peer reviewed papers not something you have found on Stackoverflow or Wiki Answers) please stop posting because you clearly have no idea what you are talking about.

Look, if you want to win the argument, you should try a newer paper (1998? holy). At the time that paper was released, Java is even slower than todays Java (before the maturation of JIT, etc). All those people are doing is taking someone super good in Java doing extreme optimization on it and testing it with C. Read my previous comments. If the same optimization are done in C we are back to 10 times faster.

Ask Android why there are no AAA games in Java. It is just too slow. And the effort to do it (using all tricks of Java) is just not worth it when you can simply do a simple non-tricks version in C and beat it. You could have taken the Quake in Java beating C/C++ benchmark, where they went though all sort of tricks. (if you do the same thing in C, goes back to square one) I'm not here to argue with you. Just go ahead and make something in Java and C yourself using the same algorithm and come back.

Look at the first link. It is the SAME algorithm for all languages. No tricks. Compile it, in Java. Compile it in C. Run them both. 1000 times faster! For large programs, you get average 10x faster (where you can get variance because of classes, compared to functional). But just for some simple small loop Java 1000 time slower! Thats 1000000% slower! 10 times slower is 1000% slower. I know it may feel sad at these figures, and maybe you have some experience in Java. But just ask around people who have created software like games on them. Ask them what the main problem is. IT IS DAMN SLOW! It takes so much memory too for the virtual memory. And the garbage collector kicks in, screwing up the framerate at odd times, forcing you to keep everything in global memory if possible.

In other words, if someone is super good in Java tricks, and someone is super good in C. Bet your life on the C guy. If someone is super good in C and someone is super good in assembly. Bet your life on the assembly guy. If you eventually work with a supercomputer, where speed and memory is no problem, go ahead, do work in Java. Otherwise, find benchmarks with SAME algorithm, no tricks. If you allow different algorithms, then it is experience in of one guy against experience of another person which ends up skewing it. Google added C/C++ to placate the Android developers. Apple actually force Java and interpreted languages off iOS (then they went back to allow emulators). One is to speed up, the other is to speed up.

If after you read the above and you STILL are not convinced...

Read this: 2011 (NOT 1998!)
http://bruscy.republika.pl/pages/przemek/java_not_really_faster_than_cpp.html

C++ is slower than C. So if you code it in C even faster. A fast C loop is VERY FAST. Faster than object oriented method with all the C++ overhead. Assembly is even faster.

If you STILL need convincing...

http://shootout.alioth.debian.org/

Play with that. If you want to make a Java benchmark faster than C, simply find weakest in C code, and code the Java using their strong points. If you want experience on the topic, peruse all the benchmarks in that link. (note they remove the first loop, where JIT is compiling, which is actually helping Java, so if you include it, it is STILL DAMN SLOWER) On average with the first loop it is 10x+ slower in practical use. Experience is greater than theory. Again, I'm not here to argue with you. So either you take it as it is, or you can be like the emperor with no clothes story guy.

talmy
Aug 4, 2012, 11:01 AM
The higher level languages give more abstractions that make programming more productive and less error prone. You can off course write anything in assembler, since ultimately everything is running machine code anyway. And because of this you can always write more optimal code in assembler or a low level language like C. Heck, the first C++ compilers were translators that converted C++ to C for compilation.

I remembered another example (besides my Java vs C++ one). I was working at Tektronix which had a license to produce Smalltalk systems (speaking of "dog slow"?). We had one product where the interpreter was written in 68000 assembler, but some people were interested in a version that would run on the IBM PC (with an 80386 processor). I took on the challenge in in a few months did an implementation entirely in C except for graphics primitives ("bitblt" and line drawing) and bytecode dispatching. The 68000 proponents were smug that this quickly cobbled, high level language implementation would be awful, however it ran twice as fast as the 68000 code.

You keep talking about "AAA games" and I really don't know what they are but can imagine that for games speed is the top criterion. So you are willing to invest more time (and money) crafting the code particularly in assembler for maximum speed. I'm much more concerned with implementation time and program correctness. Embedded programming moved from assembler to C some years ago. There is no way we could afford to use assembler. It was used for compactness, but the microcontrollers have far more capacity than they used to have so space is not a problem. Programs running on Linux and Windows are all C++ or C# now. I don't use Java here but as I mentioned, used it for many years. It has hooks for assembly language that I've used to interface with hardware but never used it for performance enhancement. Algorithms are always the most important factor in performance.

Sven11
Aug 4, 2012, 11:31 AM
Java Sucks (http://harmful.cat-v.org/software/java)

VinegarTasters
Aug 4, 2012, 11:41 AM
The higher level languages give more abstractions that make programming more productive and less error prone. You can off course write anything in assembler, since ultimately everything is running machine code anyway. And because of this you can always write more optimal code in assembler or a low level language like C. Heck, the first C++ compilers were translators that converted C++ to C for compilation.

I remembered another example (besides my Java vs C++ one). I was working at Tektronix which had a license to produce Smalltalk systems (speaking of "dog slow"?). We had one product where the interpreter was written in 68000 assembler, but some people were interested in a version that would run on the IBM PC (with an 80386 processor). I took on the challenge in in a few months did an implementation entirely in C except for graphics primitives ("bitblt" and line drawing) and bytecode dispatching. The 68000 proponents were smug that this quickly cobbled, high level language implementation would be awful, however it ran twice as fast as the 68000 code.

You keep talking about "AAA games" and I really don't know what they are but can imagine that for games speed is the top criterion. So you are willing to invest more time (and money) crafting the code particularly in assembler for maximum speed. I'm much more concerned with implementation time and program correctness. Embedded programming moved from assembler to C some years ago. There is no way we could afford to use assembler. It was used for compactness, but the microcontrollers have far more capacity than they used to have so space is not a problem. Programs running on Linux and Windows are all C++ or C# now. I don't use Java here but as I mentioned, used it for many years. It has hooks for assembly language that I've used to interface with hardware but never used it for performance enhancement. Algorithms are always the most important factor in performance.

Come on... you are not being fair. Comparing Motorola and Intel chip speed in ADDITION to your experience coding C versus assembly? (you were wise to keep the graphical blits and interpretation in assembly though). Yes, I agree algorithm is important. No argument there. That is why you should compare languages using same algorithm. Quicksort in slow language can beat bubble sort in fast language depending on language.

As for AAA games, things like Battlefield 3 and Call of Duty or any high selling games on Sony Playstation3, like GTA5 , MGS4. Especially GT5. Only the menu or loading screens can you probably tolerate using higher level languages. The engine you won't find them running on Java. The closest Java came is probably Runescape and Minecraft. They can be played because their graphics are not that high or demanding compared to todays AAA titles. I put Minecraft as a casual game.

I have nothing against productivity and Java/C# etc. Its just that in mobile devices, there is limited battery life, which means limited power consumption, limited space, limited memory. Interpreted languages are the opposite. They suck up time/power/memory. Even if you do really well and get Java to only be 2 times slower than C++, a 30FPS AAA game will run 15FPS, killing the game on the platform. If Java was forced as the language of choice for OSX, guess what? All 30FPS games mentioned above WONT run on OSX because no matter how fast you code, the kernel in command of the drivers will slow it down many times. Casual games you can get away with... That is why Java is still ok on Android for these.

softwareguy256
Aug 4, 2012, 03:01 PM
To state the obvious, any interpreted language has a non-zero overhead and garbage collection introduces non-deterministic running time that is simply unacceptable for many mission critical tasks. Cube jobs, IT work, there's so much slack and the penalties for mistakes are so minor that using ANY language including java and visual basic are acceptable.

It is blatantly ABSURD to say that Java is faster than C++ and its only a matter of time whether it is 1, 5, 10, 20 years before you realize that I am right. But the base pay says everything. If you are making under 6 figures you need to be quiet immediately, because it shows that the market does not really value your opinion or your expertise.

You obviously have no idea what you are talking about. Frankly I'd be worried if you were interviewing me. Get a clue before making stupid posts.

Oh and if you are going to make a claim that "Java is dog slow for many reasons" then you need to back up your claim with facts. Otherwise you just look like a fool.

charlieegan3
Aug 4, 2012, 03:03 PM
C#

softwareguy256
Aug 4, 2012, 03:06 PM
It is quite possible that people who don't know C++ can easily write bad code that is 2x slower because they fail to understand all the hidden code that is inserted by the compiler. That is why in my business we hire only the best and brightest.

Then I might hire him. I had a fairly complex simulation which was originally written in C++ back in 1998 that I recoded into Java which I was just learning at the time, and it was, indeed, faster. And there were good reasons for this, mainly because the language was better suited for what I was trying to accomplish and I found better ways to implement the algorithms because of it. Sometimes the "efficiency" of a compiled language (or assembler) doesn't get you faster operation, or faster implementation. I was selling Forth systems I developed in the 1980s which outperformed C (a financial system where they had to figure out how to embed the Forth calculation engine in a larger system because their attempt to implement it in C was half the speed) and even assembly (a stepper motor controller where the assembler code was using a poor algorithm for advancing motor position).

Boltonic
Aug 4, 2012, 03:23 PM
To state the obvious, any interpreted language has a non-zero overhead and garbage collection introduces non-deterministic running time that is simply unacceptable for many mission critical tasks. Cube jobs, IT work, there's so much slack and the penalties for mistakes are so minor that using ANY language including java and visual basic are acceptable.

It is blatantly ABSURD to say that Java is faster than C++ and its only a matter of time whether it is 1, 5, 10, 20 years before you realize that I am right. But the base pay says everything. If you are making under 6 figures you need to be quiet immediately, because it shows that the market does not really value your opinion or your expertise.


By you saying that a software engineer who makes under 6 figures opinion doesn't matter shows that you are ignorant and have no clue what developers make across the country. I work on a multi billion dollar project as one of the main developers. I do make very good money as you state but I have worked with developers who make less than me but are absolutely more OO savey. I have also have went to another state as a representative of my company to start up our business there. Companies down there employ people who have absolutely no clue what good programmer is and they make way more than me. The problem is our field is that people become complacent and will not adapt to newer and better techniques. Developers have domain knowledge which makes them complacent and hard to fire. They won't adapt adapt to things like OO or TDD. The developers that do are mostly younger which means they may not make the we do which is bs to me. This is just my 2 cents.

talmy
Aug 4, 2012, 07:00 PM
Come on... you are not being fair. Comparing Motorola and Intel chip speed in ADDITION to your experience coding C versus assembly?

Yeah, true. :) But the 68000 team at the time felt that their better architecture (no argument there!) and 100% assembly language code was going to be far faster than mine. After all assembly is superior in performance.

History keeps repeating itself.

VinegarTasters
Aug 4, 2012, 09:16 PM
Yeah, true. :) But the 68000 team at the time felt that their better architecture (no argument there!) and 100% assembly language code was going to be far faster than mine. After all assembly is superior in performance.

History keeps repeating itself.

Yes, you bring up a good point. Experience is important. With experience, you can improve a slow languages algorithm to match a faster language. Take a look at string reversal for example. If this was implemented in a function like C, someone would normally create a new string array, copy character to character from the opposite end of the original string, then delete or transfer the final results. A better algorithm is to simply keep the original string go index from one end to the middle and swap characters with the other end (based on stringlen). Instead of 3(N) you are going (N/2). The other two N comes from copying and transfer back, the last N from interation. That is already 6x faster based on this simple experience. And if someone never learns or improve, they will always keep doing old slow habits. Like you say, history keeps repeating itself. The problem with higher languages is that you can't get at the low level to improve performance. For example, if you keep a count inside a 64bit integer inside main memory, and are manipulating things there it can slow up to 10 times compares to keeping the index inside a register inside the CPU itself. With Java and C# its difficult because the virtual machine mostly takes up the whole CPU, then provide a fake CPU that you can't really take advantage of.

Which come back to a good point. Why is ObjectiveC still being used in OSX? It is so slow. Message passing is so slow compared to procedural calls. It is such a problem that in COM they learned to pass by reference via .tlb files rather than pass by value.

balamw
Aug 4, 2012, 10:53 PM
MOD NOTE: Thread has long since left the original topic. Please report this post if you have something to add to that topic.

B