Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

AdonisSMU

macrumors 604
Oct 23, 2010
7,298
3,047
I learned Python with the MIT course in iTunes U

Oh yeah and those courses are free which is nice. They also have an iOS course. However, I am loving this book I have from The Nerd Ranch on Obj-C programming for iPhone and iPad.
 

ChrisA

macrumors G5
Jan 5, 2006
12,581
1,697
Redondo Beach, California
...
[1] Here's how you teach Logo:

1. This is a word.

2. This is how you run words.

3. This is how you add your own words.

Compare to the 200-page whargarble you have to wade through just to tour the core of a C/Python/JS/whatever language, and you wonder what mainstream programmers are smoking. Or maybe they just enjoy creating endless complexity and OCD micromanaging make-work for themselves... :p

It depends on WHAT you are writing. Two examples...

(1) Are you a scientist or engineer wanting to reduce and plot some data. You write maybe at most 100 lines of code, get your results then trash the program.

(2) you are controlling and automating the operation of a nuclear power plant on a submarine. An error can kill lots of people and cost over a billion dollars and this code will have to be maintained by the next generation of programmers after you retire.

The method you use to develop each of these is different. In the first case you just hack on the code and and make changes until it works then you stop. In the second case you DESIGN it fist in rough detail then present your redesign and defend it in front of a review committee. When that passes you do a more detailed design or each of the parts and have that design reviewed. Then you start thinking of ways to TEST the parts and write up a plan and get that through a review process too. Now you are ready to start wrong the sub-parts.

I doubt you'd ever get case #2 to work in Logo. Ada would be my #1 choice for high stakes embedded work. but the engineer who just wants to see his data would go nuts trying to work in Ada.

I've worked on both types of projects. Most of us work in the area between. The programming languages are mostly concerned problems that come up during development. On the larger projects the most common problems are miscommunication between the various subgroups of programmers and changes to code made by someone who is unable to see the big-picture effect of the change. One small projects the problem is just writing the code because the design and function is obvious.

You pick a language and a development method or "process" that fits the kind of software you are writing. there is no "best", there is only a "best fit"
 

Barney63

macrumors 6502a
Original poster
Jan 9, 2014
799
1
Bolton, UK.
In the UK they are changing the way ICT (Information and Communication Technology) is taught in schools. They are changing it to computer science I believe.
This will now be based on programming and will be taught from something like year 5 (9-10 year olds) upto year 12 (16-17 year olds).
I think the languages that they will be teaching are Logo and Scratch (IIRC).
What are these languages like to learn?

I am currently doing a Maths Degree and I'm considering maybe going into teaching when I graduate, so the languages that they teach at school might be more relevant to me.

Any comments?


Barney
 

Mewtwo

macrumors newbie
Jun 3, 2014
3
0
You're asking the wrong question. You should be thinking about a project you want to begin.

Regardless... For a curious beginner I strongly recommend Java. It's a very versatile language that's easily learned.
 

hiddenmarkov

macrumors 6502a
Mar 12, 2014
685
492
Japan
Oh yeah and those courses are free which is nice. They also have an iOS course. However, I am loving this book I have from The Nerd Ranch on Obj-C programming for iPhone and iPad.

MIT have a pythin centric class on edx starting tomorrow (today, depending on time zone). Its a MOOC which means it will have a free and paid option. We all learn differently, I find I need some forced guidance at first. namely assignment deadlines and such. If only to have a reason to tell the wife I need a study night lol. Also gets me CPE's I need for various things tbh. Self study nice, self study that for a cert I have is more official to meet CPE needs for the year usually a nice bene as well.


https://www.edx.org/course/mitx/mitx-6-00-1x-introduction-computer-1841


class link if interested. I have not done MOOC from edx fo don't know how it goes with them. But if you opt for free audit no skin off your nose, right?
 

hhas

macrumors regular
Oct 15, 2007
126
0
You're quite confused. Here's how you do most stuff in a good language:

1 - Use the standard tool.
2 - Test.
3 - Debug if necessary.

Welcome to C. You must be new. Enjoy your malloc() and free().

Seriously, go crank open an old school book on actual Computer Science sometime, as opposed to the standard Java diploma mill schmutz that mostly gets pumped out nowadays. The more I (slowly) learn, the more I realize there's bugger all math or science involved in 99% of today's mainstream programming and languages. It's mostly bureaucracy, ideology, and good ole John Wayne cowboyism; just high-functioning Dunning Kruger-ism.

To quote Guy Steele (before he went to the dark side): "The most important concept in all of computer science is abstraction."

Everything else is just the tedious mechanical crap you've gotta wade through on your way to being able to say what you mean. And modern mainstream languages are *fantastically good* at drowning that one simple truth under such infinite barrels of crap. Sorry, but if your only pleasure in life is spelunking code all day, every day, there is something wrong with you as a person. Go write a metacircular evaluator. I'll wait.
 

hhas

macrumors regular
Oct 15, 2007
126
0
(2) you are controlling and automating the operation of a nuclear power plant on a submarine. An error can kill lots of people and cost over a billion dollars and this code will have to be maintained by the next generation of programmers after you retire.

[...]

I doubt you'd ever get case #2 to work in Logo. Ada would be my #1 choice for high stakes embedded work. but the engineer who just wants to see his data would go nuts trying to work in Ada.

You are confusing language with philosophy. Logo is good in that it teaches you to think and build higher-level abstractions upon higher-level abstractions; the fundamental principle by which we can advance without cognitive limit. Mainstream languages today condition you to twiddle bits; a very comfortable, incestuous mediocrity with rigid upper bounds.

McCarthy's original Lisp wasn't a language, it was a language for writing languages. It's a bona-fide epiphany to get your head around that. It's a skill that industry hasn't fostered, unfortunately, because lowest-common-denominator interchangeable and dirt cheap units are all it cares about. So you end up with such a ubiquity of dedicatedly mediocre crippled thinkers [1] that eventually everyone believes that's what programming is actually supposed to be.

If I was writing nuclear sub software, the first thing I'd do would be to build my own language that was tailored to solving that problem domain, and nothing else. The smaller and more precise the language, the fewer opportunities there are to put a foot wrong when using it.

But then, of course, you'll get some derpy bureaucrat or religious developer comes along later and insists on redoing everything in C++, because that's "the standard". Well of course that's the standard: when nobody thinks or tries to do any better, nobody knows any different. Ignorance is bliss.

--

[1] e.g. Take the Web Industry. Please.


“Perfection is achieved, not when there is nothing more to add, but when there is nothing left to take away.”
― Antoine de Saint-Exupéry, Airman's Odyssey
 

neutrino23

macrumors 68000
Feb 14, 2003
1,881
391
SF Bay area
...
This is true, except in one field where AppleScript kicks the absolute tar out of every other supported option: controlling scriptable (i.e. Apple event-aware) applications. For a company that invented and owns both AppleScript and Apple event technologies, they're really rather rubbish at implementing and supporting it themselves. ...

Applescript got amazingly faster in Mavericks, at least for the scripts I run.

Applescript has been enormously helpful for me over the years for just the end use you mention. It lets me talk to different applications to automate things that would be tedious to do by hand.
 

hhas

macrumors regular
Oct 15, 2007
126
0
Applescript got amazingly faster in Mavericks, at least for the scripts I run.

Interesting. Mind you, even just fixing AppleScript's notorious list performance (it must be the only language where array access is O(N), not O(1)) would make a huge observable difference to many users. (You can hack around it by mucking around with script object properties, but that's unpleasant and completely unfair on users.)

Applescript has been enormously helpful for me over the years for just the end use you mention. It lets me talk to different applications to automate things that would be tedious to do by hand.

Yep. There's actually a really simple lesson in this: for ordinary users, it's improving the *small simple stuff* that makes the difference to their lives. Programmers far prefer constructing vast complex dramas over mere 'trivial toys' (been there myself), but such AWESOME UNLIMITED POWER is utterly useless and disempowering when completely inaccessible to everyone else.

Plucking small, undramatic, low-hanging fruit might seem dreadfully boring to the geek elite, but can provide huge enlightenment and empowerment to the 98% who do all the other work in the world. (Though, of course, once the masses get sufficiently inspired, the geeks might have to watch their backs...;)
 

ArtOfWarfare

macrumors G3
Nov 26, 2007
9,561
6,059
Welcome to C. You must be new. Enjoy your malloc() and free().

I'm confused by this statement. I have ten years of experience with C, so I certainly don't think most people would consider me new. Are you trying to suggest that C is a good language or not?

Seriously, go crank open an old school book on actual Computer Science sometime, as opposed to the standard Java diploma mill schmutz that mostly gets pumped out nowadays.

I don't bother with books, and I find the very existence of books on programming to be ironic. The early internet was dominated by people teaching each other programming (and other things)... the early internet was made so that people wouldn't need to learn from books anymore, but from the internet instead.

The more I (slowly) learn, the more I realize there's bugger all math or science involved in 99% of today's mainstream programming and languages. It's mostly bureaucracy, ideology, and good ole John Wayne cowboyism; just high-functioning Dunning Kruger-ism.

I don't understand any of what you just said.

To quote Guy Steele (before he went to the dark side): "The most important concept in all of computer science is abstraction."

Absolutely - I very much agree with that.

Everything else is just the tedious mechanical crap you've gotta wade through on your way to being able to say what you mean. And modern mainstream languages are *fantastically good* at drowning that one simple truth under such infinite barrels of crap.

How so?

Sorry, but if your only pleasure in life is spelunking code all day, every day, there is something wrong with you as a person. Go write a metacircular evaluator. I'll wait.

I have written partial C evaluators in C before... but such things already exist. Why would I make one? If your only pleasure in life is ignoring the contributions already made by other people, just so you can reimplement what they already did then... well, knock yourself out, I guess. The rest of us can keep on building better frameworks, tools, and languages (each with more abstraction than the prior one it's built on) until eventually, everyone will be able to program. And of course, once everyone can program, all of the problems we have right now can hopefully be solved (IE, doctors can write programs to help them be better doctors, without bothering to learn about memory management in C).
 

hhas

macrumors regular
Oct 15, 2007
126
0
In the UK they are changing the way ICT (Information and Communication Technology) is taught in schools.

With Gove in charge. Papert help us all.


This will now be based on programming and will be taught from something like year 5 (9-10 year olds) upto year 12 (16-17 year olds).
I think the languages that they will be teaching are Logo and Scratch (IIRC).
What are these languages like to learn?

Regardless of anything else, go read Papert's Mindstorms tomorrow. (I wouldn't bother with Children's Machine; it's far less useful, while it could've been a valuable postmortem of how to botch getting your message across to the Goves of this world who ultimately choose what gets inflicted on the generations of tomorrow, but soured with way too much postmodernish handwavium and incipient b**thurt.)

Also from my list of useful bookmarks: Hack Education, so at least some folks in the field seem to have heads somewhat screwed on. (In contrast to, say, this [1].)

I'd be surprised if Logo was being used. Logo environments tend to be very dated-looking and limited in libraries which won't engage kids raised on Angry Birds. And the language is deeply unfashionable amongst an industry that values cheap rote key mashing far above any fundamental thinking or awareness. More likely the younger ones will get Scratch and the older something like JavaScript or Java.

FWIW, there was a time when I thought the Scratch approach was the way to go, since it provided a similar level of transparency and safety over text based languages that GUI provided over CLI. Now though I'm convinced Scratch has completely missed the target to do even more damage instead. Scratch proudly highlights gamifies all the bits of programming that are evil - i.e. banging the rocks together - while completely obfuscating and downplaying both the concepts and capabilities required for abstraction building. Defining new 'words' is the only key concept that matters; not types, not variables, not flow control, not syntax or anything else that only serves to hide that one simple truth under a mountain of trivial detail.


I am currently doing a Maths Degree and I'm considering maybe going into teaching when I graduate, so the languages that they teach at school might be more relevant to me.

Learning languages != learning to program. Forget the platform; the platform's not important, it's only there something upon which to learn and practice some particular subset of programming concepts and techniques. Languages flit as leafs in the wind; fundamentals are forever.

If you want to learn about programming and pedagogy, learn lots of different languages with lots of different methodologies. Small, tight languages that follow a single clearly defined idiom are infinitely better than kitchen sink monstrosities like C++ that only bureaucrats and Trekkies can love. e.g. Don't believe anyone who says you can learn "functional" programming in Python or JavaScript, because you can't; having first-class functions doesn't make a language functional: modeling relationships between inputs and outputs (and thus being unconcerned by the order in which calculations might be performed) does.

Heck, I'd actually suggest looking for some old-school CS material which teaches you by having you write your own Scheme-like interpreter. I don't have any links to hand, unfortunately, but writing my own Scheme-Bash-AppleScript-etc-ish language for the first time did a lot to recalibrate my own perspective.

And if you're seriously considering teaching, for heaven's sake go speak to some teachers, both school and university level. Crusty old ones who've seen fashions come and go and can who wax all day on histories of computing and education may be particularly useful to pursue, bringing a measured perspective and cynicism that young starry-eyed teacher training kids won't yet have had time to grow.

--

[1] Apparently, not only does masterminding UK computing education require zero pedagogical skills, but no computer knowledge either. (More info)
 

Barney63

macrumors 6502a
Original poster
Jan 9, 2014
799
1
Bolton, UK.
@hiddenmarkov

I signed up with the edX Python course as suggested, it looks quite interesting.

@hhas

I made a typo with Logo, it is Lego (Mindstorm).



Barney
 

hiddenmarkov

macrumors 6502a
Mar 12, 2014
685
492
Japan
@hiddenmarkov

I signed up with the edX Python course as suggested, it looks quite interesting.

@hhas

I made a typo with Logo, it is Lego (Mindstorm).



Barney

Hope it works out for you. I am liking its approach so far. My angle is to learn python (did c, java, perl many moons ago, R in another MOOC is on the plate now as well) in a more controlled environment but the other stuff a nice refresher or some items on syllabus will be new material.


Well that and CPE's for a cert I have by reading a book more independent are harder to push tbh. this way I get my learning...and CPE's that will pass audit (if audited) a bit more easily.
 

hiddenmarkov

macrumors 6502a
Mar 12, 2014
685
492
Japan
What are CPE's?


Barney

continuing professional education credits. Some IT certificates expire if you don't show a record of keeping on learning things. I have one of these certs. As the cert exam was not on my top 10 list of ways to spend a few hours on a day off (plus the prep time at night to get to that test day)....I keep up on collecting the CPE's to meet renewal requirements to not retake the test.
 

Barney63

macrumors 6502a
Original poster
Jan 9, 2014
799
1
Bolton, UK.
In the UK we have CPD (continuing professional development). It can be verified (with a certificate) or non-verified (without). My wife is a Dental Nurse and has to do something like 15 hours verified and 30 hours non-verified every year. The CPD has to be relevant to Dentistry.


Barney
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.