Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

ChristianVirtual

macrumors 601
May 10, 2010
4,122
282
日本
Me too in the C-corner with pure C, C++ and ObjectiveC. Flexible and fast.

Sometimes some Python and JavaScript, if I need something quick.
Quite some ABAP in the past, too. Because I got paid for it. And liked it.

And I learned to dislike Java because of all changes they had in early years. Just never found my way back to it.
 

ArtOfWarfare

macrumors G3
Nov 26, 2007
9,571
6,079
Hi,

if you are looking for a minimalistic language you should try Brain**** (http://en.wikipedia.org/wiki/Brain****).

Have fun
Peter

Moo and Whitespace are both worse than BF.

They both take BF, but the 8 operations are represented by 3 characters instead of one. In Moo, the 8 operations are: moo, moO, mOo, mOO, Moo, MoO, MOo, and MOO. Same idea in Whitespace, but instead of capital letters use tabs, and instead of lowercase letters use spaces.
 

Crocodoc

macrumors member
Sep 15, 2014
58
0
Croc Island
You missed something. If you're making games, for any platform, Unity is the way to go.

As someone who has worked professionally with Unity, run far away, in the opposite direction. Preferably into the arms of a better engine without a gimped runtime, actual source control support and a debugger that actually works without the blessings of your cat.

And I learned to dislike Java because of all changes they had in early years. Just never found my way back to it.

The Java replacements are gaining popularity, if you ever do JVM work you can avoid Java itself at least.
 
Last edited:

MkVsTheWorld

macrumors regular
Jan 20, 2010
106
0
Baltimore
C# and VB.NET are my preferred languages because that's what I've had to develop on. Lately, I've been inching towards C#, just because it's ideal in my job and we're 99% Microsoft.
 

firewood

macrumors G3
Jul 29, 2003
8,113
1,353
Silicon Valley
Moo and Whitespace are both worse than BF.

They both take BF, but the 8 operations are represented by 3 characters instead of one. In Moo, the 8 operations are: moo, moO, mOo, mOO, Moo, MoO, MOo, and MOO. Same idea in Whitespace, but instead of capital letters use tabs, and instead of lowercase letters use spaces.

Yes there are tiny and tinier obscure languages designed for academic and bragging purposes. At the limit, you punch holes in a Universal Turing machine tape.

But a few years ago, a large number of people actually coded real applications using Tiny Basic and tiny Forth programming language systems, which often only had around a couple dozen statements+operators available, because those were among the largest non-assembly languages that would fit in available memory (back when 4k cost a lot of money).
 

Crocodoc

macrumors member
Sep 15, 2014
58
0
Croc Island
Yes there are tiny and tinier obscure languages designed for academic and bragging purposes. At the limit, you punch holes in a Universal Turing machine tape.

But a few years ago, a large number of people actually coded real applications using Tiny Basic and tiny Forth programming language systems, which often only had around a couple dozen statements+operators available, because those were among the largest non-assembly languages that would fit in available memory (back when 4k cost a lot of money).

Fortran Punch cards, an era of computing I will never have to endure.
 

talmy

macrumors 601
Oct 26, 2009
4,727
337
Oregon
But a few years ago, a large number of people actually coded real applications using Tiny Basic and tiny Forth programming language systems, which often only had around a couple dozen statements+operators available, because those were among the largest non-assembly languages that would fit in available memory (back when 4k cost a lot of money).

I was one of those people. I started using Forth in 1981 and wrote an integrated circuit layout program that ran on a TRS-80 when the "competition" was $100k dedicated workstations. I also wrote (and sold) a Forth implementation for machine control and a Forth compiler (traditionally Forth was interpreted). The Forth I wrote was multitasking and I even had a three user timesharing Forth which I demonstrated at the time. A major draw of the language (besides the small size and surprisingly good performance) was its extensibility - you could add new control structures and data types (like floating point) interactively.
 

hhas

macrumors regular
Oct 15, 2007
126
0
A major draw of [Forth] (besides the small size and surprisingly good performance) was its extensibility - you could add new control structures and data types (like floating point) interactively.

Ditto Lisp. IMO, it really helps to think of McCarthy's original Lisp and Moore's Forth as meta-languages; that is, languages for writing languages. It wasn't until I started designing my own languages that I realized just how horrendously bloated, crippled, and just plain moronic today's popular mainstream languages really are. Computing has such an incredibly rich, fascinating history of thought and creativity behind it, right up until somewhere in the 1980s, at which point it seems to go completely to sh*te for some reason. As an embarrassingly late starter myself, I absolutely despair for all the clever young programmers of today, all bright-eyed and bushy-tailed with Absolutely No Bloody Clue About Any Of It.

All the stuff we're taught to believe are "essential fundamental features" of programming like statements and expressions and conditionals and loops and swarms of special forms and all the endless, endless, bloody punctuation is just a giant bag of sh*te elevated to absolute untouchable religion by a bunch of witless cargo cultists and micromanaging OCD martinets. Man, how pissed I was upon this realization, after years and years of struggling to memorize and master all that crap. Turns out, all you truly need for a language is a couple basic data types, an eval loop, an environment to capture runtime state (if necessary), and some sort of mechanism for composing existing behaviors to form new ones. Most of all that last one, which Computer Science likes to call "abstraction" but which should really be called "The Absolutely Bloody Obvious which anyone can see just as soon as all that aforementioned sh*te is stripped away, killed and burned".


IMIHO, any "Computer Science" course that currently teaches C/C++, Java, Python, or any other modern Algol descendent should be stripped of that name forthwith, retitled "Software Engineering" (or, ideally, "Code Monkey Training School"), and cease pretending that they're anything other than cheap-n-cheerful suppliers of industry chair-warmers. (Which, y'know, is absolutely fine for those who aspire to be exactly that.)

The remaining "Genuine™ Computer Science" departments can then use, say, a Lisp-Forth hybrid like Logo to teach their students how to think and problem-solve and grow their own vocabularies in which to express themselves, their interests, problems, and solutions, ever more efficiently and effectively. It's the difference between being a tool user and a tool creator. Teach a child to use other people's words, and they'll say "Hello World" for a day. Teach a child to construct their own words, and they'll be building giant killer robots for life. And isn't that what we all truly yearn the most?

--

Q. What's the difference between the modern Programmer and Stone Age Man?

A. One recursively builds a rich, powerful collection of sophisticated self-empowering tools, while the programmer just keeps banging the rocks together.
 
Last edited:

chown33

Moderator
Staff member
Aug 9, 2009
10,780
8,502
A sea of green
Ditto Lisp. IMO, it really helps to think of McCarthy's original Lisp and Moore's Forth as meta-languages; ...

One reason Forth is the way it is is simple practical portability. Moore wanted to take his tools with him to new jobs. That often meant a completely different computer architecture. So the ability to define a small number of primitives (add, drop, print a character, etc.) and then have the rest of the toolkit be identical was a huge benefit.

That's also why traditional Forth uses blocks/screens. Because it's easy to map it onto plain numbered disk sectors, or onto seek positions in a file, or even blocks on a mag tape.

In other words, Forth is Forth thanks to ruthless simplification. It eschews practically everything except the ability to define new things in terms of existing things, and it starts from a stunningly small collection of primitive things.

Someone once asked me for Forth's syntax rules. I replied it was nothing more than "word, word, word" or "do this thing, do the next thing, do the next thing". In other words, a Forth program was a huge "to do" list.
 

jjhoekstra

macrumors regular
Apr 23, 2009
206
29
iForth! I love the upside down approach of Forth, where you first create the building blocks and than use them to build your solution. I like using it so much that I now and than just take a problem and program a solution for it, just for fun.
 

firewood

macrumors G3
Jul 29, 2003
8,113
1,353
Silicon Valley
The remaining "Genuine™ Computer Science" departments can then use, say, a Lisp-Forth hybrid like Logo to teach their students how to think and problem-solve and grow their own vocabularies in which to express themselves, their interests, problems, and solutions, ever more efficiently and effectively. It's the difference between being a tool user and a tool creator.

I would go further and require learning to use something like Knuth's MIX assembly language (which isn't too far from arm64), and learn that computation is really just about state machines, with some arithmetic and memory attached for ease of vocabulary growth. (Personally, I find the equivalent Turing machines a little too sparse to actually do much coding on.) One reason I like to code in C is that I can almost imagine the machine code it spits out (before optimization) and how much physical hardware logic and bits of RAM it might consume.

Per this forum, 6502 and 1802 (the basis for Sweet-16) assembly language might also be appropriate. They were good enough for Woz to invent Apple, and get the whole ball rolling.
 

Crocodoc

macrumors member
Sep 15, 2014
58
0
Croc Island
iForth! I love the upside down approach of Forth, where you first create the building blocks and than use them to build your solution. I like using it so much that I now and than just take a problem and program a solution for it, just for fun.

You'd probably like Monadic Programming with Combinators. It's the modern equivalent of such a style.

I would go further and require learning to use something like Knuth's MIX assembly language (which isn't too far from arm64)

At the University I went to we were required to program a multitasking kernel for a custom MIPS processor (Had the branch delay removed) for a second year paper. Was actually super useful as a learning tool.

IMIHO, any "Computer Science" course that currently teaches C/C++, Java, Python, or any other modern Algol descendent should be stripped of that name forthwith, retitled "Software Engineering" (or, ideally, "Code Monkey Training School"), and cease pretending that they're anything other than cheap-n-cheerful suppliers of industry chair-warmers. (Which, y'know, is absolutely fine for those who aspire to be exactly that.)

Don't ruin your points with a nice ol' No True Scotsman. Argue for Universities to teach better CS history than to try justify some BS about how some Uni's "don't teach real CS" because they use a particular teaching tool.
 
Last edited:

janil

macrumors member
Nov 10, 2006
61
16
IMIHO, any "Computer Science" course that currently teaches C/C++, Java, Python, or any other modern Algol descendent should be stripped of that name forthwith, retitled "Software Engineering" (or, ideally, "Code Monkey Training School"), and cease pretending that they're anything other than cheap-n-cheerful suppliers of industry chair-warmers. (Which, y'know, is absolutely fine for those who aspire to be exactly that.)

So what language would you recommend for a class in operating systems? Isn't there some value in seeing how C developed and why it developed? I suppose you could develop an OS using a Lisp Machine.

I think there's value to studying all sorts of languages... assembly, C, C++, Pascal, Smalltalk, Lisp/Scheme, Erlang, Haskell, Scala, Prolog, and more.

A good computer science program will introduce you to a wide variety of these languages and work more on how to think deeply about problems. It won't focus primarily on learning the preferred tool of the day... those go in and out of style on a regular basis.
 

Stella

macrumors G3
Apr 21, 2003
8,854
6,361
Canada
I enjoy ObjectiveC - its a bit PITA and long winded but the style makes it easier to write readable code than other languages.

Python is a great language - concise and backed with good 3rd party libraries and community support. A lot of third party libraries tend to be well written and good to learn from.

Overall, I like to use the language that is most applicable for the task. Even if that includes Java ( which is OK, but down my list ).
 

hhas

macrumors regular
Oct 15, 2007
126
0
Let me say that this was my most enjoyable read in a long time! I've also been a big Lisp user. You might find my XLISP-PLUS page interesting.

I'll certainly have a squizz, though TBH I've never used Lisp or Forth for any real work; I've just played a bit while reading up on them in order to wrap my head around the key concepts, which is what I'm really interested in.

My own language, kiwi, draws ideas and inspiration from all over the place - including Lisp, Bash, AppleScript, Perl, Tcl, Io, Eiffel, imperative, functional, a dataflow idioms, and even nasty old crap like Sys V init - while aligning philosophically with good old undervalued Logo.

It's somewhat domain-specific, as it and its predecessor evolved to fit a very particular problem space (Adobe artwork automation), but if you're into language design then you might have fun figuring out what I've pinched from where.:) Semi-completed documentation is here.

If you have a copy of Adobe Illustrator handy then there's also a demo installer that'll let you play with it on your own machine (caveat emptor, E&OE, etc). You won't be able to do much with it beyond play with the tutorial as it's still early days. Right now my focus is on getting it into production use with local clients, though eventually I'd like to sell a shrink-wrap version as well. Even so, I think there's already some neat ideas that might be applicable in general-purpose scripting/end-user programming languages, though maybe that's a job for another day... <g>
 

iSee

macrumors 68040
Oct 25, 2004
3,539
272
I guess I'll weigh in on the topic:

I don't prefer any language because it doesn't matter. That's a bit a downer way to address it, but let me explain:

Language doesn't really matter because, while they can help make easy problems easier -- or harder if you choose unwisely -- they don't make the hard problems easier.

I should be clear that I'm following a particular aesthetic here: that achieving the result originally desired is the best thing. (This aesthetic does not appeal to everyone... for some an enriching journey is most important... some love unexpected results the most... I like these things, but for me the best thing is to conceive of a result and then make it happen in reality.)

The hard problem most directly associated with software development is design: choosing the right levels of abstraction, of components and how they should talk to one another. That is, optimizing the choice of which parts to build and the terms in which they communicate with each other... all in order to effectively achieve the desired result in the face of the various external constraints.

And that leads to a lot of other kinds of problems that are not within the purview of computer languages at all -- office politics, interpersonal communication, inter-group politics & communication, funding, licensing terms, resource availability, market viability, etc.)

I do greatly enjoy playing with different languages, and exploring the concepts behind and promoted by the different ones. But it's like candy -- mind candy -- so sweet but of no real depth of consequence. Lisp gives you, perhaps, the most power to define your own domain-language but doesn't help you decide what the best domain-language is. To me it's like this: every day I see people -- ostensibly very smart people -- *badly* screw up understanding even the terms in which they could solve their most pressing problems. (Some fail even to understand what their most pressing problems are, but I guess those aren't the very smart ones.) While they are focusing on C++ vs. C# vs. Javascript or package management or source control vendor their project is collapsing under the pressure of out-of-control proliferation of dependencies or being sabotaged by political infighting, or various other reasons.

So... within the small confines in which language matters... I guess I'll say I prefer Swift because it's the language I'm learning at the moment.

(BTW, it's a great age we live in, where you can spend any amount of money on absolutely worthless crap and at the same time get a Stanford course for *free* taught by Paul Hegarty (well, free after an internet connection and PC) Just incredible.

...My own language, kiwi...
Checking it out... I hope I don't come across as too skeptical. I *love* this stuff.
 

hhas

macrumors regular
Oct 15, 2007
126
0
I would go further and require learning to use something like Knuth's MIX assembly language (which isn't too far from arm64), and learn that computation is really just about state machines, with some arithmetic and memory attached for ease of vocabulary growth.

Not as a first language. The whole point of abstraction is to lift us as far up and away from all that primitivist rock-banging as possible. Learning to express themselves effectively (which a high level language enables) should be students' first priority. Once they can do that, they can go back and fill in the building blocks beneath. But if they start at the bottom they'll most likely remain at the bottom, because they cannot envision what might exist above it.
 

hhas

macrumors regular
Oct 15, 2007
126
0
So what language would you recommend for a class in operating systems? Isn't there some value in seeing how C developed and why it developed?

Don't get me wrong: I think C and Unix should be studied, but as much as for what its authors screwed up as what they got right. Unix represents both the best and worst of hacker culture: a working system whipped together in little time and even less resources while the Real Development Project next door (Multics) was burying itself beneath its own impossible weight, which promptly locks in all of its hacks, shortcuts, and plain bad decisions to plague the world for the next fifty years. C and Unix may be many things, but correctable and evolvable ain't amongst them. So yes, study them, but with a view to figuring out how to fix these larger failures (which are not only technical in nature, but logistical as well). That's how you really learn: steal the good stuff, dissect the bad.

Meantime, I'd suggest looking to other OSes like Oberon, L4, Minix, etc (and there's probably plenty even more exotic ones out there well worth chasing down and studying). And even Plan9, just to see how that learned from Unix's mistakes and approached the task of redoing it right.

Oh, and the other thing they'd do well to study is the logistics of turning an OS (or any project, for that matter) from academic curiousity to real-world product, because ultimately it doesn't matter if you write the greatest code on Earth if you can't get anyone to use it.

I suppose you could develop an OS using a Lisp Machine.

Already done. :)

I would definitely pick the highest-level language I could find. The only reason C gets picked is because C compilers have already been ported to every architecture under the sun, and because programmers like to pretend they're macho *******s by juggling live grenades. But really, if you're writing a kernel then you should be using the safest language you can (since any mistakes at that level really screw everyone's day), so even if it was just something like Cyclone then that'd be an improvement.

And I can't help wondering to what extent a robust declarative language such as ML might be applicable to OS development. At kernel level I suspect the Big Bad (i.e. state) is something that can't be entirely eliminated, only contained, but anything that makes behavior provably safer and more predictable has to be worth investigation. I'm aware of at least one experiment in this area, House, though being a bear of very little brain it's not something I'm up on myself.

Heck, even Forth might be worth a go: fast, simple, and with an established history already. Perhaps students could even build their own platform to run it (tip: Burroughs mainframes were wonderful stack machines); with modern technology like FPGAs and CAD/CAM at their disposal, they wouldn't need to spend weeks hand-wrapping cheese boards so could get on with the stuff that's actually of value. (Yay, automation!)

I think there's value to studying all sorts of languages... assembly, C, C++, Pascal, Smalltalk, Lisp/Scheme, Erlang, Haskell, Scala, Prolog, and more. A good computer science program will introduce you to a wide variety of these languages and work more on how to think deeply about problems. It won't focus primarily on learning the preferred tool of the day... those go in and out of style on a regular basis.

Yes, it's the idioms, not the tools, that's the important thing. Languages are the tool by which we express ourselves. Mastering many different idioms - imperative, functional, relational, logic, dataflow, concatenative, concurrent, and so on - vastly expands and enriches our abilities, allowing us to adopt whichever one can best express a solution to the problem at hand. Alas, programming seems to be the one craft whose members frequently place far more pride in their ability to whang a hammer at absolutely everything than in carrying a full toolbox.

There's such an incredible wealth of ingenuity and insight in the declarative programming world, I can't believe so many of today's programmers would be so thoroughly, determinedly, uninterested in anything outside of their tiny self-imposed cardboard box. They're supposed to be these great free-thinking visionaries and fearless techno-pioneers, yet many are such reactionary conservatives, so utterly averse to challenge or change, they make the John Birchers look like a bunch of screaming queens.

It's a paradox... but hey, more opportunities for the rest of us... ;)
 

subsonix

macrumors 68040
Feb 2, 2008
3,551
79
Major bikeshedding thread..

Not as a first language. The whole point of abstraction is to lift us as far up and away from all that primitivist rock-banging as possible.

Isn't the point to choose the right level of abstraction, use the right tool for the job etc.

Learning to express themselves effectively (which a high level language enables) should be students' first priority. Once they can do that, they can go back and fill in the building blocks beneath. But if they start at the bottom they'll most likely remain at the bottom, because they cannot envision what might exist above it.

This can easily be turned on it's head. If they start at the top, they most likely remain at the top because they cannon envision what might exist below it.
 

hhas

macrumors regular
Oct 15, 2007
126
0
Language doesn't really matter because, while they can help make easy problems easier -- or harder if you choose unwisely -- they don't make the hard problems easier.

One language does: English*. As in "Talk to your bloody users!"

It's remarkable how less painful everyone's lives can be when programmers actually learn the problem domain for which they are ostensibly constructing a solution. :p

--

* (Or whatever your regional equivalent is, of course.)
 

firewood

macrumors G3
Jul 29, 2003
8,113
1,353
Silicon Valley
But if they start at the bottom they'll most likely remain at the bottom, because they cannot envision what might exist above it.

It's actually harder to go in the opposite direction for most coders, and envision what goes underneath.

A lot of famous computer scientists, OS programmers and high level language designers started out working with lots of assembly language. Whereas many high level programmers often skip the theory of computation courses (Turing machines, Boolean logic, binary arithmetic, etc.) as being too hard.

Thus the existence of far too many bloated, slow, hot, battery draining applications.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.