Become a MacRumors Supporter for $50/year with no ads, ability to filter front page stories, and private forums.

hhas

macrumors regular
Oct 15, 2007
126
0
Isn't the point to choose the right level of abstraction, use the right tool for the job etc.

Yes. And learning is a job in itself. Don't make it impossibly hard on students just to indulge some completist OCD urge in yourself. Prioritize.

As Guy Steele said (before he went to the Dark Side), "the most important concept in all of computer science is abstraction".

Therefore, begin with something like Logo that focuses on teaching and developing that ability, along with structured thinking and analytical problem-solving in general. There'll be plenty time for endless pedantic nuts-n-bolts later, once students have mastered the core skills that'll enable them to manage all that crap effectively.


This can easily be turned on it's head. If they start at the top, they most likely remain at the top because they cannon envision what might exist below it.

Not unless they're extraordinarily stupid, lazy, and outright blind. Sorry, but I'll take wild, messy abstract thinking over precision-engineered rote repetition any day of the year. Teach a person to teach themselves, and they'll go looking for whatever they don't know as and when they need it.
 

subsonix

macrumors 68040
Feb 2, 2008
3,551
79
Yes. And learning is a job in itself. Don't make it impossibly hard on students just to indulge some completist OCD urge in yourself. Prioritize.

You make an awful lot of assumptions here, in a passive aggressive tone.

As Guy Steele said (before he went to the Dark Side), "the most important concept in all of computer science is abstraction".

There is such a thing as overly abstract, there are some examples in Java frameworks. My point isn't that abstraction isn't important. You said that the point was to take the abstraction as far as possible.

Not unless they're extraordinarily stupid, lazy, and outright blind. Sorry, but I'll take wild, messy abstract thinking over precision-engineered rote repetition any day of the year. Teach a person to teach themselves, and they'll go looking for whatever they don't know as and when they need it.

So stupid, lazy, blind, only applies when going in one direction, seems completely arbitrary to me. I get it's your preference, but who cares really. I think both methods may work, that's my point, but I'm no educator.
 

hhas

macrumors regular
Oct 15, 2007
126
0
It's actually harder to go in the opposite direction for most coders, and envision what goes underneath.

Citation?

A lot of famous computer scientists, OS programmers and high level language designers started out working with lots of assembly language.

A lot of famous computer scientists, OS programmers and high level language designers started out in the days when everyone worked at low-level, because high-level tools didn't yet exist. (i.e. They're the ones who built them. And there's probably a reason for that.)

Whereas many high level programmers often skip the theory of computation courses (Turing machines, Boolean logic, binary arithmetic, etc.) as being too hard. Thus the existence of far too many bloated, slow, hot, battery draining applications.

Many high-level programmers skip a lot of the high-level stuff as well, which is why they end up as Java Enterprise Architects. Experienced practitioners become adept in disguising their total ignorance of everything behind sufficiently vast and impenetrable UML graphs. Perhaps you are simply mistaking "high-level" programmer for "totally crap"? There's a site for that.

Also, if you think that developing compact, fast, efficient applications doesn't carry a significant price tag of its own - concrete costs which must be carefully weighed against whatever abstract benefits they might provide future to determine whether or not they're even worth doing at all - then perhaps CS courses should throw in some classes on business and economics as well. "Premature optimization is the root of all evil", to quote some feller.
 

subsonix

macrumors 68040
Feb 2, 2008
3,551
79
I would definitely pick the highest-level language I could find. The only reason C gets picked is because C compilers have already been ported to every architecture under the sun, and because programmers like to pretend they're macho *******s by juggling live grenades. But really, if you're writing a kernel then you should be using the safest language you can (since any mistakes at that level really screw everyone's day), so even if it was just something like Cyclone then that'd be an improvement.

Counter example is the seL4 kernel. It's a research kernel, and the first formally verified kernel. Part of the implementation is done in Haskell which is later translated to C for the final implementation.

Unique about seL4 is its unprecedented degree of assurance, achieved through formal verification. Specifically, the ARM version of seL4 is the first (and still only) general-purpose OS kernel with a full code-level functional correctness proof, meaning a mathematical proof that the implementation (written in C) adheres to its specification. In short, the implementation is proved to be bug-free (see below). This also implies a number of other properties, such as freedom from buffer overflows, null pointer exceptions, use-after-free, etc.

https://sel4.systems/FAQ/#l4v
 

hhas

macrumors regular
Oct 15, 2007
126
0
You said that the point was to take the abstraction as far as possible.

No I didn't. I said abstraction should be taught first. You can teach JMPs later, once they've already developed the knowledge, awareness, and skills that'll enable them not to make a lethal bloody rat's nest out of it.

As for pointing at the Java world's adoration of vast baroque self-fellating BS as if that's a meaningful argument against abstraction, that's a ludicrous strawman and I'm sure you know it. It actually reinforces case for teaching students robust abstraction, structured thinking, and analytical problem-solving skills, plus a big heaping dose of self-awareness and humility, before soaking them beneath the endless firehose of evasively moronic menial make-work that is regularly passed off as "professional software development" these days.
 

firewood

macrumors G3
Jul 29, 2003
8,113
1,353
Silicon Valley
Also, if you think that developing compact, fast, efficient applications doesn't carry a significant price tag of its own - concrete costs which must be carefully weighed against whatever abstract benefits they might provide future to determine whether or not they're even worth doing at all - then perhaps CS courses should throw in some classes on business and economics as well. "Premature optimization is the root of all evil", to quote some feller.

Once upon a time, programmer productivity used to be an issue. Nowadays, the cost of energy for all the data centers and new high performance chips for all the latest devices rivals the salary of those "productive" programmers, who now find themselves part of the problems of global warming and generated e-waste. Blind ignorance of "Premature optimization" is the path to endemic designed-in inefficiency.
 

hhas

macrumors regular
Oct 15, 2007
126
0
Don't ruin your points with a nice ol' No True Scotsman. Argue for Universities to teach better CS history than to try justify some BS about how some Uni's "don't teach real CS" because they use a particular teaching tool.

As a True Scotsman I greatly resemble that remark! But seriously, tools can be an enabler or an impediment to learning as well as production.

The problem with Python, Java, etc. is that they require students to wade through vast amounts of primordial scut work (conditionals, loops, complex grammar, special forms, etc) and advanced concepts for which they aren't yet prepared and aren't necessary to get started (e.g. type systems, which are all to do with set theory and formal proofs, not about having the compiler do your homework for you). By the time they get to functions and stuff, their heads are so packed with mechanical trivia, which they've come to believe is what programming is all about, there's no room left for the actual philosophy of programming: problem solving, managing complexity, expressing yourself in a form that is understandable to other humans and not just machines.


Another commenter points at Java as an example of "excessive abstraction", which is total nonsense: Java is a perfect example of bad abstraction, the major purpose of Java's vast, baroque class architectures being to show how incredibly clever and adept their authors are at creating complexity and managing it in their own heads, not eliminating it (as abstraction's meant to do). (The other benefit being how it enables poor programmers to disguise their ignorance of the problem domain beneath huge impenetrable class graphs, none of which actually does anything of worth but looks very impressive to their equally useless management.) There may be many culprits encouraging and reinforcing that mentality, but it's naive not to think Java couldn't be one of them.

While linguists may argue the extent to which natural languages influence thought, I don't think anyone here would deny that our artificial (programming) languages significantly influence how we think about problems and how we can solve them. It is, after all, the major reason we create so many of them to cover so many different idioms: imperative, functional, logic, dataflow, concatenative, concurrent, etc, etc.


There are two types of programmers in this world: those that approach programming as a means towards solving real-world problems, and those who treat writing code as the end in itself. Therefore, if you want programmers who are good at abstraction, and thus good at managing and solving complex problems, don't make them wade through unnecessarily complex, inflexible languages that drowns them in pedantic micromanagement of tedious trash. All that does is drive off the sort of folk who've got more pressing things to do than shovel buckets of crap all day, while making those who do enjoy playing with crap totally at home. And then we wonder why so much software fails so miserably to meet users' needs. (Protip: study the mindset and attitude of the people who produced it, and figure it out from there.)

Grok abstraction, and you open the door to learning everything else and applying it effectively. Fail - or refuse - to do so, and best you stick to flipping ones and zeros on the front panel, which is the only thing machines care about, after all. At least it'll slow down the amount of spaghetti you dump on the rest of us in your lifetime. :p
 

hhas

macrumors regular
Oct 15, 2007
126
0
What you said, verbatim.

hhas said:
The whole point of abstraction is to lift us as far up and away from all that primitivist rock-banging as possible.

Cute. I particularly like the bit where you take the second statement completely out of context by omitting the sentence immediately before it:

hhas said:
Not as a first language.

which was in direct response to firewood's recommendation:

firewood said:
I would go further and require learning to use something like Knuth's MIX assembly language

I'm all for students learning stuff like assembly when they're ready to do so, but it's totally inappropriate as an introduction to programming. The giants before us dedicated their lives to building higher-level tools for a reason. It takes a huge amount of work to achieve anything of use when working at assembly level.

There will always be some coders who prefer generating 10 lines of primitive make-work where a single high-level expression would do the same job at much less cost[1]. But I see no reason why CS should deliberately select for the bleeders to the exclusion of everyone else.

Personally, I'm very skeptical that someone who starts out their learning in assembly will do a better job of controlling all the complexity which that creates than someone who's already mastered abstraction principles in a high-level language before tackling assembly coding[1]. Though if you can point to any solid research on the subject that points one way or the other then I'd certainly like to read it. I'm interested in expanding knowledge, not winning arguments, so my opinions are always open to revision, but it needs more than just some opposing random internet opinion to do so.

--

[1] Which is not just the cost of initially writing the code, but also grokking and maintaining it over its entire lifetime, and not just by its original author but anyone else who has to deal with that code as well.

[2] Trying to go from 80s spaghetti BASIC to even more spaghetti assembly is what scared me off programming for the next 20 years. Had I gone to Pascal instead things might've turned out very differently. Then again, it was being forced into that different path that ultimately led to the position I'm now at, where it's much more interesting to rock the boat than blindly follow the status-quo.
 

hhas

macrumors regular
Oct 15, 2007
126
0
Counter example is the seL4 kernel. It's a research kernel, and the first formally verified kernel. Part of the implementation is done in Haskell which is later translated to C for the final implementation.

Are you sure? The FAQ's a bit unclear, but as I read it it's an L4 kernel written in C according to extremely strict coding standards. As they write the kernel's C code, they use a proof checker, written in Haskell, to confirm that code follows those standards to the letter. That is, they write the proof checker first, then they write the coding standards that ensure the C (kernel) code will be checkable against that proof, and finally they write the kernel code itself.

While their attention to formal correctness is admirable, I can't help wondering why they didn't just write their own kernel programming language and program their kernel in that? It's what K&R did to bootstrap Unix, after all; the only difference being that theirs (C) couldn't prove itself for ******. Which wouldn't have been a huge problem had C not immediately escaped into the wild where it was used to code *everything else* as well, thus ensuring every defect and weakness of C spread throughout the entire computing ecosystem for decades to come.

Having a formally provable language that can be used to build not just a kernel but all of the stuff on top of it as well would be infinitely more useful. Although, as you say, it's a research project, so perhaps they're still at the initial learning stage where they figure out how to solve one specific problem before attempting to figure out how to generalize that solution to cover all of the other problems as well.
 

hhas

macrumors regular
Oct 15, 2007
126
0
Plan 9 and Go. :D

There seems to be a schism between Unix people and Lisp people.

Except Plan9 and Go aren't evolutions of existing Unix and C, but ground-up replacements. Which is not a bad thing in itself: better to start afresh having learned from previous mistakes than waste effort trying to "fix" the inherently unfixable. The problem is: it doesn't matter how many Unix mistakes Plan9 manages to fix, because Unix users simply aren't willing to migrate to it. There's various reasons for that, including cost, lock-in, laziness, obtuseness, hatred of change. Even when Torvalds started his own ground-up OS project, he chose to copy Unix, not Plan9.

Heck, never mind anything as big as an OS switch, just look at something like the Python2 to Python3 migration, which has been tied up in knots by self-inflicted bureaucracy and inertia for years. That's a trivial shift by comparison, and the Python team has totally botched it. Or how about Microsoft, who managed to screw up their recent attempt to make Windows radically simpler and easier to use so badly that they're now putting all of the Win95-Win7 complexity and crap back in?

Successful transitions are hard to pull off, requiring not just technical but also logistical and marketing nous, and lots of it. What more commonly happens is everyone gets stuck on a bogged-down platform until a fresh new platform floats along, at which point they finally cut their losses and jump to that. Of course, the new platform then proceeds to bloat until it sinks beneath its own weight as well, and so the cycle repeats all over again.

This growth-stagnation-defection cycle is very costly to users and none too pleasant for the platforms either. If we could find ways to construct platforms from the ground-up so that they're inherently correctable and evolvable without breaking everything that's already on top of them Every Single Time, it could greatly reduce this disruptive pain and cost.

Hey, we already have languages for managing such evils as state and concurrency; why not one that takes care of its own bureaucracy as well? :)
 

subsonix

macrumors 68040
Feb 2, 2008
3,551
79
Are you sure? The FAQ's a bit unclear, but as I read it it's an L4 kernel written in C according to extremely strict coding standards. As they write the kernel's C code, they use a proof checker, written in Haskell, to confirm that code follows those standards to the letter. That is, they write the proof checker first, then they write the coding standards that ensure the C (kernel) code will be checkable against that proof, and finally they write the kernel code itself.

The proof checker is Isabelle/HOL, Haskell is used as a high level prototype of the kernel to enable an abstraction from the hardware and a top down design approach from the specification. The high performing actual code is then manually written in C. It's described in more detail in some of their papers if you are interested. It's a kernel belonging to the same family as L4, and from what I have seen also has the highest performance, fastest IPC and lowest LOC count so far, just north of 9.000 lines all in all.

While their attention to formal correctness is admirable, I can't help wondering why they didn't just write their own kernel programming language and program their kernel in that? It's what K&R did to bootstrap Unix, after all; the only difference being that theirs (C) couldn't prove itself for ******. Which wouldn't have been a huge problem had C not immediately escaped into the wild where it was used to code *everything else* as well, thus ensuring every defect and weakness of C spread throughout the entire computing ecosystem for decades to come.

I don't know, but a high performance language which aims to do what C does would be a language pretty similar to C. I think a common mistake is to think of C as a high level language, instead of as a portable assembly, then there is no expectation of magic going on. The fact that C got widespread adoption outside of the kernel, hardware and high performance code is probably in part due to the time, computers were not that fast and so on.
 
Last edited:

subsonix

macrumors 68040
Feb 2, 2008
3,551
79
Except Plan9 and Go aren't evolutions of existing Unix and C, but ground-up replacements. Which is not a bad thing in itself: better to start afresh having learned from previous mistakes than waste effort trying to "fix" the inherently unfixable.

Details, Plan 9 is seeking to correct some of the issues in Unix by the authors. Go is related just because some of the authors were on the Unix team and is somewhat similar in syntax and philosophy. Besides that, there are things that gets removed and added and improved on in Unix all the time.

The problem is: it doesn't matter how many Unix mistakes Plan9 manages to fix, because Unix users simply aren't willing to migrate to it. There's various reasons for that, including cost, lock-in, laziness, obtuseness, hatred of change. Even when Torvalds started his own ground-up OS project, he chose to copy Unix, not Plan9.

Well, that's the same for your ideal functional programming language, it's all beautiful in theory. When Torvalds made the Linux kernel, it was an effort to get access to Unix, he was unaware of FreeBSD at the time.

----------

Another commenter points at Java as an example of "excessive abstraction", which is total nonsense: Java is a perfect example of bad abstraction, the major purpose of Java's vast, baroque class architectures being to show how incredibly clever and adept their authors are at creating complexity and managing it in their own heads, not eliminating it (as abstraction's meant to do).

If you are referring to my comment, I specifically mentioned frameworks. I had a particular class in mind which made the rounds on the Internet a few years back but couldn't find it, I think this may be it: http://docs.spring.io/spring/docs/2...mework/AbstractSingletonProxyFactoryBean.html

In any case excessive is bad, no matter what we are discussing. Good abstraction is empowering, too much and it obfuscates the very things you are trying to achieve. It goes back to my original point of choosing the right level of abstraction for the problem at hand.
 

iSee

macrumors 68040
Oct 25, 2004
3,539
272
...laziness, obtuseness, hatred of change...
You know, there are a lot of things you wrote that I could take issue with, but this is the most egregious. In my experience people (in general, but particularly technical people) are willing to accept the short-term pain of change as long as they see a long term benefit that makes it worthwhile.

That isn't any of those negative things you ascribe to unix users. That's just rational decision-making. The reality is people try to find the best OS available that suits their needs. As a result, people switch OSs from time to time.

I think you have to ask those unix users why they didn't switch to Plan 9 rather than just calling them stupid and lazy. Plan 9 seems to have a lot of nice characteristics but also a lot of gaps that would prevent a lot of potential users from using it.
 

firewood

macrumors G3
Jul 29, 2003
8,113
1,353
Silicon Valley
Good abstraction is empowering, too much and it obfuscates the very things you are trying to achieve. It goes back to my original point of choosing the right level of abstraction for the problem at hand.

Good artists and craftsmen know how to work at multiple levels of abstraction, all the way from the cultural trends and effects of their productions, down to the physics and chemistry of their medium. Many renaissance artists experimented with the (al)chemy of their paint pigments. Engineers of massive skyscrapers first learn to fracture small things on tensile test machines in the lab. Etc.

So there is no single "right" level.

Higher levels will often stand up or fall down depending on the creators strong understanding of the properties of the lower levels on which the higher levels are standing.
 

dusk007

macrumors 68040
Dec 5, 2009
3,412
104
I really like the new languages like Ruby, Coffeescript, Scala mostly because of their syntax and the clean looking code.

I have found though one big disadvantage to Java and that is debugging. I find it much easier to figure the location of an error in Java while debugging ruby is kind of tricky. There is so much implicit in the language that you just have to know that it is especially for a beginner quite hard. It is easy to write quick code but once you hang and aren't sure how to code something it is tricky. Java is always clear it just looks ugly and is verbose.

I like the cleaner less verbose syntax but it can be confusing to learn with all the small differences in those languages that let you write short cut with few symbols. When it is == or === or what does x!. mean when you need brackets, when don't you. I imagine also that such languages are more problematic in big projects because you can write code in a more individualistic style rather than the only single right way. A lot of possibilities have their downsides.

Though when it comes to coffeescript vs. Java Script I am clearly in the former camp. Code is so so much quicker to write and so much more readable. I find it also much less tricky like ruby on rails which I still haven't down too well.

I think programming languages like C++ are the best at what they do and you may just need them for a certain problem (like efficient personal computer code), but it is not a modern language that really deserves consideration. Just like Fortran has its place but when discussing the best langauge it is more a bother you have to put with than a joy to use. It isn't really high level enough IMO.
 

firewood

macrumors G3
Jul 29, 2003
8,113
1,353
Silicon Valley
It takes a huge amount of work to achieve anything of use when working at assembly level.

Which is the reason to learn this low level stuff. So that the smarter ones know and appreciate exactly why they are using some higher levels tools, with all their associated advantage and disadvantages. As opposed to the misconception that their favored HLL tool is the only way to solve any problem, thus wasting their employers power budget long after they are gone (or are promoted into management where they can cause other damage to the business).
 

joshlalonde

macrumors 6502
Jul 12, 2014
422
0
Canada
That's sad, and one of the main things I want to do away with: people writing libraries that have already been written. It's a huge waste of time. Imagine what better things the human race could have done with all that collected time that was wasted writing yet another * library. Maybe we'd have cancer solved by now. But no, we wasted time reinventing the basics rather than moving forward.

I don't blame you, C is a really crappy language for sharing code in (as is every other language, because no language was actually designed with it in mind. There's a butt ton of SCM software (more people wasting time reinventing the same thing) that all tries to remedy this, but none of them do very well.)

It's not about re-inventing the basics. It's about making more efficient algorithms. Why use a huge library when you can make something that does what you want faster and makes it less complicated.

I don't like using other's peoples code because even if they're made to be modular, I don't like the rules they have made.

It's not a waste of time to write your own library so long as you can re-use that code in a meaningful way. I mean, that's the basis of modern, modular programming. Making re-usable code.

For a job, I'll use other's code, but if possible, I'd like to go in and make it better. Sadly I'm not at that point yet.
 

Cromulent

macrumors 604
Oct 2, 2006
6,802
1,096
The Land of Hope and Glory
It's not about re-inventing the basics. It's about making more efficient algorithms. Why use a huge library when you can make something that does what you want faster and makes it less complicated.

I don't like using other's peoples code because even if they're made to be modular, I don't like the rules they have made.

It's not a waste of time to write your own library so long as you can re-use that code in a meaningful way. I mean, that's the basis of modern, modular programming. Making re-usable code.

For a job, I'll use other's code, but if possible, I'd like to go in and make it better. Sadly I'm not at that point yet.

I hope you don't take that attitude with security based software such as cryptography.

The worst things people do are think that they can invent their more secure of encryption which normally ends up being very easy to break. Security through obscurity never works out well.

Also the other point to make is that your attitude is extremely arrogant. You seem to be assuming that you are a better programmer than everyone else since your solutions are "better" (whatever that means). Have you ever actually profiled any of this so called slow code compared to your own code? You'd probably be surprised at the results.
 

bjet767

Suspended
Oct 2, 2010
967
319
I started with COBOL and punch cards, migrated to Assembly for speed, hated BASIC, and eventually sat on C because of its speed and universal coding ability between platforms.

In between I enjoyed C++ for the strict type defining it introduced to C and MS C foundation classes.


Today I use Objective C for the Apple stuff and have done some work with the new SWIFT but really don't want to learn a new syntax to do what I already can do with Objective C. Basically I don't really get the point of SWIFT except to attract those who never liked C and its variants.

I have coded in HTML, and a bit of JAVA for some web stuff.

One of the things I like about programming Apple products is Apple requires the code to meet minimum standards they set.

In the end it comes down to speed and efficiency to get the job done.
 

joshlalonde

macrumors 6502
Jul 12, 2014
422
0
Canada
I hope you don't take that attitude with security based software such as cryptography.

The worst things people do are think that they can invent their more secure of encryption which normally ends up being very easy to break. Security through obscurity never works out well.

Also the other point to make is that your attitude is extremely arrogant. You seem to be assuming that you are a better programmer than everyone else since your solutions are "better" (whatever that means). Have you ever actually profiled any of this so called slow code compared to your own code? You'd probably be surprised at the results.

Oy, calm down. On the job, I use what others have created. Plus, I acknowledge the things that I do not know. So, if I can't improve it, why would I waste my time trying?

But if I think I have a better solution, I will implement it. If I'm right or wrong, that's what it is, but I have at least tried unlike you. And perhaps I'll get lucky. Perhaps not.

I'm not arrogant just because I actually tried. Unlike you. You're very arrogant about your views. So keep them to yourself if you're going to be so rude.
 

iSee

macrumors 68040
Oct 25, 2004
3,539
272
I hope you don't take that attitude with security based software such as cryptography.

The worst things people do are think that they can invent their more secure of encryption which normally ends up being very easy to break. Security through obscurity never works out well.

Also the other point to make is that your attitude is extremely arrogant. You seem to be assuming that you are a better programmer than everyone else since your solutions are "better" (whatever that means). Have you ever actually profiled any of this so called slow code compared to your own code? You'd probably be surprised at the results.

I don't think it's fair to assume OP meant security software.

If you imagine a spectrum where at one end you will only find existing code to use while at the other you will only write every line yourself self. Neither of these extremes are effective for anything but the smallest projects. Somewhere in between is a reasonable range, where you aren't spending all your time reinventing the wheel nor spending all your time sanding down, spackling over and stitching together modules that don't always fit together that well.

I didn't see anything in OP's post to make me think it meant anything outside of the reasonable range. Note the nod to practically in the last sentence. ...probably OP is closer to the do-it-yourself side of things. That's OK. Somebody has to actually do original work unless you think all problems have already been solved in an optimal way. (I don't think that's anywhere close to being true. Computer science and engineering are in their infancies as far as I can tell.)
 

Cromulent

macrumors 604
Oct 2, 2006
6,802
1,096
The Land of Hope and Glory
Oy, calm down. On the job, I use what others have created. Plus, I acknowledge the things that I do not know. So, if I can't improve it, why would I waste my time trying?

But if I think I have a better solution, I will implement it. If I'm right or wrong, that's what it is, but I have at least tried unlike you. And perhaps I'll get lucky. Perhaps not.

I'm not arrogant just because I actually tried. Unlike you. You're very arrogant about your views. So keep them to yourself if you're going to be so rude.

Ah. You're funny getting all butt hurt over a rather straight forward comment. Grow a thicker skin.

I find it especially funny that you think you have to attack me in response. Nothing I said was rude, insulting or in anyway derogatory.

My point was that you should only rewrite code if you have an objective reason to do so. Saying "I can do better than this!" with no reason is stupid. An objective reason could be one of the following, less bugs in your code than the original, faster code execution, supporting more platforms than the original code, easier to extend to your own uses or removes redundant or obsolete code paths.

Sometimes you need to rewrite things. Most of the time though if you think you need to rewrite something it is because you have failed to understand the original software. A major problem in much of the open source world is lack of documentation making it hard to use some open source code. Rather than thinking you should rewrite it to make it easier for you to use you should spend the time to learn the software and contribute improved documentation to the project to help other users out.

So if you have an actual objective reason to rewrite the code go ahead if not then rewriting the code is going to be a waste of time for you.

There is of course another thing to consider. Whether your current project is a hobby or something serious. If it is a hobby then do whatever the hell you like. It doesn't matter if it takes a week to write or 2 years. You can spend as much time rewriting things as you like. If on the other hand you are doing this as a serious project that is going to put food on the table then you need to carefully consider all your options. Maintenance of existing code costs time and money and you want to spend as little time as possible doing so. In that case it is much better to use fully supported open source software from upstream sources which should reduce the amount of work you need to do on your own code base.

I don't think it's fair to assume OP meant security software.

If you imagine a spectrum where at one end you will only find existing code to use while at the other you will only write every line yourself self. Neither of these extremes are effective for anything but the smallest projects. Somewhere in between is a reasonable range, where you aren't spending all your time reinventing the wheel nor spending all your time sanding down, spackling over and stitching together modules that don't always fit together that well.

I didn't see anything in OP's post to make me think it meant anything outside of the reasonable range. Note the nod to practically in the last sentence. ...probably OP is closer to the do-it-yourself side of things. That's OK. Somebody has to actually do original work unless you think all problems have already been solved in an optimal way. (I don't think that's anywhere close to being true. Computer science and engineering are in their infancies as far as I can tell.)

I think I addressed the points I was trying to make in my response above.
 
Register on MacRumors! This sidebar will go away, and you'll see fewer ads.