Functional Programming (was Pitfalls of lacking a MMU)
-
- Member
- Posts: 81
- Joined: Wed Nov 09, 2011 2:21 am
- Location: Behind a keyboard located in The Netherlands
Re: Functional Programming (was Pitfalls of lacking a MMU)
david simply put you don't remove the initial problem.
You just displace it that is not a solution.
You just displace it that is not a solution.
-
- Member
- Posts: 223
- Joined: Thu Jul 05, 2007 8:58 am
Re: Functional Programming (was Pitfalls of lacking a MMU)
@CrypticalCode0: I don't get what you're saying. The code I provided solves the stated problem. I think you are tripping over the fact that it solves it recursively. Basically what it is doing is we have the original problem and a small set of rules. Using this set of rules the function reduces the original problem to that of slightly smaller size. Because the size can be expressed as a natural number (the combined length of the two string), and is always reduced by at least 1, we will eventually end up with either of the two strings empty, for which we know the answer. This kind of pattern is quite common in algorithms, especially when you want to count something.
- Combuster
- Member
- Posts: 9301
- Joined: Wed Oct 18, 2006 3:45 am
- Libera.chat IRC: [com]buster
- Location: On the balcony, where I can actually keep 1½m distance
- Contact:
Re: Functional Programming (was Pitfalls of lacking a MMU)
There's just one problem with that. lcs("axbcd", "aybcd") returns 4, it should be 3. "abcd" is not a substring of either, it's a subset.The code I provided solves the stated problem.
-
- Member
- Posts: 28
- Joined: Fri Feb 24, 2012 5:10 pm
Re: Functional Programming (was Pitfalls of lacking a MMU)
Various people on this thread are full of themselves. My advice? Eat some humble pie: there are (and always will be) a lot of people on this planet who know a lot more about computer science than you.
Indeed, it is the attitude displayed in this thread that currently has me questioning whether I really do want to swap from studying linguistics and philosophy over to a computer science degree. Do I wish to condemn myself to several years spent surrounded by "comic book guy" larvae?
"Beware C/C++, in which everything is possible but nothing of interest is beautiful."
-~ Beware the turing tarpit, in which everything is possible but nothing of interest is easy ~-
Re: Functional Programming (was Pitfalls of lacking a MMU)
Various people on this thread has proven themselves knowledgable and gained respect. While there might be someone extraordinary who know a lot more about computer science than them, it does not change the fact that they still know more than many of other people.
And stop the language flame.
And stop the language flame.
-
- Member
- Posts: 28
- Joined: Fri Feb 24, 2012 5:10 pm
Re: Functional Programming (was Pitfalls of lacking a MMU)
So? My area of expertise is pixel pushing code. Just because I've forgotten more about graphics pipelines than most people will ever know doesn't mean I don't have a lot to learn ...Various people on this thread has proven themselves knowledgable and gained respect. While there might be someone extraordinary who know a lot more about computer science than them, it does not change the fact that they still know more than many of other people.
Language flame? I'm writing C++ code as we speak. I happen to be a fan of C++ (for the the things it is good at.) Just because I use and like a language doesn't mean I don't acknowledge its deficiencies.And stop the language flame.
Never! Everybody don their asbestos underwear!You're almost definitely in the wrong forum.
-~ Beware the turing tarpit, in which everything is possible but nothing of interest is easy ~-
Re: Functional Programming (was Pitfalls of lacking a MMU)
Hi,
Let's be honest here - programming (the task) is harder than it should be, and programming (the career) is worse.
Why is programming (the task) is harder than it should be?
The task is hard because natural solutions are often not the best solutions due to restrictions. Consider your LCS example. If you had a group of 100 children (all without any programming experiences at all) and asked each child how they would solve the problem, you'll find that almost all of them would start by searching the string for the first character of the substring, and each time they found the first character of the substring they'd count to see how many characters at that starting position match. This is the natural solution. In C, it might look like this:
Of course this is the natural way of doing it with only one thread. The natural way of doing it with multiple threads is relatively simple (it's "embarrassingly parallel") - determine how many CPUs/children you've got and get each of them to scan part of the main string, then determine which result is the best result at the end.
Restrictions that prevent the natural solution from being the best solution include performance and language features. For performance, in this case the natural solution is likely to perform well (nice predictable pattern of accesses for the prefetcher to deal with, good cache locality even for the multi-threaded version, etc). However, if the language you're using doesn't have arrays, or doesn't let you use variables to remember what the "best length so far" is, then you're screwed.
When you can't use the natural solution, you have to use an unnatural solution. This makes programming awkward and means that how a piece of code works isn't how people expect the code to work.
For language design, restrictions that prevent the natural solution from being used must be avoided, and any language that forces programmers to use unnatural solutions is a failure. For compiler design, the goal is to find ways to optimise the natural solution into the fastest solution, in an attempt to remove restrictions that prevent the natural solution from being the best solution.
For language design, functional languages are a complete failure because they don't allow the programmer to express the natural solution in many cases. For compiler design, research into functional languages is important, because it can lead to better ways of converting the natural solution into the fastest solution. It's important not to confuse these things - normal programmers should never need to care about functional languages unless they're designing code optimisers.
Why is programming (the career) harder than it should be?
Let's start with some quotes!
Up until a few years ago I'd learnt several dialects of BASIC (Commodore 64 BASIC, QuickBASIC, VisualBASIC), several assembly languages (6502, 80x86) and C. I learnt some more obscure languages (e.g. "ladder diagrams" used by electricians to program industrial controllers, some BASH scripting, etc). I also picked up small pieces of other languages along the way (a little Pascal and a little C++, etc).
However, more recently I've been doing a "Bacheler of IT" course. In the last 18 months or so I've been taught (which isn't quite the same as "I've learnt") Alice, PHP, Phython, Perl, Java, Javascript and SQL.
I just want to write software, but over the last 18 months I haven't really been able to. Instead I've been faced with churn - learning languages for the sake of learning languages (although to be fair it's not just languages, it's the libraries and frameworks too). I've been getting more and more frustrated because I can't find enough time.
What have I learnt? Mostly what I've learnt is that the single biggest problem for programmers today is the proliferation of different languages (and the libraries, tools, code documentation systems, etc. that go with them). It's futile and unproductive and has a massive cost on the I.T. industry as a whole - retraining, retooling, rewriting and the "Tower of Babel" effect. As a test, go to stackoverflow and see how many of the questions you can answer (for any experienced programmer, I can almost guarantee that it will be less than 25% of them).
What we need is to limit the number of languages to one per field (e.g. one for low level programming, one for high level programming and scripting, one for databases, etc). What we need is some sort of conference for academics and language researchers (to get them all in the same place at the same time) and a special educational tool (a baseball bat with nails/spikes) that can be applied to these people until they get the message.
Cheers,
Brendan
That would depend if the reader understands important key concepts or not.davidv1992 wrote:Brendan, you're last post just gives the idea that you're completely oblivious to the gains of fundamental research (most of what you write can be transposed one to one to every other field of fundamental research out there) combined with a complete lack of respect.
Let's be honest here - programming (the task) is harder than it should be, and programming (the career) is worse.
Why is programming (the task) is harder than it should be?
The task is hard because natural solutions are often not the best solutions due to restrictions. Consider your LCS example. If you had a group of 100 children (all without any programming experiences at all) and asked each child how they would solve the problem, you'll find that almost all of them would start by searching the string for the first character of the substring, and each time they found the first character of the substring they'd count to see how many characters at that starting position match. This is the natural solution. In C, it might look like this:
Code: Select all
int lcs(char *a, char *b) {
int pos;
int length;
int bestPos = 0;
int bestLength = 0;
char first;
first = b[0];
for(pos = 0; a[pos] != 0; pos++) {
if(a[pos] == first) {
length = 1;
while( (a[pos + length] != 0) && (a[pos + length] == b[length]) ) {
length++;
}
if(length > bestLength) {
bestPos = pos;
bestLength = length;
}
}
}
return bestPos;
Restrictions that prevent the natural solution from being the best solution include performance and language features. For performance, in this case the natural solution is likely to perform well (nice predictable pattern of accesses for the prefetcher to deal with, good cache locality even for the multi-threaded version, etc). However, if the language you're using doesn't have arrays, or doesn't let you use variables to remember what the "best length so far" is, then you're screwed.
When you can't use the natural solution, you have to use an unnatural solution. This makes programming awkward and means that how a piece of code works isn't how people expect the code to work.
For language design, restrictions that prevent the natural solution from being used must be avoided, and any language that forces programmers to use unnatural solutions is a failure. For compiler design, the goal is to find ways to optimise the natural solution into the fastest solution, in an attempt to remove restrictions that prevent the natural solution from being the best solution.
For language design, functional languages are a complete failure because they don't allow the programmer to express the natural solution in many cases. For compiler design, research into functional languages is important, because it can lead to better ways of converting the natural solution into the fastest solution. It's important not to confuse these things - normal programmers should never need to care about functional languages unless they're designing code optimisers.
Why is programming (the career) harder than it should be?
Let's start with some quotes!
brain wrote:Lol someone's got an axe to grind against some academics
Guess who's right?davidv1992 wrote:That said brendan, your posts in this thread do not give the impression that you have spend any decent ammount of time with languages other than C/C++. Whilst this is your choice, it might be refreshing to try and use some other languages, i would suggest something like haskell or lisp, and something like scheme or erlang. These are languages structured in radically different ways to C/C++ type languages, and working with them might teach new methods of approaching problems. If you've tried them and you felt they didn't work for you that's fine, but at the moment you give the impression that you haven't used them at all.
Up until a few years ago I'd learnt several dialects of BASIC (Commodore 64 BASIC, QuickBASIC, VisualBASIC), several assembly languages (6502, 80x86) and C. I learnt some more obscure languages (e.g. "ladder diagrams" used by electricians to program industrial controllers, some BASH scripting, etc). I also picked up small pieces of other languages along the way (a little Pascal and a little C++, etc).
However, more recently I've been doing a "Bacheler of IT" course. In the last 18 months or so I've been taught (which isn't quite the same as "I've learnt") Alice, PHP, Phython, Perl, Java, Javascript and SQL.
I just want to write software, but over the last 18 months I haven't really been able to. Instead I've been faced with churn - learning languages for the sake of learning languages (although to be fair it's not just languages, it's the libraries and frameworks too). I've been getting more and more frustrated because I can't find enough time.
What have I learnt? Mostly what I've learnt is that the single biggest problem for programmers today is the proliferation of different languages (and the libraries, tools, code documentation systems, etc. that go with them). It's futile and unproductive and has a massive cost on the I.T. industry as a whole - retraining, retooling, rewriting and the "Tower of Babel" effect. As a test, go to stackoverflow and see how many of the questions you can answer (for any experienced programmer, I can almost guarantee that it will be less than 25% of them).
What we need is to limit the number of languages to one per field (e.g. one for low level programming, one for high level programming and scripting, one for databases, etc). What we need is some sort of conference for academics and language researchers (to get them all in the same place at the same time) and a special educational tool (a baseball bat with nails/spikes) that can be applied to these people until they get the message.
Cheers,
Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
Re: Functional Programming (was Pitfalls of lacking a MMU)
Just tell them they can't leave the conference until they offer a working solution to the halting problem, their pride won't let them quit </sarcasm>
-
- Member
- Posts: 223
- Joined: Thu Jul 05, 2007 8:58 am
Re: Functional Programming (was Pitfalls of lacking a MMU)
You're right, at least according to wikipedia, I actually meant the least common subsequence problem. I never had any formal education in this and learned the this under the name longest common substring, which I now see is faulty. I'll edit the original post. "abcd" is not just a subset, it is a subsequence.Combuster wrote:There's just one problem with that. lcs("axbcd", "aybcd") returns 4, it should be 3. "abcd" is not a substring of either, it's a subset.The code I provided solves the stated problem.
It is however true that my code solves the longest common subsequence code, or at least was written with the intention of solving it.
-
- Member
- Posts: 28
- Joined: Fri Feb 24, 2012 5:10 pm
Re: Functional Programming (was Pitfalls of lacking a MMU)
Amen While they're at it they can solve P ?= NPbrain wrote:Just tell them they can't leave the conference until they offer a working solution to the halting problem, their pride won't let them quit </sarcasm>
brendan_is_right == !(p || !p)Brendan wrote:Guess who's right?
This explains a lot.Brendan wrote:However, more recently I've been doing a "Bacheler of IT" course.
I think this statement actually points to a major flaw in your argument about "natural" solutions. To a BASIC programmer a pile of GOTO spaghetti seems perfectly natural, indeed the very concept of a language without GOTO seems like an unnecessary restriction. Those with more experience understand why imposing such a restriction on a programming language is the Right Thing.Brendan wrote:Up until a few years ago I'd learnt several dialects of BASIC (Commodore 64 BASIC, QuickBASIC, VisualBASIC), several assembly languages (6502, 80x86) and C. I learnt some more obscure languages (e.g. "ladder diagrams" used by electricians to program industrial controllers, some BASH scripting, etc). I also picked up small pieces of other languages along the way (a little Pascal and a little C++, etc)
Thinking that there's One True Language is naive and immature. Remember "C" was a new language once. Then "C++" was a new language. People will always create new languages, for good reason: there's no such thing as a "perfect" language. Remember: "If all you have is a hammer, every problem starts to look like a nail."
And then we can get rid of all these unnecessary natural languages and everybody can just speak English. Heil Brendan!Brendan wrote:What we need is to limit the number of languages to one per field (e.g. one for low level programming, one for high level programming and scripting, one for databases, etc).
-~ Beware the turing tarpit, in which everything is possible but nothing of interest is easy ~-
Re: Functional Programming (was Pitfalls of lacking a MMU)
Hi,
For GOTO there's 3 types of programmers: noobs that don't understand why goto can be bad in certain situations (yet), noobs that don't understand that goto is beneficial in some situations (yet), and real programmers. For a nice example of this, google for "Python goto".
For an example of "naive and immature" consider how well the "If all you have is a hammer" saying applies to programming languages. Good programming languages can be used in many different ways - they're not like a hammer that can only be used in one way, but they're much more like a swiss army knife that's designed to be used in many different ways. Now, if all you have is 100 different types of swiss army knives, then every problem looks like it can be solved with a swiss army knife (which is a good thing - far better than not knowing how to solve a problem); except for the idiotic/unnecessary problem of deciding which type of swiss army knife to use.
Cheers,
Brendan
No.childOfTechno wrote:I think this statement actually points to a major flaw in your argument about "natural" solutions. To a BASIC programmer a pile of GOTO spaghetti seems perfectly natural, indeed the very concept of a language without GOTO seems like an unnecessary restriction. Those with more experience understand why imposing such a restriction on a programming language is the Right Thing.Brendan wrote:Up until a few years ago I'd learnt several dialects of BASIC (Commodore 64 BASIC, QuickBASIC, VisualBASIC), several assembly languages (6502, 80x86) and C. I learnt some more obscure languages (e.g. "ladder diagrams" used by electricians to program industrial controllers, some BASH scripting, etc). I also picked up small pieces of other languages along the way (a little Pascal and a little C++, etc)
For GOTO there's 3 types of programmers: noobs that don't understand why goto can be bad in certain situations (yet), noobs that don't understand that goto is beneficial in some situations (yet), and real programmers. For a nice example of this, google for "Python goto".
I think the word you're looking for is "idealistic".childOfTechno wrote:Thinking that there's One True Language is naive and immature. Remember "C" was a new language once. Then "C++" was a new language. People will always create new languages, for good reason: there's no such thing as a "perfect" language. Remember: "If all you have is a hammer, every problem starts to look like a nail."
For an example of "naive and immature" consider how well the "If all you have is a hammer" saying applies to programming languages. Good programming languages can be used in many different ways - they're not like a hammer that can only be used in one way, but they're much more like a swiss army knife that's designed to be used in many different ways. Now, if all you have is 100 different types of swiss army knives, then every problem looks like it can be solved with a swiss army knife (which is a good thing - far better than not knowing how to solve a problem); except for the idiotic/unnecessary problem of deciding which type of swiss army knife to use.
The number of programming languages increases at a rate of about 10 new languages per year. The number of natural languages increases at a rate of about 0 new languages per year. If there were assholes running around creating and advocating new natural languages and making the problem worse then you might have a valid point. Fortunately there aren't any assholes for natural languages; and unfortunately you don't have a valid point.childOfTechno wrote:And then we can get rid of all these unnecessary natural languages and everybody can just speak English. Heil Brendan!Brendan wrote:What we need is to limit the number of languages to one per field (e.g. one for low level programming, one for high level programming and scripting, one for databases, etc).
Cheers,
Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.
Re: Functional Programming (was Pitfalls of lacking a MMU)
I can hear the cannons firing from my house..
-
- Member
- Posts: 28
- Joined: Fri Feb 24, 2012 5:10 pm
Re: Functional Programming (was Pitfalls of lacking a MMU)
I laughed, I cried. From http://entrian.com/goto/ : "The "goto" module was an April Fool's joke, published on 1st April 2004. Yes, it works, but it's a joke nevertheless. Please don't use it in real code!"Brendan wrote:For GOTO there's 3 types of programmers: noobs that don't understand why goto can be bad in certain situations (yet), noobs that don't understand that goto is beneficial in some situations (yet), and real programmers. For a nice example of this, google for "Python goto".
Did you hear that? It's an April fool's day joke (The module even includes a "comefrom" statement, which is straight out of Intercal.)
GOTO may be useful very occasionally in highly exceptional circumstances. I imagine that you think your code is always highly exceptional.
You may be intelligent, but profoundly lacking in experience. You are not superior to every other programmer who ever walked the earth, got it? Your approach to image scaling as described in your developer's blog is a good example, and gave me a hearty giggle.
-~ Beware the turing tarpit, in which everything is possible but nothing of interest is easy ~-
Re: Functional Programming (was Pitfalls of lacking a MMU)
Brendan, what about forks and extensions of languages, for example the many dialects of basic? Each solves a specific set of problems right but isn't strictly a new language, whereas there are very few 'dialects' of c if you include objective c, c# etc as dialects...
Re: Functional Programming (was Pitfalls of lacking a MMU)
Hi,
I'm not sure what you found funny about my image scaling research - pixel perfect scaling and rotation isn't a bad result. It's just unfortunate that I ran out of time/patience attempting to get good compression when converting bitmap data while also improving the original image's quality during the conversion. In hindsight I should've done one or the other rather than trying to do both at the same time.
Cheers,
Brendan
Sigh. I was referring to all the people asking "How can I write this because there is no goto", all the fools explaining that you should use exception handling as an alternative to "spaghetti" (!), and all the smart people using goto in the (typically rare) cases where it's beneficial, even though it was an April fools joke.childOfTechno wrote:I laughed, I cried. From http://entrian.com/goto/ : "The "goto" module was an April Fool's joke, published on 1st April 2004. Yes, it works, but it's a joke nevertheless. Please don't use it in real code!"Brendan wrote:For GOTO there's 3 types of programmers: noobs that don't understand why goto can be bad in certain situations (yet), noobs that don't understand that goto is beneficial in some situations (yet), and real programmers. For a nice example of this, google for "Python goto".
Did you hear that? It's an April fool's day joke (The module even includes a "comefrom" statement, which is straight out of Intercal.)
If you knew how many times I've restarted my OS project from scratch because I didn't like my own code, you wouldn't be making wild assumptions like this.childOfTechno wrote:GOTO may be useful very occasionally in highly exceptional circumstances. I imagine that you think your code is always highly exceptional.
Did you hear the one about people in glass houses, Mr Child of Techno? From your attitude alone I can guess that I've been programming longer than you've been breathing.childOfTechno wrote:You may be intelligent, but profoundly lacking in experience. You are not superior to every other programmer who ever walked the earth, got it? Your approach to image scaling as described in your developer's blog is a good example, and gave me a hearty giggle.
I'm not sure what you found funny about my image scaling research - pixel perfect scaling and rotation isn't a bad result. It's just unfortunate that I ran out of time/patience attempting to get good compression when converting bitmap data while also improving the original image's quality during the conversion. In hindsight I should've done one or the other rather than trying to do both at the same time.
Cheers,
Brendan
For all things; perfection is, and will always remain, impossible to achieve in practice. However; by striving for perfection we create things that are as perfect as practically possible. Let the pursuit of perfection be our guide.