Teaching computer science
- kenneth_phough
- Member
- Posts: 106
- Joined: Sun Sep 18, 2005 11:00 pm
- Location: Williamstown, MA; Worcester, MA; Yokohama, Japan
- Contact:
Teaching computer science
I've taught C programming last year and unfortunately it didn't go well. Everyone who was interested had different backgrounds in computer literacy, so I had to explain many vocabs and use all possible analogies to explain things that they had trouble picturing. Especially when I talked about memory allocation and pointer (also arrays) a lot of them stumbled. Another teacher Mr. J and I have decided this year to teach some computer science class because there were a couple students showing interest in it. We were trying to brainstorm possible topics to teach and may stick to the general concept of programming and a very quick crash course on how computers work. I was wondering if anyone has any advice or tips on things I should touch on, talk about, discuss when teaching programming, possibly Java, but would like to stick to the general idea of what programming is and the development flow.
Yours,
Kenneth
Yours,
Kenneth
As a student of computer science at an upper high school level, there are a few things I'd like to see in a computer science class:
The first is equal application of theory and practice. There is no use in having a large amount of theory, if you cannot put this theory into practice. As a side note, the first thing we learned in class was about 6 weeks of binary and calculations, logic, etc. Our class size halved. Sure, those that were left got pretty much straight A's and became good programmers through dedication. But it wasn't the right way to get people interested. I'd suggest more practical work to begin with, and put in the theory as it becomes necessary to progress further (for instance... I only learned boolean logic formally after about two years programming... it didn't hold me back... I just learned it as necessary).
The second thing, and a vital consideration, is choice of language. I see two major options (and yes, somebody has tried to teach me both). Java and Python. Java, IMHO, is the worse language, but is going to set up the student better to move into 'hard core' languages like C or C++. It's also better to get work with.
My suggestion, though, would be Python. It's interpreted, which gives students an immediate look at what they have done. It can be written straight into a command prompt, or typed into a file and run. It can even be compiled. So it does it all (like Lisp really, but with much more beginner-friendly syntax). The syntax is lovely. It forces students to learn proper indentation and formatting rules, because this is how the language's control structures work.
During the course, the students IMHO should be working towards a larger project (large being 750-2000 lines). For a beginner, this poses enough of a challenge for them to learn (either the easy way or the hard way) how much design they personally need to put into a project before they start, what they like in documentation, realising that 'first make it run, then make it run fast' is very good advice, etc.
There are my thoughts. Sorry if it's a bit rambly, I've been doing homework and my brain is mush.
The first is equal application of theory and practice. There is no use in having a large amount of theory, if you cannot put this theory into practice. As a side note, the first thing we learned in class was about 6 weeks of binary and calculations, logic, etc. Our class size halved. Sure, those that were left got pretty much straight A's and became good programmers through dedication. But it wasn't the right way to get people interested. I'd suggest more practical work to begin with, and put in the theory as it becomes necessary to progress further (for instance... I only learned boolean logic formally after about two years programming... it didn't hold me back... I just learned it as necessary).
The second thing, and a vital consideration, is choice of language. I see two major options (and yes, somebody has tried to teach me both). Java and Python. Java, IMHO, is the worse language, but is going to set up the student better to move into 'hard core' languages like C or C++. It's also better to get work with.
My suggestion, though, would be Python. It's interpreted, which gives students an immediate look at what they have done. It can be written straight into a command prompt, or typed into a file and run. It can even be compiled. So it does it all (like Lisp really, but with much more beginner-friendly syntax). The syntax is lovely. It forces students to learn proper indentation and formatting rules, because this is how the language's control structures work.
During the course, the students IMHO should be working towards a larger project (large being 750-2000 lines). For a beginner, this poses enough of a challenge for them to learn (either the easy way or the hard way) how much design they personally need to put into a project before they start, what they like in documentation, realising that 'first make it run, then make it run fast' is very good advice, etc.
There are my thoughts. Sorry if it's a bit rambly, I've been doing homework and my brain is mush.
I would suggest you to teach about the build process. That is something I had a lot of trouble when I was beginning. What the compiler and linker did completely escaped me.
On the note of languages, I started by learning C++, and am genuinely happy with that decision. At the beginning it was hard, but once I realized what the compiler actually did, I was set. Pointers, arrays, structures were all simply blocks of memory. It became all so simple once I realized the underlying principles. This is why I feel Java and other high level languages are not suited for teaching. If the student's language is so abstracted, how can they reason the principles taught in said language without memorization? (Which is, IMO, a most poor method of doing anything, for the most part.)
On the note of languages, I started by learning C++, and am genuinely happy with that decision. At the beginning it was hard, but once I realized what the compiler actually did, I was set. Pointers, arrays, structures were all simply blocks of memory. It became all so simple once I realized the underlying principles. This is why I feel Java and other high level languages are not suited for teaching. If the student's language is so abstracted, how can they reason the principles taught in said language without memorization? (Which is, IMO, a most poor method of doing anything, for the most part.)
C8H10N4O2 | #446691 | Trust the nodes.
That's a good point, but I would much prefer students got the method behind programming right, before they got to know the details behind how the compiler works. Of course, in any well-rounded education, I would very much suggest that learning two langauges (perhaps one high and one low level?) or more would be a very good move, but in most courses there simply isn't enough time for this.
In my course this year, we learned Java for the most part, but we spent a week on SAM (some really simple virtual machine / assembly language). The thought was nice, but it didn't teach us anything, because we didn't spend nearly enough time on it. To most of the people in the class, it was still just Java expressions with a different syntax, and not, as it was designed to be, an introduction to how the CPU and memory worked together at a low level.
In my course this year, we learned Java for the most part, but we spent a week on SAM (some really simple virtual machine / assembly language). The thought was nice, but it didn't teach us anything, because we didn't spend nearly enough time on it. To most of the people in the class, it was still just Java expressions with a different syntax, and not, as it was designed to be, an introduction to how the CPU and memory worked together at a low level.
Agreed. IMO, half the year should be spent on learning a language to the extent of being able to read the syntax and to be able to write in it to an extent. The other half would then be spent teaching why it does what it does. As for hexadecimal, binary, and logic, IMO, they shouldn't be taught at all in an entry level course.Yayyak wrote:That's a good point, but I would much prefer students got the method behind programming right, before they got to know the details behind how the compiler works. Of course, in any well-rounded education, I would very much suggest that learning two langauges (perhaps one high and one low level?) or more would be a very good move, but in most courses there simply isn't enough time for this.
If the course is trying to teach the students to be good programmers, then I believe it is failing them. They'll learn that soon enough through experience, and there is no way a class meeting several times a week can ever accomplish the needed amount of programming work to substitute what good theory wil eventually teach.
Please excuse me if I sound strange, but this is how I learned. In fact, I've never taken a single course on programming. Gotta love books, no?
C8H10N4O2 | #446691 | Trust the nodes.
It's teaching them to pass the theory-only exam at the end of the year. I also learned to program about seven years before I took this course. I'm taking it mainly so I can get good marks in it.Alboin wrote: If the course is trying to teach the students to be good programmers, then I believe it is failing them.
I think formal logic and binary do play a role in a computer science course, which is what is being suggested. I don't think that they should be the first thing to be learnt though. Those things should be taught (again) as needed to get through the theory of why things happen.
As for your half practical / half theory structure, I like that.
I actually gave courses in programming in a kind of summer school. There were 12-year-old kids and 65-years-old retired people, and everything in between.
In the first day, I actually showed them brainf*ck. It allows you to talk about the real basics: Memory as an array of characters, one pointer, and how every problem ends up being solved by very elementary steps. It is so absolutely simple that no-one feels overwhelmed. It gives the people a basic foundation to which you can refer later on when talking about stuff like pointers, offsets and the like. It also shows them the essentials of "code creativity" ("add 48? Well don't type 48 '+', why not having a loop that executes 6 times and adds 8 each time...") which I consider essential for writing good code later on.
The tricky point is to stay with brainf*ck just long enough to show them that you can actually solve problems with it, but not long enough to waste time or having them going cross-eyed with all those punctuations.
The next step is a C program that implements the Euclidian algorithm. It solves a real-life mathematical problem, it allows you to demonstrate the use of argc / argv[], it consists of a while-loop and an if - else, it shows what a return value can be used for, and you don't need any includes, functions or whatever. After an hour of brainf*ck, they actually cheer at the power and clean syntax of C (imagine that).
Then you can extend that program with output. I prefer switching to C++ at this point (or rather, already using "g++" for the "C" Euclid program), showing them std::cout instead of printf(), if for nothing else but the former's more simple syntax. Either way, you can now explain them what a header file is, and what a linker does - and with cout, you don't actually have to explain a function just yet, just let it stand as-is and come back to operator<<() when they're ready for it.
You see where this leads: I take small, well-defined steps, with pre-selected examples, introducing one or two new concepts at a time. Never mind that I am switching languages in the early stages - it actually helps, because they do realize that there are different languages, and that they can be significantly different.
One thing I advice against is using weak / dynamic typed languages like Python, or "everything is an object" type languages like Java, in a beginner's course. Both paradigms may seem easy for us experienced code wizards, but I found they are very much confusing to people who didn't get the underlyings ironed out first.
As a rule, people have an easier time understanding that and how an integer is different from a float than they understand how and why a language can turn a string into an integer on a whim. And even I don't really get it why I have to declare a class ("what's a class?") just to write my first static ("what's a static?") int main(). With the higher-level languages, people usually do get it right after some time, but just because they learned to do it "this way" instead of really understanding what's going on.
Call me biased, but I consider C++ to be the best beginner's language around. It allows you to show what an array is, that a string is basically an array of chars, but at the same time offers <string> and <vector> so you don't have to bother with the limitations of C. It allows you to start with a simple int main(), touch on linkers and make on the way, and end up in the big league of IDE's, object orientation, generic programming, event-handler-driven GUIs, the works. It is also available and well-supported, even on the measly Pentiun II / Win98 boxes I had available in the lab.
Plus, in my experience a C++ coder has a very easy time picking up Java, Python, C#, or whatever (except perhaps functional languages) as a second / third language. The other way round is usually stony and full of thorny thickets, because - as I said above - they might have learned how to use the language, but have no clear picture in their head of what is actually happening inside.
As for Bool logic and the theory of stuff, I agree that it should take a sizeable portion of the time - but after they learned the basics, and as a way to solve problems. Unless I am mistaken, we're not talking aspiring CS masters or electrical engineers here, but people attending a "programming" type of course.
In the first day, I actually showed them brainf*ck. It allows you to talk about the real basics: Memory as an array of characters, one pointer, and how every problem ends up being solved by very elementary steps. It is so absolutely simple that no-one feels overwhelmed. It gives the people a basic foundation to which you can refer later on when talking about stuff like pointers, offsets and the like. It also shows them the essentials of "code creativity" ("add 48? Well don't type 48 '+', why not having a loop that executes 6 times and adds 8 each time...") which I consider essential for writing good code later on.
The tricky point is to stay with brainf*ck just long enough to show them that you can actually solve problems with it, but not long enough to waste time or having them going cross-eyed with all those punctuations.
The next step is a C program that implements the Euclidian algorithm. It solves a real-life mathematical problem, it allows you to demonstrate the use of argc / argv[], it consists of a while-loop and an if - else, it shows what a return value can be used for, and you don't need any includes, functions or whatever. After an hour of brainf*ck, they actually cheer at the power and clean syntax of C (imagine that).
Then you can extend that program with output. I prefer switching to C++ at this point (or rather, already using "g++" for the "C" Euclid program), showing them std::cout instead of printf(), if for nothing else but the former's more simple syntax. Either way, you can now explain them what a header file is, and what a linker does - and with cout, you don't actually have to explain a function just yet, just let it stand as-is and come back to operator<<() when they're ready for it.
You see where this leads: I take small, well-defined steps, with pre-selected examples, introducing one or two new concepts at a time. Never mind that I am switching languages in the early stages - it actually helps, because they do realize that there are different languages, and that they can be significantly different.
One thing I advice against is using weak / dynamic typed languages like Python, or "everything is an object" type languages like Java, in a beginner's course. Both paradigms may seem easy for us experienced code wizards, but I found they are very much confusing to people who didn't get the underlyings ironed out first.
As a rule, people have an easier time understanding that and how an integer is different from a float than they understand how and why a language can turn a string into an integer on a whim. And even I don't really get it why I have to declare a class ("what's a class?") just to write my first static ("what's a static?") int main(). With the higher-level languages, people usually do get it right after some time, but just because they learned to do it "this way" instead of really understanding what's going on.
Call me biased, but I consider C++ to be the best beginner's language around. It allows you to show what an array is, that a string is basically an array of chars, but at the same time offers <string> and <vector> so you don't have to bother with the limitations of C. It allows you to start with a simple int main(), touch on linkers and make on the way, and end up in the big league of IDE's, object orientation, generic programming, event-handler-driven GUIs, the works. It is also available and well-supported, even on the measly Pentiun II / Win98 boxes I had available in the lab.
Plus, in my experience a C++ coder has a very easy time picking up Java, Python, C#, or whatever (except perhaps functional languages) as a second / third language. The other way round is usually stony and full of thorny thickets, because - as I said above - they might have learned how to use the language, but have no clear picture in their head of what is actually happening inside.
As for Bool logic and the theory of stuff, I agree that it should take a sizeable portion of the time - but after they learned the basics, and as a way to solve problems. Unless I am mistaken, we're not talking aspiring CS masters or electrical engineers here, but people attending a "programming" type of course.
Every good solution is obvious once you've found it.
I would tend (unfortunately) to disagree with Solar.
As a computer science student myself, I know how important it is for all the class to be on the same level, same footing. If you have half the class already knowing, say, BASIC, a quarter knowing bits of C already and the rest knowing nothing, trying to keep them all interested while not getting too far ahead would seem a difficult task.
My teachers (We're solely talking about coding here, hardware and architecture courses ran in parallel) focused mainly on teaching algorithms, problem solving using programming, essentially the methods which make people good programmers. As the teachers pointed out, A good programmer can migrate from any language to any other language. It is only the syntax and some control structures that change, not the methods used. This can be applied to any tuple of turing-complete languages.
My lecturers started us out using MIT Scheme. Scheme is a cut-down dialect of Lisp and is powerful, yet completely different to most other imperative languages. This is what gets the class on the same level. They went on to describe how you can use recursion, why tail-recursion is best and why iteration is better (with examples showing stack overflow).
On to more compex topics - scope and environment, closures (not so useful in C, but Python, Ruby et al. have the same concept), and making ADTs (Abstract Data Types) out of procedures (you message-pass into the procedure, local variables being stored in the closure).
I feel personally that explaining pointers right from the start can have a negative effect on people. It is a difficult concept to understand, and while Solar may have 'got it right', using good teaching methods etc, it can all too easily backfire.
I would say: Teach them to walk before they run. You don't need to program in C/++ to be a good programmer, or indeed to be a professional programmer. My company employs about 30 people (of a 80 person company) who program solely in Perl, to maintain an automated testing infrastructure. Perl, again, can be seen as a stepping stone. You have closures etc just like in scheme, but also you see references (which act as C++ pointers do except you can't do arithmetic with them), and that can be a stepping stone onto C.
My two-penniworth
As a computer science student myself, I know how important it is for all the class to be on the same level, same footing. If you have half the class already knowing, say, BASIC, a quarter knowing bits of C already and the rest knowing nothing, trying to keep them all interested while not getting too far ahead would seem a difficult task.
My teachers (We're solely talking about coding here, hardware and architecture courses ran in parallel) focused mainly on teaching algorithms, problem solving using programming, essentially the methods which make people good programmers. As the teachers pointed out, A good programmer can migrate from any language to any other language. It is only the syntax and some control structures that change, not the methods used. This can be applied to any tuple of turing-complete languages.
My lecturers started us out using MIT Scheme. Scheme is a cut-down dialect of Lisp and is powerful, yet completely different to most other imperative languages. This is what gets the class on the same level. They went on to describe how you can use recursion, why tail-recursion is best and why iteration is better (with examples showing stack overflow).
On to more compex topics - scope and environment, closures (not so useful in C, but Python, Ruby et al. have the same concept), and making ADTs (Abstract Data Types) out of procedures (you message-pass into the procedure, local variables being stored in the closure).
I feel personally that explaining pointers right from the start can have a negative effect on people. It is a difficult concept to understand, and while Solar may have 'got it right', using good teaching methods etc, it can all too easily backfire.
I would say: Teach them to walk before they run. You don't need to program in C/++ to be a good programmer, or indeed to be a professional programmer. My company employs about 30 people (of a 80 person company) who program solely in Perl, to maintain an automated testing infrastructure. Perl, again, can be seen as a stepping stone. You have closures etc just like in scheme, but also you see references (which act as C++ pointers do except you can't do arithmetic with them), and that can be a stepping stone onto C.
My two-penniworth
No problem. (Casual) disagreement and discussion brings deeper understanding.JamesM wrote:I would tend (unfortunately) to disagree with Solar.
(Note: I never got the chance to go beyond first semester in CS. I learned my first half-dozen languages by self-study, and picked up the underlying principles only much later. I think that might be part of why our POV differs so radically.)As a computer science student myself...
Difficult to do if you lack the vocabulary to express algorithms in. I know that in formal CS study, that vocabulary is taken from the field of mathematics and (theoretical) CS itself, but that kind of vocabulary is basically unknown outside of the math / CS faculty.My teachers (We're solely talking about coding here, hardware and architecture courses ran in parallel) focused mainly on teaching algorithms, problem solving using programming, essentially the methods which make people good programmers.
Agreed. However, the first language is the most difficult to learn (so it really doesn't hurt to have them learn it in the lab instead of the first job position), and it does shape your way of thinking about problem solving.A good programmer can migrate from any language to any other language. It is only the syntax and some control structures that change, not the methods used.
And that is where the CS world and the "hands-on" world differ dramatically. I keep hearing that Lisp (and dialects) can teach you wonders about how problems can be solved and data be represented, but at the end of the day, "no-one" uses Lisp outside of the lab (except for the Emacs people, but they are {beeep} anyway ).My lecturers started us out using MIT Scheme.
As I said, the first language is hardest, and shapes your way of thinking. Yet, you are suggesting that people should go through this with a language they will probably never use again. They learn lots of stuff, sure, but they still most likely have to switch to a different language in their first job, which is most likely an imperative / OO language.Scheme is [...] completely different to most other imperative languages.
That does not strike me as a win-win situation but a tradeoff.
Being a professional for over seven years now, who got excellent ratings for his work in a variety of fields and languages, I still don't really know what a "closure" is (or a "lambda calculus", for that matter), and every time I get around to looking it up, I dismiss it after a few paragraphs because I feel it has little impact to my everyday work. (*)On to more compex topics - scope and environment, closures [...], and making ADTs (Abstract Data Types) out of procedures (you message-pass into the procedure, local variables being stored in the closure).
Yet, pointers are something I encounter every day.I feel personally that explaining pointers right from the start can have a negative effect on people. It is a difficult concept to understand, and while Solar may have 'got it right', using good teaching methods etc, it can all too easily backfire.
I think both approaches - yours and mine - do that. However, once we got our respective classes walking / running, they are moving on two different roads: Your class sure knows the theory behind it all, and can probably design a parser or compiler-compiler no problemo, while my class knows at least one mainstream programming language and can solve everyday problems in an everyday project right away.I would say: Teach them to walk before they run.
There is a large intersection of the two fields of knowledge, and it is rather easy to switch between them if you got the skill it takes, but we're not aiming at the same target.
I never said that. It is merely a good starting point. It is a language that can do three things at once: demonstrate 98% of all ugliness, bad habits and horrors programming has to offer (preparing people to do maintenance work on legacy code); demonstrate programming on the really basic level that gives you an understanding of how a computer "works" (preparing people to do development work on a lower level than Java or Python); and be used to design elegant, efficient and useful software in the "real world".You don't need to program in C/++ to be a good programmer, or indeed to be a professional programmer.
I don't know of any other language that allows all three things at once.
PS: (*) I just looked up "Closures" on Wikipedia, and left with the impression that they are somewhat akin to functors in C++, basically language-dependent, and thus I don't have to really know them untill I work in a project using a language that actually supports them - so they aren't anything I "need" to be a programmer. Correct?
Every good solution is obvious once you've found it.
re-reading my post, I think my 'approach' is really more geared towards a longer course as opposed to a crash-course on coding.(Note: I never got the chance to go beyond first semester in CS. I learned my first half-dozen languages by self-study, and picked up the underlying principles only much later. I think that might be part of why our POV differs so radically.)
Advocates of Lisp dialects (I'm not one of these by the way, I haven't used Scheme since first year university) say that it is a language in which it is extremely easy to cook up a test program or an algorithm implementation. In relation to C, imho lisp dialects are a magnitude easier to pick up and use, so can get people with less natural flair coding in less time. (You cannot pretend that it does'nt take natural ability to cognitively cope with pointers immediately - apologies for double negative).Difficult to do if you lack the vocabulary to express algorithms in. I know that in formal CS study, that vocabulary is taken from the field of mathematics and (theoretical) CS itself, but that kind of vocabulary is basically unknown outside of the math / CS faculty.
Agreed. It seems in this point you are taking my point of view more than yours. I'm not sure if that is the case or if I'm misinterpreting. It is important that the first language be a simple one (as it is the most difficult to learn). And C does not qualify as that. (Sorry, but it really doesn't. Take the posts from mohammed in the OSDev subforum as an example).Agreed. However, the first language is the most difficult to learn (so it really doesn't hurt to have them learn it in the lab instead of the first job position), and it does shape your way of thinking about problem solving.
Here I would argue that the intention is not to imply that you should use Lisp in your everyday coding life. It is to teach a language that can be picked up easily and used quickly to illustrate concepts.And that is where the CS world and the "hands-on" world differ dramatically. I keep hearing that Lisp (and dialects) can teach you wonders about how problems can be solved and data be represented, but at the end of the day, "no-one" uses Lisp outside of the lab (except for the Emacs people, but they are {beeep} anyway ).
An example is my lecuturer had a laptop and projector, and would write algorithms from start to finish on the screen in front of us, in about a minute. This would take a lot longer in C, and would probably see the lecuturer use ready-made programs (here's-one-I-made-earlier style), which IMHO is not as good.
That was quite a bit of an aside - the point being that this initial language is meant to teach principles which can then be ported to C, C++, Assembler, PHP, Perl etc etc (remember that not everyone uses C, many people make their money via web development, and these principles are just as valid there, possibly more so given that most webdevs have never heard of a binary search...)
And I use Emacs
Yes, and where's the harm in that? Most people don't learn only one language and stick with it. Hell, most people start with BASIC or something of that ilk - they don't expect to stick with it the rest of their lives.As I said, the first language is hardest, and shapes your way of thinking. Yet, you are suggesting that people should go through this with a language they will probably never use again. They learn lots of stuff, sure, but they still most likely have to switch to a different language in their first job, which is most likely an imperative / OO language.
Who knows what the language of the future will be? Granted, 90% certain it will be an imperative language, but it probably won't be Java and although C has stood the test of time so far, surely something else will come along. Again reiterating - it is much less the language that matters and much more how you use it.
A closure is code and data combined - that is, code with an associated environment and scope that can be passed around as a parameter. The code in a closure is sometimes mutable, but not always. (Think Ruby, modifying classes dynamically).Being a professional for over seven years now, who got excellent ratings for his work in a variety of fields and languages, I still don't really know what a "closure" is (or a "lambda calculus", for that matter), and every time I get around to looking it up, I dismiss it after a few paragraphs because I feel it has little impact to my everyday work. (*)
Me too, but that does not mean that they are the best teaching tool in the box. (A personal opinion, please don't take this as an affront - you have much more experience than me and I'm just trying to put my point across )Yet, pointers are something I encounter every day.I feel personally that explaining pointers right from the start can have a negative effect on people. It is a difficult concept to understand, and while Solar may have 'got it right', using good teaching methods etc, it can all too easily backfire.
I would agree, and possibly my approach is why so many university graduates are initially air-headed about the world of work . For this reason a middle-road might be appropriate.I think both approaches - yours and mine - do that. However, once we got our respective classes walking / running, they are moving on two different roads: Your class sure knows the theory behind it all, and can probably design a parser or compiler-compiler no problemo, while my class knows at least one mainstream programming language and can solve everyday problems in an everyday project right away.
Is that a good introduction to programming?It is a language that can do three things at once: demonstrate 98% of all ugliness, bad habits and horrors programming has to offer
JamesM
I found (and still find) it extremely hard to wrap my thinking around a functional approach, and I know this view is shared by many others who didn't enjoy CS at university. Perhaps it's a different flavor of talent - those who easily "see" the machine code behind imperative programming, and those who easily juggle the more mathematical approach of functional programming. The latter end up in university doing CS, and wonder why those who took up programming as a hobby or made it into the business without a degree sometimes cannot follow their talk.JamesM wrote:In relation to C, imho lisp dialects are a magnitude easier to pick up and use, so can get people with less natural flair coding in less time. (You cannot pretend that it does'nt take natural ability to cognitively cope with pointers immediately - apologies for double negative).
The point remains that the "nicety" of LISP-alikes is mostly a lab thing, while pointers (or references, to avoid the Java camp stoning me) are bread and butter.
Part misinterpretation, part bad wording by me.It seems in this point you are taking my point of view more than yours. I'm not sure if that is the case or if I'm misinterpreting.
Instead of continuing this point-by-point, let me summarize:
You are looking at a lengthy type of education that does much on the theoretical level, tightly interwoven with mathematics and algo theory, with even the code itself being mostly used to express algorithmical ideas. People might come out of this being great thinkers and designers, getting a high-payed job where they might never have to touch a compiler again.
I am looking at a very goal-oriented type of training with a focus on today's problems, which is all too often about finding out why the code left behind by some extern doesn't work correctly. While you have to implement some kind of algorithm every now and then, you are seldom expected to think it up yourself rather than looking it up somewhere, and getting those pointers right is just as important because there's no-one there to do it for you.
LISP is good for expressing algorithms and talking about programming issues on a high level (lambda calculus, anyone?). C++ is good for showing you how low-level programming works, getting you up-to-speed with a mainstream high-level language, and showing you what errors or shortcomings you might encounter in legacy code. It also looks good on your resume.
Yep, I sure think so. Algorithmic design is only one small part of software engineering (in which your pupils might well exceed). Implementation, extending functionality and bug hunting is another part, and I sure don't want a maintenance coder who doesn't know everything there is to know about pointers.Is that a good introduction to programming?
Bottom line: Different goals, different ways.
Every good solution is obvious once you've found it.
To expand on this, as a self-taught paid professional who consistently deals with college-taught coworkers, myself. I would say that my college educated peers, even after years of employment, lack *real* practical knowledge... the kind of knowledge that makes the difference between a calling yourself a programmer (genuine problem solver) and calling yourself a coder (genuine symptom solver)Solar wrote: Being a professional for over seven years now, who got excellent ratings for his work in a variety of fields and languages, I still don't really know what a "closure" is (or a "lambda calculus", for that matter), and every time I get around to looking it up, I dismiss it after a few paragraphs because I feel it has little impact to my everyday work. (*)
It's fine and dandy that you know what fancy names are applied to certain functions... but do you *really* know how, when and why to use them???
So, from a more senior perspective and potentially future employer, I definitely would like to more students come out of college with more practical application experience instead of a heavy dose of theory. Also, it would be preferable if they had knowledge of a traditional programming language (e.g. ASM/C) instead what is merely popular at the moment.
This approach should easily solve the other problem, how to separate those who *really* want to be programmers from those who think it is just today's cash cow.
In the end, the same old saying generally applies to your teaching method and priorities: Time, Price, Quality... pick any 2
However, good universities (courses in computer science, note the science part) are *not* training students for programming jobs.
Instead, they are training students to be able to *manage* programmers, or to produce design requirements which programmers then code up.
They would argue (this was told to me my first day at uni) that if you want to learn to program professionally, get a vocational qualification. They teach you how to make lexers, compiler compilers, hardware (our course had a very nice bit about ASM for the Z80 chip in) etc, but expect the student to acclimatise his/her self with the languages they intend to use.
Note that I know many universities shun this, instead giving lots of courses in Java and Web2.0, which I personally think is totally useless (Web2.0, not Java) - any halfwit can learn to code PHP ffs, why not teach something useful?!
Instead, they are training students to be able to *manage* programmers, or to produce design requirements which programmers then code up.
They would argue (this was told to me my first day at uni) that if you want to learn to program professionally, get a vocational qualification. They teach you how to make lexers, compiler compilers, hardware (our course had a very nice bit about ASM for the Z80 chip in) etc, but expect the student to acclimatise his/her self with the languages they intend to use.
Note that I know many universities shun this, instead giving lots of courses in Java and Web2.0, which I personally think is totally useless (Web2.0, not Java) - any halfwit can learn to code PHP ffs, why not teach something useful?!