There are aspects of informatics (my preferred term) which are a science, or at least are a research field of a type.Muazzam wrote:(Not sure if computer science really counts as science.)
The problem I have with the term 'computer science' comes from the fact that it almost invariably refers to other aspects of the topic which aren't even remotely scientific, or even engineered. In the US at least, most informatics coursework is more about training for coding as a trade (which applied informatics often is), with a total focus on learning specific languages, techniques, frameworks, etc.
Working programmers (and informatics theorists) need to know at least one practical programming language and have exposure to several others, and those who are more conscientious (or at least, not working in a fixed niche such as maintaining COBOL programs the whole time) keep taking them throughout their careers. However, the languages and frameworks themselves are not really the core topics of 'computer science' or 'software engineering', or even of programming as a trade. They aren't even really enough for the practice of the field as a trade, without also including a more abstract understanding, never mind the theory and research aspects.
Most universities here give only the most rudimentary coverage of even basic topics such as data structures and algorithmics, either of which are big enough topics that they could fill a curriculum on their own, while at the same time giving almost none to the interpersonal communication and project management topics which are the real bread and butter of working software developers.
By trying to split the difference, while putting most of the focus on what are really somewhat peripheral topics which need to be learned and re-learned repeatedly anyway, they end up diluting the courses to the point of losing most value, which is why so many developers and theorists alike tend to dismiss university degrees as irrelevant. They shouldn't be, but there is just no way to cover enough of the topic in a four-year course while also providing a liberal (in the sense of broad) education, which is after all the original purpose of universities.
Personally, I think we need to restructure the whole thing. We need three or four different tracks, in addition to the existing separate MIS track many schools offer now, with different goals and endgames. All of them would be graduate curricula, not undergraduate - even for trade programmers, four years isn't enough. Perhaps a 'pre-informatics' track might be made for BS courses, but maybe not.
Also, all of them would have a separate two-year apprenticeship, above and beyond the degree work, with a certification at the end.
I actually think - despite being against my own self-interests - that it would be best to have commercial (paid) software development regulated in the same way medicine and law are - or, more appropriately, like every other engineering discipline is. However, whenever I have mentioned this over the past 20 years, people respond with 'burn the witch!', or at least dismiss it as impractical.
I have often gotten the answer that unlike those fields, software isn't something that can destroy a person's life or livelihood when done incorrectly, but that's so obviously wrong that I can't help wondering what planet they are talking about where programs don't affect things. Bad software does take lives - anyone who knows the stories of the Therac-25 or the timing problem in Patriot missiles knows this. Bad software has caused stock market crashes and rocket launch failures. It misleads, it facilitates theft, it wastes peoples' time. And we've got almost no software today that isn't bad. We can't go on like this.
It is a trite cliche that if we built houses like we build programs, the first woodpecker to come along would destroy civilization. But architects are licensed for a reason. I think programmers should be too, for the very same reason - too much is at stake if it goes wrong.