Computer Science (by alaric)
Is a Computer Science degree useful for people who want to have a career in software development? Many who work in the field come from physics, maths, or electrical engineering degrees, and do perfectly well. There's a widespread feeling that the concepts taught on computer science degrees, such as formal logic, proving the correctness of algorithms, functional programming, compiler theory, and so on are, at best, only vaguely useful in "real-world" software engineering, There's a sort of warm fuzzy feeling that knowing these things makes you a Better Programmer, even if you never use the knowledge directly, because you're more aware of the underpinnings of the tools you use. But I don't think anyone has ever shown a real benefit. With the obvious exception of people who go into niches such as compiler development, or writing tools for mathematicians...
Software development, in practice, is mainly engineering; often just following simple plans in obvious ways, like bricklaying. It takes skill to do it neatly and well, but not imagination or theoretical background. Familiarity with tools such as off-the-shelf libraries and standard system interfaces like POSIX are probably more useful than Prolog programming to most programmers. Debugging is, in practice, more valuable as a skill than using natural deduction to prove the correctness of algorithms.
But that's not to say that computer science is useless. Many modules in my computer science degree were engineering based, looking at practical topics such as building reliable distributed systems, dealing with concurrent access to resources, databases, networks, and operating systems. Those courses covered how things like TCP stacks are built, but that's necessary information to properly use them; information required by anyone who has to do a good job of writing network software. And the theoretical modules, on semantics, functional programming, logic, Prolog, and formal methods were useful to me as a special case of somebody interested in building new programming languages; a small minority of us nerds-among-nerds bury our heads in topics like continuation-based models of concurrency, and then emerge at the end with practical tools such as programming languages, threading libraries and distributed agreement protocols that the rest of the nerds can use to build applications with.
However, an electrical engineer will be taught programming, aimed at writing embedded software. It will be approached as an engineering activity, goal-oriented and pragmatic, emphasising requirements capture and verification of the result, and debugging. Issues such as working with the constraints of the hardware will be covered. It's no surprise that electrical engineers are widespread and successful in the software industry. But the electrical engineers who make it in software have had to do a lot of learning in their own time, and as such, it's harder to select them; they need to be individually interviewed in depth, rather than being rolled off the University assembly line pre-tested to a known standard.
So perhaps computer science degrees need to diversify further. Mathematics is often split into Applied and Theoretical sects; the distinction is sometimes arbitrary, with most topics straddling the divide in some way, but they are taught with different emphases. Theoretical mathematicians are better trained to go into mathematical research in academia or the more abstract R&D teams, while applied mathematicians are primed to dive into practical problems in statistics, simulation and optimisation. Perhaps we need something similar in computer science; I know that most degrees are modular, and mine let one end up with a degree title reflecting the specialisations one took, but I'm not talking about modules - I'm talking about a fundamental shift in emphasis in the degree, from day one. Everyone should start off with a year of practical software engineering, because even the most abstract theoretician needs to know how their work will be applied (and have the skills to build implementations of their theories, so they can be tested and then applied by others). Teach enough about compilers and computer architecture to give the student a head-start in optimising their code, without going into the detail required to build compilers or design CPUs. Give a nod to formal methods in showing how to design correct algorithms by informally argument.
Then in the second year and beyond, let it be down to modules; the software engineers can go into things like networking, databases, graphics, operating systems, high performance computing, distributed systems, and so on, depending on their desired specialisation. The theoreticians can go into abstract topics. And by all means, at the end, give them a Software Engineering degree if they did mainly software engineering modules, Computer Science if they did mainly theoretical modules, and something like "Applied Computer Science" if they did a mixture. Don't restrict student's choices, unless modules have an actual dependency on the knowledge from previous modules; but at the same time, give them guidance by explaining which modules will help them for different career paths. And don't force software engineers to spend their time learning abstract stuff they'll resent, in the vague hope that it will make them better programmers; it's no more useful than the electrical engineers working in software who had to sit through courses on filter design!