Any CS course that does not teach students “the hard way” is doing them a disservice, and represents everything wrong with the industry.
Learning CS is not about learning how to get a big tech job at a fancy company, it’s about igniting the passion for computing that so many of these job applicants today seem to lack whereas 20 years ago it seemed anyone applying for a CS job was a nerd who wouldn’t shut up about computers.
For some, learning CS is also learning that this field might not be for you, and that’s okay. Just bow out and pursue something more tolerable instead of profilerating shitty low effort, low passion software in our world.
I feel it is essential that a CS curriculum be timeless in the way physics or math is. So yea, I would expect that if I went back to my university and saw what my old professors were teaching, it would still be the same theoretical, algorithmic, hand coded work in low level languages or assembly. I would be very disappointed if they were just teaching students how to prompt stuff with AI.
Mind you, as a student at the time I did not understand why we were doing all that old stuff instead of learning the cool modern things, but I understand why now, and I wish the professors would have explained that a bit clearer so students don’t feel misguided.
My ideal curriculum would be to go through the entire evolution of computing, and at the final years you end up in modern computing. In the end we kind of went over all those topics, but it would have been a very straight forward curriculum. You start at basic electricity and the Turing machine, in the middle somewhere you learn about neural networks (I learned that around 2000, and it was old technology then).
When you graduate, you have a full understanding from bottom to top.
That's how I would have loved it, but maybe for others that would have been too boring, so they mixed it up.
In the end I got great value from my master in CS. All the practical things you learn at the job anyway, and I definitely learned a lot those first few years. But my education allows me at certain occasions to go further when other developers reach their limit.
Yea I think a general history of computing that teaches from first principles would be great. Could help students realize neural networks and transformers aren’t really new concepts just needed the data and hardware to catch up. Can dispell a lot of myths and magical thinking about AI.
Learning CS is not about learning how to get a big tech job at a fancy company, it’s about igniting the passion for computing that so many of these job applicants today seem to lack whereas 20 years ago it seemed anyone applying for a CS job was a nerd who wouldn’t shut up about computers.
For some, learning CS is also learning that this field might not be for you, and that’s okay. Just bow out and pursue something more tolerable instead of profilerating shitty low effort, low passion software in our world.
I feel it is essential that a CS curriculum be timeless in the way physics or math is. So yea, I would expect that if I went back to my university and saw what my old professors were teaching, it would still be the same theoretical, algorithmic, hand coded work in low level languages or assembly. I would be very disappointed if they were just teaching students how to prompt stuff with AI.
Mind you, as a student at the time I did not understand why we were doing all that old stuff instead of learning the cool modern things, but I understand why now, and I wish the professors would have explained that a bit clearer so students don’t feel misguided.