Someone without a formal CS education who's selling programming courses telling you that formal CS courses don't necessarily teach you programming (duh).
He has a point that if you do a Master's you might be going without the extremely useful basic algorithms and data structures courses, but that is easily avoided by choosing a program that lets you take that. For other topics that aren't strictly programming, eg AI, ML, game theory, cryptography, quantum computing, probabilistic methods, optimization,... it would take a significant amount of dedication to match the amount of understanding that a rigorous course with problem sets can give you in self study.
>For other topics that aren't strictly programming, eg AI, ML, game theory, cryptography, quantum computing, probabilistic methods, optimization,... it would take a significant amount of dedication to match the amount of understanding that a rigorous course with problem sets can give you in self study.
You mean, the same amount it would take to do it through a university program? I fundamentally don't believe the material is unavailable. If you've done a CS undergrad at a respectable university, you should have enough instinct to build a curriculum to become well versed in those subjects
I am going through the autodidact process for all of those subjects (except quantum computing) (yes, it's brutal, especially alongside a FT job). But with the amount of MIT open courseware, books, and frankly, just wiki pages available, it's been easier than doing it at the pace of a university course
Yes, following along online can give you as much information as sitting silently in lectures and recitations and completing problem sets while sitting in isolation in your dorm room; but there’s way more to be learned by thoughtfully engaging in discussions with your professors, TA’s, and classmates. In fact I’d argue the bulk of my education came from the latter.
In short: Getting A’s at MIT is simple. Challenging yourself to challenge your colleagues is difficult. Not to mention all the labs.
As a corollary, I’d go on to argue everyone who paid full tuition for online classes during the pandemic got ripped off on a massive scale.
I am someone who has a masters degree and I have taken Distributed systems, Computer Architecture and Advanced Algorithms at Bradfield. In my experience, I would rate Bradfield better than a master degree for these reasons -
1. Was more relevant and practical. It was taught by people who have worked building real things in the real world.
2. Better ROI on time spent learning. In about 8 weeks (roughly 2 hours every week) and some self study I had the confidence to apply the learnings and continue learning more. Time comes at premium to me as I am working full time and have a family to take care of
3. Cheaper than a university course. It costs 2k USD roughly for a course.
During the same time, I also tried OMSCS from GATECH. I felt I got way more from these Bradfield's courses by spending less time and money than OMSCS as well. That said, its just me and YMMV.
Well the material that the authors have collected on teachyourselfcs.com is:
1) free
2) taken from outstanding courses at Berkeley, Stonybrook (the illustrious Steven Skiena), MIT and Stanford.
As I see it, this is evidence that those schools have some good courses. Graduate systems courses are pretty great as well, and of course you can hit other areas such as computer graphics, AI/ML, etc..
For formal education, I'm a bit biased toward multi-year programs (because it takes a while to learn this stuff) and schools (like those above) with good CS courses. But there's nothing to say that the authors' unaccredited graduate program is bad - based on the above I'd expect it's probably decent, although I worry about the 15 hours x 52 weeks though because I don't think that's enough time. In my experience a good systems or engineering course takes about 200 hours.
For other topics that aren't strictly programming, eg AI, ML, game theory, cryptography, quantum computing, probabilistic methods, optimization,... it would take a significant amount of dedication to match the amount of understanding that a rigorous course with problem sets can give you in self study.
Also, you're going to have an extremely difficult time getting to work on that stuff without a degree, given that even PhDs often end up on regular business bullshit.
In my experience, it's not the degree that gets you the job, it's the research you got published in the niche you're applying for. The number of people with excellent publications in such a niche, but no degree, is awfully small.
On the Contrary: A Masters Degree is often easier than an undergrad degree. You get a lot more freedom to do whatever the fuck you want. In most Universities, Masters students can teach/do research and get tuition forgiveness.
If you want the experience of being in a college campus for a while without the bullshit involved in {undergrad, phd, jobs}, Masters degrees are the perfect middle ground.
I did a Masters degree. I didn’t do very well on some courses, really loved others, learned how to read academic papers and what research would look like if I wanted to do a PhD. But I also had enough free time to make friends, hang out and go on road trips and parties. It was a lot of fun!
Academia is running out of ways to convince people in that trap. Evolve already. I believe academia as we know it, is in stiff decline and the problems in CS are exarcerbated because of the faster reality of software.
Any degree, by the time you have it, might be deprecated or eroded to the point is pointless. If academia doesn't accelerate the pace between education/application, only wealthy people that can afford a lifestyle without income will seek them for entertainment and that's not sustainable to keep a society educated. The value proposition is just not viable.
Education has to go the tutorial way or it will continue to decline.
> Academia is running out of ways to convince people in that trap. Evolve already. I believe academia as we know it, is in stiff decline and the problems in CS are exarcerbated because of the faster reality of software.
That’s just CS you are talking about, getting into an emerging field that is very important like Material Informatics would be veeeery hard without an academic setting. Even with CS… honesty I still disagree that academia doesn’t have a purpose or value. Just reading a good research papers recently, one of my favorite recent topics being new ways to discover spy cameras using smartphones. Also some of our greatest minds have received a lot of benefits from an academic environments. I understand bachelor degrees from a lot of places are overpriced and I agree that those seem like a trap and the value might be questionable.
> Any degree, by the time you have it, might be deprecated or eroded to the point is pointless.
My degree cost me out of pocket about ~$5k for my tuition. I stayed in state and my state had a program that paid for tuition if you made above 2.3 GPA in high school. That gave me new friends, sports clubs to join, and understanding of subjects early on that I still use today like psychology and finance. Not to mention my CS degree has paid me back in dividends and I still use knowledge from it today. I have maaany friends in the same boat as me.
> Education has to go the tutorial way or it will continue to decline.
Sounds like a version of tutorial hell potentially but on the job training has its place. AI, ML, game theory, cryptography, quantum computing, probabilistic methods, the list goes on for sincere that would take a significant amount of dedication to match the amount of understanding that a rigorous course with problem sets can give you in self study.
> Even with CS… honesty I still disagree that academia doesn’t have a purpose or value
I said their model is not viable and they need to evolve. Academia and education has a lot of value.
> Just reading a good research papers recently, one of my favorite recent topics being new ways to discover spy cameras using smartphones
That research paper probably started by the author searching on google and finding a conspicuous mind showing how they were able to use their phones to spot IR lights.
> Also some of our greatest minds have received a lot of benefits from an academic environments
I agree, we need a central resource of our greatest minds and that's why I hope academia finds a way to evolve, because widespread education is doomed if they don't. Important to note, some of our greatest minds are autodidacts.
> My degree cost me out of pocket about ~$5k for my tuition
Most people don't even know how they are going to pay for the next rent or meal. They should still be able to access knowledge and education.
> Sounds like a version of tutorial hell potentially
Any subject as complicated as it might seem, to a devoted person digging information online they can learn it. Academia is not different in that aspect either, they usually just provide a direct link to the information and students have to learn it on their own.
I think you're missing the distinction between education and training.
On my Master's InfoSec degree, several students complained that the tech we were studying was really old. The professor simply responded that this is education, not training. You are expected to learn how to analyse the security of systems in general. The specific tech used is just an example.
Quantum computers don't obsolete CS. They're a kind of coprocessor like GPUs and can't do most tasks faster than classical computers (or, actually, at all).
I did specialize in "AI" in undergrad a year before deep learning was discovered and so most of what I learned about genetic algorithms/K-means/MCMC turns out to be kind of useless, but oh well.
Can't say my professor thought much of it at the time. I did have to actually learn statistics though, which should put me at an advantage over ML engineers if I get back into it…
In CS, a lot of the knowledge that is very valuable today was discovered years ago and is taught well in Master's programs. As a concrete example, I know some great database developers making >1 million a year because of their expertise. They tell me the vast majority of knowledge the apply at work was discovered in the 80s. They're up to date on newer stuff, but it's just not what gets used.
Could that all be invalidated by some new hype technology? Possibly, but even then it would be decades before these older systems are ripped out.
I think you're imagining a much more rapid change in technology than usually happens. Heck, I knew someone who's masters was focused on vacuum tubes and graduated the year the semi-conductor was invented. This was one of the more rapid seismic technical shifts in history and it still allowed him to have a long lucrative career with legacy systems. I wouldn't worry about CS knowledge being outdated anytime soon.
Academia will always try to teach the best way around a problem. So, you won't see academia, in an updated program, trying to teach these old ways, eventhough there's still going to be a market for legacy technology. They will usually be highly profitable for that same reason.
This is pretty much the real reason I’d consider pursuing a Masters degree. Being in an academic environment for a while sounds fun, even if the resulting degree is fairly useless or redundant. I’d love a second chance to be exposed to people who could become new friends, while also learning a thing or two maybe.
It is possible to get a position as a graduate assistant or teaching assistant where the university pays for your tuition and pays your living expenses in exchange for you helping the CS department with various tasks (which can include teaching undergrad classes if you're a TA).
I did that for a year and then ended up a class short of graduating because of a scheduling mistake. Since staying a grad assistant wasn't an option, I ultimately dropped out and started working instead. Getting the GA job required me to take the GRE (I scored 170 on the math part, 165 on the verbal and 90 something percentile for both with only minimal studying for the test) and be interviewed about some undergrad CS concepts.
If you're able to do grad school in CS and you've already got permanent legal status in the US, it doesn't really make financial sense even if you're a GA or TA. You can make much more just getting a job. That said, I don't regret doing it because I probably enjoyed that year more than any of my undergrad years (this was pre-pandemic) and I didn't go into any additional debt to do it!
you can get a stipend along with tuition forgiveness, so all you're really losing is $150k+/year in opportunity cost by goofing off in academia rather than working
He's saying a reasonable software developer salary is $150K which is likely true if someone is a software developer who is considering a masters program just for fun.
Is that a reasonable salary for a software developer just out of school? Only a faang would approach that amount for a new grad and that includes the extra bonus/stock grant.
In 2019 my employer posted the necessary H1B notices, and the new grad jobs were 115-130k. This was cash salary, and I believe the stock grants were minimal/nonexistent at that level. This was not a faang-level public company, nor a high-prestige/unicorn startup, in Seattle. 150k seems to me to be a pretty reasonable ballpark number for a software developer working in one of the cities with a high cost of living with a well-established ecosystem of companies competing for developers.
Lots of valid criticisms here against academia, but I'll take this time to get on a mini-soapbox about communication skills. The biggest value-add I honed over the course of my academic career is my ability to communicate technical topics. Now, that isn't to say every master's student or CS degree holder is definitely better than a bootcamp grad at tech comm, in fact the winning combo there would be a communications/english degree + bootcamp for entry-level positions.
And this isn't just making presentations and writing documentation. It's making effective use of your time in standups. It's conversations about your future relationship with your employer. It's knowing how to ask the right questions to weed out bullshit when digging through other technical documentation respectfully. It's taking a bold new idea that entered your head and figuring out which way would be the best to envangelize it at your organization.
I think in an ideal case, a master's degree in a computing discipline hits the intersection between intense training on a subfield of computing that can be difficult to break into (e.g. statistical machine learning, robotics, formal methods, cryptography) and your demonstration of your "mastery" is taking a cutting edge concept in that subfield and demonstrating mastery through effective technical communication in the form of a project/thesis.
A master's degree isn't guaranteed to make you a "better" programmer, but I would really hope that it would make you much more familiar with the field, and also teach you how to take that familiarity and leverage it to become a more effective communicator.
You should aim for a research environment where you don't have to justify your existence to some "product manager" every 24 hours. Take it from me as an old person: if you work in stupid environments in stupid ways on stupid stuff... what happens is exactly what you'd think would happen.
They are merely a way to understand what the rest of the team is working on and to make sure you have the resources you need to do your job without having to interrupt you later or are you having to interrupt somebody else later it takes 15 minutes and it often saves hours of wheel spinning.
It's a process without a purpose. It is a process that kills flow and causes less work to get done. Everything important must fit within a short story 24/h later which means work gets packaged into easy to explain daily concepts.
It serves no purpose because if you are blocked today waiting till a morning meeting is a bad approach because the meeting than expands to discussing an issue that belongs at another meeting. Plus the time wasted adds up.
Asking for help today is something seniors do.
Standup meetings treat developers like children and encourage bad habits.
I agree with the sentiment that justifying your existence every 24 hours isn't an effective use of anyone's time. Perhaps "standup" wasn't the right word. But in general, verbal communication/meetings/etc. was the gist.
As a manager, one has to manage many different personality types. Not all people are proactive in communicating that they need help or are lagging. The manager could go to each of his directs and ask these questions individually, but it is more efficient to do it as a group. The group setting also gives coworkers the opportunity to say, “oh, I know what’s going on. I can help with that.”
Stand ups do not have to happen every day. That’s where a lot of managers fail.
It could mean that. It could also mean that there are some parts of working in a group (instead of solo) that are non-negotiable, such as periodically meeting with that group even if it is on a call.
This post seems light on any actual data between the difference on individuals who have a CS masters and not, and instead argues against getting one on a few aux points:
* Opportunity cost
* two tweets of HM saying they don't find it useful
* cs masters aren't geared towards non-cs college grad students
* professors don't know how to handle online teaching
* programs are cash cows (mentioning two non-top-cs programs)
To address the core point about whether those with a masters from a top CS program (say top 20) vs self-taught _on average_ do know more about programming, in my experience they do! Topics like how programs work (interpreted vs compiled languages, memory management), how a program talks to the OS and underlying hardware, and even topics on AI/ML and how to do deep learning. Whether that makes them better at their job depends on what their role is - if it's making a web application or working on underlying infrastructure in C.
So I guess to provide an alternative view to the author's, there are good online programs like UT Austin and Georgia Tech where students don't have to take time off work as they can do it at night / weekends, both programs are low-cost (~$10k), and do have professors who do understand online teaching. I think it's important to make sure you pick colleges and classes geared towards learning, not go with the intention of specific "job training", and likely you'll come out ahead having done it.
What I found quite interesting were that these programs are both lower cost than the program the author was selling. Georgia Tech and UT Austin are both recognizable names as well.
I think the article has a sentiment that academia is wasted time. On the contrary. Of course it has its points, I agree.
I am a self-taught programmer who got later a CS degree. Also a self-taught entrepreneur who later (almost) got an MBA.
My experience is that these are much more valuable for people who already know how to do the stuff but missing the jargon and thoughts of a crowd of exceptionally brilliant people lived before us. I am talking about the scientists, inventors of algorithms, math, bookeeping, all the concepts. How faster you can tell complex thoughts when you just have the vocabulary to use. How do you even present a spectacular idea if you have to spend half an hour explaining something that you should have known has a name.
I never understand how some people get the hubris to think they can do better without all this. Of course if they feel they could not have used all these in their professional lifes - probably getting a degree would have been a waste of time for sure. Some of these folks of course are quite capable and can and will do big things. Also sometimes you have situations when you have to choose between your startup company and the degree. We know some successful dropouts. Still, I think there is big value here.
I've been self taught primarily in the computer networking space for a little over 10 years coming straight out of high school and I'd really really like to get more formal knowledge inside the CS field, both out of general interest but also to match the increasing scope of what I work with, but I've never been sure how to go about getting started mid career. Further education was never really something I planned for when I was younger and I barely even made it out with a high school diploma because I didn't value trying to chase education as much as I just enjoyed tinkering with computers. I was lucky to have an in to an internship via a high school robotics mentor to get my career started and now I'm in a principal role still on the upward portion of my career while also traveling a significant amount and working an irregular schedule so I'm not sure what education path is actually a worthwhile yet flexible enough. Do I just start looking up random universities with online courses? Is that even going to give me what I want or am I just going to end up qualifying for an official piece of paper instead of actually getting a useful amount of new knowledge? Do I have enough time to actually get a useful amount of new knowledge on the side?
So I haven't yet pursued a CS degree but whenever I hear of a story like yours it becomes more nagging in the back of my mind.
It appears a somewhat counter-argument is about getting high salaries and the corresponding positions without having to go into debt or at least invest the time and money into getting a degree. As we all know, student loan debt is a very real issue. Some of those people might feel they aren't missing anything, where in cases like yourself, there is a need to find out what they might be missing or a sense of wanting to be publicly viewed as a true equal in the field.
I think it should be noted that the Georgia Tech Online Master's in Computer Science (known as OMSCS) mentioned in the article costs about 7,000 USD, all-in for a 5-semester program. [1] This seems to me an amazing bargain for a Master's degree from a respected institution in the USA.
The article's argument against programs like OMSCS is that the only feedback is from teaching assistants. This claim is not true. There is (a) automated grading of programming assignments, so your program runs against unseen test cases, (b) informal feedback from TAs and other students on Slack and other unofficial channels, (c) formal feedback from other students as part of grades in certain classes and group projects, and (d) feedback from professors to student questions during video office hours (in some courses).
Yes, lots of assignment and exam feedback does come from TAs, but it is comprehensive and valuable. The TAs are well versed in the course material and relay information and questions from and to the professors. I completed the program and at no point did I feel like I was receiving too little feedback on my work.
I also completed the program. Some classes did better at feedback than others. But honestly, whats so bad about getting feedback from teaching assistants? The TA for Reinforcement Learning wrote the book Grokking Deep Reinforcement Learning and has experience using RL in industry. If that is the worst they can say about OMSCS its sort of a pretty big endorsement.
They also have an ms of cybersecurity. Depending on the track you take, it can end up being 9/10 the same classes needed for the CS degree.
I’m currently enrolled and will say the networking is the best part. I’ve found a group of about 50 people in a private slack scattered all over the world. We all bounce things off each other.
It’s amusing to me how this article ended. It really discredits much of the author’s opinion.
I’m a self taught software developer that went back to school to learn the engineering aspects of what I’m doing. Boot camps and online tutorials like what the author appears to be selling teach you how to do a task. Higher education teaches you how to think, learn, and communicate at a higher level.
Do you learn how to build a React app? No. But anyone can go learn that on any of the thousands of websites that promise to turn you into a computer scientist in 6 weeks.
I fully enjoyed my experience in grad school, and wouldn’t trade it for anything. It might not have been the deciding factor in getting a job, but it sure as hell made me more confident.
I think CS masters degrees serve one valuable purpose. They are a gateway for foreign graduates to come to the US, work on their CS masters while doing TA work to reduce the load on the professor and then get a 27 month work permit .
In fact it is cheaper to hire foreign graduates under this program compared to a US national because you get to save on FICA taxes which are approximately 15k per year.
In fact many universities are already aware on this and have built coursework only masters degrees with no research required, as that provides more tuition to the university at a foreign grad rate and let’s the students graduate earlier so that a fresh batch can take their place.
My experience: I was interested in machine learning and was working in industry with no CS background. I was accepted to an MS in CS based on my undergraduate work and also programming experience. The program was in-person, not online. Through the courses and contact with professors both in CS and in other departments (Mathematics, Statistics) I developed a far better understanding of the field than I would have without being at a university.
All of the courses were the same regardless of whether or not you were an MS student or a PhD student. Professors actively encouraged MS students to continue on to the PhD.
The article says "If your undergrad degree was in some other field, you can get through an MS in CS without ever taking an algorithms or data structures class." In my department you had to take a minimum number of courses from a few categories. For the category that included the Algorithms course, I would say at least 90% of all students took that course rather than the others on offer. It followed CLRS and moved really quickly for someone with no undergrad background like me. The course had no programming in it so I have no idea what the author is talking about when per mentions programming experience.
I think if the program were online it would have been harder to have the multidisciplinary experience I got in-person. But for people who didn't want that aspect of it, I think an online program might work just fine.
My experience: The first two years of my CS PhD program, which overlapped significantly with the MSc program, were extremely engaging and pushed my intellectual boundaries much harder than OCW or any other online content.
Doing the OMSCS program for Georgia Tech and interviewers always comment on how that's impressive. It's a rigorous program, and I definitely think it's helped me grow as a developer.
It is impressive. Don’t let articles like this make you think it isn’t. Many people start it and can’t finish. Graduating is an accomplishment that should be celebrated.
Higher education is not designed to churn out 10x engineers.
My MS in comp-sci was super rewarding. I loved my classes, my advisors and professors, and it was great to be around students with similar interests and academic experience. I found it easier than undergrad because all of the classes were things I was interested in and better at - compared to many of the liberal arts requirements for my undergrad.
The credentials DID enable me to teach as an adjunct professor which was also great and helped more for my current role than actually taking classes. I recommend teaching to anyone that has even halfway considered it. If not at a university then still go to local meetups and clubs and participate as a speaker.
No one ever made the guarantee that it would make me a better employee or founder but it did get me an immediate pay bump from the company I was at when I completed it and does open doors when roles have it as a requirement.
I'm currently getting my BS in Computer Science with the University of Florida's Online Program, and I think that there is a duality between academic CS and applied CS, where the overlap is massive but college isn't particularly required. I find that many courses and tutorials online cover the topics well. Although these non-degree courses may not be of equal rigour, I leave them understanding the fundamentals needed to succeed in say an internship. By self teaching, I have proven that I know enough in my Freshman year of college to acquire an internship.
That being said, I think that someone with CS knowledge is always going to be more valuable than someone without. I really like teachyourselfcs, while in high school I skimmed a lot of the material preparing for what I had in store. I think that it's entirely possible to get good at CS with this method, but the issue is feedback and the stress associated with self teaching. Self teaching isn't always so glamorous as it is made out to be, a lot of people need the backbone of an actual program with structure than be thrown into a text book with courses that may go along with said text book. I think that entirely dismissing degree programs is equally as unwise as saying they're required, there is a lot of gray between the lines and I think that people should approach the issue with their strengths and weaknesses in mind.
The author is correct: you do NOT need a (masters) computer science degree to advance your career. I've worked with plenty of stellar principal software engineers — some with Phd degrees in CS, some with no formal education, most in between.
And while I am a life long learner and continue to self-teach myself a range of topics that pique my interest, Georgia Tech's OMSCS masters in CS was such a pleasant experience, a rigorous one at that.
I'm guessing the OP is bashing OMSCS based on Reddit comments from people who tried a course, found it difficult (because, hello, it's actual graduate school at a real university), gave up, and are now spreading negativity on the internets about something they only gave a half-assed try.
Sure, there are negatives of learning at-scale. Grades are going to be exam-driven (noisier) because papers/independent projects don't scale as easily, which means there's a chance that you do everything right and get a B. Sure, some of the videos are a couple years out of date. Overall, though, I've taken two GT OMSCS courses and so far the quality has been very high... and the professors, in my experience, are also constantly trying to make the experience better and more flexible.
Great to hear. Which two courses have you taken so far and have you decided on your specialization? I had specialized in computing systems and the courses — especially compilers — was top notch.
Really excited about Compilers. I've heard it's very good. Haven't decided on a specialization yet; I'm still in that phase where everything looks interesting. TBH, there are ~20 courses that appeal to me, although if I still feel like I want to press on after 10, I'll probably look into pursuing a PhD.
I have an undergrad from a large, anonymous state school, and a masters (earned in 1993) from a smaller, regionally prestigious private engineering school.
I honestly don't believe that I learned anything useful when getting my MS. The upper level undergrad courses that I took at the state school were mixed grad/undergrad, and covered far more material than the ones at my grad school. I sailed through all the coursework because it was an easier version of the courses I'd previously taken.
I applied and was accepted to the PhD program at the school because I thought I wanted to study AI (in the early 90s). I was lucky enough to get a fellowship, so I took an overload of coursework. After 2 semesters, I decided that I was much more interested in systems, and was at the wrong school. So I decided to leave with an MS and did a project over the summer and left.
All that year really did was to pad my resume a bit and put a better known school at the top of it. I think it I may have gotten a marginally higher salary at one of my first employers because of that. I'm still thankful for the fellowship, and the knowledge that getting my MS cost me almost nothing beyond delaying my career by a year.
I stuck around another year and got my masters. It was mainly an excuse to go on another solar car race. Even though I delayed getting a job a year, I suspect the masters degree got me more stock options than I would have gotten otherwise, and that stock turned out to be rather lucrative. My wealth of experiences from building embedded systems for solar race cars also played positively into entering the workforce.
The graduate level algorithms course I took was definitely interesting and helpful.
> “Instruction” here includes faculty salaries and benefits, and a typical faculty member spends around 40% of their time on teaching-related tasks, so it’s fair to say that universities spend 10-20% of their budgets overall on teaching.
Reducing teaching costs to just faculty salaries ignores a wide array of elements that I consider teaching, including:
-- Student tutoring/learning assistance centers
-- Graduate student led discussion-sections
-- Instructional lab sections, often taught by staff, not faculty
-- Salaries for librarians, who often help students with class-related research projects
-- Open access computer labs and other facilities necessary to instruction
Most classes at a university are more than just a few faculty lectures; a big chunk of "Academic Support" and "Student Services" should also be included in the cost of instruction.
There's an interesting sort of credentialism talked about here where an MS is seen as a negative signal.
The theory here is that this is because of people doing MSs without a good underground background knowledge, although even ten years ago I'd stopped being impressed by an MS on a lot of resumes for a different reason: continuing on to an MS was often easier than getting a top-tier job offer out of school, especially at lower-tier or international schools without big tech recruiting presences.
I was originally in a BS/MS combo, but I didn't complete the MS (guess that was a good move? huh), but I learned a lot of interesting things in the extra 8 classes I took. In a world where engineers are judged on "demonstrated abilities" (even if that's just whiteboarding in an interview) having that much more exposure to math, logic, stats, algos, and operating systems definitely gave me more breadth to do better on both interview + real world projects. So for ramping up your knowledge of solving "CS problems" it's great.
Note that the article here is a sales pitch for kinda the same thing: study the fundamentals, just pay (? i don't see anything about pricing on the Bradfield site) them instead. Left unanswered is what will keep this a reasonable option or a meaningful credential over time. If it stays super small, it's probably too low-profile for people to see it as valuable. If it scales, it probably breaks in the same way as traditional programs.
It seems astonishing that people are paying large sums of money to obtain master's degrees in anything. If you have no opportunity to get paid as a research assistant / teaching assistant and have tuition covered by some other route, like a grant or something, then it's probably not a real graduate program, just a means of extracting cash from aspiring students.
Real research and teaching programs need graduate students and will offer you a package of some sort if they think you're any good.
> it's probably not a real graduate program, just a means of extracting cash from aspiring students.
If it offers a Master’s degree it’s a real graduate programme. Not being difficult, not covering material that is useful, not being impressive, none of these are disqualifying for being a graduate programme. Education schools exist. Law schools exist. Business schools exist. They’re all graduate programmes.
Personally, my undergrad was in electrical engineering so I always felt a little behind as a software engineer. Getting an accredited Master's in software engineering from Harvard extension gave me more confidence in a lot of the classes I would have taken in undergrad. Not to mention, development has moved on in the time since I took my undergrad, so it was kind of a mid-career refresher. There are definitely jobs that like seeing a Master's as well. So in my case I think it was worth it.
I loved 50% of my Master's degree program. Those were the best CS courses I ever took (writing assemblers, compilers, CPUs in FPGA, OS kernels from scratch). I could have presumably avoided the degree and done it all by myself, without the knowledge of a professor, without the support of the TA and without feedback from others, but it would have taken at least twice as long and I am not entirely sure I would have been able to get to the finish line.
Agreed completely. Well-taught graduate systems courses are fantastic. Additional courses might include computer architecture (though maybe that was your FPGA course), networking, graphics, databases, numerical computation, parallel programming, AI/ML, etc..
The primary advantages, as you note, of taking a formal course are that it is well-structured and you get feedback and support.
Secondary advantages include a potentially positive and motivating learning environment, meeting and interacting with instructors and other students, increasing your portfolio of completed projects, and a potentially useful or beneficial degree or certification.
Overall though I'd say the main advantages come from completing the course projects.
If you have the time and motivation, you can teach yourself from the same material, but that usually requires more time and motivation.
It's a shame that formal and self-directed education are often seen as being in opposition to each other. As a field I think we should support and encourage both.
The article seems to argue from a vocational point of view. I don't know about the US, but around here, universities are not meant for vocational training. We've got another educational branch for that; universities are intended for academic training. The value of that for a job is that you ought to have learned how to keep abreast of new developments in the fields and how to make new skills in your field your own. Eg. (simplified) you may not know Java, but you know how to get to know Java - quickly and with great understanding.
That's what a bachelor is supposed to give you. A master on top of that is supposed to dive deep into one area in the field, and show you the basics of research in that area.
If you want to become a FAANG developer, a bachelor should suffice. A master education definitely has its advantages, but I doubt if there is any master programme anywhere geared towards becoming a rockstar developer. Masters aren't about becoming top-notch in something, they're about taking a broader, higher-level view. That sure can help someone on their path to rockstar developer, but it's up to the master how to continue given the acquired skills.
I graduated with a humanities degree from Penn, spent a decade working and eventually became a self-taught developer, then obtained a masters degree in computer engineering through Boston University’s LEAP program. While I agree with most points in the article, getting my master’s degree was definitely the right choice for me. The program I attended also included undergraduate courses to ramp up to the master’s, so it is a little different than what is being critiqued here.
The main omission of the article is that while the author feigns (or projects) surprise that universities aren’t entirely dedicated to teaching, they omit the major benefit of _opportunity_ that students get when attending universities.
I entered the program knowing that I wanted to focus on things like cloud computing and k8s; it turned out my university had a partnership with one of the two main companies working in k8s and this allowed me an opportunity to get an internship.
That’s just my personal example, but the connections and opportunities from attending a university are a pretty fundamental part of the experience and outcomes. It is the other half of the coin for the individual students in the visa cash cow story.
CS degrees give you an academic understanding of CS: theory and how the things you use actually work. This can be useful knowledge but isn’t necessary. So CS masters degrees are good but also shouldn’t be necessary for getting a job.
However you should still require at least a basic “academic” understanding of CS, since that is discrete structures and recursion and other stuff you pretty much need to know to write good software
A decade ago Harvard Extension School's ALM in Software Engineering was a good match for what I was looking for as a self taught programmer with a Bachelor's and Master's degree in another field. There was a good blend of computer science fundamentals along with classes in modern (but not necessarily trendy) technologies. At the time it was also fairly inexpensive relative to other comparable options.
Programs vary widely, but if you can find one with a degree plan that aligns well with your goals it can be valuable. While there are definitely ways to learn everything on your own, I've found the credential helpful as a foot note to a track record demonstrating skills.
My biggest recommendation would be to work on a Master's degree part time in order to be able to do "real" work while studying. A Master's degree with 4 years of real world software engineering experience is going to be so much more valuable than a Master's degree from two years of just focusing on school.
My personal experience is that I was paid by my university to get my master's degree in the late 90's. Since I was a teaching assistant for an entry level programming class (in Java at the time). I got a tuition waiver and a stipend which was enough to cover my living expenses.
One surprising benefit was that re-learning the intro stuff well enough so I could teach it crystalized my understanding of some fundamental concepts like the difference between the stack and the heap which I kinda glossed over my first time through.
At the other end of the continuum, I was able to gain a level of skill in parallel and distributed programming and network programming much deeper than undergrad.
Was it worth it? For the price of $0 and 2 years, yes. If I was paying full price, not sure.
And do those who say the degree erodes, fundamental concepts endure. A race condition is still a race condition and a socket is still a socket even all these years later.
I think there is some confusion here between programming, which is a skill; and engineering which is the application of knowledge.
Of course programmers don't need degrees, the same way that good builders, carpenters, electricians etc. Don't need degrees. In fact, I've always believed there should be trade schools for programming, that programmers should unionize etc. etc.
Engineering is typically a different set of skills altogether. It's about understanding how to analyse reality, combine them with ideas in your mind, and guide those ideas back into the real world.
Civil and Electrical Engineers working in construction never lay a brick or run a wire (in fact, often there is legislation preventing them from doing so). But a construction project could never proceed without the careful analysis that engineers bring to the table.
The IT bar to entry the industry is lower than it used to be so you really do not need CS degree to become a developer, I think that with a time as the supply of these workers increases the value of University degrees will be restored as well as the meaning of a term of engineer.
Things must be well engineered and they can be without calling a person an engineer.
IMO we should focus on defining what well engineered means in contexts where a thing must be engineered, not handing people a sigil of power to exploit.
A quick master's is generally always worth it in a technical field.
Most Universities have a way where you can spend a single extra year or two max and tack a Master's onto your Bachelor's with testing or a quick project. If you can do this, do it.
On the flip side, if you're not getting a PhD, avoid the thesis Master's like the plague. It will suck up vast amounts of time and isn't worth the risk.
The money bump combined with the doors opened because you have a "credential" is very useful over the course of your career. It also has an impact when layoffs and promotions come around because you can be associated with the "research" groups while pure Bachelor's are blocked by silly HR rules.
IMHO the best reason to get a master's is because you want one. For you. Not to fill a spot in a resume or check off a box on some application. You can go learn all of it yourself with self-study if that's what you're after (much of the curriculum and lectures are freely available), and I really think the value to a potential employer is fairly limited aside from some narrow corner cases. And if you fall into one of those cases you probably know it already.
So get it if you think the feeling of satisfaction and achievement is worth it. The major cost by far is not money, but time, be aware.
> Unfortunately, MS programs are nominally designed to build upon corresponding BS programs, so most CS master’s programs will expect you to already know the very things you’d like to learn.
For the most part I agree with the author and have heard similar sentiments from people with an undergrad cs education.
I wish people would talk more or elaborate on what it “means” to be “self taught”. To me, taking a course on a subject is basically a subscription to being exposed to a defined set of material. Whether an individual learns that material is up to them (is this considered “self teaching”?).
Formal education (with or without a degree tied to it) can lower the activation energy of learning. But it is by no means the best way to learn for everyone.
The author of the article should replace software engineer with software developer. Not being a gatekeeper but this is the proper term for the role described in the article.
> In my experience, an MS degree has been one of the strongest indicators of poor technical interview performance.
Success in ICPC or grinding leetcode is probably a better predictor of success in technical interviews that are based on algorithm puzzles or competition problems.
But it won't teach you how to write your own compiler or OS kernel.
I completed the program in 2018. I'm glad I did. It was a tremendous amount of work and I was already working full time with a lengthy commute and family commitments. Pretty much every minute of my weekends and 10pm-12pm most week nights. I had to do maybe 3 all-nighters when schedules didn't line up. Some of the classes are much harder than others and should be taken by themselves. Other classes are less demanding and you can 2x or 3x. The video lectures tend to be very well made. Some of the classes make extensive use of Piazza - which I grew to loath. The assignments can be challenging but also sort of fun. Each class is different though. You're welcome to email me at gmail or yahoo if you have specific questions.
I graduated last year. If you enjoy academics, do it. Don’t expect you’re gonna be building React apps or something. If you want that, do a bootcamp I guess. It’s a traditional program where you’ll use C, Java, and Python to learn how computers work.
I have a CS masters. I got it because my undergrad degree is in math and I wanted to round off my skillset with some CS. I don’t know if I needed it, but I enjoyed getting it.
The only problem with CS masters is that there’s a CS Ph.D. So it really doesn’t have a lot of value when competing for jobs.
My experience: I got a master's degree from a pretty weak school about 10 years ago. There was rampant cheating on tests and I didn't learn that much. While I'm not sure it helped me land any jobs, I do think it did help me get a higher salary at a few places.
Maybe 10 years ago I would have been impressed with any candidate with a masters from a legit CS department.
However, in the last 5 years I've worked with numerous engineers with masters from both "prestigious" universities and also the typical "cash-cow" degrees that are so common today.
One of my previous devs had a masters from a "public-ivy" in computer science, and he honestly was not qualified for anything more than basic CRUD "enterprise". Hate to say it, but he was obviously pushed through the system due to his race and the need for these schools to fill diversity quotas.
Also, I knew a public school middle school teacher - she was bright but had no formal math or engineering or CS education beyond basic algebra with her teaching degree, no programming experience, etc. But she had been able to obtain a masters in CS from a top 10 program via some "Masters in CS for Public School Teachers" program. I also took a few masters courses from this university, and they were extremely difficult, even for me who has a lot of experience and a solid CS bachelors.
For example, some of the qualifying courses included programming MIPS in assembly language, binary numbering systems, algorithms with assumed knowledge of Big O, data structures, dynamic programming, etc. There is no way anyone without both a good undergrad CS degree and some dev experience could legitimately pass these courses, especially when the program was meant to be done by professionals who were already experienced developers at night / weekends - so I assume she was pushed through with a "wink wink" from the administration.
Finally, the hordes of foreign students getting degrees from cash-cow programs in order to remain in the US or obtain OPT visas ensure that nobody respects a Masters in CS anymore. Now that most of these are online, they're even less legit as you know these programs are plagued with cheating, fraud, etc.
> ...pushed through the system due to his race and the need for these schools to fill diversity quotas...
> ...hordes of foreign students getting degrees from cash-cow programs in order to remain in the US...
Not saying that somewhere in there, a legitimate argument about the reduction in overall quality of candidates doesn't exist, but it does smell a bit of having a strong bias against minorities and foreigners. There is such a thing as confirmation bias, where a person interprets a situation through a negative ideology about others not of their ethnic group and where they are in a position to execute an agenda based on those beliefs.
> For example, some of the qualifying courses included programming MIPS in assembly language, binary numbering systems, algorithms with assumed knowledge of Big O, data structures, dynamic programming, etc. There is no way anyone without both a good undergrad CS degree and some dev experience could legitimately pass these courses
What? Of course you can. Like any other subject it takes study and practice.
What happens very often in school, though, is you focus full throttle on the subject matter to get a good grade. That knowledge is extremely fragile, likely to be lost to the wind within weeks.
Day-in, day-out experience solidifies that knowledge, but you can pass nearly any CS class with bare minimum Python and surface level linear algebra / calculus knowledge.
I have an MSCS in machine learning and am not practicing daily, so if you ask me to do linear regression without a library I'm gonna be in trouble.
I'll probably struggle with Big O on some algorithms without putting a lot of thought into it.
Well there isnt much “science” in computer science these days. Spaghetti code and reinventing the wheel (see the js word) dont always cause business issues, so why not?
Those who want an education and would benefit from getting one are not allowed to even try - as designed by the system.
Those who are in a position to easily walk into a situation where they can get one, are well off enough to not need one.
So, if you have the possibility of getting one, it won’t matter if you get it or not (Gates, Zuckerberg, etc etc). If you can’t get one, then it probably would have changed your life… of course “you’re not qualified to try” so don’t worry about it.
Just don’t be born poor. That is the best solution.
I was about to write a blog post against CS masters degrees. I'm going through a pretty ambitious gauntlet autodidact process. Some background: I already have a CS undergrad degree.
- I've found all the online courses I would ever need, alongside high quality video courses (MIT, standford, 3b1b) + books
- A masters costs money & too much time (my time is better spent than getting an application together) (Opportunity cost of not working is too great)
- I already know 80% of the stuff in a CS masters from building systems and learning how they work (looking at the syllabus of various online masters degrees)
- My employment history is much better for future employment opportunities than a masters ever would be. Getting a FANG job not only pays you, comparatively to a masters degree. It's also a much better signal than "cash cow online masters program"
- I can publish papers at my company if I wanted to (Not being gatekept from becoming a researcher)
- My workplace has a mathematics "guild" who I can outsource questions to whenever I want. Otherwise, I can go to /sci/'s math general or stack exchange.
In university, I learned the computational algorithm to pass the lin alg exams, and that took eight months. Being an autodidact, I relearned the true essence of lin alg in two weeks. Now, I can derive any computation lin alg operations from first principles. I'd even argue that undergraduate degrees are a waste of time. Mandatory electives? Give me a break. After going through this autodidact process, I'm actually offended by how much time I've wasted in HS & uni.
I'm basically redoing an undergraduate math degree + a CS masters + building real projects throughout it, all in the fraction of the time frame. At my current speed, I'll be finished learning all what I want to learn in ~2 years, while maintaining a pretty demanding FT SWE job. The knowledge provided in both highschool and university can be obtained in the fraction of the time, on your own. At the end of the day, the only thing that matters is what you can engineer & build with your knowledge.
Universities are a meme in 2022. They're largely mechanisms to filter out the "peasant non-educated" class, through a proving function that can only be passed by coming from a family with enough resources.
Sounds like you’re very smart and self motivated, and you’re good at figuring things out without any help. As a teacher, this describes about 1% of my students.
Your post here really isn’t an indictment of higher learning institutions, so much as it reveals you aren’t the target audience for master’s programs. That it’s not for you personally doesn’t mean it doesn’t work for others.
I'm still struggling to graduate from undergrad, due to Corona this is my 10th semester.
I spent more than two years for mathematical analysis as well as linear algebra, because the professor in charge of the course was different every year, I had to re-identify the symbols used by the professor, otherwise I would have had a hard time understanding the questions in the assignments and exams.
I also know that there are many quality online resources such as 3b1b or MIT linalg, but I don't have enough time to watch them during the semester, because I'm already tired of dealing with the deadline of my weekly assignments. In particular, I often encounter situations where I have to do the work of two people by myself, because homework partner is unreachable or withdraw from the course, and I cannot find a new partner.
Most new ideas build off of existing ideas, so if you want to make new ideas, one way is to learn all of the existing ideas in a particular field (and a few ideas from other fields) so you can reuse those ideas in novel ways.
If you have some idea of what you'd like to have ideas about, research in that field is definitely something you can do as part of a Masters. I can't speak to the OMSCS specifically, but many in-person Masters actually require research.
I'd say tech has already reached this point... I barely check or care about a candidate's degree.
I do take note if it's a highly reputed CS school, otherwise I treat them as interchangeable. I'm never going to hold lack of a degree against a candidate though, if they impress in an interview
TLDR: "I don't have a CS masters degree and I believe that getting one is not worth it and you should not purse one either. But I have my own CS course to sell you."
Not a surprise to see the article ending with the author shilling their own courses.
He has a point that if you do a Master's you might be going without the extremely useful basic algorithms and data structures courses, but that is easily avoided by choosing a program that lets you take that. For other topics that aren't strictly programming, eg AI, ML, game theory, cryptography, quantum computing, probabilistic methods, optimization,... it would take a significant amount of dedication to match the amount of understanding that a rigorous course with problem sets can give you in self study.