Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

In theory, it would be good to have a part of our educational system which teaches people how to think. But what does that look like? I'd say that boils down to two things: logic and evidence.

The vast majority of philosophy classes are absolutely garbage at teaching either of those things. Sure, in theory, logic is part of philosophy, but in any of the philosophy classes I've taken, we didn't talk about logic. The things we did talk about were often examples of how not to think, yet they were presented as equally valid next to much more rational ideas.

For example, one of the things we covered in my ethics class was Kant's categorical imperative. The categorical imperative falls over instantly if you apply even the most basic logic to it, but no mention of this was made in any of the course discussion or materials. I'm sure lots of people walked out of that class thinking that the categorical imperative was a perfectly reasonable way to make ethical decisions. If this is the sort of "learning how to think" philosophy classes are doing, then I'd prefer we didn't--I'd rather let people figure out how to think on their own than to teach them unequivocally incorrect ways of thinking. Philosophy could be useful if these classes were taught as, "Here's a bunch of historical ideas, and here's how we apply logic to prove them wrong." But until that happens, I'd strongly oppose introducing any more philosophy to curricula.

Other fields are better-equipped to teach people logic and evidence. Science is all about evidence collection, and logically applying the collected evidence to the evaluation of hypotheses. Math, especially around proofs and derivations, is all about logic, and probability and statistics give you tools that are very broadly applicable. History, if taught well, teaches you how to logically analyze artifactual evidence and logically contextualize the present in terms of the past.

But, there are two problems: first, many college students don't focus much on these areas. And second, the parts of these fields which I mentioned aren't particularly well taught even by these fields. Many students get A's in science classes thinking that science is memorizing a bunch of facts about chemicals or living things, without ever having learned how to obtain new facts themselves. Many students get A's in math classes having memorized a bunch of formulas without being able to derive even basic proofs. Many students get A's in history classes having memorized a bunch of historical events, without knowing the difference between primary and secondary sources, and without ever considering that an author might have bias. Even the classes which do teach people how to think, to some extent, generally do a piss-poor job of it.

That's not to say that these fields (and other fields not mentioned) have no value. Even if you think well, your thinking is only as useful as the evidence you feed into it, and colleges do a very good job at moving vast amounts of evidence on a variety of subjects into people's brains. Further, colleges often do a lot of work getting people skills: lab techniques, using computers, effective communication, etc. You can argue that the purpose of college is learning how to think, but the implementation of college is much better at teaching people information and skills. Learning how to think would certainly be valuable, but de facto it's not what colleges are doing, and the things colleges are doing do have some value.

That said, modern colleges often put teaching of any kind behind profits, and that's not something I see any value in for society or students.



I agree completely on your critique of profit motive at universities, and think it particularly applies to state-owned institutions. There is a false notion that profit is an automatic good, when that is clearly not the case.

There is more to critical thinking than formal logic, I'd argue. The classroom format of a typical humanities college course has a lot to do with this. For example I would argue that a person without any background in the classics and some basic theology is going to struggle to get much from Paradise Lost. It's dense, difficult, and you have to know some things going into it to pick up on the nuances. But 30 people discussing it collaboratively 2 or 3 times per week, with the expert professor's guidance when they stumble on certain parts makes for a lot of interesting discussion. Thirty different people will pick up on thirty different bits and pieces in every classroom session.

I'd guess that a big part of the reason there is such a glut of humanities graduates who can't find professorships is that people simply enjoy the classes enough to keep going all the way through graduate degrees. You get discussion and debate in those classes that you can't find anywhere else.

I don't think the above is true of many other disciplines of study, with so many degrees offered being pitched for purely profit motive as job training, as you mentioned above.

I can't do a better job of describing this than this professor who puts his public lectures on youtube for free..

https://www.youtube.com/playlist?list=PLpO_X3dhaqULiItXg84O9...


> There is more to critical thinking than formal logic, I'd argue. The classroom format of a typical humanities college course has a lot to do with this. For example I would argue that a person without any background in the classics and some basic theology is going to struggle to get much from Paradise Lost. It's dense, difficult, and you have to know some things going into it to pick up on the nuances. But 30 people discussing it collaboratively 2 or 3 times per week, with the expert professor's guidance when they stumble on certain parts makes for a lot of interesting discussion. Thirty different people will pick up on thirty different bits and pieces in every classroom session.

Well, if you look at literary criticism, there are a bunch of different ways to do it. The oldest ways, such as authorial intent or historical criticism, aren't that divorced from history as described in my previous post, or from just normal old formal logic. But a lot of the ways popular now, such as Marxist criticism or feminist criticism, are forms of reader-response criticism. In the worst cases, this sort of criticism can be used as a pulpit for professors to pass on their ideologies, which is deeply problematic--rather than teaching students how to think for themselves, it's teaching them to think like the instructor. In the best case, it can teach students how to evaluate literature in relation to their own goals--but I would argue that this is just an application of formal logic. The reality, in my limited experience, is neither of these extremes--classes I've taken and my friends have taken have mostly been "these are some of the ways people have thought about literature"--it's more about passing on information than about teaching how to think.

As I've said before, there's a lot of value in giving people information, I just don't think it supports the "college is about teaching people how to think" narrative.

That said, I'll give two caveats here:

1. My own formal training isn't in literary criticism, and beyond some general-ed requirements and second-hand experience from friends/partners in literature programs, I have very little experience here. My impressions here may very well be off-base, which is why I didn't mention literary programs in my previous post. A notable bias in my previous post is that I talked most about the fields I'm most familiar with.

2. Up to this point, I've basically been speaking about teaching facts versus teaching how to think as if they were two different, mutually exclusive things, but it's worth noting that that's not quite true. It's true that simply giving a student a fact doesn't teach them how to evaluate whether something is a fact, but if you give a student enough facts, eventually they come across discrepancies and begin to experience cognitive dissonance. Over vast swaths of facts resulting in a few discrepancies, a student will eventually begin to come up with their own framework for evaluating discrepancies, and hopefully that framework will look a lot like formal logic and evidence collection. I'd argue that this is a very, very inefficient way to teach students how to think, but eventually I think it does work.


For example, one of the things we covered in my ethics class was Kant's categorical imperative. The categorical imperative falls over instantly if you apply even the most basic logic to it, but no mention of this was made in any of the course discussion or materials

I've read this a couple times, I'm curious about what you're saying here, do you mean that your class just reviewed some writing on the categorical imperative on its own, or read Groundwork of the Metaphysics of Morals?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: