Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> and their annoyance can't be simply added together

Why not?

EDIT: Serious question. If I save two lives that is twice as good as saving one, right? Why is this situation different?



It depends on how you measure "good". :/

Most people measure goodness and badness by how they feel about it, and for anyone who feels a significant amount of badness (grief, anger, whatever) at the death of a single person, it is physiologically impossible for them to feel a million times worse about the death of a million people. It's intuitively obvious to most people, therefore, that suffering, annoyance, life-saving, and so on, are not additive: they just have to check how they feel about the situation to know that.

In order to suggest that they are additive, and that N "people annoyed" can outweigh M "people suffering", you have to first convince someone that how their own internal measurement of goodness (how they feel about it) is not as accurate as some external measurement.


It's analogous to saying that "most people believe the world is flat. the burden is really on you to show that the world is really round."

Which is true. But requires a willingness to counter one's own intuition when encountering contradictory evidence. Unfortunately the type of person that does that is uncommon.


I don't disagree with what you actually said, but your choice of analogy suggests that you believe that questions of morality are settled and have obvious, objective answers.


In some cases, yes. If you accept utilitarianism as the the reductive explanation of morality, and assume some non-controversial terminal values, then all of morality is reduced to straight-forward calculations.

"The only way to rectify our reasonings is to make them as tangible as those of the Mathematicians, so that we can find our error at a glance, and when there are disputes among persons, we can simply say: Let us calculate, without further ado, to see who is right." -Leibniz

Unfortunately we retain some ignorance on the correct nature of utility functions (finite? time-preference adjusted? etc.), and terminal values for humans are demonstrably arbitrary.


>If you accept utilitarianism as the the reductive explanation of morality

... then LW ends up with Roko's Basilisk.

Really, you're using that as your answer to "I don't disagree with what you actually said, but your choice of analogy suggests that you believe that questions of morality are settled and have obvious, objective answers." You can prove anything if you first make it an axiom.

You can't seriously claim that utilitarianism accurately captures human moral intuitions. Variations on the Repugnant Conclusion occur immediately to anyone told about utilitarianism, and are discussed in first-year philosophy right there when utilitarianism is introduced.

LessWrong routinely has discussion articles showing some ridiculous or horrible consequence of utilitarianism. The usual failure mode is to go "look, this circumstance leads to a weird conclusion and that's very important!" and not "gosh, perhaps naive utilitarianism taken to an extreme misses something important."


And why, even while being an atheist, would I accept utilitarianism over Jesus Christ in this case?


For more-less exactly the same reason you accept general relativity over aristotelian motion - it is derived from first principles using maths, can be shown to match experience even if somewhat intuitive to people, and works pretty well in practice.


> can be shown to match experience even if somewhat [un]intuitive to people, and works pretty well in practice.

I think these are the two points that those skeptical of utilitarianism have trouble with: it's exactly that it doesn't seem to match experience that started this thread. Additionally, it doesn't actually seem to work well in practice: http://econlog.econlib.org/archives/2014/07/the_argument_fr_...


It's easier for two people to cope with one bad experience each than for one guy to cope with two bad experiences.


Assumed independence. You are talking about a strictly different setup. Assume that these individuals don't know each other.


I don't think he is making the assumption that they know each other.

Each individual has a tolerance of what they can comfortably cope with. If 3^^^3 people were all experiencing a pain that is below that tolerance, nobody would be prevented from happiness. However in the other situation, the tortured individual clearly would be.


That's an example of infinite or unbounded utility functions: no matter how many specks of dust in the eye, it will never add up to a single person being tortured for 50 years. Even 3^^^^^^^^^^^3 specks of dust. Unfortunately the mathematics of infinite and/or unbounded utility functions doesn't work out well. It leads to some seriously messed up edge cases. (So does finite utilitariansm, to be fair -- [Pascal's mugging](http://www.nickbostrom.com/papers/pascal.pdf) -- but these are fully dealt with by decision theory, whereas the infinite or unbounded cases are not). It's not very strong, but it is evidence that we should be accepting of the calculations of finite utilitarianism since the formalization works out better in cases which are within the realm of our experience.


Talk to an urban planner or someone working in disaster relief.

Empathic "strangers" help others, in many contexts, often at personal risk.

That's what separates humans from singularitarian quantum computing devices.


I'd say if each of those two people had the "bad experience" of having their only child killed in a car accident, that's worse than someone else having his uncle and grandmother killed in a car accident. "Bad experience" is hugely oversimplified. And just not start about a trillion specs of dust in a trillion people.


To quote Heinlen: "Men are not potatoes."


To put it differently, since we are rationalists, try the scientific way. Do an experiment.

First day, let a speck of dust enter your eye, at noon. Before sleeping, write down how you feel about that event.

Next day, rip your balls off at noon. Before sleeping, write down how you feel about that.


Let me know how it goes.


Because at the end of the day, no one gives a fuck about a spec of sand in their eyes. Having your balls ripped off, might be different. YMMV. Human feeling is a bit more complicated than just adding.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: