sparr: (cellular automata)
I follow a variation of consequentialism, filtered through the opposite of paternalism which doesn't have a more specific name.

My value system is how I decide which outcomes are better than which others. It is important to note that the philosophical concepts below are not dependent on that value system. Everything in the next few paragraphs holds regardless of what value system we are considering. Wherever you see "good", "bad", "positive", "negative", "better", "worse", etc below, those can mean whatever you want them to mean, especially if your value system is internally consistent and universalizable. I sometimes even prefer to operate in your value system, if we are discussing a situation where the positive and negative outcomes affect mostly to only you.

I apply a maximax criterion regarding the choices of other actors with agency. That's someone like you, in most cases. When I take an action that allows you to choose between two actions of your own, I am responsible for the most good outcome you could choose, and you are responsible for any less good or more bad in the outcome that you do choose. If I opt to not give you that choice because I expect you would choose the less good outcome, I am denying you agency in the situation, and that would be paternalistic. When I tell you that your dog is trapped in a burning building, you might decide to run inside; if the outcome of your choice is worse than if I had not told you then you are responsible for that outcome, not me. When the villain drops two people off a bridge and you can only save one, someone is responsible for the death of the person that you do not save, and it is mostly to completely not you.

I apply an expected value criterion regarding actions with random outcomes. When I play a game of Russian Roulette, the death of the loser is as much my responsibility as that of the person who made the unlucky trigger pull.

Finally, I do not recognize a fundamental distinction between action and inaction. If I tell you that pressing the button will do something and you press it, you're responsible for the outcome. If I tell you that not pressing the button will do that same something and you don't press it, you're equally responsible. Not pressing the button is just as much a choice as pressing it. This concern is most often illustrated with variations of the trolley problem where the two tracks are switched, which I don't consider to actually change the problem at all.

That's all I've got for now. This is my first real attempt to put this all together in a reference document. It will certainly be revised in the future, as I get a better grasp on the concepts that drive my decisions, and also continue to become better at describing them.

Profile

sparr: (Default)
Clarence "Sparr" Risher

February 2025

S M T W T F S
      1
2345678
9101112131415
16 171819202122
232425262728 

Syndicate

RSS Atom

Most Popular Tags

Page Summary

Style Credit

Expand Cut Tags

No cut tags
Page generated Jun. 29th, 2025 10:29 pm
Powered by Dreamwidth Studios