Whenever I act, one of following is true:
- My action is uncaused.
- My action is caused, but not by a reason. (I will take this mean: by a reason, in the right way.)
- My action is caused and by a reason.
But no one is responsible for an uncaused event. And an action that is caused but not by something rather than a reason is not a rational action. Hence in all cases where I act both rationally and with responsibility, I act on a reason.
Are you in for a regress here? What causes reasons?
ReplyDeleteHeath, one might add:
ReplyDelete1. Some reasons are caused in part by other reasons.
2. Some reasons are caused by being alive and human.
Heath:
ReplyDeleteReasons aren't actions. And I think you can be responsible for acting on a reason even if you're not responsible for having the reason (as long as you have some competing reasons and can choose freely between them--you won't like that, I know).
Dr Pruss
ReplyDeleteYou say,"I think you can be responsible for acting on a reason even if you're not responsible for having the reason (as long as you have some competing reasons and can choose freely between them)"
Now it seems to me that my free choice is an action, which is either caused by a reason (or a set of reasons) or not by a reason or it is uncaused.
In the second case, I am not responsible, in the third case, my choice wasn't rational.
In the first case it seems obvious to me that I am not responsible for having that particular reason to act upon that other reason, hence I am not responsible for my action either.
You see, let's suppose you and I have the same "reasons", then the fact that you act on some of them while I act on others cannot be explained by the reasons themselves (since they are completely the same). That means neither of us is acting rationally.
Now if we choose differently between our competeing reasosn because we have different reasons to choose between our competing reasons, it is clear that either thoise reasons oare uncaused, or we are stuck in an infintie regress.
Hence, it seems that ultimate responsibility is an incoherent concept.
I think you're assuming that if p explains q, then p cannot explain something incompatible with q. But that's false for stochastic explanation. We understand why an indeterministic coin lands heads, and we understand why another lands tails, and the explanation can be the same physics.
DeleteI am not assuming that if p explains q, the p cannot explain something incompatible with q. I know that that is false for stochastic explanation, but stochastic explanation doesn't account for responsibility. If I "choose" A by chance, then you may say that the explanation why I chose A is chance, but that doesn't give me any sort of responsibility.
ReplyDeleteWell, you said: "then the fact that you act on some of them while I act on others cannot be explained by the reasons themselves (since they are completely the same)". This seemed to be a commitment to the something like that principle. Maybe you have an alternate argument for the quoted claim, though.
ReplyDeleteBy "reasons" I mean rational considerations. There may be a stochastic explanation why if we act upon the exact same prior conditions we may end up acting differently, but I would not consider those prior conditions reasons in the sense needed for responsibility.
ReplyDeleteThat's why I put quotation marks around "reasons". i didn't repeat them in the next sentence, but I meant the same kinds of "reasons" there as well.
I think it is quite obvious that if we have exactly the same rational considerations to act a certain way, then we do act the same way.
Not at all obvious to me. Curley has a rational consideration (in light of the good of wealth) in favor of taking a bribe and other rational considerations (in light of the goods of morality and avoiding jail) in favor of rejecting it. Given these competing rational considerations, the decision could go either way. If he takes the bribe, his taking the bribe is explained by his conviction that wealth is worth having. If he rejects it, his rejection is explained by his conviction that one should act rightly and/or that jail is bad. But of course he has all of these convictions.
ReplyDeleteBut in that case, one conviction is stronger than the other.
ReplyDeleteIf e.g. your conviction that jail is bad is stronger than your conviction that wealth is worth having, while my conviction that wealth is worth having is stronger, then we do not really have the very same reasons. Having the same reasons (or convictions) does not merely mean having conviction A, B and C. It also means that my convinction A is exactly as strong as your conviction A etc.
Now if that is the case, then my choice of action may be explained by some stochastic process, but since that porcess is not something that's under my control, I am not responsible for my choice.
Hi Walter-by acting upon one reason the mind evaluates the consequences and set up the "future expectation"/the future effects"bonuses" or retributions. So knowing this effects or evaluating it makes me responsible for acting upon this reason against another competing reason with known different effects like good deeds or retribution .
ReplyDeleteA 3rd option might be non acting at all(which has itself a reason by evaluation the other competing ones.
So acting upon reason is goal oriented with future projected effects from the start go (known results) I want to achieve and all are rooted in the agent mind upon evaluation which makes me obvious responsible.
Hi Carmel
ReplyDeleteI agree that by acting upon one reason the mind evaluates the consequences and set up the future expectations/effects, bonuses etc, but either the action of the mind is caused by a reason or it is not. If it is not, then, as Dr Pruss says, it is not a rational action. If it is caused by a reason, then it doesn't really help that there are stochatistic explanations for why the same reason leads me tpo take a different course of action than you, because I don't see how I can be responsible for a stochatistic outcome of a reason. That seems to be precluded by the very definition of "stochatistc".
So, I don't think what you say points at any sort of ultimate responsibility.
There might be a sense of compatibilist responsibility in it, though.
There might be a sense of compatibilist responsibility in it, though.-YES, nice out :)
ReplyDeleteBut reasons don't cause anything by its own existence, since I argue are mental states and static information processed by a dynamic mind:
Let's say the mind have multiple reason coming as stochastic/(not determined by previous states) or not.
Reasons coming as stochastic just breaks the deterministic chain , however some reasons can be rooted back to at least some steps backwards with given additional reasons for the "why".
Only when mind find the "desirable expectation" it wants to achieve(by evaluating other reasons with competing expectations) triggers a willed action: to eat a healthy salad instead of a hamburger for example, so I find ultimate responsibility relies on agent not in information(which can be very well processed by an automatic machine with a decisions algorythm ...just to make the point that information does not trigger an output BUT just provides inputs/variables/parameters to an complex decision software, which can be very well fallible and biased also :) - management by exceptions :))
It can be the case that same scenario with same reasons(eating a healthy salad instead of a hamburger) we both have additional reasons(extra input variables, like strongest taste for a hamburger for me - another mental state/reason, for example)that we/our minds evaluate and mind gave precedence against the initial reasons(to eat healthy ) hence my mind acted against eating the hamburger even very well knowing/evaluating is or not healtly and take responsibility by tacking the associated risk/to become sick(here I stress against responsibility) What else can it be the the "responsibility" if not evaluating final results from competing reasons and acting against one to achieve that projected expecting ?
Thanks for your reply and let's keep in touch(my email is carmel76@gmail.com). are you based located in NL? :)
Carmel
ReplyDeleteYou said, "What else can it be the the "responsibility" if not evaluating final results from competing reasons and acting against one to achieve that projected expecting ?"
That's exactly what I would call compatibilist responsibility. As I said, we either have a reason 'to act against one competing reason' or we don't. If we don't, we are not acting rationally. If we have a reason, we are back to my original statement.
BTW, I am Belgian and my email is wr.vandenacker@gmail.com .
Thanks for email. Can you recommend me some good compatibility philosophy in regards with agent responsibility ?
ReplyDeleteBut even when we don't act against one reason(by finding the expected result not desirable, let;s say) that is still an act against one reason: for example my mind decides to stay neutral when considering 2 or more reasons: for example by fasting upon evaluating both eating an hamburger or eating a salad (that reached already due to expire date or whatever) :).
I can;t conceive not acting against one reason or another by being neutral or irational Why?
Because that final action itself is still backed up by a reason(however I can conceive a mind being irrational by being faulty, biased , unhealthy, coerced by another etc-but isn't that ultimately backed up by a reason when considering the decisson from an outside observer ?). Cheers.
I leave in NL but English speaker :)