Tags
Christianity, Clifford, James, Pascal, Pascal's wager, philosophy, rationality, reasonableness, religion
In my last blog I explained that I learned we need to deal with uncertainty. I have found that many people do a very bad job in dealing with uncertainty. They demand certainty and if they don’t get it, they sort of mentally turn off. I think rational people take actions (which I think can include holding beliefs or at least trying to hold certain beliefs) in light of uncertainty. That’s really what I think being rational is all about.
In doing this we not only need to consider the likelihood that a belief is true but also consider the consequences that would occur if our beliefs end up being right or wrong. Of course Pascal’s famous wager brings this up. I am not necessarily going to argue the wager in the same way Pascal did, but I do contend that rational people must weigh the consequences of the actions including the act of trying to believe something or other.
BTW I think Pascal’s wager is likely the victim of the more ill-founded criticisms than about any other philosophical argument. Here is a good paper on it by Lycan and Schlesinger if you are interested:
Click to access pascalswager.pdf
In my post discussing what it means to be rational I argued that Clifford’s claim “”it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence.” was illogical because it was self-defeating.
But I think there is at least one other problem. It seems to only view one of the essential aspects of making a rational decision to believe or not. It seems to only address the chance of the belief being true. It does not address the consequences of believing one thing and being right or wrong about it.
William James gives a good counter argument in this regard. He says that if it is the case that by believing I will survive cancer, I will actually increase my chances of surviving, then it is rational for me to believe I will survive cancer. This is so even if the evidence doesn’t suggest I will. So if my chance of survival doubles from 1% to 2% if I believe I will survive it’s hard to say I have sufficient evidence to believe I will survive. Yet it seems rational to go ahead and try to double my chances to survive by doing my best to believe.
Is that the only type of situation where this might come up? I don’t think so. I think in morals it comes up often. I believe every time we are tempted to do wrong it is easy to waiver unless we strongly commit to believe certain moral standards. Often we will commit to believing moral standards well beyond the “evidence” that the moral standards really exist. Whether there can even be “evidence” that a moral standard exists, and what such evidence would look like, will be the topic of another blog.
If anyone would like to comment on what they think evidence of objective moral standards would look like I would enjoy reading it. Also feel free to just put a link to it in the comments.
I’m not sure what you mean by moral standard exactly – is it an action that is objectively moral or an intention? if it’s an action I don’t think there is any evidence but for moral intentions (or ideals may be a better word e.g. equality,love etc.) the only way I think it could be proven is if you manage to identify some quality to humanity or life in general that gives it innate value … the fundamental argument for objective morality given by Christianity is that God made man in his own image such that humanity then has innate worth/dignity
Just to say I really don’t like Pascal’s wager … I respect that the consequences should be understood when making the decision but I don’t think they should influence the decision itself … faith based on utility/self-gain doesn’t hold much value
Thanks for the comments.
“I’m not sure what you mean by moral standard exactly – is it an action that is objectively moral or an intention?”
I think it depends on the moral system. In the law the prosecutor usually needs to prove the defendant had a guilty mental state (mens rea) and took some action (actus reus). Even attempted murder requires some action with the intent to murder.
Christians don’t get away so easy. Guilty intentions/thoughts are considered wrong. At least they are in the Catholic Church.
Mathew 5:21-48
http://www.biblegateway.com/passage/?search=Matthew%205&version=NIV
I’m not aware of any ethical system that would punish the action alone without some bad intent or at least knowledge of what they are doing. The insanity laws vary but generally you have to have some sense of right and wrong to be guilty.
“Just to say I really don’t like Pascal’s wager … I respect that the consequences should be understood when making the decision but I don’t think they should influence the decision itself … faith based on utility/self-gain doesn’t hold much value”
I believe my initial view of Pascal’s wager may have been the same as yours. It seems anti-christian/selfish and perhaps even intellectually dishonest to believe something based on motives! Retch! I wanted to have my beliefs be based on a pure and honest estimate of the evidence and nothing else.
But at some point after examining the reality we are dealt, and how meager the evidence is for things like Morality and God or Atheism and several other things, I had to give up on that notion and fall back on just being rational about what is going on with life. I will blog on this more but I thought in terms of what if there is a God what if there is no God what if there is some Crazy God. Maybe I don’t know for sure one way or another. But I am living my life and making decisions including moral ones.
Agnosticism sounds great and reasonable but how then do I live then? Do I live as a christian? Do I live as Christian – sometimes?
Time doesn’t stop for me to wait on “new evidence.” Sadly that’s not the way reality works. In the end I just said well I will deal with this uncertainty as rationally as I can. I committed to making the most rational choice I could and if it ended up being wrong-well no regrets. On the other hand if I made a choice and it was wrong *and* irrational – well I decided I would try to avoid that.
And making a rational choice included weighing the various options and the forseeable consequences. In another blog I will go into more depth how I went about that. But I did indeed consider the consequences of my being wrong, but really heaven and hell really didn’t need to enter the picture.
Pragmatism, then? I think that we have to acknowledge that beliefs are provisionally true, unless they are about some defined entity (like logic or math). The beliefs themselves are uncertain, or at least incomplete.
Thank you for the comments
Yes I think my view is pragmatic at least in the general sense of the term. (I don’t have a philosophical definition of the term in mind so I can’t say whether that fits) As for whether we should acknowledge our beliefs are only provisionally true – I think its more complicated.
Yes I think when we initially choose a moral path we should consider the possibility that our path is wrong. But after we choose a moral path I think its important that we hold a firm belief in it or else we may falter. I would refer to quines description of our beliefs being like a battery with weaker or stronger charges.
https://trueandreasonable.co/2014/01/09/do-you-belieeeeve/
Consider Isaiah 63-64. Where he says among other things “Oh, that you would rend the heavens and come down, that the mountains would tremble before you!” He asks God to not stay hidden. If you read the full passage Isaiah says if he would reveal himself people would stop sinning. I think he is right. To the extent God stays hidden we feel more free to sin. If God could always be seen standing over us well I think I would behave better. Whether that behavior would reveal who I really am as much is not so clear. But Isaiah 63-64, God not revealing himself, and our free will is another blog I have in mind. For now suffice it to say that I think the more strongly we believe our moral code the more likely we are to follow it.
Really enjoyed this post. Yes, that argument sounds self-defeating to me…
That’s an interesting point that we have to think about the consequences of having one belief over another. In some sense, I think it is really profound and something I usually probably live by, whether acknowledging it or not….
But maybe some people would just rather be absolutists without regard to consequences and prefer to accept bad consequences so as to increase the chance of being right.
Interesing william james anecdote. Do you happen to know if they proved a causal link between this optimism and survival rates? (controlling for all other factors?) I feel like I’ve heard that before…
I guess it depends on what you understand rationality to mean. If I am understanding correctly, it seems your argument would entail a balancing act between the chance of being right and consequences. So whether or not it would be rational for the cancer patient to accept his death, or turn his head from the statistics and believe otherwise, depends on whether he actually prioritizes being an entirely realistic person, or if he is willing to sacrifice the accuracy of his beliefs for a beneficial chance in survivial.
Do you think it possible to claim what is rational before you have made assumptions about what matters for that agent? If a persons number one priority is living in line with the truth, then maybe they would be willing to have a lower chance of survival and that could arguably be a rational choice.
i want to get into pragmatism so so bad, any recommendations for starting off? just read a tiny bit of dewey so far.
Thank you for your comments. Let me offer my general thoughts on this.
First, what happened to you blog? It says its gone.
I think some of my comments to others might fill you in on where I am coming from as well.
Does believing really help fight Cancer?
I don’t know whether that study panned out. I understand there is a placebo effect. but that seems a bit different. I just use that example as a thought experiment.
“But maybe some people would just rather be absolutists without regard to consequences and prefer to accept bad consequences so as to increase the chance of being right….”
That’s possible and that ties in with a good question raised on your blog. Do we have moral obligations with respect to the beliefs we hold. Clifford pretty clearly thought we did. When he said it was wrong to hold beliefs on insufficient evidence he really thought that was immoral.
I think our beliefs have moral implications as well. But IMO it is not really in the sense that allowing yourself to hold a false belief is culpable. I no doubt hold lots of false beliefs. For example my belief that there really is a placebo effect might be false. But I don’t really think I have a moral obligation to look it up and correct that possibly false belief.
I believe that we should act rationally. That is itself a belief I hold. It might be false. Do I have sufficient evidence to support that belief? What am I to do then? Act irrationally? Act rationally – sometimes? I do mean these as serious questions. I have considered these things. I ultimately decided that ok if it is the case that I was wrong in my belief that I was to act rationally (including my trying to believe certain things instead of others) then ok the Gods “got me.” But somehow I don’t think I would have as much regret if I see that not only was I wrong, but also irrational for believing the wrong things.
Now in the interests of full disclosure I am taking some philosophical turns that others don’t. The description of belief that I follow by Quine seems to be at odds with Hume’s understand of belief. Hume argues that there is an unbridgeable gap between belief and actions. Quine’s view, which I accept, seems to have a much closer connection between belief and action.
Heres the blog where I give Quines Description:
https://trueandreasonable.co/2014/01/09/do-you-belieeeeve/
“I guess it depends on what you understand rationality to mean. If I am understanding correctly, it seems your argument would entail a balancing act between the chance of being right and consequences. So whether or not it would be rational for the cancer patient to accept his death, or turn his head from the statistics and believe otherwise, depends on whether he actually prioritizes being an entirely realistic person, or if he is willing to sacrifice the accuracy of his beliefs for a beneficial chance in survivial.”
Yes that is correct. I admit I haven’t thought it through but I tend to think we act rationally toward achieving a goal. For me that goal is to live my life in a good way. I got there by asking is there a real morality? In a sense that means am I really supposed to act a certain way or not? I didn’t concern myself with the possibility that there really was no particular way I am supposed to act because, well, if that is the case then no worries regardless, right?
So if that’s right then it seems rational to assume there is “really” moral way to act. Ok then the next step is how would I know what it requires of me?
“Do you think it possible to claim what is rational before you have made assumptions about what matters for that agent?”
Probably not. I think I am willing to accept that acting rationally means usually acting rationally to achieve something. But I really haven’t thought it through.
But that said I am a moral realist. So just because someone thinks this matters for them that doesn’t mean it really matters. They can be wrong and that belief can be false. IMO a belief is true if and only if it accords with reality.
I’d be curious what you think about this.
I have given this all some thought. So, we have hundreds of thousands of beliefs. I think it would be ridiculous to assert we are morally culpable for making sure ALL of our beliefs were correct. One potential reason would be that we would not have enough time to even do so, even if we put in 100 percent of our time to researching beliefs- so it would impossible to be fully moral. But that doesn’t mean I automatically assume that we are morally culpable for none. Could it be possible that we could be morally culpable for holding certain ones? If so, the golden questions would be – which ones, and why? I do not think I would be morally culpable for a belief it will rain outside. But for some beliefs, I am not sure. It would come down to creating a standard for which ones count …
Rationality is so tricky, because many things indicate to me humans are actually very irrational and/or immoral creatures.
“I didn’t concern myself with the possibility that there really was no particular way I am supposed to act because, well, if that is the case then no worries regardless, right?”
I actually think this might be problematic…To me this could be translated to: “I act like there is right and wrong, just in case there actually IS right and wrong.” But how solidly grounded is right and wrong, if you accept the possiblity it might all be an illusion anyway? It makes it seem like some huge game. I almost think the best way to ‘prove’ right and wrong exists, is to get rid of the feeling of the need to prove it, and just start acting like its a fact of life. Right and wrong is right and wrong. its circular. I don’t think you can do any better than that. (Though, that is probably still not good enough for me, because it is not clear to me what precisely constitutes right and wrong. ) I don’t know if this sounds crazy – this was just my initial reaction.
Whether rationality has a goal? Yeah, probably . I think it is safe to accept that people can fulfill the same goals in different ways, so there is not necessaily a 1:1 mapping between like, actions and goals, making rational life choices look pretty fluid to me. The other thing about rationality, is that I think alot of times what people call ‘irrationality’ would more precisely be called ‘conflicting rational goals.’ For instance, I have a goal to lead a peaceful life of low stress. But then I have many other goals that necessitate high levels of stress to fulfill. So when I achieve peacefulness, I feel guilty and lazy for not doing my other goals. When I work super hard, I feel guilty and irrational for overworking myself and not relaxing. This is a goal system that is paradoxical, so I always feel conflicted and irrational and mad at myself. In this sense, I think rationality would mean having an overall system of the least conflicting sets of goals. Everyone hates themselves for all the mistakes they make, but they do not necessarily realize that they are fulfilling other goals while sacricing other ones. It’s all about balance. An interconnection of all the goals in your life, so you would not be able to contradict yourself, would be the most rational person to me…. And then you wouldn’t constantly be losing, no matter what you do.
Now I am rambling, sorry!
Olivia I think you might like this blog. We think along similar lines.
I think I agree with you that, although I some beliefs might not be morally relevant, others may be. William Clifford may indeed have offered an example of when someone is morally culpable for believing what they do. (the ship owner who purposely bolsters his belief in the seaworthiness of his ship and therefore puts the crew at risk) I think there are other examples.
I’m moving slowly with the blog for a few reasons but let me give you a sort of preview.
I take the position that we should rationally try to go about achieving the goal of living a moral life. By that mean a real morals. I’m a moral realist. The goal is to do our best to do what we should.
Of course, to do that we need to figure out what we should do as best we can. We may not ever have all the evidence we would like for that, but we need to carefully evaluate and weigh our options at the outset. But then once we decide on the most promising path then believe in that path strongly. Or like you said our beliefs will be weak and then we may not be able to follow through when it gets hard.
“‘I didn’t concern myself with the possibility that there really was no particular way I am supposed to act because, well, if that is the case then no worries regardless, right?’
I actually think this might be problematic…To me this could be translated to: “I act like there is right and wrong, just in case there actually IS right and wrong.” But how solidly grounded is right and wrong, if you accept the possiblity it might all be an illusion anyway? It makes it seem like some huge game. I almost think the best way to ‘prove’ right and wrong exists, is to get rid of the feeling of the need to prove it, and just start acting like its a fact of life. Right and wrong is right and wrong. its circular. I don’t think you can do any better than that.”
I wish I could think of better analogies. But here is one:
Imagine you are in a navy battle in the northern pacific and all the ships are destroyed except a few are just badly damaged. Lets say you decide that swimming to one of those ships is your best recourse. Realistically you think all those ships will likely go down before help can arrive to save you. But you notice major problems with all but one of the ships. The problems you see makes it fairly certain they are not going to stay up long at all. So you start swimming for the one that seems the least bad off. (but again realistically you think even that one will more likely than not go down before help can save you) Now the water is cold and you are tired. You start to wonder if you should keep struggling to get to the ship.
I think in that sort of situation it is rational to believe firmly that the ship will stay afloat long enough for you to be rescued. To the extent firmness of that belief will motivate you to try harder to make it, that is what you should believe as best you can.
I don’t think its circular reasoning. You initially evaluated all the options and chose that path in a non circular way. You did try to convince yourself that it is more sturdy than it really appeared but that was for the motivational purpose of doing what you rationally should do anyway. It was to help you overcome the emotions of despair and tiredness which might lead you to act irrationally and give up.
IMO the goal of rational thinking/action is to achieve the goal of doing what we should. That might sound circular or vacuous but really there is allot too it. Most of my blogs will be aimed at this goal.
“(Though, that is probably still not good enough for me, because it is not clear to me what precisely constitutes right and wrong. )”
I think trying to come to terms with that is part of the process. I doubt that anyone has the evidence you or I would like on this. But I do think we have enough evidence and information to react in a rational way. The process that I use and the thinking behind the process is really what I want my blog to focus on.
Hey, sorry for the slow reply! Wish I could spend all day blogging but sometimes I get behind.
It seems like we are dealing with the same types of issues in our heads and we think similarly. Although I would say that you are much farther along then I am in figuring some of your own intuitions or beliefs out!! I am going to read alot more of your articles when I have time in the near future and I look forward to reading them and commenting my thoughts!
your blog honestly looks great to me!
still thinking on whether the boat example would be circular reasoning.