Tags

, , , , , , , ,

In my last blog I explained that I learned we need to deal with uncertainty.  I have found that many people do a very bad job in dealing with uncertainty.  They demand certainty and if they don’t get it, they sort of mentally turn off.  I think rational people take actions (which I think can include holding beliefs or at least trying to hold certain beliefs) in light of uncertainty.  That’s really what I think being rational is all about.

In doing this we not only need to consider the likelihood that a belief is true but also consider the consequences that would occur if our beliefs end up being right or wrong.   Of course Pascal’s famous wager brings this up.   I am not necessarily going to argue the wager in the same way Pascal did, but I do contend that rational people must weigh the consequences of the actions including the act of trying to believe something or other.

BTW I think Pascal’s wager is likely the victim of the more ill-founded criticisms than about any other philosophical argument.  Here is a good paper on it by Lycan and Schlesinger if you are interested:

http://joelvelasco.net/teaching/hum9/pascalswager.pdf

In my post discussing what it means to be rational I argued that Clifford’s claim “”it is wrong always, everywhere, and for anyone, to believe anything upon insufficient evidence.”  was illogical because it was self-defeating.

But I think there is at least one other problem.  It seems to only view one of the essential aspects of making a rational decision to believe or not.  It seems to only address the chance of the belief being true.  It does not address the consequences of believing one thing and being right or wrong about it.

William James gives a good counter argument in this regard.  He says that if it is the case that by believing I will survive cancer, I will actually increase my chances of surviving, then it is rational for me to believe I will survive cancer.  This is so even if the evidence doesn’t suggest I will.   So if my chance of survival doubles from 1% to 2% if I believe I will survive it’s hard to say I have sufficient evidence to believe I will survive.  Yet it seems rational to go ahead and try to double my chances to survive by doing my best to believe.

Is that the only type of situation where this might come up?  I don’t think so.  I think in morals it comes up often.  I believe every time we are tempted to do wrong it is easy to waiver unless we strongly commit to believe certain moral standards.  Often we will commit to believing moral standards well beyond the “evidence” that the moral standards really exist.  Whether there can even be “evidence” that a moral standard exists, and what such evidence would look like, will be the topic of another blog.

If anyone would like to comment on what they think evidence of objective moral standards would look like I would enjoy reading it.  Also feel free to just put a link to it in the comments.