• About
  • True and Reasonable Blog

True and Reasonable

~ Religion Philosophy Christianity Theology Logic Reason

True and Reasonable

Tag Archives: Pragmatic encroachment

Thoughts on Pragmatic Encroachment

30 Sunday Mar 2014

Posted by Joe in Uncategorized

≈ 2 Comments

Tags

Atheism, Christianity, epistemology, Faith, Pragmatic encroachment

Can beliefs be justified by anything other than evidence that they are true?   I think allot of people would want to say “No” to that question at some time in their lives, myself included.  Any other justification for beliefs seems somehow wrong and intellectually dishonest.   But because

1 ) Beliefs have  a causal connection with how we act,

2)  Often we have to act on uncertainty about the actual state of affairs and

3) When rational people decide how to act based on uncertainty they must weigh the likelihood and the consequences of being right or wrong as to the state of affairs

 

it may be irrational to only consider the likelihood of being right or wrong and not considering the consequences.

In this blog I would like to offer some of my thoughts on pragmatic encroachment.   But first let’s start with some observations of the traditional definition of knowledge.

There are various ways that philosophers have tried to define what Knowledge is.  The most traditional is to say that a subject S knows a proposition P if and only if:

 

1)            S believes P,

2)            P is true,

and

3)            S has sufficient reason for believing P

 

Now the third condition might be phrased differently.  For example it might be stated as “3) S is justified in believing  p.”  Or “3) S’s belief in P is properly warranted.” [1]

As it turns out I think this 3rd conditions is ambiguous in a few respects.  One way is that we often think someone might be “justified” in believing something even when we don’t think their justification is sufficient to call that belief “knowledge.”   I might have believed the Seahawks would beat the Broncos in the Super Bowl.    That belief might have been a “justified” belief based on different things I have learned about the two teams.   Hence in that sense we can call that a “justified true belief.”  We might say my belief was a rational belief.    But I don’t think most people would say I “knew” the Seahawks would win the Super Bowl – at least not before the game started.   So we can see there is “justified true belief” and there is “justified true belief.”  The “justification” required for knowledge is greater than the “justification” needed to hold mere “justified belief.”

Notice that this ambiguity remains regardless of whether we use the formulation of “justified” belief or “sufficient reason” or “proper warrant.”  What is “sufficient reason” to rationally believe something is less than the “sufficient reason” required to know something.

 

The justification that yields knowledge is stronger than the justification that allows us to simply say we are justified in believing something.   This raises a few questions:

1)            How much justification do you need to” know” something?

2)            How much, if any, justification do you need to be “justified in believing” something?

3)            Is there any difference in the forms of justification that can relate to “knowledge” versus the forms of justification that can relate to mere “rational belief.”

 

I think those questions are bit vague, and even if clarified, somewhat difficult to answer.  But here are some thoughts.    The justification for “knowledge” might require something close to 100% certainty.  We might be inclined to say mere “rational belief” would require something like a preponderance of evidence.  That is, that it is more likely than not true.   But I think the cases presented by those who consider pragmatic encroachment shows “justification” (or “sufficient reason” or “proper warrant”) can get a bit more complicated than just looking at the certainty/probability that your belief is true.

 

Let’s consider theDeRose’s “bank cases” as set forth and explained by Jeremy Fantl and Matthew McGrath in their paper “Pragmatic Encroachment”:

Some of our intuitions about specific cases seem to support the claim that knowledge can depend on practical factors   Consider DeRose’s famous (1992) “Bank Cases”:

‘Bank Case A (Low Stakes).  My wife and I are driving home on a Friday afternoon.  We plan to stop at the bank on the way home to deposit our paychecks.  But as we drive past the bank, we notice that the lines inside are very long, as they often are on Friday afternoons.  Although we generally like to deposit our paychecks as soon as possible, it is not especially important in this case that they be deposited right away, so I suggest that we drive straight home and deposit our paychecks on Saturday morning.  My wife says, “Maybe the bank won’t be open tomorrow.  Lots of banks are closed on Saturdays.”  I reply, “No, I know it’ll be open.  I was just there two weeks ago on Saturday.  It’s open until noon.”

 

Bank Case B (High Stakes).  My wife and I drive past the bank on a Friday afternoon, as in Case A, and notice the long lines.  I again suggest that we deposit our paychecks on Saturday morning, explaining that I was at the bank on Saturday morning only two weeks ago and discovered that it was open until noon.  But in this case, we have just written a very large and important check.  If our paychecks are not deposited into our checking account before Monday morning, the important check we wrote will bounce, leaving us in a very bad situation.  And, of course, the bank is not open on Sunday.  My wife reminds me of these facts.  She then says, “Banks do change their hours.  Do you know the bank will be open tomorrow?”  Remaining as confident as I was before that the bank will be open then, still, I reply, “Well, no.  I’d better go in and make sure.” (913)’

 

It looks like Keith speaks truly in Case A in attributing knowledge to himself that the bank will be open tomorrow, while he also speaks truly in Case B in denying himself knowledge.  The only thing that changes in the two cases is how important it is for Keith to be right about whether the bank will be open tomorrow.  Therefore, it looks like how important it is for Keith to be right about whether the bank will be open tomorrow is relevant to whether Keith knows that the bank will be open tomorrow.  And relevant in a clear way: holding fixed Keith’s evidence concerning whether the bank will be open tomorrow, whether he knows it will be open varies with variations in how important it is for him to be right about this.

But here we find some odd consequences.  If this is the proper lesson to draw from the Bank Cases, it would appear to follow that two subjects can have the same evidence concerning whether the bank will be open tomorrow, even though one of them knows it’ll open tomorrow and the other doesn’t.  ……What makes the difference in knowledge has nothing to do with these traditional factors.  In fact, one subject might have more evidence than another that the bank will be open tomorrow – be better informed, have done more checking, etc. – but because much more is at stake for the more well-informed subject, the more well-informed subject can fail to know that the bank will be open tomorrow while the less-informed subject knows that the bank will be open tomorrow.  All this is hard to swallow.

https://www.google.com/url?sa=t&rct=j&q=&esrc=s&source=web&cd=2&cad=rja&ved=0CDQQFjAB&url=https%3A%2F%2Fweb.missouri.edu%2F~mcgrathma%2Fpubs-papers%2FPragmaticEncroachment.doc&ei=UJD-UoDjCoHm2AWag4D4Ag&usg=AFQjCNGMBYTohUuRYmF0oT6BsV0dlx0gCQ&sig2=YyCPP25IPEfzLjAsoyYfIA&bvm=bv.61535280,d.b2I

 

I think these cases can illustrate few different ambiguities about what it means to “know” something or be “justified” in believing something.    The first ambiguity is the one I already mentioned.   It seems to me that having gone to a bank a few weeks back and having it be open on a Saturday is pretty good justification for the belief it will be open next Saturday.  Is it certain enough that we would say we “know” it will be open this Saturday?  I think so, but it’s getting pretty close and some might disagree.   If he went there 2 years ago we probably would say it’s not enough certainty to count as “knowing” whether it will be open this Saturday.  So I think these examples are playing on that gray area of what amount of certainty we need before we call something knowledge.    Accordingly this example tends to open the door to look at other ways Keith might or might not be “justified in believing” it is open on Saturday.

 

The bank cases clearly isolate the role of justification in our beliefs that deals not with the probability of our beliefs being true, but with the consequences of their being true or false.  Let’s consider how that is working here.

First, saying that as “the stakes” increase, better evidence is required for knowledge, is not quite what this shows.  It’s not just that “the” stakes are increased, but only certain stakes.  Specifically the stakes are increased in such a way that if he acts on his belief and he is wrong he will suffer greater consequences.

 

Consider case C (another high stakes case).  This case is just like case A as far as it goes.  It is not the case that any important checks will bounce as in case B.  There is nothing else that would cause any urgency for Keith to deposit that check before Monday.    But let’s add a few other facts that increase the stakes.   Keith is on his way to a very important interview.  He is sure he will get this job if he is on time, because a decision maker told him that everyone was so impressed with his credentials and past interview that so long as he shows up, on time, for this interview they will probably make him an offer.  This would be the offer of a lifetime.  And he is not sure with parking and the odd traffic around the bank, whether he will be on time for that interview if he stopped to deposit that check.

 

It seems to me the stakes are just as high in case C as they are in case B.  And I think we would still agree that Keith’s knowledge claim is just as valid as in case A.    So it’s not just that “the” stakes went up in Case B.  The stakes went up in a way that made his being wrong in his belief yield harsh consequences.  Case C increases the stakes concerning his belief as well but it increases the stakes in a way that reinforces acting on his belief.   Could we still say he knows the bank will be open on Saturday due to his going a few weeks ago?  What about 2 years ago?

 

Rather than get bogged down on how much certainty we need for “knowledge” I would rather explore how this second view of “justification” works with our belief.  The distinction is whether we are justified due to the probability of our belief being true or due to the consequences of our belief being true.

 

In an earlier blog I explained what a belief is so that we perhaps better understand how they might be “justified.”   I accepted that “[a belief] is a disposition to respond in certain ways when the appropriate issue arises.”  From W.V. Quine and J.S. Ullian’s  book The Web of Belief.     This description helps us make sense of the bank cases.  Case B demands more “justification” to “respond” as if the bank will be open on Saturday.  The way we would “respond” if the bank is open on Saturday, is to simply drive past the bank on Friday night.   But that response is less justified if there is some doubt in our belief about the bank being open and we risk having an important check to bounce.

 

However the “response” of driving past the bank is not less justified if the stakes are raised in such a way that supports driving past the bank.  Should our “disposition to respond in certain ways” (i.e., our belief) be effected by the stakes we have for responding a certain way?  I think they should.   That is, I think our beliefs should be effected by the stakes we have for responding a certain way.

 

Some people will recoil from this.  They will think our beliefs should only be effected by the probabilities that they are true.    I think that view will usually work out ok for them.  However in certain circumstances this approach may lead to irrational behavior.  But we are skipping ahead too fast.  Let’s back up and think about a few things.

 

First in case C the inherent importance of holding a “true belief” seems to be overshadowed.  Since there is no urgency to have the check deposited on Saturday, the belief “that the bank will be open on Saturday” being true seems relatively unimportant.   Adding the fact that you might be late for a very important interview further decreases the concern whether that belief is actually true or false.  The consequences of your “responding a certain way” is determining your “disposition to respond in certain ways” as much as, if not more than, any inherent importance of holding true beliefs about bank hours.     The probabilities that the bank will actually be open on Saturday becomes relatively less important in Case C, because the decision is really hinging on the consequences of missing the interview.

When we look at the “justification for believing” that the bank is open on Saturday, in case A and C we tend to think he has more justification to than in Case B.  And clearly he does have more justification to be “disposed to respond” by driving past the bank.

 

In sum I think these cases do indeed indicate not only that the probability of our beliefs being true is not the only consideration to holding true beliefs.   In fact, I think we can see that given certain circumstances the probability of our beliefs being true can be relatively unimportant in whether we should hold them.

 

Now I think allot of what I said depends on how we understand “belief.”  Some might not agree with my analysis.  They might say that the belief is not better justified depending on the consequences.  Remember the definition “[a belief] is a disposition to respond in certain ways when the appropriate issue arises.” (emphasis mine)   They might say that the belief has the same justification regardless of the consequences, but the “appropriate issues” change leading to the action of Keith driving past the bank in Case A and C but not driving past in Case B.

 

They might argue that the belief should not be held more or less strongly dependent on the consequences but your actions should be change as the consequences change.  This seems a sensible way to view things.   If we were a computer program or robot that might be the best approach.  But sometimes I think we know we should act a certain way but our doubts about probabilities prevent us from following through.    But I wonder what people think of what I said so far so I will end here.

 

[1] “  A philosopher named Gettier provided some important counter examples to this definition which ends up being the subject of other important philosophical developments on this topic.  However, I don’t mean to address that now.  This idea of knowledge being “justified true belief” remains a sort of default view and its good enough for our purposes.

What Goal are We Rationally Pursuing?

12 Wednesday Mar 2014

Posted by Joe in Uncategorized

≈ 1 Comment

Tags

apologetics, Atheism, Christianity, epistemology, philosophy, Pragmatic encroachment

It seems to me that we act rationally toward a goal.  If the goal changes then it’s likely that the rational way to act will change.   I decided that my goal would be to try my best to act morally to the extent there really is a moral way to act.  That is, do real good and avoid real evil.   God, or no God, what if there is something I should be doing to make the world really better.

Now I don’t mean good as made up by someone or group, as a constructivist might think of it.  That sort of made up morality in some ways sounds good but I decided not to live my life based on make believe.  I am pursuing the real morality, if such a thing exists.   It is with that goal that I decide to consider what beliefs I should hold, to the extent I have control over my beliefs.

I decided that if I live my life trying to live as I really should, and because of that do not live by some rules a person or group of people made up, well I am fine with that.   Sure it’s possible there is no real morality, in that case, there was nothing I really should have done anyway.   But if it does really exist then I think trying to discover what it is, and trying to live by it, should be my focus.    I think everyone should give their best efforts in this regard.

Fairly early on I realized that if naturalism and evolution are true our moral beliefs are completely unreliable.    If you don’t think I am right on that point (or perhaps just don’t understand what in the world I am talking about) please share your thoughts in the comment section to my last blog.   But for this blog I want to rest on that conclusion.   I argued for it in the last blog, and now I want to draw some other conclusions.  So for this blog Ill assume my conclusion in the last blog is correct.   This also happens to be the conclusion reached by a few other philosophers including Richard Joyce, Sharon Street, and Mark Linville.

What that means is if evolution and naturalism are true our moral beliefs are completely unreliable.  From that I concluded that pursuing one set of moral beliefs is no better or worse than any other set of moral beliefs if N and E are true.  Accordingly pursuing the morality of Christianity would be no less likely to be true than any other, even if N and E are true.   Accordingly even if evolution and naturalism is true, following Christ would not be a worse moral option than any other in the rational pursuit of my goal.

It’s at this point that I think it is established that the nonbeliever has lost his case that the believer is acting less rationally – at least toward the goal of living a life that is really morally correct.   From this point forward I will try to push things a bit further and argue that the nonbeliever is less rational than the believer in pursuit of the goal to lead a really moral life.

Ok so we see that if N and E are true our moral beliefs are completely unreliable, so then it doesn’t matter what moral beliefs we choose.  But what if N and E are not true?  Since any moral beliefs, are a wash if N and E are true, I think it’s rational to focus our attention on the possibility that N and E are not true.

Specifically what if naturalism is not true.  Then it seems we might actually have reliable moral beliefs.  But how could we know what they are?  From what I (and the other 3 philosophers) have argued I am convinced that natural processes alone could not produce beings with this knowledge.  So we would need to look for something from a supernatural/non-natural confirming source that could teach us these morals.   From this it seems we should weigh the evidence of what sources of morality seem to have a supernatural/non-natural confirming source.  There are many religions that fit this bill and I would suggest the reader consider these religions and which has the best evidence.  I won’t go into that weighing here.  But I would like to point out that when it comes to weighing the religious moral schemes we are looking for evidence that the moral teachings were affirmed by a supernatural/non-natural source.

Now I anticipate a few objections to what I said.

First is to say what if there is a God who gave us our moral beliefs but he wants us to believe there is no God?

I think we weigh the evidence of this God the same way we would of any other God.  What is the evidence that this God exists?  But I think there is a second problem with continuing to not believe in this God.   It seems like a contradiction to believe in this god and follow this God’s rules.  If we believe and follow this God then we don’t believe this God.

Finally I think there is a third problem with not believing in God.  If we do not believe in God and we understand that what I and the other philosophers said is true, then the belief that there is no God would also imply our beliefs concerning morals are unreliable.  This would undermine our determination to act morally when acting morally is hard.  When it’s hard, it would be easy to rationalize and say “well the reliability of my moral beliefs are suspect anyway.”  Now I admit that reaction wouldn’t be rational based on my goal.  But I think that would happen.   When you know you are subject to irrationally immoral behavior by taking certain course of action (and here I include an action such as adopting a belief or taking actions which would lead to adopting the belief) then rational people will not take that course of action.

Here is a second objection:

So let’s say we agree to follow some God that we think has the best evidence.  But the “best evidence” is really pretty weak.  Let’s say for example we think the Christian God is more likely than Zeus but maybe just barely.   Let’s say we don’t think the evidence for the Christian God makes it more probably true than not true.  But nevertheless that God has better evidence than any other Gods.    What then?

I think we need to consider this carefully.  It seems to me that if we knew full well this God existed because we could see this God continually and literally standing over us watching our every move few of us would sin.   But that is not the case.   And so we all sin or act in ways we might agree is not how we should.  It seems to me that the firmness of our belief in God is important to how well we follow his moral laws.    And again that is our goal.  We want to find and  follow the real moral way of life.

How we should look at this depends how committed we are to our original goal of trying our best to act morally to begin with.

Let me offer an analogy involving a game.  For this scenario let’s say you are not in need of any set sum.   You want to maximize your potential return.   In fact maximizing your potential return in this game trumps all other concerns you have.    Maximizing your return in this game is in effect all that matters to you.

Let’s say there is a roulette wheel with 1,000,002 numbers.  You get $3,333.34 every month over the course 25 years.  You will receive $1,000,002.00.   You must immediately place the money on a number once you receive it.  At the end of the 25 years there will be one throw that will decide the winning number.  You can only keep the money that is on the number that the ball lands on.   You can put the money on more than one number.  So you could have one dollar put on each number.  You would be sure to get one dollar back but you also know you would only get one dollar.

Now everyone knows the number 7 is slightly rigged such that there is 3xs the possibility of the roulette ball landing there than for any other particular number.  I am not saying it is 3xs as likely to land on 7 as it is to land on any of all the other numbers combined.   I am just saying it is 3xs more likely that it will land on 7 compared to it landing on, say, 474,923 or any other particular number you pick.

How do you bet over the 25 years?

Now let’s say you went all in on 7 but the number comes up 775,957.  How do you feel?  Do you feel bad that perhaps you were irrational?

On the other hand let’s say you figured you did not have “enough evidence” to believe in the number 7.  After all, you lacked evidence sufficient to show that 7 was “more likely than not” going to be the winner so you just picked a random number like 42 and went all in on that.     And the number 7 came up.   And then you saw the other people who picked 7.   Would you disagree with them if they told you it was irrational for you to not go all in on 7?

Here is a more interesting question.  Let’s say some people actually claimed to firmly believe that it would be 7 and went all in on 7?  Let’s say they looked at the situation and they just wanted to make sure that they acted rationally in this game.  So they reinforced the idea that it would be 7 so they would be sure not place any money outside of 7.    So for example they convinced themselves that the odds of it being 7 was much higher than it really was.   Was that irrational to the extent of pursuing their goal?

I don’t think it was irrational.  I think so long as your actions concerning an uncertain belief would not change by adding certainty to your belief it is not irrational to reinforce that belief.   That is whether a person believes that the chance of 7 winning is .0003% .3% 33% or 100% when all the other numbers are about .0001% it won’t make any difference, you should still bet it all on 7.  So none of the actions that this belief is relevant to are negatively affected by puffing up the belief.   And in fact puffing up this belief might be beneficial.

Let’s say the evidence suggested that people who did not puff up the belief that it would be a 7 often would put some money on other numbers.    Assuming your goal was to maximize your possible gains then would it be irrational not to puff up the belief that the number 7 would win?  I think it might be irrational not to puff up that belief.

How should those who reinforced their belief feel if it happened to come up 42?  Would you be able to say that their foolishness mattered?

Recent Posts

  • Perspective
  • Rauser Causal Theories of Knowledge and the Moral Argument
  • Why Context Shows Historical Intent for the New Testament but Not the Old Testament
  • Jesus Loves the Canaanites Part 3
  • Randal Rauser: Interpretting the Old Testament Part 2.

Recent Comments

RaPaR on Perspective
Joe on Perspective
Archon's Den on Perspective
Perspective | True a… on “Top Down” and…
keithnoback on Rauser Causal Theories of Know…

Archives

  • May 2022
  • November 2021
  • August 2021
  • May 2021
  • April 2021
  • February 2021
  • August 2020
  • July 2020
  • February 2020
  • December 2019
  • November 2019
  • October 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • November 2018
  • July 2018
  • January 2018
  • October 2017
  • May 2017
  • April 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • June 2016
  • May 2016
  • May 2015
  • April 2015
  • February 2015
  • January 2015
  • December 2014
  • October 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014

Categories

  • apologetics
  • atheism
  • Athesism Christianity
  • Catholic
  • choir
  • chorus
  • christianity
  • Christmas
  • college football notre Dame
  • economics
  • epistemology
  • europe
  • history
  • Islam
  • law
  • logic
  • metaethics
  • Morality
  • Music
  • philosophy
  • politics
  • rationality
  • religion
  • science
  • scripture
  • socialism
  • Songs
  • Trump
  • Uncategorized

Meta

  • Register
  • Log in
  • Entries feed
  • Comments feed
  • WordPress.com

Recent Posts

  • Perspective
  • Rauser Causal Theories of Knowledge and the Moral Argument
  • Why Context Shows Historical Intent for the New Testament but Not the Old Testament
  • Jesus Loves the Canaanites Part 3
  • Randal Rauser: Interpretting the Old Testament Part 2.

Recent Comments

RaPaR on Perspective
Joe on Perspective
Archon's Den on Perspective
Perspective | True a… on “Top Down” and…
keithnoback on Rauser Causal Theories of Know…

Archives

  • May 2022
  • November 2021
  • August 2021
  • May 2021
  • April 2021
  • February 2021
  • August 2020
  • July 2020
  • February 2020
  • December 2019
  • November 2019
  • October 2019
  • July 2019
  • June 2019
  • May 2019
  • April 2019
  • March 2019
  • February 2019
  • November 2018
  • July 2018
  • January 2018
  • October 2017
  • May 2017
  • April 2017
  • January 2017
  • December 2016
  • November 2016
  • October 2016
  • June 2016
  • May 2016
  • May 2015
  • April 2015
  • February 2015
  • January 2015
  • December 2014
  • October 2014
  • July 2014
  • June 2014
  • May 2014
  • April 2014
  • March 2014
  • February 2014
  • January 2014

Categories

  • apologetics
  • atheism
  • Athesism Christianity
  • Catholic
  • choir
  • chorus
  • christianity
  • Christmas
  • college football notre Dame
  • economics
  • epistemology
  • europe
  • history
  • Islam
  • law
  • logic
  • metaethics
  • Morality
  • Music
  • philosophy
  • politics
  • rationality
  • religion
  • science
  • scripture
  • socialism
  • Songs
  • Trump
  • Uncategorized

Meta

  • Register
  • Log in
  • Entries feed
  • Comments feed
  • WordPress.com

Blog at WordPress.com.

  • Follow Following
    • True and Reasonable
    • Join 141 other followers
    • Already have a WordPress.com account? Log in now.
    • True and Reasonable
    • Customize
    • Follow Following
    • Sign up
    • Log in
    • Report this content
    • View site in Reader
    • Manage subscriptions
    • Collapse this bar