delirium happy

Just keep on trying till you run out of cake

Previous Entry Share Next Entry
On Faith
taijitu
rho
I'd like to talk about faith. In this world of ours with its increasing polarisation between science and religion there seem to be two oft-espoused opinions on faith, which I tend to think of as the Jack Chick view and the Richard Dawkins view. The Jack Chick view states (as I understand it) that faith is the single greatest human virtue that there is, above all other virtues, and that faith is both necessary and sufficient for salvation. The Richard Dawkins view, on the other hand, states (same caveat) that faith is nothing more than foolishness and self-delusion, and is actively dangerous and should be opposed at every turn. Needless to say, there are plenty of other opinions in between these two extremes, but they tend not to get stated so often. Which is why I'm going to expound my own personal opinions.

In my view of the world, there are two modes of knowledge and thought, which I think of as the logical and the intuitive. You could also call them Apollonian and Dionysian. Or left brain and right brain. Or thinking and feeling. Or any one of a multitude of different possible names. There are some things that we know because we have reasoned them out logically from an initial set of observations and suppositions. On the other hand there are things where we don't have all the necessary evidence to be able to make such an inference, but where we have a strong intuitive feeling anyway. It is my strong belief that these two modes must be balanced and any system which denies either one of them is flawed.

We all use both methods of thinking all the time. Science is generally something that's thought of as being intensely rational, but even there, intuitive thought is all over the place. Take something like Euclidean geometry. From five postulate sand five common notions, we can then derive the entirety of geometry using only logical means. But we have to ask, where did those postulates come from? How did Euclid determine that it's possible to draw a straight line between any two points, or that all right angles are equal? He certainly didn't prove them. He just wrote them down as a starting point because they intuitively felt as if they were correct.

We also have notions such as Occam's Razor and the scientific method. These aren't proved or deduced or anything. They're just plucked out of the ether because they're what feels right. And that's before we even start to think about things like leaps of intuitive reasoning in coming up with new scientific theories or ideas.

On the other hand, even when we're acting almost entirely instinctively, there's still a whole lot of logical reasoning going on. Let's say that I meet someone and feel instantly at ease with them and intuitively feel that I can trust them. Let's say they then ask me to borrow something of mine. Clearly, I trust them, so I let them borrow it. Only it's not that simple. I would actually be reasoning that since they are trustworthy, and since trustworthy people are likely to return borrowed things, then whatever they wanted to borrow would likely be returned, therefore I would not be losing out by lending it. All of which amounts to a decidedly logical chain of thought.

Essentially, everything we know must be, at its most basic level, intuitive, but these bits of intuitive belief must be held together by a glue of logic in order to form a coherent whole.

So what does this have to do with faith? I define faith not as belief in spite of lack of evidence, but as belief based on intuition in spite of the lack of logic-based evidence. It's then important to realise two things: intuition is right more often than random chance would allow; intuition is still often wrong, though.

First, cases where intuition is right. From my own experience, I know that there have been plenty of times where I've just had a nagging feeling that something wasn't right, figured I was just being silly and put it to the back of my mind, and then turned out to be right. I also know that in some maths or physics problems, I can look at an answer and see instantly that it's wrong, without knowing why. I've known very quickly after meeting some people that we'd get along well. And so on and so forth.

Now, I'm no neurologist or psychologist, but I'd speculate that we have far more sensory input than we could possibly hope to consciously process. We see and hear and sense all kinds of things, some of which we aren't even aware of. It seems reasonable to me that some part of our brain should be able to process these things and come to conclusions from them, even when we can't rationally explain why. After all, things like subliminal advertising exist.

Then onto the cases where intuition fails. Human beings are remarkably good at spotting patterns. In fact, we're far too good at it, and can easily spot patterns even when none are there. I'll use my favourite dice example. Imagine you're playing a dice game of some sort with someone else. The first three times they throw their die, it comes up each time as a 6. The probability of that happening on a fair die is 1 in 6³, which is 1 in 216. Not terribly likely. The die must be weighted, right? Well, no. There's also a 1 in 216 chance of it showing a 2 then a 5 then a 4. Or any other combination you care to choose. Whatever the first three rolls were, the probability of it happening that way on a fair die is 1 in 216. We only notice the three 6s though, because it looks far more suspicious to us. In reality though, we encounter lots of dice in our lifetime. Why shouldn't one of them come up with several 6s the first time we throw it? And why can't now be that one time?

In fact, pretty much the whole of the scientific method is set up to guard against fooling oneself by seeing patterns when they aren't there. If we think we see a pattern, the scientist has to ask "is it repeatable?" "is it independent of environment?" "can it be used to make predictions?" and so on. If these sort of questions can be answered in the affirmative, then it's a safe bet that the observed effect is real.

And again, this tendency towards spotting patterns that aren't there makes sense to me. Let us go back and imagine our ancestors of several million years ago. A group of them are sitting around. One of them drops a bit of fruit he was eating. Some nearby grass rustles. Birds fy up into the air. Suddenly, a lion attacks. Some time later, one of them drops a piece of fruit again. He remembers that last time he did that, a lion attacked, so he runs up a tree to hide. Obviously no lion attacks, and he's just wasting his time. Then some more time passes. He sees the same ripple of grass, and birds flying into the air as he saw with the lion attack, so he again climbs a tree to hide. This time, a lion does attack.

I'm far from an expert on the subject, but it would seem to me that the drawbacks of seeing patterns when they aren't there (wasting a bit of time and energy climbing a tree) are rather outweighed by the benefits of being able to spot a patterns that does exist (avoiding being eaten). Obviously, what you ideally want is a 100% success rate, with neither false positives or false negatives, but that's pretty much impossible to achieve.

But regardless of the reasons why, intuitions are sometimes right, but can't be relied upon. Same thing with faith. People with faith tend to just feel that their beliefs are correct. Maybe they're right. Maybe they're wrong. And both sides need to remember that. In the absence of any definitive evidence, there's not much more anyone can say.

At least, that's my take on things.

  • 1
On a related note, I recently found myself wasting the best part of an evening reading http://en.wikipedia.org/wiki/List_of_cognitive_biases and related links. I was familar with many of them already, but reading about so many complementary biases greatly exacerbated my intuitive sense that humans shouldn't generally be trusted to make [rational] decisions.

I agree with your point that seeing patterns (and presumably other cognitive biases) probably provides some form of net advantage. Primitive humans (such as cavemen and those who believe everything they read on the BBC news) don't have the advantage of statistical data or the knowledge of how to apply it, so have to make judgements using whatever information they do have.

Neat link!

On a side note, I have devoted a decent part of my mental energy to fighting the outcome bias. I refuse to be hard on myself or consider myself to have made the wrong decision when it turns out wrong, if I still feel I made the best decision I could have given the information and abilities I had at the time. I wasn't wrong... it just didn't work out. Just like buying a lottery ticket isn't a good investment, even if it ends up winning. The decision was still wrong, the person just got lucky.

As to the original post... we do process data we're not aware of. That's how the cocktail party affect works... where you're in a large group with lots of people talking and you're just following the conversation in front of you, but then you hear your name. How did you hear your name? Well, part of your brain was doing low-level filtering for really important info, and when it hit some, it sent it through to higher level processing.

On the other hand, you also have a lot less info than you think you do. You remember more than you remember. What I mean by that is when remembering an event, you tend to have a fairly decent amount of detail and info, generally more than you actually stored. Because your brain fills in bits and pieces here and there based on what it did store. It stored a gist, and assumed more. This is why it's so incredibly easy to alter people's memories, and why eye-witnesses suck. What they actually tend to remember is pretty heavily influenced by the exact questions you ask them the first time round.

Plus, you don't necessarily see everything you see. Again, you take in info, and your brain fills in the rest. This is how some optical illusions work. My favorite example of this is hard for people to run or even for me to repeat. Your peripheral vision (assuming you're not a weird mutation) is made of rods, not cones. This means it has no color perception. However, you normally move your eyes around when you see, and thus fill in the color. You see color in your peripheral vision, even though you don't. Odder... even if you don't see color, you might fill it in anyway based on your best guess. When my retina detached in my right eye, I had no central vision at all. While disturbing, it was also a rare opportunity, as it's hard to not make yourself cheat and move your eye. So, I tested my peripheral vision (this was actually one of the very first things I did when my retina detached, before seeking medical care... don't be me). Sure enough, the world was harder to see with no central vision, but it was in color. How odd... I knew I shouldn't have color vision. So, I looked at an item whose color I wasn't sure of... a soda can. It was too blurry to make out details, but it was red. Then I opened my left eye... it was green, mountain dew. Then I looked with just my right eye again... now having convinced my brain that it could not, in fact, see color... the world was black and white. But it had been in brain-supplied guesswork color.

And this is why science tries so hard to be careful, because at a very fundamental level, our data is biased by our beliefs. However, I do agree with Rho that often we know things for very good reasons that we cannot necessarily state or haven't fully thought through consciously. I'm great at this, being stronger with intuition than reason. I'll come to the right conclusion, and 3 months later have the logical explanation of why it's right. Now, I trust my intuition more and am willing to wait for the explanation to come. Sure, I'm not perfect, but neither is my reasoning. I can't be perfect no matter what process I use, but I should use what works well for me, and that's my intuition.

But I won't miss a chance to babble about psych. :)

It seems to me that the problem isn't what one might call provisional conclusions: "there's some evidence pointing to this, and nothing to prove it wrong, so I'll act as though it's true." The problem is when, having decided to act as though something is true, a person then denies any evidence (either newly discovered facts, or logical reasoning) against it.

In your example, okay, you decided to trust this person and lend them, say, a book. A month later, you've seen them a couple of times, and they've said that they enjoyed reading the book, and discussed it with you (so you know they finished it). And you asked for it back, and they made some excuse. If they then ask to borrow something else, you might say yes, but you'd probably be less confident of getting it back, because they haven't lived up to "X is trustworthy, therefore they'll return my book," and in general, it's reasonable to think "X didn't return my book, so they probably won't return my coat." If you then told your friend Y that X was trustworthy and honest, Y lent X a hacksaw, and when they didn't get it back and told you about it, they might be annoyed if you said "Oh, yeah, X kept my book and my coat, too" instead of "How strange! I thought X was honest."

For that matter, while Euclid's axioms aren't provable as such, they aren't entirely arbitrary either--they were based on observations of the world around him, and if they were false someone would have disproven them by now (indeed, there are useful geomtries with different axioms about parallels).

Oh, side note on the dice example: while 2,4,5 is no more likely than 6,6,6 only one of them is plausible if the die is weighted, so only one suggests checking for that.

Good points. But I'd argue that the basic axioms of geometry and much math/science/etc. are not true, but non-falsifiable. That's why they are axioms. You can't prove them false, because they're not falsifiable. Which means, you just accept them, like faith.

I don't think that makes science and all forms of faith equivalent. Because not all axioms lead to systems of equal value. If my axiom is nothing I perceive is truly real and thus nothing I do matters, you cannot disprove it, nor can I prove it. And it's a valid axiom. But it doesn't get you much of anywhere worth going.

Of course, the problem is that sciencey people say, science has given us so much, that shows the value of its axioms - people live longer, people have better homes, we can communicate with people around the world, etc.

And religion says, but none of that really matters, it's saving souls and serving god, and science detracts from that, so it doesn't have true accomplishments.

And you can't argue against that. At a very fundamental level, you have to choose what you value. And then you can judge systems based on how they work toward those values. But you can't really argue for your values. My value is to increase happiness and decrease suffering for all sentients. And that makes me a big fan of science. But it's not going to be everyone's value.

I'm not entirely sure of my point. But I think the very fact that there are other systems of geometry with different axioms show that it isn't accepted because they are true. It is accepted because it is useful. It is a model that leads to people being able to do things that work, whether it is true or not. Whether it being true has any meaning or not.

Touching on a long-term fascination of mine there! - the relationship between intuition and intellect.

If you currently possess some level of "being into books", you might enjoy the following ones:

Guy Claxton "Hare brain, tortoise mind" - lots of yer actual scientific research into intuitive knowing

Malcolm Gladwell "Blink" - similar stuff specifically about first impressions

Cialdini (I think Robert Cialdini, can't quite remember off the top of me head) "Influence" - cognitive biases arising through social mechanisms

All very well worth a read i.m.o.! and all quite easy to devour, if I recall correctly (i.e. not vast intellectual tomes).

Arthur Koestler expressed that view in pretty much all his work (says me, based on reading 8 or 9 of the 50-odd books that he wrote). The most relevant starting points would be Arrival and Departure for a straightforward expression of the idea. Me, I love his history of science trilogy, that started with The Sleepwalkers.

  • 1
?

Log in

No account? Create an account