Monday, July 28, 2014

Common Sense - Part 2

In a previous post I wrote about common sense.  In that post I concluded that common sense is largely a useless concept because it is so egocentrically defined.  I have never met anyone who admitted to a lack of common sense.  It seems to mean that which is obvious to us.  Most everyone seems to think they have it while occasionally pointing out the lack of it in others.  Mathematicians call obvious facts "trivial" and psychologists may refer to them as "intuitive" (though "trivial" and "intuitive" may be in the eye of the beholder).  One problem with common sense, that I identified in the last post, is that everyone defines it differently.  Another problem, that I will discuss in this post, is that sometimes common sense (or intuition) may turn out to be completely wrong.  In other words, sometimes an idea seems obvious on the surface, and yet it turns out to have some unnoticed subtleties that defy the common sense interpretation.

From an evolutionary standpoint we sometimes need to perceive a situation and take action quickly.  Our survival may depend on it.  Most people are hardwired to fear spiders and snakes.  This fear generally serves us well because some spiders and snakes are poisonous and can even be deadly.  Our natural fears do not seem to distinguish between poisonous and harmless varieties, but this is unimportant from an evolutionary standpoint.  The cost of fleeing from a harmless snake is much less than the cost of being bitten by a poisonous one.  The easier evolutionary step of fearing all snakes has been favored by natural selection because the cost of a false positive is so low that the more complex evolutionary task of distinguishing poisonous from harmless varieties carries much less selection pressure.

The snake and spider example shows how our intuitions can turn out to be wrong through over-generalization, especially when greater discrimination carries so little survival advantage.  Over-generalization can also occur in behaviors and thoughts that are learned rather than instinctive.  My ex-father-in-law, who I mentioned in my last post as someone who is often lauded for his common sense, was convinced that pictures stored on a CD would degrade over time the way printed pictures do.  No amount of explaining how digital data differs from analog data could convince him otherwise.  Not only is digital data discreet (each bit of information is either on or off as represented by a 1 or 0), but digital data often contains error-correcting mechanisms just in case any of the bits do flip because of environmental causes.  Actual bits on magnetic or optical media have enough separation from each other (even though their values can vary slightly) that flipped bits are unlikely.  Furthermore, uncorrected flipped bits can make an entire file corrupt and unreadable rather than resulting in diminished quality.  When the file is copied, any subtle changes causing small value changes but not enough to flip the bits will be gone because the data is recreated anew.  All this was outside the realm of my ex-father-in-law's experience and cognition so his intuition and common sense completely failed him on this specific point.

Our intuition can also fail us when something is completely outside of our normal experience of the world, and yet based on solid science.  Such is the case for Einstein's Theory of Relativity.  Relativity deals with extremes in speed and gravitation.  Relativistic effects do not become important until speeds approach the speed of light or gravity approaches that of a black hole.  Our experience does not equip us with intuition for such extremes, but without taking into account the relativistic effects of gravitational fields on time our GPS system would grow increasingly inaccurate since it requires extremely precise timekeeping in satellites orbiting 22,000 miles about the earth where the gravitational field is much weaker than at the earth's surface.

The Monty Hall problem is a very non-intuitive probability problem.  It is based on the show "Let's Make a Deal" from the 70s with Monty Hall as host.  There were three doors, each concealing a prize.  Only one door contained a desirable prize with the other two containing booby prizes.  The contestant would pick one of the doors.  Then the host opened one of the two doors not chosen to reveal a booby prize and narrow the choice down to two doors.  The question is, should the contestant stick with his original choice or switch to the other remaining closed door?  Most people intuitively think that the chances are 50/50 since there are two doors left, but this is wrong.  If someone just showed up and saw the two doors without knowing which one the contestant had chosen then it would be a 50/50 chance for them to pick one of the closed doors, but the host's choice is constrained by the contestant's initial choice and provides crucial information.  If the contestant picked the good prize right off the bat the host can open either remaining door, but this happens only one time in three.  The other two out of three times when the contestant picked a booby prize first there is only one door the host can open.  That means that two out of three times the good prize is the remaining door that the contestant did not choose first.  So the contestant should switch every time to have the best probability of winning which is 2/3 for switching and 1/3 for staying.  This has been verified by mathematical proofs and by many computer simulations including one set up by one of my coworkers, even though the true probabilities are very non-intuitive.

I recently heard about another very non-intuitive math problem.  Imagine a rope around the earth at the equator.  Now raise that rope by one foot around its entire circumference, making a larger circle by 1 foot of radius.  How much extra rope does that take?  Now imagine the same scenario, but this time the rope encircles the sun at its equator.  Then, similarly, raise that rope off the surface of the sun by 1 foot.  How much more extra rope would it take compared to the earth scenario given that the sun is much larger with a radius about 100 times that of earth?  The non-intuitive answer is that it takes exactly the same amount of extra rope in each case.  The answer is 2π or approximately 6.28 extra feet of rope.  To see that this is true take a circle of any size and add 1 to its radius.  Since the circumference of any circle can be calculated with the expression 2πr, the difference between a circle of radius r and radius r+1 can be calculated as follows regardless of the size of r.
2π(r+1) - 2πr =
2πr + 2π  - 2πr =
In the first step 2π is distributed to r and 1.  Then the two terms 2πr and -2πr cancel leaving 2π.  The canceling of the terms containing r indicates that the size of r is not relevant.  The final answer, 2π, is merely a number with no variables with the approximate value of 6.28.  It seems intuitively that you would need much more extra rope to encompass the sun with a one foot increase in radius than the earth with the same increase in radius, but that is not the case.  The amount is exactly the same for both cases.

These examples illustrate how common sense or intuition can fail us.  If common sense is the only tool in our bag, our comprehension of the world will be extremely limited.  As useful as common sense is and as much as it is lauded (despite its mushy, egocentric definition), relying completely and only on it puts a rather low ceiling on what we are able to understand about the world.  This post, along with my previous post on common sense, is rather personal to me.  This is my defense of the importance of continuous education and humility in the face of our ignorance, most especially the ignorance that we fail to perceive.  I am speaking to myself and to all who have assumed that they have understood the obvious.  Sometimes the truth is not as obvious as we think it is.