Monday, March 14, 2016

Skeptics and circle-squarers

Mix Bakeshop
Decaf Americano
Chocolate Croissant

It's Pi Day!  So I thought it would be a perfect day to talk about circle-squarers: people who are unreasonable about irrational numbers.

Circle-squarer is a pretty obscure reference, and it relates directly to pi.  Pi is a specific number equal to the circumference of a circle divided by its diameter.  Exactly the same ratio for any size of circle.  Specific, but not exactly friendly to our base-10 number system.

For millennia, humans have tried to pin down pi as an easy number that can be written on a single line of a single piece of paper, either in decimal form or, even better, as a fraction.  They've gotten closer in some times and places than in others.  The Bible said pi equaled 3, which was maybe the least accurate guess.  In reality, it's 3 and a smidge.  And for the last few hundred years we've been getting really good at determining the value of that smidge.  But we can only get so far because pi is only exact in its own number base - in base 10, the number goes on and on with no end in sight.

I used to have pi memorized to a modest 12 digits (my copy of "The Joy of Pi" has pi printed to the millionth digit), but most of my math classes used 3.14159 for practical purposes.  And the average J.Q. American gets by with 3.14.  But mathematicians are certain that pi goes on well past even the billionth digit, and have been for a long time.  This makes it an "irrational" number.  Yet, despite this communal certainty, every so often some young mathematician will proclaim that they have found the actual value or pi - and its terminus!  And if pi is not irrational, as we pretty well know it to be, then you could, in mathematical terminology, "square the circle."  Thus, these bold but misguided, would-be truth-speakers are called "circle-squarers."

In other words, it's a nerd burn, and it roughly translates to "birther."

The question is, what makes someone believe something so wrong in the face of so much knowledge to the contrary?  Well, partly, because they are not always wrong.  Sometimes it is the well-established "knowledge" that is wrong.  That was the case with Einstein, who's birthday it is today.  Funny enough, he was born 100 years, 1 month, 1 week, and 1 day before I was.  Not that that means anything.

So, back before Einstein was Einstein, the scientific community initially pushed back against his new theory.  But, because scientists know how to science, some became open to his theory and later endorsed it, after they made the observations that could confirm it.  Bam!  Nobel in Physics. 

Okay, to be fair, that whole process took many, many years and much analysis.  Science generally doesn't 'Bam!' unless it's on Mythbusters.  Science is about a method of determining reality.  Or trying to, anyway.  I have glibly said that scientific laws are laws as long as they work.  When you run into a bit of reality that doesn't fit into your model, you first go through the process of determining whether or not it really happened, then see if your model can be tweaked to accommodate the new reality.  If it can't, you start looking again for a better model.

That's probably the greatest virtue of science - letting go of cherished misbeliefs to seek the truth in earnest.

But scientists are also people and have all the same flaws that other people do.  They have biases, prejudices.  Fortunately, scientists are aware of this and try to minimize the risk of their own biases ruining their work by faithfully following the rigors of the scientific method, carefully noting how they followed these rules, what data they found, and then offering all this up to other scientists to check their work.  The more people scrutinizing the work, the more confident everyone can be in the conclusion.

So how does bad science still get through the checks and balances and pass into Common Knowledge?

Firstly, if it's not science but propaganda.  And the best propaganda is put out - by scientists!  As I said, scientists are still human, and even they can be bought.  I recently shared an article on my Facebook page about how a corporation influenced a university to declare that their study showed the corporation's product had astounding health benefits.

They said special milk prevented concussions.  Yeah.

But that was such an obvious red flag to the rest of the scientific community that the statements were quickly retracted "pending further investigation."  What about when the red flags aren't so obvious?  That's the problem.

These days you can pretty much find an "expert" for any point of view.  Experts who are legitimate, trained experts who are espousing a false view because... the payoff is too good to refuse, or they have too much ego or bias on the line to see that they are wrong, that the data is really inconclusive but they want it to mean something it doesn't, or because they are talking about stuff that's not even in their own field that they don't know is incorrect but they trust whoever told them... and the like.  Then there are the experts who are outright frauds, fake doctors or paper universities with names that sound like the names of reputable institutions... and so on.  It's hard to know who is a trustworthy source of information.

What's harder is that fact that so many people don't care too much about being well-informed or scrutinizing the source of their information.  They accept the information of chosen authority figures (experts) and don't much utilize the other means of knowing the world.

Way back a million years ago when I was in college for the first time, I took ye old philosophy 101 and we discussed four ways in which we gain knowledge: authority, tautology, empiricism, and logic.  I think.  It's been a while.  Anyway.

Authority is obvious - someone tells you something and you believe it.  Tautology is something that is true by definition.  As in, 2 plus 2 equals 4, because that's the way we have defined those values and operations.  Tautologies are useful in math proofs because you're basically stating the same thing but in a slightly different way, which you may need to reach a particular conclusion.  That's why "tautological" also gets used interchangeably (though not always correctly) with the word "redundant."

On to empiricism, which basically means observations.  I can determine, empirically, that it is dark outside because I'm looking out a great big window and it's night.  But seeing that it's dark doesn't tell me whether or not it's also cold.  I'd have to go out there and feel it or, even better, get a thermometer.  And hope the thermometer is accurate. (Technology is kind of an authority figure, too, isn't it?)  But I also might be able to determine that it's cold outside by combining the empirical observation that many people are walking by wearing coats and other cold-weather clothing with the logical observation that most people wear coats when they are cold.

Logic.  I love logic.  I kid you not, symbolic logic was my favorite class of my "some college" career.  It's so clear and... well, logical...  It has distinct rules, known fallacies, and understood limitations.  Sometimes there is no logical conclusion to a given question.  God is like that for me.  As I understand logic and as I understand physical reality, I don't believe you can make a logic-based argument for God's existence or non-existence.  To me, God's nature is outside the rules of nature and, thus, beyond the knowable.  And I'm okay with that.  But maybe, someday, someone will convince me one way or another.  I'm okay with that, too.

Home
cold decaf americano
cold pasta

Okay, what was my point?

Well, this reflection on knowledge has really been prompted by a some appallingly untrue things floating around lately.  But rather than parse them all, the point is what this abundance of untruth says about the state of knowledge today.  In short - we don't think.  Some would say, cynically, that it's always been like this, with a statistical handful of people doing the intellectual heavy-lifting, and another handful of people utilizing or distorting those ponderings to mobilize the opinions and behaviors of the masses.  I still say it's different today.

Historically, most of the intellectual exploration was done by the educated, the scholars.  Yes, I know there were many exceptions.  Broadly speaking, however, the masses knew that they were at an informational disadvantage and had to, therefore, rely on trusted authority figures to explain their opinions to them.  Again - broad generalizations, I know.

But this is the Hyper-Information Age.  Somehow with the opportunity to be well-educated and continuously and accurately informed, we have instead become excessively passive in our pursuit of knowledge.  And also, our exposure to differing opinions has become narrower.  I blame all the algorithms designed to sell us things, even sell us the "optimal" social media experience.  And when we only see the same opinions bouncing back at us from the same kind of people - our Opinion Tribe - not only do we have less opportunity to hear other opinions, our own opinions feel more valid because they are more readily reinforced.

This is a breeding ground for the low-scrutiny of bad data.  This is the very reason you need friends at your table who don't agree with you on everything, to push back when you say something dumb.

If you told me that some study or other proved that raising the minimum wage reduced poverty and crime and did many other positive things, I'd probably say, "Yeah, that sounds about right," and maybe not even glance to see whether or not a recognizable institution conducted the research.  But if you told me that some study showed that raising minimum wage caused massive unemployment and runaway inflation, I'd say, "Let me see that..." and I'd scrutinize the heck out of that study, starting with who conducted it, when, under what circumstances, and what measures they used to reach their conclusions, and what confounding factors they did or didn't account for.  And we'd both be better off in the end because we would have more confidence in stating whether or not that particular study was accurate or bunk.

But in the absence of that scrutiny, we leave ourselves more vulnerable to misinformation.  If we don't even look at how we know what we think we know, we commit ourselves to ignorance.  Instead of being the strong, well-informed, dynamic people we could be, we instead make ourselves somebody else's tool.

There will be times we make mistakes, reasonable ones, too.  People we thought were reasonably trustworthy, neutral arbiters of information, may lead us astray.  Perhaps unknowingly.  The important part is to be both vigilant and humble, and don't just say you've considered that argument from both sides unless you've really putting the effort in.  I've seen plenty people give lip-service to objectivity and then go on to say clearly ignorant things.  Instead, go to the people who say you're wrong and ask them what information has convinced them of their opinions.  You show me your data and I'll show you mine.

And maybe you both are citing experts with the same title - compare those too.  Is it one or two fringey scientists with known biases versus a pantheon of scientists and well-respected institutions?  Is it a political think tank versus the CDC or the like?  If your side seems like the less credible one, it might still be right, but that is usually not because of some big conspiracy undertaken by the government or mainstream institutions.  What you have to answer is, "What would it take for this to be true?  What is the real probability that what I think I know is right?"

Am I Einstein or just a circle-squarer?

No comments:

Post a Comment