Every time I go to a doctor, he starts telling me things I need to change. I’m too fat, my blood pressure’s too high, and so on. Who is he to say what’s wrong with me!? Besides, isn’t it hypocritical of him to be pointing out my problems when I’m sure he’s not in perfect health?
If I can ever find a doctor who will just tell me things that make me feel good, I may go. Until then, I’m just never going to a doctor again!
Just in case it isn’t obvious, this post isn’t really about doctors. Having this attitude toward doctors would clearly be ridiculous. A doctor’s job isn’t to tell me things to make me feel better, but to actually help me get and stay healthy.
Reread this post, but replace “doctor” with “church.” Now it’s something you’ve probably heard many times before, but is it any less ridiculous?