It is my opinion that the role of medical or dental insurance companies is to pay for treatment. It was never intended that insurance companies would actually decide on which treatment is necessary. Unfortunately, insurance companies and not the doctor and patient decide what treatment you should have.
Any discussion that begins with "My insurance company...." is no longer a discussion about our dental health or the right treatment or what is the best treatment for you but rather who will decide what treatment that you will have.
A relatively small percentage of patients have NO choice but to have treatment that insurance companies will pay for. A fairly large number of people have the ability to decide for themselves what treatment is in their best interest and do not let the insurance company make an important decision about their dental health. Unfortunately, many patients financially able or not often begin our conversations with "My insurance company..."
No one would listen to an insurance company tell them which house, car or coat to buy but often rely on the insurance company to make the decision about health care.
Does this make any sense?