I understand why. There are side effects, toxicity and a lot of corporate bullshit wrapped up in pharmaceutical companies. The majority of doctors do not know anything about nutrition and preventative modalities, and are quick to push medications on their patients.
Many people end up on meds who do not benefit from them, and they actually cause harm.
But what about those of us who do need them?
We are then put into an even more highly stigmatized category. Now, we are not only struggling to go on with our lives in a healthy way, we are shamed for taking our medicine.