I had a conversation with my mother about vitamins, diet, and medication just a couple of minutes ago. It's really frustrating, both to try to explain to her and to try to understand it myself: I don't want to take vitamins or change my diet. Why not, if they could help me feel better? Shouldn't I want to do anything?
There's a whole bunch of guilt wound up in the vitamins and diet that doesn't come up when discussing medication. If I can cure my depression and ADHD by eating better and taking vitamins, that means I'm not really sick, right? Obviously the cure was at my fingertips the whole time and I just was too bad of a person to see it. I don't want it to work, because that would mean I've been faking it this whole time.
Depression, practically by definition, comes with a shitload of guilt and self-blame over things that may or may not be my fault (I have a biased perspective here, obviously, writing from the inside) and a lot of them are about not being really sick, just lazy/an attention seeker/a bad person. Admitting that diet and vitamins and exercise could help feels like admitting that I'm just a terrible person.
Medication is a lot safer. If you take medication, it means there's something wrong with you. Healthy people take vitamins and then drink wheatgrass juice and jog out the door with a big smile or something. Sick people take meds.
So by choosing to side with medication rather than vitamins, I'm siding with safety, with validating my own emotions and feelings. It's annoying, though, because there's always the shadow of the vitamins, haunting my brain, whispering that I'm not sick, that if I took them I could be healthy.
I don't know what to do about that.