The majority of people old and young take vitamins or supplements because they believe that they need them. The producers of vitamins have convinced people that taking vitamins improves your health and prevents getting sick. Well, neither the FDA nor any clinical studies have proven that taking vitamins is of any benefit to whoever uses them. I have never heard of anyone who takes vitamins that has cured themselves of any kind of malady that they have. Of course, taking vitamins will not do any harm to you, and certainly, taking vitamins is like taking a placebo, in other words, it makes you believe that you’re improving you health. No such thing. If people would take an adequate diet of vegetables, fruits, and other essential nutrients, they do not need to take vitamins for any reason. Please, if you believe strongly in vitamins and think I’m in error, make a comment and enlighten me and I will be more than thankful. But you have to prove it to me first 🙂