Do Vitamins Really Make You a Healthier Person?

How many times have you heard that supplements are merely just expensive urine? Have you also heard that supplements are not needed as we get everything we need from our diet today? Even worse, recent mainstream media articles have also suggested that vitamin supplements are even harmful. It’s time to shed some light on these myths.

View Full Article on Huffington Post >

Leave a Reply