We’ve all gone to the store formally called natural food store or coop because we thought organic was better.
Organic only means that wasn’t grown with pesticides, but does it mean that its healthier and/or better for you.
Experts say that their isn’t enough evidence to say their is a distinct advantage to eating organic. You are less likely to encounter pesticides, which may come back to haunt all of us with side affects. While the natural fertilizer, manure, may cause E.Coli. Regarding if you go conventional or organic is it recommended that you rinse all fruits and vegetables prior to eating.
Here is where I play the green card and I don’t mean money? By not purchasing conventional foods, we aren’t hurting the earth with pesticides.
My gut tells me how long can your wallet sustain the cost go for it? What do you think? Do you buy organic, if so why?