Are Organic Foods Really Healthier?

It has been said eating organic foods have greater nutritional benefits and can even fight cancer. That is why they sometimes spend twice as much on organic products compared to non organic ones.

The recent purchase of Whole Foods by Amazon highlights the explosive growth in organic sales.

The US Department of Agriculture says that foods labeled organic must be:

  • Grown without synthetic fertilizers
  • No GMO
  • Animals raised without antibiotics or hormones
  • Access to outdoors

But are organics healthier? Are they worth the money?

Here is the situation.

Concerns started growing about the growing use of pesticides and growth hormones in animal feed.

In 1990 Congress passed the Organics Foods Production Act to develop national Standards for organic foods.

And organic products become more common. Mainstream grocery chains started producing their own organic lines.

Here is the argument.

Proponents say organic foods have more nutrients including antioxidants and vitamins. They also suggest eating organic limited exposure to toxic chemicals that may lead to certain cancers.

But while eating organic does reduce your exposure to pesticides there is no that the trace amounts in food are a danger.

Scientific surveys haven’t food that organics foods are much more nutritious. And just because food is organics does not mean it is good for you. It may be nearly as high in sugar, sodium and other unhealthy ingredients as non organic products.

Yet, more farms are going organic to feed the appetite for pure foods. With big supermarkets chains selling more organic products and Amazon’s determination to drive down prices at Whole Foods, organic foods should only increase in popularity healthier or not.

Personally, I prefer to eat healthy which includes organic and look forward to prices dropping through competition.


Image: 1

H2
H3
H4
3 columns
2 columns
1 column
Join the conversation now
Logo
Center