Can Eating Organic Help With the Appearance of Your Skin?

If you eat enough vegetables, could your skin start to look better?
If you eat enough vegetables, could your skin start to look better?
Photo courtesy of

If you are what you eat, does that mean that eating natural foods makes you naturally beautiful? The short answer is...maybe.

Organic food, by definition, is food that's been grown or raised according to approved national guidelines meant to "foster cycling of resources, promote ecological balance, and conserve biodiversity." [Source: USDA] In other words, it's responsibly-produced food that's better for the environment than its conventional counterparts. Whether organic food is better for the people eating it, however, remains unknown.

Most of the food production that goes on in the United States is not organic. Produce is often treated with synthetic fertilizers or pesticides to help it grow and protect it from insects. It may be exposed to radiation, which can extend its shelf life and prevent foodborne illnesses. Or it may have been genetically modified in a lab so that it will grow bigger, look more appealing, or develop tolerance to things like heat and drought. Likewise, most livestock and poultry are fed nonorganic grains, treated with antibiotics to keep them from contracting diseases, or injected with growth hormones.

Certified organic food, which makes up about 2 percent of the U.S. food market, cannot be produced using any of the above methods. [Source: OFRF] Because of this, eating organic food can cut down on the amount of chemical (and even drug) residue you're ingesting. Many doctors and experts say that this translates into better overall health -- and many dermatologists say that it can even improve a person's outward appearance. Scientific studies to support these theories, however, are lacking.

A 2002 study from the nonprofit Organic Materials Review Institute found that 13 percent of organic produce samples contained pesticide residue, versus 71 percent of conventionally grown produce. [Source: Baker et al.] However, a Stanford University study made headlines in 2012 when it concluded that there is no significant nutritional benefit -- no significant health benefit at all, in fact -- to eating organic. [Source: Spangler, et al.] People who don't eat organic may be exposed to more chemicals, the researchers found, but the levels in their bodies still fall below those that could compromise health or safety.