Iodized salt is so commonplace in the U.S. today that you may never have given the additive a second thought. But new research finds that humble iodine has played a substantial role in cognitive improvements seen across the American population in the 20th century.
Iodine is a critical micronutrient in the human diet—that is, something our bodies can’t synthesize that we have to rely on food to obtain—and it’s been added to salt (in the form of potassium iodide) since 1924. Originally, iodization was adopted to reduce the incidence of goiter, an enlargement of the thyroid gland. But research since then has found that iodine also plays a crucial role in brain development, especially during gestation.
Iodine deficiency today is the leading cause of preventable mental retardation in the world. It’s estimated that nearly one-third of the world’s population has a diet with too little iodine in it, and the problem isn’t limited to developing countries—perhaps one-fifth of those cases are in Europe (pdf), where iodized salt is still not the norm.
Iodine’s Natural Experiment With this background, then, a group of economists saw a natural experiment: comparing the intelligence of children born just before 1924—the year iodization began—and those born just after. James Freyer, David Weil and Dimitra Politi used military data from the early 1900s 1920s, when World War II drove millions of men and women to enlist.
Recruits all took a standardized intelligence test as part of their enlistment. Researchers didn’t have access to the test scores themselves, but they had a clever substitute: smarter recruits were assigned to the Air Forces while the less bright ones went to the Ground Forces. This allowed the researchers to infer test scores depending on which branch a recruit was selected for.
Intelligence data were paired with birthdate and hometown, since iodine levels in the soil and water vary significantly from place to place. To estimate which regions were naturally high-iodine and which were low, the researchers referred to nationwide statistics collected after World War I on the prevalence of goiter.
In all, researchers had sufficient data on about 2 million male recruits born between 1921 and 1927.
The economists found that in the lowest-iodine areas—the bottom quarter of the study population—the introduction of iodized salt had stark effects. Men from these regions born in 1924 or later were significantly more likely to get into the Air Force and had an average IQ that was 15 points higher than their predecessors.
Nationwide, that averages out to a 3.5-point rise in IQ because of iodization, the researchers report in a paper for the National Bureau of Economic Research.
The initiative wasn’t without its drawbacks—sudden iodine supplementation among people who are deficient can cause thyroid-related deaths. The researchers estimate that 10,000 deaths in the decades after 1924 were caused by salt iodization.
But on the positive side, iodine deficiency and its symptoms were vanquished almost overnight. And iodine’s mental benefits may even help explain the Flynn Effect, which observes that IQ rose about 3 points per decade in developed countries throughout the 20th century. It’s been thought that improved health and nutrition were the driving forces of the Flynn Effect. Now, it appears that iodine alone was responsible for roughly one decade of that remarkable climb. All the more reason, then, for the rest of the world to follow suit and relegate iodine deficiency to the history books.
Image by trekkyandy via Flickr