Following in the McGovern Senate Select Committee's lead, the USDA drafted its first official Dietary Guidelines for Americans in 1980. The recommendations were strikingly similar to those of McGovern's recommendations, despite the fact that many of the organizations involved in food and nutrition, like the Food and Drug Administration (FDA), the National Academy of Sciences (NAS), and the National Institute of Health (NIH), had considered the original Dietary Goals to be largely a political document instead of a scientific one. Now, solid scientific evidence was needed to support the low-fat diets they were promoting (that's a little backwards, no?). In the early 70's, NIH administrators decided against conducting a $1 billion clinical trial that would likely offer a definitive answer to whether or not low-fat diets prolong life. Instead, they opted to conduct a half-dozen smaller observational studies, at a third of the cost, that they hoped would provide the evidence they were looking for. The results of these studies were published between 1980 and 1984.
Four of these studies tried to establish relationships between dietary fat and heart disease. They observed populations in Honolulu, Puerto Rico, Chicago, and Framingham, Massachusetts, and although some of the data suggested an association with fat and heart disease, the data involving all-cause death revealed a different story. In the Honolulu study, the researchers followed 7,300 men of Japanese descent and found that those who developed heart disease ate slightly more fat and saturated fat than those who didn't. However, the men who actually died tended to eat less fat and less saturated fat, so low-fat diets were associated with increased mortality. Similar results were seen in Framingham and Puerto Rico. The researchers' interpretation of these results is absolutely astounding. They reported that because men in Puerto Rico and Honolulu who remained free of heart disease ate more starch, the studies suggest that Americans should follow the Dietary Guidelines and eat more starch. Consequently, in order to avoid eating too many calories, Americans should also reduce fat intake. WHAT?? Talk about twisting the results to support your hypothesis...
And it gets worse. Also found in this data, with the exception of the Chicago study, was an association between cholesterol levels and cancer rates. That is, low cholesterol predicts higher cancer rates. This link was not abnormal. In fact, by 1980, this association was showing up regularly in studies like this. But this connection seemed to present a problem for the diet-heart hypothesis and was never publicized the way it should have been. Just ask someone on the street today if they've ever heard that high cholesterol may be protective of cancer... they'll probably laugh and walk away. Regardless, there is a strong link there. In the Framingham study, men whose total cholesterol was below 190 mg/dl were three times more likely to get colon cancer as men with cholesterol over 220 mg/dl, and they were almost twice as likely to get any form of cancer than those with cholesterol over 280 mg/dl. This is not an anomaly. It is in fact very typical of these types of studies. Those who have higher cholesterol tend to have a slightly greater risk of heart disease, but those with lower cholesterol clearly have a greater risk of cancer, and often times a greater total mortality rate as well.
The National Heart, Lung, and Blood Institute (NHLBI) also published two studies in the early 1980's that were supposed to provide support for the diet-heart hypothesis. The first was the Multiple Risk Factor Intervention Trial (MRFIT), which collected a group of 12,000 men who were considered at imminent risk of having a heart attack; they all had a total cholesterol level over 290 mg/dl. The men were randomly divided into two groups, a control group and an intervention group. The control group was told to live, eat, and address their health problems however they wanted, while the intervention group was counseled to quit smoking, take medication to control their high blood pressure, and eat a low-fat, low-cholesterol diet. The men were then followed for 7 years. The results, announced in October 1982, showed that there had been slightly more deaths in the intervention group than in the control group. Also of note, despite the fact that 21% of those in the intervention group quit smoking compared to only 6% in the control group, the intervention group had more lung cancer. The researchers attributed this to the fact that those on the low-fat diet had lower cholesterol levels, hence were more likely to succumb to cancer. I would concur.
The other NHLBI study was the Lipid Research Clinics (LRC) Coronary Primary Prevention Trial. This trial collected 3,800 men who had cholesterol levels over 265 mg/dl, considered imminently likely to suffer a heart attack. All of the participants were counseled to eat a cholesterol-lowering diet, but half of them took a cholesterol-lowering drug called cholestyramine, while the control group took a placebo. So, to clarify, the only difference between the two groups was the presence or absence of the cholesterol-lowering drug. This means that the only variable being tested was the effectiveness of the drug, nothing else. Here are the results: In the control group, cholesterol levels dropped 4%, 158 men suffered non-fatal heart attacks, 38 died from heart attacks, and overall 71 men died. In the group receiving cholestyramine, cholesterol levels dropped by 13%, 130 men suffered non-fatal heart attacks, 30 died from heart attacks, and overall 68 men died. Putting the heart disease numbers aside, 71 deaths versus 68 deaths. This means that the cholesterol-lowering drug had improved by less than 0.2% the chance that any one of the men who took it would live through the next decade. Very insignificant. Any right-minded person would wonder, then, how such an unimpressive cholesterol-lowering drug trial was featured in Time magazine as proof that cholesterol was a plague to us all, and that we need to lower our cholesterol by eating low-fat, low-cholesterol diets. The heading of the article read "Sorry, It's True. Cholesterol Really Is a Killer." Basil Rifkind, who headed the study, was quoted in the article as saying, "It is now indisputable that lowering cholesterol with diet and drugs can actually cut the risk of developing heart disease and having a heart attack." That may be true, but it doesn't appear to save any lives, and that's the ultimate goal.
I hope it is clear at this point that the research backing the diet-heart hypothesis is surprisingly thin. You would think that for an entire nation to adopt these low-fat diets as the gold-standard in healthy eating, and most of the civilized world for that matter, there would need to be a wealth of research supporting it. As I've shown here, the science is ambiguous at best, but by this point, the media had grabbed a hold of the low-fat diet so hard that it would never let go. The USDA Dietary Guidelines for Americans are republished every 5 years and they're considered to be the most comprehensive, unbiased assessment of the science by many in the field of nutrition, even though they've barely changed since 1980. Dietitians blindly take it as truth and implement it in their work often without questioning it. To them I say, get some historical context.
Next time, in what will definitely be the last part of this Historical Context series, I'll wrap up all of this information in a nice, neat little package and attempt to bring it all together.
Saturday, March 26, 2011
Thursday, March 24, 2011
Historical Context, Part 4 - Dietary Goals for the United States
Probably the most influential event in the acceptance of the diet-heart hypothesis, the one that finally cemented the idea that we should all eat less fat and cholesterol, was the 1977 publication of Senator George McGovern's Dietary Goals for the United States. This was, in McGovern's words, "the first comprehensive statement by any branch of the Federal Government on risk factors in the American diet." In other words, until now, government had never told Americans what they should be eating. Unfortunately, what led to the publication of these guidelines had little to do with nutritional science.
The influence of Ancel Keys on this process cannot be understated. The man should take a large part of the credit (or blame) for convincing the country to fear fat, not just through his research but also through his influence on the American Heart Association (AHA), for which he was a board member. As early as 1960, a full 17 years prior to the government recommendations, the AHA had begun recommending that Americans eat less saturated fat and cholesterol by avoiding red meat. By 1970, the AHA had broadened their recommendations to all Americans (formerly they only applied to those with high cholesterol and smokers) and had begun an alliance with the vegetable oil and margarine manufacturers. Two of these major manufacturers began distributing a "risk handbook" to doctors all over the country, touting the benefits of avoiding saturated fats and eating more polyunsaturated fats from vegetable oils like corn oil. Doctors, of course, would begin passing this information along to their patients; all it took now was to add a label to a product saying "low in saturated fat and cholesterol" and poof! it's a health food in the public eye. This alliance between the AHA and the vegetable oil manufacturers dissolved in the early 1970's due to research showing that polyunsaturated fats from vegetable oils and margarine could cause cancer in rats. Nevertheless, the AHA was becoming more and more well-known by the public and would soon be considered a trusted source. Today, you can find the AHA logo on such heart-healthy foods as Cocoa Puffs and Lucky Charms.
Another important political problem that was gaining momentum in the early 1970's was the problem of feeding the world's growing population. The subject of famine in the third world was a constant presence in the news, where images of starving, impoverished children from all over the world were regularly shown. A growing number of concerned individuals began blaming this world hunger on the wasteful American livestock industry. The idea was brought to the mainstream through the popularity of a number of books, such as Diet for a Small Planet by Francis Moore Lappe and Appetite for Change by Warren Belasco. According to Lappe, a 26-year-old vegetarian, the American beef industry required 20 million tons of soy and vegetable protein to produce two million tons of beef. So, he argued, we would be doing the world's growing population a favor by bypassing this process and simply subsisting on the soy and vegetable protein ourselves. This argument transformed meat-eating into a social issue, as well as a moral one. Wrote Warren Belasco in Appetite for Change, "A shopper's decision at the meat counter in Gary, Indiana would affect food availability in Bombay, India." In the eyes of these people, there wasn't enough food for everyone because the food industry was feeding it to cattle to support our meat-loving nation. Coincidentally, this sentiment ran parallel to the AHA's stance that Americans should eat less saturated fat, especially red meat. Just to clarify, I'm not some cold bastard who doesn't care about starving children in the third world, I'm simply trying to make the point that there were other factors at play in this whole diet-heart hypothesis deal that had nothing to do with recommending an optimal diet for health. Besides, in 1968, before this public starvation scare even took hold, Norman Borlaug created high-yield varieties of dwarf wheat that had ended famines in India and Pakistan and averted the predicted mass starvations. The hunger problem was already on its way to a solution.
And now we come to Senator George McGovern. The aforementioned document, the 1977 Dietary Goals for the United States, was a product of McGovern's Senate Select Committee on Nutrition and Human Needs, a bipartisan nonlegislative committee that had been formed in 1968 with a mandate to wipe out malnutrition in America. In its first five years, McGovern and his colleagues were very successful in implementing federal food-assistance programs to feed the hungry in America. But by 1977, McGovern's Senate Select Committee was in danger of being reorganized and downgraded to a subcommittee, which would operate under the Senate Committee on Agriculture. As investigative reporter William Broad explained it in a 1979 article, the Dietary Goals constituted a last-ditch effort to save McGovern's committee from reorganization. The committee members knew that this was primarily a political move. Committee staff director Marshall Matz was quoted as admitting, "We really were totally naive, a bunch of kids, who just thought, Hell, we should say something on this subject before we go bankrupt." So McGovern and his committee decided to "just pick one" and support the diet-heart hypothesis, and recommend that Americans consume less fat and cholesterol. The committee used the "changing American diet" story for the basis of its position, stating that at the turn of the century Americans consumed less fat and more carbohydrates, and heart disease was rare (refer to part 2 of this series for more on that). Incidentally, they also loved Ancel Keys' Seven Countries Study (boooo.).They emphasized the need to return to the diet of the past, reducing meat and fat intake in favor of grains and other carbohydrates.
Now, put yourself in George McGovern's shoes for a moment... by endorsing the low-fat diet, you're winning on so many levels. First, you're promoting the Senate Select Committee's reputation, not to mention your reputation as a politician. Let's not forget politicians' obligation to make themselves look good. Second, you're promoting a diet that can feed more people, the kind of diet that a government can get behind. And third, you're giving the American people concrete advice to follow that you believe will improve their health. The Dietary Guidelines were a culmination of all of these factors. The major problem with the guidelines, however, was that now the public thought the debate was over; that fat and cholesterol were killers... and the science didn't support that. The guidelines make it seem as though the data was clear, while in reality it was anything but. Skeptics would continue to say that more research was needed in order to offer accurate advice, but unfortunately "more research needed" isn't particularly quotable or catchy. The key concept to understand here is that the Dietary Guidelines for the United States was not a scientific document; it was a political one. In the last part of this Historical Context series, I'll address the actual dietary research that refutes this low-fat and cholesterol dogma, and also discuss the ginormous impact that the Dietary Guidelines had on public opinion.
The influence of Ancel Keys on this process cannot be understated. The man should take a large part of the credit (or blame) for convincing the country to fear fat, not just through his research but also through his influence on the American Heart Association (AHA), for which he was a board member. As early as 1960, a full 17 years prior to the government recommendations, the AHA had begun recommending that Americans eat less saturated fat and cholesterol by avoiding red meat. By 1970, the AHA had broadened their recommendations to all Americans (formerly they only applied to those with high cholesterol and smokers) and had begun an alliance with the vegetable oil and margarine manufacturers. Two of these major manufacturers began distributing a "risk handbook" to doctors all over the country, touting the benefits of avoiding saturated fats and eating more polyunsaturated fats from vegetable oils like corn oil. Doctors, of course, would begin passing this information along to their patients; all it took now was to add a label to a product saying "low in saturated fat and cholesterol" and poof! it's a health food in the public eye. This alliance between the AHA and the vegetable oil manufacturers dissolved in the early 1970's due to research showing that polyunsaturated fats from vegetable oils and margarine could cause cancer in rats. Nevertheless, the AHA was becoming more and more well-known by the public and would soon be considered a trusted source. Today, you can find the AHA logo on such heart-healthy foods as Cocoa Puffs and Lucky Charms.
Another important political problem that was gaining momentum in the early 1970's was the problem of feeding the world's growing population. The subject of famine in the third world was a constant presence in the news, where images of starving, impoverished children from all over the world were regularly shown. A growing number of concerned individuals began blaming this world hunger on the wasteful American livestock industry. The idea was brought to the mainstream through the popularity of a number of books, such as Diet for a Small Planet by Francis Moore Lappe and Appetite for Change by Warren Belasco. According to Lappe, a 26-year-old vegetarian, the American beef industry required 20 million tons of soy and vegetable protein to produce two million tons of beef. So, he argued, we would be doing the world's growing population a favor by bypassing this process and simply subsisting on the soy and vegetable protein ourselves. This argument transformed meat-eating into a social issue, as well as a moral one. Wrote Warren Belasco in Appetite for Change, "A shopper's decision at the meat counter in Gary, Indiana would affect food availability in Bombay, India." In the eyes of these people, there wasn't enough food for everyone because the food industry was feeding it to cattle to support our meat-loving nation. Coincidentally, this sentiment ran parallel to the AHA's stance that Americans should eat less saturated fat, especially red meat. Just to clarify, I'm not some cold bastard who doesn't care about starving children in the third world, I'm simply trying to make the point that there were other factors at play in this whole diet-heart hypothesis deal that had nothing to do with recommending an optimal diet for health. Besides, in 1968, before this public starvation scare even took hold, Norman Borlaug created high-yield varieties of dwarf wheat that had ended famines in India and Pakistan and averted the predicted mass starvations. The hunger problem was already on its way to a solution.
And now we come to Senator George McGovern. The aforementioned document, the 1977 Dietary Goals for the United States, was a product of McGovern's Senate Select Committee on Nutrition and Human Needs, a bipartisan nonlegislative committee that had been formed in 1968 with a mandate to wipe out malnutrition in America. In its first five years, McGovern and his colleagues were very successful in implementing federal food-assistance programs to feed the hungry in America. But by 1977, McGovern's Senate Select Committee was in danger of being reorganized and downgraded to a subcommittee, which would operate under the Senate Committee on Agriculture. As investigative reporter William Broad explained it in a 1979 article, the Dietary Goals constituted a last-ditch effort to save McGovern's committee from reorganization. The committee members knew that this was primarily a political move. Committee staff director Marshall Matz was quoted as admitting, "We really were totally naive, a bunch of kids, who just thought, Hell, we should say something on this subject before we go bankrupt." So McGovern and his committee decided to "just pick one" and support the diet-heart hypothesis, and recommend that Americans consume less fat and cholesterol. The committee used the "changing American diet" story for the basis of its position, stating that at the turn of the century Americans consumed less fat and more carbohydrates, and heart disease was rare (refer to part 2 of this series for more on that). Incidentally, they also loved Ancel Keys' Seven Countries Study (boooo.).They emphasized the need to return to the diet of the past, reducing meat and fat intake in favor of grains and other carbohydrates.
Now, put yourself in George McGovern's shoes for a moment... by endorsing the low-fat diet, you're winning on so many levels. First, you're promoting the Senate Select Committee's reputation, not to mention your reputation as a politician. Let's not forget politicians' obligation to make themselves look good. Second, you're promoting a diet that can feed more people, the kind of diet that a government can get behind. And third, you're giving the American people concrete advice to follow that you believe will improve their health. The Dietary Guidelines were a culmination of all of these factors. The major problem with the guidelines, however, was that now the public thought the debate was over; that fat and cholesterol were killers... and the science didn't support that. The guidelines make it seem as though the data was clear, while in reality it was anything but. Skeptics would continue to say that more research was needed in order to offer accurate advice, but unfortunately "more research needed" isn't particularly quotable or catchy. The key concept to understand here is that the Dietary Guidelines for the United States was not a scientific document; it was a political one. In the last part of this Historical Context series, I'll address the actual dietary research that refutes this low-fat and cholesterol dogma, and also discuss the ginormous impact that the Dietary Guidelines had on public opinion.
Friday, March 18, 2011
Historical Context, Part 3 - Ancel Keys
Ancel Keys, a University of Minnesota physiologist, deserves much of the credit for convincing the public that dietary fat and cholesterol are killers. He initially became famous through his development of the "K-ration" for feeding combat troops in World War II; the "K" stood for Keys. He then performed a series of human starvation studies and wrote the book "The Biology of Human Starvation", which made him a well-known, reputable nutrition researcher. Originally, Keys did not believe dietary fat and cholesterol had anything to do with the rising heart disease rates, but his opinion changed when he attended a conference in Rome in 1951, where he spoke with a physiologist from Naples, Italy who boasted about the lack of heart disease in his city. The diet in southern Italy was low in animal products, and the people there, especially the poor, tended to have lower cholesterol than those in the United States. The rich in Naples, however, ate more meat, and had higher cholesterol levels and heart disease rates. This convinced Keys for the first time that dietary fat from meat was driving the heart disease epidemic in the United States.
There were two key observational studies performed by Ancel Keys that ended up having an impact on the public's view of dietary fat. The first, which many researchers did not taken seriously, was the 1953 study he performed involving six countries, comparing their fat intake to their heart disease rates. The six countries he reported on (United States, Canada, Australia, UK, Italy, and Japan), showed a very strong association between fat intake and heart disease. Now, of course, this is only an observational study and no cause and effect can be determined. But the biggest problem with his study is that he left out the data from the 16 other countries for which data was available. When all 22 countries are considered, his perfect correlation turns into a much weaker one.
Initially, in 1957, the American Heart Association (AHA) opposed Ancel Keys on the diet-heart hypothesis. They wrote a 15-page report that year denouncing Keys and similar researchers for jumping to conclusions about the diet-heart hypothesis when there was no good evidence that it was true. Less than four years later, in December of 1960, the AHA flipped their stance and adopted the diet-heart hypothesis as their new philosophy on heart health, proclaiming that "the best scientific evidence of the time" strongly suggested a low-fat diet, or at least replacing saturated fats with polyunsaturated fats, is preventative of heart disease. What had changed in that four-year period? Not the evidence. There was no new evidence to either confirm or reject the diet-heart hypothesis. What had changed is that Ancel Keys and Jeremiah Stamler, another supporter of Keys, had now made up two of the six members on the AHA committee. Soon after, Ancel Keys was enshrined as the face of dietary wisdom in America in an article in Time magazine. The article discussed Keys' idea of a heart-healthy diet as one in which nearly 70% of calories came from carbohydrates and just 15% from fat. Despite the fact that there was ZERO evidence from clinical trials to back up this claim, the article only contained one short paragraph explaining that Keys' hypothesis was "still questioned by some researchers with conflicting ideas of what causes coronary heart disease."
The second important study done by Ancel Keys was considered to be his masterpiece, The Seven Countries Study. This study is still, today, considered to be a landmark study because of the pivotal role it played in the acceptance of the diet-heart hypothesis. Launched in 1956, Keys' followed 16,000 middle-aged men for over a decade and tracked their diets and their heart-disease risk. The populations he chose came from seven countries: Italy, Yugoslavia, Greece, Finland, the Netherlands, Japan, and the United States. The results showed, again, a remarkably clear association, but this time the association was between saturated fat and heart disease. Keys drew three conclusions from this study: 1. Cholesterol levels predicted heart disease. 2. The amount of saturated fat predicted cholesterol levels and heart disease. 3. Monounsaturated fats protected against heart disease.
Seems pretty clear huh? Not quite... there are a number of problems with the study. First and foremost, this is an observational study, and like I've said a million times, you cannot determine any causality from it. Secondly, Keys chose countries that he knew would fit his hypothesis. Had he chosen at random, he may have included countries like France or Switzerland that consume high amounts of saturated fat and have very little heart disease. Third, we know now that middle-aged men are the only population for which total cholesterol numbers can predict heart disease, and the Seven Countries Study only looked at middle-aged men. Lastly, and perhaps most importantly, Keys didn't look at total mortality, even though what we really want to know is whether or not we'll live longer. Coronary heart disease accounted for less than a third of deaths. He said himself in a 1984 follow-up paper, "little attention was given to longevity or total mortality." Interestingly, if all-cause death had been taken into account, Keys would have found that the American population he studied lived longer than any other population with the exception of the Crete islanders, despite their high cholesterol.
Even with all of the problems with Ancel Keys' research, his findings on saturated fat and cholesterol would have a profound impact on the public due to a sort-of perfect storm of events that would eventually lead up to the first government dietary recommendations, Senator George McGovern's 1977 Dietary Guidelines for America. More on that in part 4!
There were two key observational studies performed by Ancel Keys that ended up having an impact on the public's view of dietary fat. The first, which many researchers did not taken seriously, was the 1953 study he performed involving six countries, comparing their fat intake to their heart disease rates. The six countries he reported on (United States, Canada, Australia, UK, Italy, and Japan), showed a very strong association between fat intake and heart disease. Now, of course, this is only an observational study and no cause and effect can be determined. But the biggest problem with his study is that he left out the data from the 16 other countries for which data was available. When all 22 countries are considered, his perfect correlation turns into a much weaker one.
Initially, in 1957, the American Heart Association (AHA) opposed Ancel Keys on the diet-heart hypothesis. They wrote a 15-page report that year denouncing Keys and similar researchers for jumping to conclusions about the diet-heart hypothesis when there was no good evidence that it was true. Less than four years later, in December of 1960, the AHA flipped their stance and adopted the diet-heart hypothesis as their new philosophy on heart health, proclaiming that "the best scientific evidence of the time" strongly suggested a low-fat diet, or at least replacing saturated fats with polyunsaturated fats, is preventative of heart disease. What had changed in that four-year period? Not the evidence. There was no new evidence to either confirm or reject the diet-heart hypothesis. What had changed is that Ancel Keys and Jeremiah Stamler, another supporter of Keys, had now made up two of the six members on the AHA committee. Soon after, Ancel Keys was enshrined as the face of dietary wisdom in America in an article in Time magazine. The article discussed Keys' idea of a heart-healthy diet as one in which nearly 70% of calories came from carbohydrates and just 15% from fat. Despite the fact that there was ZERO evidence from clinical trials to back up this claim, the article only contained one short paragraph explaining that Keys' hypothesis was "still questioned by some researchers with conflicting ideas of what causes coronary heart disease."
The second important study done by Ancel Keys was considered to be his masterpiece, The Seven Countries Study. This study is still, today, considered to be a landmark study because of the pivotal role it played in the acceptance of the diet-heart hypothesis. Launched in 1956, Keys' followed 16,000 middle-aged men for over a decade and tracked their diets and their heart-disease risk. The populations he chose came from seven countries: Italy, Yugoslavia, Greece, Finland, the Netherlands, Japan, and the United States. The results showed, again, a remarkably clear association, but this time the association was between saturated fat and heart disease. Keys drew three conclusions from this study: 1. Cholesterol levels predicted heart disease. 2. The amount of saturated fat predicted cholesterol levels and heart disease. 3. Monounsaturated fats protected against heart disease.
Seems pretty clear huh? Not quite... there are a number of problems with the study. First and foremost, this is an observational study, and like I've said a million times, you cannot determine any causality from it. Secondly, Keys chose countries that he knew would fit his hypothesis. Had he chosen at random, he may have included countries like France or Switzerland that consume high amounts of saturated fat and have very little heart disease. Third, we know now that middle-aged men are the only population for which total cholesterol numbers can predict heart disease, and the Seven Countries Study only looked at middle-aged men. Lastly, and perhaps most importantly, Keys didn't look at total mortality, even though what we really want to know is whether or not we'll live longer. Coronary heart disease accounted for less than a third of deaths. He said himself in a 1984 follow-up paper, "little attention was given to longevity or total mortality." Interestingly, if all-cause death had been taken into account, Keys would have found that the American population he studied lived longer than any other population with the exception of the Crete islanders, despite their high cholesterol.
Even with all of the problems with Ancel Keys' research, his findings on saturated fat and cholesterol would have a profound impact on the public due to a sort-of perfect storm of events that would eventually lead up to the first government dietary recommendations, Senator George McGovern's 1977 Dietary Guidelines for America. More on that in part 4!
Wednesday, March 16, 2011
Historical Context, Part 2 - The Diet-Heart Hypothesis
In the 1950's in America, the diet-heart hypothesis was born, theorizing that the fat in our diets caused heart disease. Proponents of this hypothesis had two very compelling reasons to believe in it. First, was the increase in heart disease rates, which more than doubled since the 1920's. The other was the "changing American diet" story; the idea that at the turn of the century, Americans were consuming significantly more grains and less meat and were healthier for it. These two ideas together formed the basis for the diet-heart hypothesis. The fat-laden diet of the 1950's must have been the reason for the skyrocketing heart disease rates, right? I would say no. Both of these ideas, the foundation for the diet-heart hypothesis, are easily explained by other phenomena. Much of the information in this and the rest of this historical series will come from Gary Taubes' fantastic book Good Calories, Bad Calories. I'd recommend it to everyone, but it's so information dense that it reads sort of like a textbook. Luckily, you have me to summarize for you.
First on the list is the belief that heart disease was rare before the 1920's and grew into America's #1 killer by the 1950's. Census data showed in 1910 that only 250 Americans out of every thousand would die from heart disease, but in 1950, that number had risen to 560, more than double. So the real question is, then, did heart disease rates really increase, or was there simply an increase in the awareness of the disease, or perhaps better technology to diagnose it? As it turns out, the use of the newly invented electrocardiogram in 1918 made heart disease much more easily diagnosable. People were living longer by the 1950's too due to antibiotics that could control infectious disease; life expectancy had increased from 48 years in 1900 to 67 years in 1950. As we know today, very few heart attacks are seen in 48-year-olds, and obviously the longer one lives the more likely one is to develop a chronic disease like heart disease or cancer, which, incidentally, also increased in this time span. An increase in heart disease diagnoses was also due to newly discovered variations of heart disease, like the new cause-of-death category added in 1949 for arteriosclerotic heart disease. From 1948 to 1949 alone, total heart disease rates increased by 20% for white males and 35% for white females. A similar pattern was then seen when a category for ischemic heart disease was added in 1965. Based on all of this information, it appears that this "great epidemic" may not have roots in diet at all.
The other half of this argument, the "changing American diet" story can be challenged as well. Ancel Keys, a University of Minnesota researcher who will be the subject of the next piece in this series, wrote in 1953, "The present high level of fat in the American diet did not always prevail, and this fact may not be unrelated to the indication that coronary disease is increasing in this country." Keys, and other supporters of the diet-heart hypothesis, envisioned the turn of the century as an era free of chronic disease due to a high-carbohydrate, low-fat diet. The food disappearance data, though, which this assertion is based on, were not reliable. The statistics date back to 1909, but the USDA only began compiling the data in the early 1920's. The resulting numbers for per-capita consumption are acknowledged to be, at best, rough estimates. Here's an example of the kind of data I'm talking about: this one shows estimated flour and grain consumption.
The data before 1942 were particularly sketchy, especially when it came to any foods that were grown in a garden or eaten straight off the farm, such as animals slaughtered for local consumption. In fact, David Call, a former dean of the Cornell University College of Agriculture and Life Sciences, when asked about the early food disappearance data, stated that "Until World War II, the data are lousy, and you can prove anything you want to prove." Historians of American dietary habits can provide some insight into the diet before the turn of the century, siting several sources indicating that in the 1800's, Americans were a nation of meat-eaters, typically eating meat 3 or 4 times per day. Also of historical note is the fact that at the turn of the century, pasta was considered by the general public to be "a typical and peculiarly Italian food", according to The Grocer's Encyclopedia of 1911, and rice was still an exotic item imported from the Far East, so Americans may not have been eating much of these foods. But, if it is true that grain consumption was high and meat consumption was low by 1909, it was probably a brief departure from our meat-based diets of the past. At the time, the cattle industry was reportedly having trouble producing enough meat to feed the growing United States population, so there would have been less meat available. Americans would have had to cut back.
What is most interesting to me, is that if these diet-heart hypothesis supporters wanted to use the food disappearance data as evidence to support their claim, why then did they choose to ignore the data on fruit and vegetable consumption? In these years between 1909 and the 1950's when heart disease rates doubled, vegetable consumption increased dramatically. Americans nearly doubled their consumption of leafy green and yellow vegetables, tomatoes, and citrus fruit. Why was this not taken into account? This is one of many examples of proponents of the diet-heart hypothesis choosing to ignore the evidence that doesn't support their ideals. This is exactly the type of thing a good scientist tries to avoid, but this type of bias occurred frequently in the development of the low-fat theory.
So in the end, the diet-heart hypothesis was created in order to provide an explanation for the "heart disease epidemic". But if you look deep enough, there was already an explanation for it; one that made a whole lot more sense. And if you look at American dietary history before 1909, it looks like we ate a lot of meat, making the low-meat, high-grain diet of the early 19th century a deviation from the norm. Taken one step further using the lens of evolutionary biology, which they did not have the luxury of in the 1950's, prehistoric humans ate more meat than Americans ever did, they lived into their 70's, and heart disease was virtually nonexistent. Nonetheless, many researchers, Ancel Keys in particular, became enamored with the idea that a low-fat diet was the key to keeping heart disease at bay, and he was determined to prove he was right. Stay tuned for part 3!
First on the list is the belief that heart disease was rare before the 1920's and grew into America's #1 killer by the 1950's. Census data showed in 1910 that only 250 Americans out of every thousand would die from heart disease, but in 1950, that number had risen to 560, more than double. So the real question is, then, did heart disease rates really increase, or was there simply an increase in the awareness of the disease, or perhaps better technology to diagnose it? As it turns out, the use of the newly invented electrocardiogram in 1918 made heart disease much more easily diagnosable. People were living longer by the 1950's too due to antibiotics that could control infectious disease; life expectancy had increased from 48 years in 1900 to 67 years in 1950. As we know today, very few heart attacks are seen in 48-year-olds, and obviously the longer one lives the more likely one is to develop a chronic disease like heart disease or cancer, which, incidentally, also increased in this time span. An increase in heart disease diagnoses was also due to newly discovered variations of heart disease, like the new cause-of-death category added in 1949 for arteriosclerotic heart disease. From 1948 to 1949 alone, total heart disease rates increased by 20% for white males and 35% for white females. A similar pattern was then seen when a category for ischemic heart disease was added in 1965. Based on all of this information, it appears that this "great epidemic" may not have roots in diet at all.
The other half of this argument, the "changing American diet" story can be challenged as well. Ancel Keys, a University of Minnesota researcher who will be the subject of the next piece in this series, wrote in 1953, "The present high level of fat in the American diet did not always prevail, and this fact may not be unrelated to the indication that coronary disease is increasing in this country." Keys, and other supporters of the diet-heart hypothesis, envisioned the turn of the century as an era free of chronic disease due to a high-carbohydrate, low-fat diet. The food disappearance data, though, which this assertion is based on, were not reliable. The statistics date back to 1909, but the USDA only began compiling the data in the early 1920's. The resulting numbers for per-capita consumption are acknowledged to be, at best, rough estimates. Here's an example of the kind of data I'm talking about: this one shows estimated flour and grain consumption.
The data before 1942 were particularly sketchy, especially when it came to any foods that were grown in a garden or eaten straight off the farm, such as animals slaughtered for local consumption. In fact, David Call, a former dean of the Cornell University College of Agriculture and Life Sciences, when asked about the early food disappearance data, stated that "Until World War II, the data are lousy, and you can prove anything you want to prove." Historians of American dietary habits can provide some insight into the diet before the turn of the century, siting several sources indicating that in the 1800's, Americans were a nation of meat-eaters, typically eating meat 3 or 4 times per day. Also of historical note is the fact that at the turn of the century, pasta was considered by the general public to be "a typical and peculiarly Italian food", according to The Grocer's Encyclopedia of 1911, and rice was still an exotic item imported from the Far East, so Americans may not have been eating much of these foods. But, if it is true that grain consumption was high and meat consumption was low by 1909, it was probably a brief departure from our meat-based diets of the past. At the time, the cattle industry was reportedly having trouble producing enough meat to feed the growing United States population, so there would have been less meat available. Americans would have had to cut back.
What is most interesting to me, is that if these diet-heart hypothesis supporters wanted to use the food disappearance data as evidence to support their claim, why then did they choose to ignore the data on fruit and vegetable consumption? In these years between 1909 and the 1950's when heart disease rates doubled, vegetable consumption increased dramatically. Americans nearly doubled their consumption of leafy green and yellow vegetables, tomatoes, and citrus fruit. Why was this not taken into account? This is one of many examples of proponents of the diet-heart hypothesis choosing to ignore the evidence that doesn't support their ideals. This is exactly the type of thing a good scientist tries to avoid, but this type of bias occurred frequently in the development of the low-fat theory.
So in the end, the diet-heart hypothesis was created in order to provide an explanation for the "heart disease epidemic". But if you look deep enough, there was already an explanation for it; one that made a whole lot more sense. And if you look at American dietary history before 1909, it looks like we ate a lot of meat, making the low-meat, high-grain diet of the early 19th century a deviation from the norm. Taken one step further using the lens of evolutionary biology, which they did not have the luxury of in the 1950's, prehistoric humans ate more meat than Americans ever did, they lived into their 70's, and heart disease was virtually nonexistent. Nonetheless, many researchers, Ancel Keys in particular, became enamored with the idea that a low-fat diet was the key to keeping heart disease at bay, and he was determined to prove he was right. Stay tuned for part 3!
Tuesday, March 15, 2011
Historical Context, Part 1 - Hunter-gatherers
It's no secret that I tend to disagree with the mainstream dietary advice we're getting from the government recommendations. I'm clearly in the minority here, but I would attribute that to the fact that the majority of people don't dig deep enough into nutrition to find the truth. The information that makes it to the public eye has become so convoluted, through politics, lobbyists, capitalism, etc, that it's just an incoherent mess. I may go into that conflict of interest on another day, so let's close that can of worms for now.
What I'd like to do, in an attempt to make my view seem a little less crazy, is provide some historical context to the human diet and how we came to where we are today. In my opinion, knowledge of history is vital to understanding any topic of interest. In order to know where you stand, you must know where you came from. I'll begin with a review of hunter-gatherer diets and move through up to the current era, when we began to believe in this high-carb nonsense. So without further delay, here is part one of this historical context series.
For more than 99% of human history, our diets were drastically different than they are today. Before the adoption of agriculture began in approximately 8000 B.C.E., humans hunted and gathered their food, consuming whatever was available to them in their environment. And "whatever was available" definitely did not include grains, legumes, or dairy, although there is some evidence that grains were eaten in emergency situations when there were no other options. The most commonly sited example of hunter-gatherer diets is the example from northeast Africa. According to Loren Cordain and S. Boyd Eaton, two of the leading researchers in paleolithic nutrition, the diet of northeast Africans is likely the ideal human diet because the majority of human history likely took place here; we are in fact nearly 100% genetically identical to these African ancestors. A typical African hunter-gatherer diet would consist of wild meat, vegetables, fruit, nuts, and seeds, clearly a stark contrast between the diets of modern civilization. Note the large meat and fish consumption, which was much higher than ours today. A typical hunter-gatherer would have consumed 55-65% of his calories from meat and fish. Research shows that vegetable and fruit consumption was much higher than in modern diets as well, even the most vegetable-rich modern diets. In terms of macronutrient ratios, African hunter-gatherers would consume 25-30% of calories from protein, 30-35% from carbohydrates, and 40-45% from fat. In contrast, modern Americans consume 15% of calories from protein, 55% from carbohydrates, and 30% from fat.
Once humans migrated out of Africa, approximately 100,000 years ago, they survived and thrived on a variety of diets. Some migrated to similarly warm climates and probably ate a similar diet, but many moved to colder climates where there were far fewer edible plants. In these cases, they relied much more heavily on animals, especially in the winter months. The latest ice age, which spanned from 60,000 years ago to 20,000 years ago, also had a significant impact on human diets, forcing them again to rely on mostly, and sometimes exclusively, animal foods. In these cases, fat must have made up at least 70% of their calories, and probably even more than that. As a general rule, those who settled in colder climates ate more calories from animal products, while those in more tropical climates ate more fruit and vegetables year round.
Clearly, humans were able to adapt and survive eating a wide variety of foods. But none of these groups ate grains, legumes, or dairy as any significant part of their diets until the agricultural revolution, which began around 10,000 years ago. At this time, domestication of animals made dairy foods a possibility, and the ability to grow food made grains (whole grains, mind you) an important part of the human diet. Aside from obvious benefits of agriculture, though, like the ability to feed a larger population, human health took a hit. In fact, there are a surprising number of unfortunate consequences that resulted from the adoption of agriculture, but this post is already too long and I'd rather not explain them. If you're interested, check out Jared Diamond's outstanding article entitled "The Worst Mistake in the History of the Human Race." But the idea that human health declined after the agricultural revolution is not debatable, despite the fact that it is contrary to popular belief today. Humans essentially sacrificed health in order to support a greater population.
In part 2 of this series of blog posts, I'll fast forward to 20th century America and we'll begin to get into modern nutritional science. Stay tuned.
What I'd like to do, in an attempt to make my view seem a little less crazy, is provide some historical context to the human diet and how we came to where we are today. In my opinion, knowledge of history is vital to understanding any topic of interest. In order to know where you stand, you must know where you came from. I'll begin with a review of hunter-gatherer diets and move through up to the current era, when we began to believe in this high-carb nonsense. So without further delay, here is part one of this historical context series.
For more than 99% of human history, our diets were drastically different than they are today. Before the adoption of agriculture began in approximately 8000 B.C.E., humans hunted and gathered their food, consuming whatever was available to them in their environment. And "whatever was available" definitely did not include grains, legumes, or dairy, although there is some evidence that grains were eaten in emergency situations when there were no other options. The most commonly sited example of hunter-gatherer diets is the example from northeast Africa. According to Loren Cordain and S. Boyd Eaton, two of the leading researchers in paleolithic nutrition, the diet of northeast Africans is likely the ideal human diet because the majority of human history likely took place here; we are in fact nearly 100% genetically identical to these African ancestors. A typical African hunter-gatherer diet would consist of wild meat, vegetables, fruit, nuts, and seeds, clearly a stark contrast between the diets of modern civilization. Note the large meat and fish consumption, which was much higher than ours today. A typical hunter-gatherer would have consumed 55-65% of his calories from meat and fish. Research shows that vegetable and fruit consumption was much higher than in modern diets as well, even the most vegetable-rich modern diets. In terms of macronutrient ratios, African hunter-gatherers would consume 25-30% of calories from protein, 30-35% from carbohydrates, and 40-45% from fat. In contrast, modern Americans consume 15% of calories from protein, 55% from carbohydrates, and 30% from fat.
Once humans migrated out of Africa, approximately 100,000 years ago, they survived and thrived on a variety of diets. Some migrated to similarly warm climates and probably ate a similar diet, but many moved to colder climates where there were far fewer edible plants. In these cases, they relied much more heavily on animals, especially in the winter months. The latest ice age, which spanned from 60,000 years ago to 20,000 years ago, also had a significant impact on human diets, forcing them again to rely on mostly, and sometimes exclusively, animal foods. In these cases, fat must have made up at least 70% of their calories, and probably even more than that. As a general rule, those who settled in colder climates ate more calories from animal products, while those in more tropical climates ate more fruit and vegetables year round.
Clearly, humans were able to adapt and survive eating a wide variety of foods. But none of these groups ate grains, legumes, or dairy as any significant part of their diets until the agricultural revolution, which began around 10,000 years ago. At this time, domestication of animals made dairy foods a possibility, and the ability to grow food made grains (whole grains, mind you) an important part of the human diet. Aside from obvious benefits of agriculture, though, like the ability to feed a larger population, human health took a hit. In fact, there are a surprising number of unfortunate consequences that resulted from the adoption of agriculture, but this post is already too long and I'd rather not explain them. If you're interested, check out Jared Diamond's outstanding article entitled "The Worst Mistake in the History of the Human Race." But the idea that human health declined after the agricultural revolution is not debatable, despite the fact that it is contrary to popular belief today. Humans essentially sacrificed health in order to support a greater population.
In part 2 of this series of blog posts, I'll fast forward to 20th century America and we'll begin to get into modern nutritional science. Stay tuned.
Saturday, March 12, 2011
A Tale of Two LDL's
When looking at blood lipid profiles, doctors tend to stress LDL cholesterol to their patients, but high LDL levels alone don't necessarily indicate high risk of heart disease. It turns out, further research has uncovered that there are in fact two types of LDL. You have pattern A LDL, which is large and fluffy, and you also have pattern B, which is small and dense. The large, fluffy type is not associated with an increased risk of heart attacks, while the small, dense type very much is. People with these larger LDL particles tend to have normal levels of other risk factors: they typically have high HDL and low triglycerides. People with small LDL experience the opposite: they typically have low HDL and elevated triglycerides. These two types of LDL clearly exhibit the exact opposite effect in terms of cardiovascular health, so why are we still so concerned with LDL? Well, probably because doctors don't normally test for LDL particle size. Maybe they should hop on that.
The discovery that there are two very different types of LDL has far-reaching implications, most notably in the saturated fat debate. For years, the USDA Dietary Guidelines have stressed to us that we should reduce saturated fat and cholesterol intake because they raise LDL levels. So what if they raise LDL levels? Do they raise pattern A or pattern B LDL? Research shows, saturated fat in the diet tends to raise the benign, large, fluffy pattern A LDL. This study from Sweden shows that people who consume more milk fat (whole milk, cheese, butter, etc.) have predominantly large, fluffy LDL. This study from UConn and this one out of Mexico both show that consumption of eggs, which are high in both saturated fat and cholesterol, result in the non-atherogenic large, fluffy LDL. This of course makes evolutionary sense as well. It is estimated that hunter-gatherers consumed at least 10-15% of their calories from saturated fat. The Dietary Guidelines say we should keep it under 10%. Maybe the USDA should actually read research instead of making recommendations that will sell more processed foods made from corn and soy.
My advice: pay no attention to your overall LDL level and don't fear saturated fat or cholesterol. Your total LDL number is meaningless unless you know which type of LDL you predominantly have. Even if your doctor doesn't check for LDL particle size, though, there is still a good way to predict which type of LDL you've got. If your HDL is high and your triglycerides low, you're probably safe regardless of your LDL count because it's going to be the large and fluffy pattern A. If your HDL is low and your triglycerides high, then you're in trouble, even if your LDL level isn't high. In the end, it looks like total LDL just isn't all that important of a predictor of heart disease. If you've read this and my previous blog post, you have to wonder... why is everyone so concerned about cholesterol??
The discovery that there are two very different types of LDL has far-reaching implications, most notably in the saturated fat debate. For years, the USDA Dietary Guidelines have stressed to us that we should reduce saturated fat and cholesterol intake because they raise LDL levels. So what if they raise LDL levels? Do they raise pattern A or pattern B LDL? Research shows, saturated fat in the diet tends to raise the benign, large, fluffy pattern A LDL. This study from Sweden shows that people who consume more milk fat (whole milk, cheese, butter, etc.) have predominantly large, fluffy LDL. This study from UConn and this one out of Mexico both show that consumption of eggs, which are high in both saturated fat and cholesterol, result in the non-atherogenic large, fluffy LDL. This of course makes evolutionary sense as well. It is estimated that hunter-gatherers consumed at least 10-15% of their calories from saturated fat. The Dietary Guidelines say we should keep it under 10%. Maybe the USDA should actually read research instead of making recommendations that will sell more processed foods made from corn and soy.
My advice: pay no attention to your overall LDL level and don't fear saturated fat or cholesterol. Your total LDL number is meaningless unless you know which type of LDL you predominantly have. Even if your doctor doesn't check for LDL particle size, though, there is still a good way to predict which type of LDL you've got. If your HDL is high and your triglycerides low, you're probably safe regardless of your LDL count because it's going to be the large and fluffy pattern A. If your HDL is low and your triglycerides high, then you're in trouble, even if your LDL level isn't high. In the end, it looks like total LDL just isn't all that important of a predictor of heart disease. If you've read this and my previous blog post, you have to wonder... why is everyone so concerned about cholesterol??
Wednesday, March 9, 2011
High Cholesterol = Longer Life?
The issue of cholesterol is complex, to say the least. You might not realize it from watching cholesterol-lowering drug commercials, or even from talking to your doctor. But there is way more to the story than simply keeping your cholesterol low. This will probably be a frequent topic on my blog, as there are so many aspects of it to be explored, but for today I'd like to focus on total cholesterol.
Check out this very eye-opening paper. It's an outstanding review of some of the more intriguing cholesterol research, and one of the few papers that I've actually found difficult to put down. I seriously couldn't stop reading it. And it blew my mind.
According to several studies, older adults with higher cholesterol live the longest. In fact, groups with the lowest cholesterol levels typically have the highest morbidity rate. Here's the breakdown from a couple of these studies... Dr. Harlan Krumholz found in 1994 that old people with low cholesterol were twice as likely to die from coronary heart disease than those with high cholesterol. Another study of 92 women aged 60 or over found that those with a total cholesterol level of about 270 mg/dl lived the longest. Those with the highest cholesterol, over 300 mg/dl, were only 1.8 times more likely to die, while the lowest cholesterol group, 154 mg/dl, was 5.4 times more likely to die.
Interesting stuff huh? That review paper discusses 20 studies just like these, where blood cholesterol levels were either not associated with cardiovascular disease or all-cause death, or there was an inverse relationship. This quote from the paper sums up the situation quite nicely...
"It is true that high t-C is a risk factor for coronary heart disease, but mainly in young and middle-aged men. If high t-C or LDL-C were the most important cause of cardiovascular disease, it should be a risk factor in both sexes, in all populations, and in all age groups. But in many populations, including women, Canadian and Russian men, Maoris, patients with diabetes, and patients with the nephrotic syndrome; the association between t-C and mortality is absent or inverse; or increasing t-C is associated with low coronary and total mortality. Most strikingly, in most cohort studies of old people, high LDL-C or t-C does not predict coronary heart disease or all-cause mortality; in several of these studies the association between t-C and mortality was inverse, or high t-C was associated with longevity. These associations have mostly been considered as a minor aberration from the LDL-receptor hypothesis, although by far the highest mortality and the greatest part of all cardiovascular disease are seen in old people."
In case you're having trouble with the terminology, t-C just means total cholesterol, LDL-C means LDL cholesterol. A couple of key points here... if cholesterol is the cause of heart disease, then shouldn't it be a risk factor for everybody, regardless of age, sex, or ethnicity? It should. But it's not. While cholesterol levels can be somewhat predictive of one's risk of heart disease, cholesterol doesn't cause the problem. The second bolded quote is very key as well. Studies done in older adults, like these ones showing that those with high cholesterol have less risk of cardiovascular disease, should not be taken as an aberration. By far the highest risk group is adults over 60, so if the conventional wisdom doesn't hold true for them, it doesn't hold true at all. These studies should indicate that our current thinking about cholesterol is highly flawed. Cholesterol simply cannot be the cause of heart disease when it is so notoriously unreliable as a predictor of heart disease in the most at-risk populations. It's that simple.
Check out this very eye-opening paper. It's an outstanding review of some of the more intriguing cholesterol research, and one of the few papers that I've actually found difficult to put down. I seriously couldn't stop reading it. And it blew my mind.
According to several studies, older adults with higher cholesterol live the longest. In fact, groups with the lowest cholesterol levels typically have the highest morbidity rate. Here's the breakdown from a couple of these studies... Dr. Harlan Krumholz found in 1994 that old people with low cholesterol were twice as likely to die from coronary heart disease than those with high cholesterol. Another study of 92 women aged 60 or over found that those with a total cholesterol level of about 270 mg/dl lived the longest. Those with the highest cholesterol, over 300 mg/dl, were only 1.8 times more likely to die, while the lowest cholesterol group, 154 mg/dl, was 5.4 times more likely to die.
Interesting stuff huh? That review paper discusses 20 studies just like these, where blood cholesterol levels were either not associated with cardiovascular disease or all-cause death, or there was an inverse relationship. This quote from the paper sums up the situation quite nicely...
"It is true that high t-C is a risk factor for coronary heart disease, but mainly in young and middle-aged men. If high t-C or LDL-C were the most important cause of cardiovascular disease, it should be a risk factor in both sexes, in all populations, and in all age groups. But in many populations, including women, Canadian and Russian men, Maoris, patients with diabetes, and patients with the nephrotic syndrome; the association between t-C and mortality is absent or inverse; or increasing t-C is associated with low coronary and total mortality. Most strikingly, in most cohort studies of old people, high LDL-C or t-C does not predict coronary heart disease or all-cause mortality; in several of these studies the association between t-C and mortality was inverse, or high t-C was associated with longevity. These associations have mostly been considered as a minor aberration from the LDL-receptor hypothesis, although by far the highest mortality and the greatest part of all cardiovascular disease are seen in old people."
In case you're having trouble with the terminology, t-C just means total cholesterol, LDL-C means LDL cholesterol. A couple of key points here... if cholesterol is the cause of heart disease, then shouldn't it be a risk factor for everybody, regardless of age, sex, or ethnicity? It should. But it's not. While cholesterol levels can be somewhat predictive of one's risk of heart disease, cholesterol doesn't cause the problem. The second bolded quote is very key as well. Studies done in older adults, like these ones showing that those with high cholesterol have less risk of cardiovascular disease, should not be taken as an aberration. By far the highest risk group is adults over 60, so if the conventional wisdom doesn't hold true for them, it doesn't hold true at all. These studies should indicate that our current thinking about cholesterol is highly flawed. Cholesterol simply cannot be the cause of heart disease when it is so notoriously unreliable as a predictor of heart disease in the most at-risk populations. It's that simple.
Sunday, March 6, 2011
Dietary Fat and Breast Cancer, Part 2
Finally, on to part 2. Let's talk about a study I found through marksdailyapple.com showing that a high fat and cholesterol diet causes faster growth and proliferation of tumors than a "normal" diet. The study was done in rats; the control group of rats received rat chow 5010, while the other group received a higher fat "Western Diet 57BD." Looking at the macronutrient breakdown, there appears to be nothing wrong with the diets. The rat chow diet consisted of 29% protein, 13% fat, and 58% carbohydrates, while the Western diet contained 15% protein, 41% fat, and 44% carbohydrates. You can read the whole study and find nothing about the actual ingredients in the diets, but a quick Google search will yield you the spec sheets. Here are the contents of the control group's rat chow:
Problem #2 - Their diet consisted of 31% sugar???? That's their main carbohydrate source. Sugar. There's a lot of research showing a link between insulin and cancer growth, and sugar requires a hefty insulin release. Here, here, and here. And this study, which studied mammary tumor growth, showed that mice fed sucrose had 100% tumor incidence, meaning every single mouse developed breast cancer.
Problem #3 - Casein. Casein, which is a protein derived from milk, was used in rat studies conducted by T. Colin Campbell, who wrote The China Study. He found that a diet of just 5% casein promotes tumor growth in rats. The diet in this experiment has 19% casein.
So how can the researchers claim that a high-fat and cholesterol diet proliferates cancer cell growth? If they really wanted to test their hypothesis, they would have done their best to keep as many variables as possible unchanged between the two groups. They wouldn't have fed the experimental group a laundry list of chemically isolated compounds. So in my mind we have two possible reasons why the researchers would have used a diet full of known cancer-promoting non-food items. One, they're stupid. And I refuse to believe they're stupid, because they wouldn't be where they are today without knowing the basics of science. The second option, which I like, is that the researchers' goal from the beginning was to confirm their hypothesis. So they fed the rats a diet which, while higher in fat, was also higher in substances which would be sure to cause tumor growth. I don't know why they would have done this, perhaps there was some influence from the drug or food industries. Maybe the researchers had giant egos and were so convinced high-fat diets promoted cancer that they didn't want to risk being wrong. The only thing I know for sure is that this is really bad science.
- Ground corn
- Dehulled soybean meal
- Wheat middlings
- Fish meal
- Ground wheat
- Wheat germ
- Brewers dried yeast
- Ground oats
- Dehydrated alfalfa meal
- Porcine animal fat
- Ground soybean hulls
- Soybean oil
- Dried beet pulp
- Added vitamins and minerals
- Sucrose (31% by weight)
- Milk fat (21%)
- Casein (19 %)
- Maltodextrin (10%)
- Powdered Cellulose (5%)
- Dextrin (5%)
- Added vitamins and minerals
Problem #2 - Their diet consisted of 31% sugar???? That's their main carbohydrate source. Sugar. There's a lot of research showing a link between insulin and cancer growth, and sugar requires a hefty insulin release. Here, here, and here. And this study, which studied mammary tumor growth, showed that mice fed sucrose had 100% tumor incidence, meaning every single mouse developed breast cancer.
Problem #3 - Casein. Casein, which is a protein derived from milk, was used in rat studies conducted by T. Colin Campbell, who wrote The China Study. He found that a diet of just 5% casein promotes tumor growth in rats. The diet in this experiment has 19% casein.
So how can the researchers claim that a high-fat and cholesterol diet proliferates cancer cell growth? If they really wanted to test their hypothesis, they would have done their best to keep as many variables as possible unchanged between the two groups. They wouldn't have fed the experimental group a laundry list of chemically isolated compounds. So in my mind we have two possible reasons why the researchers would have used a diet full of known cancer-promoting non-food items. One, they're stupid. And I refuse to believe they're stupid, because they wouldn't be where they are today without knowing the basics of science. The second option, which I like, is that the researchers' goal from the beginning was to confirm their hypothesis. So they fed the rats a diet which, while higher in fat, was also higher in substances which would be sure to cause tumor growth. I don't know why they would have done this, perhaps there was some influence from the drug or food industries. Maybe the researchers had giant egos and were so convinced high-fat diets promoted cancer that they didn't want to risk being wrong. The only thing I know for sure is that this is really bad science.
Saturday, March 5, 2011
Side Note on Sunlight and Vitamin D
Perhaps I should have elaborated more on the vitamin D and cancer connection. I doubt anyone will actually watch that one-hour vitamin D lecture I posted a link to, and I know everyone's been told for years to stay out of the sun because it'll give you skin cancer. So simply telling you that sun exposure reduces cancer rates probably isn't convincing. Allow me to explain myself.
First off, how serious is the skin cancer problem?. One look at the cancer statistics from 2010 shows that non-melanoma skin cancer is surprisingly benign. In 2010, less than 1,000 people in the United States died from skin cancer. That's about 0.0003% of the population, and less than 0.2% of total cancer deaths, according to the National Cancer Institute's statistics. There are at least 12 other types of cancer that are more deadly. Perhaps the problem has been a bit overstated.
But on to the task at hand. Let me begin with these two maps... the one on the left shows the amount of UVB radiation received across the nation, and on the right is a map of breast cancer prevalence. There are maps like this for all types of cancer, and they all show similar patterns.
They're strikingly similar. Areas with more UVB exposure, the type of sunlight that your skin uses to make vitamin D, tend to have lower cancer rates. Here's another chart, this one's pretty old, but still very relevant...
Cities that see more solar radiation from the sun have lower rates of breast cancer. But like I explained in my last post, correlations like these don't indicate causality. That's where the new research on vitamin D comes into play. High blood levels of vitamin D, 25(OH)D, are clearly associated with a lower risk of cancer. According to this study, supplemental vitamin D combined with sun exposure is enough to raise blood levels of vitamin D to 52 ng/ml, a level that is associated with a 50% reduction in the incidence of breast cancer. Another study, this time a controlled, clinical trial, tested the impact of vitamin D and calcium supplementation on cancer rates in postmenopausal women. The study showed that the group receiving both vitamin D and calcium, as opposed to just calcium, or a placebo, showed a "60% or greater reduction in all forms of cancer." Very significant.
Let's put this all together. Climates receiving more UVB exposure from the sun, the type that forms vitamin D in our bodies, are associated with lower cancer rates. High vitamin D blood levels are associated with lower cancer rates. Vitamin D supplementation significantly lowers the risk of developing cancer. Add to this the fact that it makes evolutionary sense that the sun would be beneficial for us: When humans migrated further and further from the equator, their skin became lighter and lighter, becoming more efficient at making vitamin D through limited sun exposure. And while we're at it, throw in some common sense too: Why would the very thing that gives us life on earth, the sun, kill us at the same time? Connect the dots, and it looks like sun exposure probably prevents cancer more than promotes it. At least that's my interpretation of the evidence.
Now I'm not saying you should go out in the sun and tan all day long to the point where you burn. That's not good for anybody. However, it is completely unnatural to avoid the sun altogether. There's nothing wrong with moderate sun exposure. And take it easy on the sunscreen. Sunscreen prevents UVB absorption, meaning you won't get burned but you also won't make vitamin D. If you'll be in the sun all day, at least hold off on the sunscreen until you've had a chance to get the benefits of sun exposure.
Could it be that the conventional medical advice to stay out of the sun has actually caused more cancer than it has prevented?? Let that marinate for a while.
First off, how serious is the skin cancer problem?. One look at the cancer statistics from 2010 shows that non-melanoma skin cancer is surprisingly benign. In 2010, less than 1,000 people in the United States died from skin cancer. That's about 0.0003% of the population, and less than 0.2% of total cancer deaths, according to the National Cancer Institute's statistics. There are at least 12 other types of cancer that are more deadly. Perhaps the problem has been a bit overstated.
But on to the task at hand. Let me begin with these two maps... the one on the left shows the amount of UVB radiation received across the nation, and on the right is a map of breast cancer prevalence. There are maps like this for all types of cancer, and they all show similar patterns.
They're strikingly similar. Areas with more UVB exposure, the type of sunlight that your skin uses to make vitamin D, tend to have lower cancer rates. Here's another chart, this one's pretty old, but still very relevant...
Cities that see more solar radiation from the sun have lower rates of breast cancer. But like I explained in my last post, correlations like these don't indicate causality. That's where the new research on vitamin D comes into play. High blood levels of vitamin D, 25(OH)D, are clearly associated with a lower risk of cancer. According to this study, supplemental vitamin D combined with sun exposure is enough to raise blood levels of vitamin D to 52 ng/ml, a level that is associated with a 50% reduction in the incidence of breast cancer. Another study, this time a controlled, clinical trial, tested the impact of vitamin D and calcium supplementation on cancer rates in postmenopausal women. The study showed that the group receiving both vitamin D and calcium, as opposed to just calcium, or a placebo, showed a "60% or greater reduction in all forms of cancer." Very significant.
Let's put this all together. Climates receiving more UVB exposure from the sun, the type that forms vitamin D in our bodies, are associated with lower cancer rates. High vitamin D blood levels are associated with lower cancer rates. Vitamin D supplementation significantly lowers the risk of developing cancer. Add to this the fact that it makes evolutionary sense that the sun would be beneficial for us: When humans migrated further and further from the equator, their skin became lighter and lighter, becoming more efficient at making vitamin D through limited sun exposure. And while we're at it, throw in some common sense too: Why would the very thing that gives us life on earth, the sun, kill us at the same time? Connect the dots, and it looks like sun exposure probably prevents cancer more than promotes it. At least that's my interpretation of the evidence.
Now I'm not saying you should go out in the sun and tan all day long to the point where you burn. That's not good for anybody. However, it is completely unnatural to avoid the sun altogether. There's nothing wrong with moderate sun exposure. And take it easy on the sunscreen. Sunscreen prevents UVB absorption, meaning you won't get burned but you also won't make vitamin D. If you'll be in the sun all day, at least hold off on the sunscreen until you've had a chance to get the benefits of sun exposure.
Could it be that the conventional medical advice to stay out of the sun has actually caused more cancer than it has prevented?? Let that marinate for a while.
Thursday, March 3, 2011
Dietary Fat and Breast Cancer, Part 1
Ah, dietary fat. So delicious, yet so frowned upon. If you look hard enough, you can find dietary fat associated with just about anything out there, from heart disease, to diabetes, to colon cancer, even in-grown toenails (not really). But the breast cancer link is something I've heard about for years in the media, and from talking to people who watch too much local news. A few weeks ago, in one of my classes, my professor showed us a graph of the relationship between dietary fat intake and prevalence of breast cancer by country. Here it is... you'll probably have to click on it to view the full-size version.
I found this one through a Google search. Interestingly, the two points that don't seem to fit in line with the others, South Africa and Israel, were left out of the graph we saw in class, I guess my professor gave them the Ancel Keys treatment... but that's not the point here. At first glance, this looks like a pretty strong relationship. The more fat a nation eats, the higher its rate of death from breast cancer. And even though this is just a correlation, meaning no cause and effect can be determined, it certainly piqued my interest. I'm always very skeptical any time a study blames fat for anything; I find it impossible not to be after reading Good Calories, Bad Calories. And I began to wonder, what else could possibly account for this correlation? And that's when it clicked. Sunlight.
If you look closely at the graph, the countries that consume less fat and have less mortality from breast cancer tend to be warmer climates, while the ones that consume more fat and have more mortality from breast cancer are colder climates. Knowing that vitamin D (from sunshine) seems to have strong anti-cancer properties, of course the countries that get more sun would have less mortality from breast cancer. If you're interested, watch Dr. Mercola's lecture on vitamin D, I don't want to get into the details here. So I decided to plot this information in an excel graph. I estimated the degrees of latitude from the equator for each country, and plotted that against breast cancer mortality rates. Here's what I came out with...
That's what is called a confounding factor. Not quite as neat of a correlation as the dietary fat graph, but there's clearly an association there. So what does all this mean? Does it mean vitamin D is the problem, not dietary fat? I'd put my money on vitamin D deficiency playing a larger role in the deaths from breast cancer than dietary fat intake. Actually, I'd argue against the very notion of lumping all types of fat into one group like this, doesn't make sense... but that's a topic for another day. From an objective standpoint, you can't infer much of anything this data. In observational studies like these, there are so many uncontrolled factors involved that you can't determine any cause and effect at all. Here's a few other confounding factors that could be at play here...
What these observational studies are actually good for, however, is to form hypotheses that can be tested in more controlled trials. It's been done in rats, and quite poorly I might add... we'll discuss that in part 2 of this series.
I found this one through a Google search. Interestingly, the two points that don't seem to fit in line with the others, South Africa and Israel, were left out of the graph we saw in class, I guess my professor gave them the Ancel Keys treatment... but that's not the point here. At first glance, this looks like a pretty strong relationship. The more fat a nation eats, the higher its rate of death from breast cancer. And even though this is just a correlation, meaning no cause and effect can be determined, it certainly piqued my interest. I'm always very skeptical any time a study blames fat for anything; I find it impossible not to be after reading Good Calories, Bad Calories. And I began to wonder, what else could possibly account for this correlation? And that's when it clicked. Sunlight.
If you look closely at the graph, the countries that consume less fat and have less mortality from breast cancer tend to be warmer climates, while the ones that consume more fat and have more mortality from breast cancer are colder climates. Knowing that vitamin D (from sunshine) seems to have strong anti-cancer properties, of course the countries that get more sun would have less mortality from breast cancer. If you're interested, watch Dr. Mercola's lecture on vitamin D, I don't want to get into the details here. So I decided to plot this information in an excel graph. I estimated the degrees of latitude from the equator for each country, and plotted that against breast cancer mortality rates. Here's what I came out with...
That's what is called a confounding factor. Not quite as neat of a correlation as the dietary fat graph, but there's clearly an association there. So what does all this mean? Does it mean vitamin D is the problem, not dietary fat? I'd put my money on vitamin D deficiency playing a larger role in the deaths from breast cancer than dietary fat intake. Actually, I'd argue against the very notion of lumping all types of fat into one group like this, doesn't make sense... but that's a topic for another day. From an objective standpoint, you can't infer much of anything this data. In observational studies like these, there are so many uncontrolled factors involved that you can't determine any cause and effect at all. Here's a few other confounding factors that could be at play here...
- People who eat less fat tend to eat more vegetables.
- People in warmer climates are able to grow vegetables for a longer part of the year, so are more likely to eat them.
- People in cooler climates tend to eat more refined grains.
- People in warmer climates are more likely to get more physical activity.
What these observational studies are actually good for, however, is to form hypotheses that can be tested in more controlled trials. It's been done in rats, and quite poorly I might add... we'll discuss that in part 2 of this series.
Wednesday, March 2, 2011
Robb Wolf and Art Devany on ABC Nightline!
Paleo has gone mainstream! Sort of. I think it's great that these guys were able to do a story on a huge mainstream media outlet like Nightline, that is fantastic for getting the word out. They're going a bit overboard with the "caveman" stuff, making it seem like sort of a novelty or a fad, which it is NOT! But any publicity is good publicity I suppose. At least it's better than the kind that Charlie Sheen is getting these days. Check out the Nightline story here!
Tuesday, March 1, 2011
The $#!T We Used To Know
People think I'm crazy. When I say things like "grains aren't healthy" or "saturated fat might even be good for you", people probably tend to ignore me; everyone knows grains should be the foundation of a healthy diet, and saturated fat causes heart disease. Today, I'd like to bring up a few historic examples of the ridiculous "facts" we used to know. Brace yourself.
1. Doctors don't need to wash their hands. Before the 1920's, it was not common practice for doctors to wash their hands with soap before working with a patient, or even between surgeries. Doctors didn't realize its importance until a Hungarian physician, Ignaz Semmelweis, discovered it was an effective means of preventing common diseases. Could you imagine a doctor performing a surgical procedure on you, right after he had his hands all up inside another patient? It used to happen.
2. Frontal Lobotomy for mental illness. In the mid 1900's, a procedure called the lobotomy was used in an attempt to cure mental illness. In fact, Antonio Egas Moniz received the Nobel Prize for Physiology or Medicine in 1949 for his discovery of the therapeutic value of the procedure in treating certain conditions. So what exactly is a frontal lobotomy? Doctors would impale a patient's head with a large needle, forming a rather large hole from the inside of the left eye socket all the way to the prefrontal cortex of the brain. Yes, they literally stabbed patients in the head. The procedure may have been somewhat effective in reducing symptoms of mental illness, but it predictably had side effects. As many as 3% of patients were killed by the procedure, according to the 1970 Psychiatric Dictionary. But besides that, it essentially made the patients easier to control by reducing their mental capacity. Good for nurses in the psychiatric ward, bad for humanity in general.
1. Doctors don't need to wash their hands. Before the 1920's, it was not common practice for doctors to wash their hands with soap before working with a patient, or even between surgeries. Doctors didn't realize its importance until a Hungarian physician, Ignaz Semmelweis, discovered it was an effective means of preventing common diseases. Could you imagine a doctor performing a surgical procedure on you, right after he had his hands all up inside another patient? It used to happen.
2. Frontal Lobotomy for mental illness. In the mid 1900's, a procedure called the lobotomy was used in an attempt to cure mental illness. In fact, Antonio Egas Moniz received the Nobel Prize for Physiology or Medicine in 1949 for his discovery of the therapeutic value of the procedure in treating certain conditions. So what exactly is a frontal lobotomy? Doctors would impale a patient's head with a large needle, forming a rather large hole from the inside of the left eye socket all the way to the prefrontal cortex of the brain. Yes, they literally stabbed patients in the head. The procedure may have been somewhat effective in reducing symptoms of mental illness, but it predictably had side effects. As many as 3% of patients were killed by the procedure, according to the 1970 Psychiatric Dictionary. But besides that, it essentially made the patients easier to control by reducing their mental capacity. Good for nurses in the psychiatric ward, bad for humanity in general.
3. Exercise ruins your bones. This myth was commonplace until the 1950's. The typical conventional wisdom of the day was that resistance training wears down your muscles and bones, and should be avoided, especially by older adults who are at risk of degeneration and osteoporosis. Today, this sounds ridiculous. Personally, I don't know how anyone ever believed this. If you look at it from a historical perspective, always useful, people knew lifting heavy things was good for you as far back as Ancient Greece. Not sure how the message got lost in translation but, at least that myth is dead now.
4. The sun revolves around the earth. Yup. Before the early 1600's, everyone knew Earth was the center of the universe. That is, until Galileo Galilei came along with his telescope and determined mathematically that the earth was in fact orbiting the sun, not vise versa. The Roman Inquisition, of course, didn't like that because it challenged the Catholic Church's view that God created the earth and made humans to run the place because they are above the laws of nature because they can control it by growing their own food and invent new ones like bread, and by coercing formerly wild animals to live beside them only to be killed for their consumption. That may have been a run-on-sentence. But the point is we know now that the sun is the center of our solar system and Galileo was right all along. Take that Pope.
5. Cigarettes are endorsed by physicians. This one might not be too hard to believe, I mean considering all the drugs physicians support these days. But it's true, doctors used to support cigarettes for "throat protection against irritation against cough." Grammatical considerations aside, it actually took a long time for the tobacco companies to finally fall to the mound of data linking smoking to lung cancer. Today, it's widely accepted that cigarettes smoking does cause lung cancer, and we can all laugh about these old ads.
I'm sure there are plenty more examples of this type of thing, but my point with this post is to say that people today tend to think we know everything. These examples seem silly, knowing what we know today, but if you were able to take yourself back in time, you would probably accept the conventional wisdom just as everyone else did. So, next time you hear something that challenges your most precious beliefs about health and nutrition, or anything else for that matter, don't simply dismiss it because "everyone knows the truth." The conventional wisdom of the time is often far from the truth.
Subscribe to:
Posts (Atom)