When looking at blood lipid profiles, doctors tend to stress LDL cholesterol to their patients, but high LDL levels alone don't necessarily indicate high risk of heart disease. It turns out, further research has uncovered that there are in fact two types of LDL. You have pattern A LDL, which is large and fluffy, and you also have pattern B, which is small and dense. The large, fluffy type is not associated with an increased risk of heart attacks, while the small, dense type very much is. People with these larger LDL particles tend to have normal levels of other risk factors: they typically have high HDL and low triglycerides. People with small LDL experience the opposite: they typically have low HDL and elevated triglycerides. These two types of LDL clearly exhibit the exact opposite effect in terms of cardiovascular health, so why are we still so concerned with LDL? Well, probably because doctors don't normally test for LDL particle size. Maybe they should hop on that.
The discovery that there are two very different types of LDL has far-reaching implications, most notably in the saturated fat debate. For years, the USDA Dietary Guidelines have stressed to us that we should reduce saturated fat and cholesterol intake because they raise LDL levels. So what if they raise LDL levels? Do they raise pattern A or pattern B LDL? Research shows, saturated fat in the diet tends to raise the benign, large, fluffy pattern A LDL. This study from Sweden shows that people who consume more milk fat (whole milk, cheese, butter, etc.) have predominantly large, fluffy LDL. This study from UConn and this one out of Mexico both show that consumption of eggs, which are high in both saturated fat and cholesterol, result in the non-atherogenic large, fluffy LDL. This of course makes evolutionary sense as well. It is estimated that hunter-gatherers consumed at least 10-15% of their calories from saturated fat. The Dietary Guidelines say we should keep it under 10%. Maybe the USDA should actually read research instead of making recommendations that will sell more processed foods made from corn and soy.
My advice: pay no attention to your overall LDL level and don't fear saturated fat or cholesterol. Your total LDL number is meaningless unless you know which type of LDL you predominantly have. Even if your doctor doesn't check for LDL particle size, though, there is still a good way to predict which type of LDL you've got. If your HDL is high and your triglycerides low, you're probably safe regardless of your LDL count because it's going to be the large and fluffy pattern A. If your HDL is low and your triglycerides high, then you're in trouble, even if your LDL level isn't high. In the end, it looks like total LDL just isn't all that important of a predictor of heart disease. If you've read this and my previous blog post, you have to wonder... why is everyone so concerned about cholesterol??
Saturday, March 12, 2011
Wednesday, March 9, 2011
High Cholesterol = Longer Life?
The issue of cholesterol is complex, to say the least. You might not realize it from watching cholesterol-lowering drug commercials, or even from talking to your doctor. But there is way more to the story than simply keeping your cholesterol low. This will probably be a frequent topic on my blog, as there are so many aspects of it to be explored, but for today I'd like to focus on total cholesterol.
Check out this very eye-opening paper. It's an outstanding review of some of the more intriguing cholesterol research, and one of the few papers that I've actually found difficult to put down. I seriously couldn't stop reading it. And it blew my mind.
According to several studies, older adults with higher cholesterol live the longest. In fact, groups with the lowest cholesterol levels typically have the highest morbidity rate. Here's the breakdown from a couple of these studies... Dr. Harlan Krumholz found in 1994 that old people with low cholesterol were twice as likely to die from coronary heart disease than those with high cholesterol. Another study of 92 women aged 60 or over found that those with a total cholesterol level of about 270 mg/dl lived the longest. Those with the highest cholesterol, over 300 mg/dl, were only 1.8 times more likely to die, while the lowest cholesterol group, 154 mg/dl, was 5.4 times more likely to die.
Interesting stuff huh? That review paper discusses 20 studies just like these, where blood cholesterol levels were either not associated with cardiovascular disease or all-cause death, or there was an inverse relationship. This quote from the paper sums up the situation quite nicely...
"It is true that high t-C is a risk factor for coronary heart disease, but mainly in young and middle-aged men. If high t-C or LDL-C were the most important cause of cardiovascular disease, it should be a risk factor in both sexes, in all populations, and in all age groups. But in many populations, including women, Canadian and Russian men, Maoris, patients with diabetes, and patients with the nephrotic syndrome; the association between t-C and mortality is absent or inverse; or increasing t-C is associated with low coronary and total mortality. Most strikingly, in most cohort studies of old people, high LDL-C or t-C does not predict coronary heart disease or all-cause mortality; in several of these studies the association between t-C and mortality was inverse, or high t-C was associated with longevity. These associations have mostly been considered as a minor aberration from the LDL-receptor hypothesis, although by far the highest mortality and the greatest part of all cardiovascular disease are seen in old people."
In case you're having trouble with the terminology, t-C just means total cholesterol, LDL-C means LDL cholesterol. A couple of key points here... if cholesterol is the cause of heart disease, then shouldn't it be a risk factor for everybody, regardless of age, sex, or ethnicity? It should. But it's not. While cholesterol levels can be somewhat predictive of one's risk of heart disease, cholesterol doesn't cause the problem. The second bolded quote is very key as well. Studies done in older adults, like these ones showing that those with high cholesterol have less risk of cardiovascular disease, should not be taken as an aberration. By far the highest risk group is adults over 60, so if the conventional wisdom doesn't hold true for them, it doesn't hold true at all. These studies should indicate that our current thinking about cholesterol is highly flawed. Cholesterol simply cannot be the cause of heart disease when it is so notoriously unreliable as a predictor of heart disease in the most at-risk populations. It's that simple.
Check out this very eye-opening paper. It's an outstanding review of some of the more intriguing cholesterol research, and one of the few papers that I've actually found difficult to put down. I seriously couldn't stop reading it. And it blew my mind.
According to several studies, older adults with higher cholesterol live the longest. In fact, groups with the lowest cholesterol levels typically have the highest morbidity rate. Here's the breakdown from a couple of these studies... Dr. Harlan Krumholz found in 1994 that old people with low cholesterol were twice as likely to die from coronary heart disease than those with high cholesterol. Another study of 92 women aged 60 or over found that those with a total cholesterol level of about 270 mg/dl lived the longest. Those with the highest cholesterol, over 300 mg/dl, were only 1.8 times more likely to die, while the lowest cholesterol group, 154 mg/dl, was 5.4 times more likely to die.
Interesting stuff huh? That review paper discusses 20 studies just like these, where blood cholesterol levels were either not associated with cardiovascular disease or all-cause death, or there was an inverse relationship. This quote from the paper sums up the situation quite nicely...
"It is true that high t-C is a risk factor for coronary heart disease, but mainly in young and middle-aged men. If high t-C or LDL-C were the most important cause of cardiovascular disease, it should be a risk factor in both sexes, in all populations, and in all age groups. But in many populations, including women, Canadian and Russian men, Maoris, patients with diabetes, and patients with the nephrotic syndrome; the association between t-C and mortality is absent or inverse; or increasing t-C is associated with low coronary and total mortality. Most strikingly, in most cohort studies of old people, high LDL-C or t-C does not predict coronary heart disease or all-cause mortality; in several of these studies the association between t-C and mortality was inverse, or high t-C was associated with longevity. These associations have mostly been considered as a minor aberration from the LDL-receptor hypothesis, although by far the highest mortality and the greatest part of all cardiovascular disease are seen in old people."
In case you're having trouble with the terminology, t-C just means total cholesterol, LDL-C means LDL cholesterol. A couple of key points here... if cholesterol is the cause of heart disease, then shouldn't it be a risk factor for everybody, regardless of age, sex, or ethnicity? It should. But it's not. While cholesterol levels can be somewhat predictive of one's risk of heart disease, cholesterol doesn't cause the problem. The second bolded quote is very key as well. Studies done in older adults, like these ones showing that those with high cholesterol have less risk of cardiovascular disease, should not be taken as an aberration. By far the highest risk group is adults over 60, so if the conventional wisdom doesn't hold true for them, it doesn't hold true at all. These studies should indicate that our current thinking about cholesterol is highly flawed. Cholesterol simply cannot be the cause of heart disease when it is so notoriously unreliable as a predictor of heart disease in the most at-risk populations. It's that simple.
Sunday, March 6, 2011
Dietary Fat and Breast Cancer, Part 2
Finally, on to part 2. Let's talk about a study I found through marksdailyapple.com showing that a high fat and cholesterol diet causes faster growth and proliferation of tumors than a "normal" diet. The study was done in rats; the control group of rats received rat chow 5010, while the other group received a higher fat "Western Diet 57BD." Looking at the macronutrient breakdown, there appears to be nothing wrong with the diets. The rat chow diet consisted of 29% protein, 13% fat, and 58% carbohydrates, while the Western diet contained 15% protein, 41% fat, and 44% carbohydrates. You can read the whole study and find nothing about the actual ingredients in the diets, but a quick Google search will yield you the spec sheets. Here are the contents of the control group's rat chow:
Problem #2 - Their diet consisted of 31% sugar???? That's their main carbohydrate source. Sugar. There's a lot of research showing a link between insulin and cancer growth, and sugar requires a hefty insulin release. Here, here, and here. And this study, which studied mammary tumor growth, showed that mice fed sucrose had 100% tumor incidence, meaning every single mouse developed breast cancer.
Problem #3 - Casein. Casein, which is a protein derived from milk, was used in rat studies conducted by T. Colin Campbell, who wrote The China Study. He found that a diet of just 5% casein promotes tumor growth in rats. The diet in this experiment has 19% casein.
So how can the researchers claim that a high-fat and cholesterol diet proliferates cancer cell growth? If they really wanted to test their hypothesis, they would have done their best to keep as many variables as possible unchanged between the two groups. They wouldn't have fed the experimental group a laundry list of chemically isolated compounds. So in my mind we have two possible reasons why the researchers would have used a diet full of known cancer-promoting non-food items. One, they're stupid. And I refuse to believe they're stupid, because they wouldn't be where they are today without knowing the basics of science. The second option, which I like, is that the researchers' goal from the beginning was to confirm their hypothesis. So they fed the rats a diet which, while higher in fat, was also higher in substances which would be sure to cause tumor growth. I don't know why they would have done this, perhaps there was some influence from the drug or food industries. Maybe the researchers had giant egos and were so convinced high-fat diets promoted cancer that they didn't want to risk being wrong. The only thing I know for sure is that this is really bad science.
- Ground corn
- Dehulled soybean meal
- Wheat middlings
- Fish meal
- Ground wheat
- Wheat germ
- Brewers dried yeast
- Ground oats
- Dehydrated alfalfa meal
- Porcine animal fat
- Ground soybean hulls
- Soybean oil
- Dried beet pulp
- Added vitamins and minerals
- Sucrose (31% by weight)
- Milk fat (21%)
- Casein (19 %)
- Maltodextrin (10%)
- Powdered Cellulose (5%)
- Dextrin (5%)
- Added vitamins and minerals
Problem #2 - Their diet consisted of 31% sugar???? That's their main carbohydrate source. Sugar. There's a lot of research showing a link between insulin and cancer growth, and sugar requires a hefty insulin release. Here, here, and here. And this study, which studied mammary tumor growth, showed that mice fed sucrose had 100% tumor incidence, meaning every single mouse developed breast cancer.
Problem #3 - Casein. Casein, which is a protein derived from milk, was used in rat studies conducted by T. Colin Campbell, who wrote The China Study. He found that a diet of just 5% casein promotes tumor growth in rats. The diet in this experiment has 19% casein.
So how can the researchers claim that a high-fat and cholesterol diet proliferates cancer cell growth? If they really wanted to test their hypothesis, they would have done their best to keep as many variables as possible unchanged between the two groups. They wouldn't have fed the experimental group a laundry list of chemically isolated compounds. So in my mind we have two possible reasons why the researchers would have used a diet full of known cancer-promoting non-food items. One, they're stupid. And I refuse to believe they're stupid, because they wouldn't be where they are today without knowing the basics of science. The second option, which I like, is that the researchers' goal from the beginning was to confirm their hypothesis. So they fed the rats a diet which, while higher in fat, was also higher in substances which would be sure to cause tumor growth. I don't know why they would have done this, perhaps there was some influence from the drug or food industries. Maybe the researchers had giant egos and were so convinced high-fat diets promoted cancer that they didn't want to risk being wrong. The only thing I know for sure is that this is really bad science.
Saturday, March 5, 2011
Side Note on Sunlight and Vitamin D
Perhaps I should have elaborated more on the vitamin D and cancer connection. I doubt anyone will actually watch that one-hour vitamin D lecture I posted a link to, and I know everyone's been told for years to stay out of the sun because it'll give you skin cancer. So simply telling you that sun exposure reduces cancer rates probably isn't convincing. Allow me to explain myself.
First off, how serious is the skin cancer problem?. One look at the cancer statistics from 2010 shows that non-melanoma skin cancer is surprisingly benign. In 2010, less than 1,000 people in the United States died from skin cancer. That's about 0.0003% of the population, and less than 0.2% of total cancer deaths, according to the National Cancer Institute's statistics. There are at least 12 other types of cancer that are more deadly. Perhaps the problem has been a bit overstated.
But on to the task at hand. Let me begin with these two maps... the one on the left shows the amount of UVB radiation received across the nation, and on the right is a map of breast cancer prevalence. There are maps like this for all types of cancer, and they all show similar patterns.
They're strikingly similar. Areas with more UVB exposure, the type of sunlight that your skin uses to make vitamin D, tend to have lower cancer rates. Here's another chart, this one's pretty old, but still very relevant...
Cities that see more solar radiation from the sun have lower rates of breast cancer. But like I explained in my last post, correlations like these don't indicate causality. That's where the new research on vitamin D comes into play. High blood levels of vitamin D, 25(OH)D, are clearly associated with a lower risk of cancer. According to this study, supplemental vitamin D combined with sun exposure is enough to raise blood levels of vitamin D to 52 ng/ml, a level that is associated with a 50% reduction in the incidence of breast cancer. Another study, this time a controlled, clinical trial, tested the impact of vitamin D and calcium supplementation on cancer rates in postmenopausal women. The study showed that the group receiving both vitamin D and calcium, as opposed to just calcium, or a placebo, showed a "60% or greater reduction in all forms of cancer." Very significant.
Let's put this all together. Climates receiving more UVB exposure from the sun, the type that forms vitamin D in our bodies, are associated with lower cancer rates. High vitamin D blood levels are associated with lower cancer rates. Vitamin D supplementation significantly lowers the risk of developing cancer. Add to this the fact that it makes evolutionary sense that the sun would be beneficial for us: When humans migrated further and further from the equator, their skin became lighter and lighter, becoming more efficient at making vitamin D through limited sun exposure. And while we're at it, throw in some common sense too: Why would the very thing that gives us life on earth, the sun, kill us at the same time? Connect the dots, and it looks like sun exposure probably prevents cancer more than promotes it. At least that's my interpretation of the evidence.
Now I'm not saying you should go out in the sun and tan all day long to the point where you burn. That's not good for anybody. However, it is completely unnatural to avoid the sun altogether. There's nothing wrong with moderate sun exposure. And take it easy on the sunscreen. Sunscreen prevents UVB absorption, meaning you won't get burned but you also won't make vitamin D. If you'll be in the sun all day, at least hold off on the sunscreen until you've had a chance to get the benefits of sun exposure.
Could it be that the conventional medical advice to stay out of the sun has actually caused more cancer than it has prevented?? Let that marinate for a while.
First off, how serious is the skin cancer problem?. One look at the cancer statistics from 2010 shows that non-melanoma skin cancer is surprisingly benign. In 2010, less than 1,000 people in the United States died from skin cancer. That's about 0.0003% of the population, and less than 0.2% of total cancer deaths, according to the National Cancer Institute's statistics. There are at least 12 other types of cancer that are more deadly. Perhaps the problem has been a bit overstated.
But on to the task at hand. Let me begin with these two maps... the one on the left shows the amount of UVB radiation received across the nation, and on the right is a map of breast cancer prevalence. There are maps like this for all types of cancer, and they all show similar patterns.
They're strikingly similar. Areas with more UVB exposure, the type of sunlight that your skin uses to make vitamin D, tend to have lower cancer rates. Here's another chart, this one's pretty old, but still very relevant...
Cities that see more solar radiation from the sun have lower rates of breast cancer. But like I explained in my last post, correlations like these don't indicate causality. That's where the new research on vitamin D comes into play. High blood levels of vitamin D, 25(OH)D, are clearly associated with a lower risk of cancer. According to this study, supplemental vitamin D combined with sun exposure is enough to raise blood levels of vitamin D to 52 ng/ml, a level that is associated with a 50% reduction in the incidence of breast cancer. Another study, this time a controlled, clinical trial, tested the impact of vitamin D and calcium supplementation on cancer rates in postmenopausal women. The study showed that the group receiving both vitamin D and calcium, as opposed to just calcium, or a placebo, showed a "60% or greater reduction in all forms of cancer." Very significant.
Let's put this all together. Climates receiving more UVB exposure from the sun, the type that forms vitamin D in our bodies, are associated with lower cancer rates. High vitamin D blood levels are associated with lower cancer rates. Vitamin D supplementation significantly lowers the risk of developing cancer. Add to this the fact that it makes evolutionary sense that the sun would be beneficial for us: When humans migrated further and further from the equator, their skin became lighter and lighter, becoming more efficient at making vitamin D through limited sun exposure. And while we're at it, throw in some common sense too: Why would the very thing that gives us life on earth, the sun, kill us at the same time? Connect the dots, and it looks like sun exposure probably prevents cancer more than promotes it. At least that's my interpretation of the evidence.
Now I'm not saying you should go out in the sun and tan all day long to the point where you burn. That's not good for anybody. However, it is completely unnatural to avoid the sun altogether. There's nothing wrong with moderate sun exposure. And take it easy on the sunscreen. Sunscreen prevents UVB absorption, meaning you won't get burned but you also won't make vitamin D. If you'll be in the sun all day, at least hold off on the sunscreen until you've had a chance to get the benefits of sun exposure.
Could it be that the conventional medical advice to stay out of the sun has actually caused more cancer than it has prevented?? Let that marinate for a while.
Thursday, March 3, 2011
Dietary Fat and Breast Cancer, Part 1
Ah, dietary fat. So delicious, yet so frowned upon. If you look hard enough, you can find dietary fat associated with just about anything out there, from heart disease, to diabetes, to colon cancer, even in-grown toenails (not really). But the breast cancer link is something I've heard about for years in the media, and from talking to people who watch too much local news. A few weeks ago, in one of my classes, my professor showed us a graph of the relationship between dietary fat intake and prevalence of breast cancer by country. Here it is... you'll probably have to click on it to view the full-size version.
I found this one through a Google search. Interestingly, the two points that don't seem to fit in line with the others, South Africa and Israel, were left out of the graph we saw in class, I guess my professor gave them the Ancel Keys treatment... but that's not the point here. At first glance, this looks like a pretty strong relationship. The more fat a nation eats, the higher its rate of death from breast cancer. And even though this is just a correlation, meaning no cause and effect can be determined, it certainly piqued my interest. I'm always very skeptical any time a study blames fat for anything; I find it impossible not to be after reading Good Calories, Bad Calories. And I began to wonder, what else could possibly account for this correlation? And that's when it clicked. Sunlight.
If you look closely at the graph, the countries that consume less fat and have less mortality from breast cancer tend to be warmer climates, while the ones that consume more fat and have more mortality from breast cancer are colder climates. Knowing that vitamin D (from sunshine) seems to have strong anti-cancer properties, of course the countries that get more sun would have less mortality from breast cancer. If you're interested, watch Dr. Mercola's lecture on vitamin D, I don't want to get into the details here. So I decided to plot this information in an excel graph. I estimated the degrees of latitude from the equator for each country, and plotted that against breast cancer mortality rates. Here's what I came out with...
That's what is called a confounding factor. Not quite as neat of a correlation as the dietary fat graph, but there's clearly an association there. So what does all this mean? Does it mean vitamin D is the problem, not dietary fat? I'd put my money on vitamin D deficiency playing a larger role in the deaths from breast cancer than dietary fat intake. Actually, I'd argue against the very notion of lumping all types of fat into one group like this, doesn't make sense... but that's a topic for another day. From an objective standpoint, you can't infer much of anything this data. In observational studies like these, there are so many uncontrolled factors involved that you can't determine any cause and effect at all. Here's a few other confounding factors that could be at play here...
What these observational studies are actually good for, however, is to form hypotheses that can be tested in more controlled trials. It's been done in rats, and quite poorly I might add... we'll discuss that in part 2 of this series.
I found this one through a Google search. Interestingly, the two points that don't seem to fit in line with the others, South Africa and Israel, were left out of the graph we saw in class, I guess my professor gave them the Ancel Keys treatment... but that's not the point here. At first glance, this looks like a pretty strong relationship. The more fat a nation eats, the higher its rate of death from breast cancer. And even though this is just a correlation, meaning no cause and effect can be determined, it certainly piqued my interest. I'm always very skeptical any time a study blames fat for anything; I find it impossible not to be after reading Good Calories, Bad Calories. And I began to wonder, what else could possibly account for this correlation? And that's when it clicked. Sunlight.
If you look closely at the graph, the countries that consume less fat and have less mortality from breast cancer tend to be warmer climates, while the ones that consume more fat and have more mortality from breast cancer are colder climates. Knowing that vitamin D (from sunshine) seems to have strong anti-cancer properties, of course the countries that get more sun would have less mortality from breast cancer. If you're interested, watch Dr. Mercola's lecture on vitamin D, I don't want to get into the details here. So I decided to plot this information in an excel graph. I estimated the degrees of latitude from the equator for each country, and plotted that against breast cancer mortality rates. Here's what I came out with...
That's what is called a confounding factor. Not quite as neat of a correlation as the dietary fat graph, but there's clearly an association there. So what does all this mean? Does it mean vitamin D is the problem, not dietary fat? I'd put my money on vitamin D deficiency playing a larger role in the deaths from breast cancer than dietary fat intake. Actually, I'd argue against the very notion of lumping all types of fat into one group like this, doesn't make sense... but that's a topic for another day. From an objective standpoint, you can't infer much of anything this data. In observational studies like these, there are so many uncontrolled factors involved that you can't determine any cause and effect at all. Here's a few other confounding factors that could be at play here...
- People who eat less fat tend to eat more vegetables.
- People in warmer climates are able to grow vegetables for a longer part of the year, so are more likely to eat them.
- People in cooler climates tend to eat more refined grains.
- People in warmer climates are more likely to get more physical activity.
What these observational studies are actually good for, however, is to form hypotheses that can be tested in more controlled trials. It's been done in rats, and quite poorly I might add... we'll discuss that in part 2 of this series.
Wednesday, March 2, 2011
Robb Wolf and Art Devany on ABC Nightline!
Paleo has gone mainstream! Sort of. I think it's great that these guys were able to do a story on a huge mainstream media outlet like Nightline, that is fantastic for getting the word out. They're going a bit overboard with the "caveman" stuff, making it seem like sort of a novelty or a fad, which it is NOT! But any publicity is good publicity I suppose. At least it's better than the kind that Charlie Sheen is getting these days. Check out the Nightline story here!
Tuesday, March 1, 2011
The $#!T We Used To Know
People think I'm crazy. When I say things like "grains aren't healthy" or "saturated fat might even be good for you", people probably tend to ignore me; everyone knows grains should be the foundation of a healthy diet, and saturated fat causes heart disease. Today, I'd like to bring up a few historic examples of the ridiculous "facts" we used to know. Brace yourself.
1. Doctors don't need to wash their hands. Before the 1920's, it was not common practice for doctors to wash their hands with soap before working with a patient, or even between surgeries. Doctors didn't realize its importance until a Hungarian physician, Ignaz Semmelweis, discovered it was an effective means of preventing common diseases. Could you imagine a doctor performing a surgical procedure on you, right after he had his hands all up inside another patient? It used to happen.
2. Frontal Lobotomy for mental illness. In the mid 1900's, a procedure called the lobotomy was used in an attempt to cure mental illness. In fact, Antonio Egas Moniz received the Nobel Prize for Physiology or Medicine in 1949 for his discovery of the therapeutic value of the procedure in treating certain conditions. So what exactly is a frontal lobotomy? Doctors would impale a patient's head with a large needle, forming a rather large hole from the inside of the left eye socket all the way to the prefrontal cortex of the brain. Yes, they literally stabbed patients in the head. The procedure may have been somewhat effective in reducing symptoms of mental illness, but it predictably had side effects. As many as 3% of patients were killed by the procedure, according to the 1970 Psychiatric Dictionary. But besides that, it essentially made the patients easier to control by reducing their mental capacity. Good for nurses in the psychiatric ward, bad for humanity in general.
1. Doctors don't need to wash their hands. Before the 1920's, it was not common practice for doctors to wash their hands with soap before working with a patient, or even between surgeries. Doctors didn't realize its importance until a Hungarian physician, Ignaz Semmelweis, discovered it was an effective means of preventing common diseases. Could you imagine a doctor performing a surgical procedure on you, right after he had his hands all up inside another patient? It used to happen.
2. Frontal Lobotomy for mental illness. In the mid 1900's, a procedure called the lobotomy was used in an attempt to cure mental illness. In fact, Antonio Egas Moniz received the Nobel Prize for Physiology or Medicine in 1949 for his discovery of the therapeutic value of the procedure in treating certain conditions. So what exactly is a frontal lobotomy? Doctors would impale a patient's head with a large needle, forming a rather large hole from the inside of the left eye socket all the way to the prefrontal cortex of the brain. Yes, they literally stabbed patients in the head. The procedure may have been somewhat effective in reducing symptoms of mental illness, but it predictably had side effects. As many as 3% of patients were killed by the procedure, according to the 1970 Psychiatric Dictionary. But besides that, it essentially made the patients easier to control by reducing their mental capacity. Good for nurses in the psychiatric ward, bad for humanity in general.
3. Exercise ruins your bones. This myth was commonplace until the 1950's. The typical conventional wisdom of the day was that resistance training wears down your muscles and bones, and should be avoided, especially by older adults who are at risk of degeneration and osteoporosis. Today, this sounds ridiculous. Personally, I don't know how anyone ever believed this. If you look at it from a historical perspective, always useful, people knew lifting heavy things was good for you as far back as Ancient Greece. Not sure how the message got lost in translation but, at least that myth is dead now.
4. The sun revolves around the earth. Yup. Before the early 1600's, everyone knew Earth was the center of the universe. That is, until Galileo Galilei came along with his telescope and determined mathematically that the earth was in fact orbiting the sun, not vise versa. The Roman Inquisition, of course, didn't like that because it challenged the Catholic Church's view that God created the earth and made humans to run the place because they are above the laws of nature because they can control it by growing their own food and invent new ones like bread, and by coercing formerly wild animals to live beside them only to be killed for their consumption. That may have been a run-on-sentence. But the point is we know now that the sun is the center of our solar system and Galileo was right all along. Take that Pope.
5. Cigarettes are endorsed by physicians. This one might not be too hard to believe, I mean considering all the drugs physicians support these days. But it's true, doctors used to support cigarettes for "throat protection against irritation against cough." Grammatical considerations aside, it actually took a long time for the tobacco companies to finally fall to the mound of data linking smoking to lung cancer. Today, it's widely accepted that cigarettes smoking does cause lung cancer, and we can all laugh about these old ads.
I'm sure there are plenty more examples of this type of thing, but my point with this post is to say that people today tend to think we know everything. These examples seem silly, knowing what we know today, but if you were able to take yourself back in time, you would probably accept the conventional wisdom just as everyone else did. So, next time you hear something that challenges your most precious beliefs about health and nutrition, or anything else for that matter, don't simply dismiss it because "everyone knows the truth." The conventional wisdom of the time is often far from the truth.
Subscribe to:
Comments (Atom)





