Well it sounds right
More about the
stark contrast between what we are sure is true, and what the facts indicate.
Medical misinterpretation
A big area for
misinformation is medical studies. Some educators are calling for changes to
Medical School curriculums to insure that future doctors can interpret
statistics and studies. If they depend on the drug rep for their education
after Medical School, as many do, they don’t understand enough to explain to
their patients. They often don’t understand the study results well enough to
make informed decisions that will help their patients.
Surviving prostate cancer
New York Mayor Giuliani
announced a while back that he was a five-year survivor of prostate cancer. He
explained how fortunate he was to live in New York where 82% of diagnosed
prostate cancer patients survive five years. He contrasted that to living in
Great Britain where 44% of such cases survived five years.
What he didn't understand, or at least didn't mention, is that comparable cases lived about as
long in both jurisdictions. In New York, vigorous screening identified a lot of
low-grade cancers that would never become dangerous. These added to the 5-year
survival numbers in New York but weren't identified in Britain.
And,
high-grade lesions were identified earlier in their progress. There was a
longer time between diagnosing the cancer and the patient dying. The course of
the disease was the same, but measuring from an earlier point in the disease’s
progress also added to the 5-year survival number.
The patients with
each type of tumor were given similar treatments and survived for five years at
about the same rate. In New York, prostate cancers were identified earlier, not
leading to cures but leading to longer times between being identified and
causing death.
Is it safer to
be diagnosed with prostate cancer in New York than in Great Britain? Nope.
Breast cancer: figures don't lie but liars distort figures
Sometimes
misleading information is deliberately published, such as when organizations
seek funds.
The Susan G.
Komen Foundation advertised (and may still, in some areas) that “The 5-year
survival for breast cancer when caught early is 98%. When it’s not? 23%.
According to
the British Medical Journal:
Screening
mammography does not guarantee that a woman will survive breast cancer. The
best evidence indicates that it decreases the chance that a 50 year old woman
will die from breast cancer in the next 10 years roughly from 0.53% to 0.46%--a
difference of 0.07 percentage points. Because breast cancer treatments are much
more effective now than when trials of screening were done, some experts
question whether screening mammography has any benefit.
And further:
The five year
survival for early and late stage cancers tells you nothing about the benefit
of screening. Because of biases caused by lead time (the time from diagnosis by
screening to when a tumour can be felt) and overdiagnosis, the five year
survival can improve regardless of whether cancer mortality is increased,
decreased, or unchanged by the screening.
The article
concluded “. . . The Komen advertisement campaign failed to provide the facts.
Worse, it undermined decision making by misusing statistics to generate false
hope about the benefit of mammography screening. That kind of behavior is not very charitable. . .”
Doctors should know how to read statistics
A respected
medical columnist says, “Editors should enforce transparent reporting of
evidence, for the benefit of their readers and of healthcare in general.”
If it's bad, don't measure it
Another
example of misuse was a study on a new statin drug. (Statins cause LDL
cholesterol lab values to decrease by blocking some liver functions.) The
researcher noted that cardiac events were down, as is shown by many statin
studies. However, he also noted that the overall death rate was higher for the
study participants due to a higher rate of liver cancer.
That drug was
withdrawn, and we can guess that the researcher had trouble finding funding
from the pharmacy companies after that. However, subsequent studies of statins
simply stopped measuring or reporting liver cancer or overall death rates.
The risk to
the consumer wasn't changed, only the reporting of risks was changed.
Feinstein's gun death stats
This subject
intersects the study of gun deaths quoted by a California Senator. Essentially,
gun deaths are higher where there are more guns, so if we reduce the number of
guns we’ll decrease violence. (Also see the previous blog.)
Studies show
the contrary, however. The key here is the leap from gun deaths to violence.
Fewer gun deaths are not related to less violence. Instead, more guns licensed for
concealed carry markedly reduces violent crime.
In the study
that Senator Feinstein quotes, removing the suicide deaths leaves the gun death
rates about the same. (Suicide by car or medication isn't examined.) Considering
only gang-violence numbers and numbers killed by guns of police officers
reverses the rates: more gun deaths in low ownership areas. And this is
reasonable to believe, since the most regulated areas have dense populations
and high crime. Prime examples are Chicago, Washington D.C., Detroit, and most
of Los Angeles.
Leaping from fact to fantasy
But the leap
from fact to fantasy serves Feinstein’s purpose, and it sells well in a press
conference.
Leaps from
fact to fantasy ran rampant during the last election. They’re still common in
some columns and blogs in Costa Mesa.
Think about whether it makes sense
Take the time
to think about what is said. It just may be that someone is using statistics as
propaganda. That is, using statistics to convince you by emotion instead of
logic.
Just because
we wish there were a simple answer doesn't mean that there is one. Or that the
writer or speaker knows what it is.
What’s in it
for him? What does she want you to believe?
No comments:
Post a Comment