Bad Science – by Ben Goldacre

“You cannot reason people out of positions they didn’t reason themselves into.”

 

A Control Experiment:

In this experiment you compare the results of two different situations, where one is the experimental condition, the other is the control condition and the only difference is the thing you’re interested in testing.

An experiment is one way of determining whether an observable effect is related to a given process.

Seductive Details:

Something about seeing science information may encourage people to believe they have received a scientific explanation when they have not. People tend to rate longer explanations as being more similar to “experts’ explanations.” There is also the “seductive details” effect: if you present related (but logically irrelevant) details to people as part of an argument, this seems to make it more difficult for them to encode, and later recall, the main argument of a text, because their attention is diverted.

Plato’s Noble Lie:

A noble lie is a myth or untruth that ultimately serves the greater good. Bad science cites the example of “brain gym” making up pseudo-scientific information that ultimately gets children to take breaks and drink water. The scientific reasoning is false, however, the result is that children take breaks and drink water, which is a good thing.

The “Proprietorialization” of Common Sense:

Scientific knowledge is free and in the public domain. Anyone can use it, understand it, sell it, or simply give it away. If you want to make money out of it, you have to make a space for yourself in the market and to do this, you must overcomplicate it and attach your dubious stamp. Tim Ferriss calls this complicating to profit. The process of professionalizing the obvious fosters a sense of mystery around science and health advice that is unnecessary and destructive. We are fostering our dependence on expensive outside systems and people.

Regression to the Mean:

All things have a natural cycle. Let’s say you have back pain. It comes and goes. You have good days and bad days, good weeks and bad weeks. When it’s at its very worst, it’s going to get better, because that’s the way things are with your back pain.

Similarly, many illnesses have what is called a natural history: they are bad, and then they get better. As Voltaire said, “The art of medicine consists of amusing the patient while nature cures the disease.” Let’s say you have a cold. It’s going to get better after a few days, but at the moment you feel miserable. It’s quite natural that when your symptoms are at their worst, you will do things to try to get better. You might take a homeopathic remedy. You might sacrifice a goat and dangle its entrails around your neck. You might bully your physician into giving you antibiotics. Then, when you get better – as you surely will from a cold – you will naturally assume that whatever you did when your symptoms were at their worst must be the reason for your recovery.

A Need for More Research:

Clinicians, pundits, and researchers all like to say things like “there is a need for more research,” because it sounds forward-thinking and open-minded. In fact, that’s not always the case, and it’s a little-known fact that this very phrase has been effectively banned from the British Medical Journal for many years, on the grounds that it adds nothing; you may say what research is missing, on whom, how measuring what, and why you want to do it, but the hand-waving, superficially open-minded call for “more research” is meaningless and unhelpful.

On Bullshit:

The philosopher professor Harry Frankfurt of Princeton University discusses this issue at length in his classic 1986 essay “On Bullshit.” Under his model, “bullshit” is a form of falsehood distinct from lying: the liar knows and cares about the truth but deliberately sets out to mislead; the truth speaker knows the truth and is trying to give it to us; the bullshitter, meanwhile, does not care about the truth and is simply trying to impress us: It is impossible for someone to lie unless he thinks he knows the truth. Producing bullshit requires no such conviction… When an honest man speaks, he says only what he believes to be true; and for the liar, it is correspondingly indispensable that he considers his statements to be false. For the bullshitter, however, all these bets are off: he is neither on the side of the true nor on the side of the false. His eye is not on the facts at all, as the eyes of the honest man and of the liar are, except insofar as they may be pertinent to his interest in getting away with what he says. He does not care whether the things he says describe reality correctly. He just picks them out or makes them up, to suit his purpose.

Cherry-picking:

There are few opinions so absurd that you couldn’t find at least one person with a Ph.D.  somewhere in the world to endorse them for you; and similarly, there are few propositions in medicine so ridiculous that you couldn’t conjure up some kind of published experimental evidence somewhere to support them, if you didn’t mind it being a tenuous relationship and cherry-picked the literature, quoting only the studies that were in your favor.

Dr. Benjamin Spock:

It’s a chilling thought that when we think we are going good, we may actually be doing hard, but it is one we must always be alive to, even in the most innocuous situations. The pediatrician Dr. Benjamin Spock wrote a record-breaking bestseller titled Baby and Child Care, first published in 1946, that was hugely influential and largely sensible. In it, he confidently recommended that babies should sleep on their tummies. Dr. Spock had little to go on; but we now know that this advice is wrong, and the apparently trivial suggestion contained in his book, which was so widely read and followed, has led to thousands, and perhaps even tens of thousands of avoidable crib deaths. The more people are listening to you, the greater the effects of a small error can be.

Why Clever People Believe Stupid Things: 

We need science because our intuition is faulty. There are basic mistakes that humans make when seeking the truth:

  1. We see patterns where there is only random noise.
  2. We see causal relationships where there are none.
  3. We overvalue confirmatory information for any given hypothesis
  4. We seek out confirmatory information for any given hypothesis
  5. Our assessment of the quality of new evidence is biased by our previous beliefs

Bias:

There are many other well-researched areas of bias. We have a disproportionately high opinion of ourselves, which is nice. A large majority of the public think they are more fair-minded, less prejudiced, more intelligent, and more skilled at driving than the average person when of course, only half of us can be better than the median. Most people exhibit something called attributional bias: we believe our successes are due to our own internal faculties, and our failures are due to external factors; whereas for others, we believe their successes are due to luck, and their failures are their own flaws. We can’t all be right.

Statistics:

Let’s say the risk of having a heart attack in your fifties is 50 percent higher if you have high cholesterol. That sounds pretty bad. Let’s say the extra risk of having a heart attack if you have high cholesterol is 2 percent. That sounds OK to me. But they are the same figures. Out of one hundred men in their fifties with normal cholesterol, four will be expected to have a heart attack, whereas out of one hundred men with high cholesterol, six will be expected to have a heart attack. That’s two extra heart attacks. That’s called natural frequencies.

This can be described in 3 ways. You can have a 50 percent increase in risk (the “relative risk increase”), or a 2 percent increased risk (the “absolute increase risk”) or there can be an extra two heart attacks for every one hundred men (the natural frequency).

MMR:

Mike Fitzpatrick, a physician with a son who has autism, says that there are two questions on the subject (of MMR) that will make him want to slap you. One is: “Do you think it was caused by MMR?” The other is: “Does he have any special skills?”

If there is one thing that has adversely affected communication among scientists, journalists, and the public, it is the fact that science journalists simply do not cover major science news stories.

Things I Learned (one-liners):

Homeopathy is bullshit. It performs no better than a placebo.

The process of obtaining and interpreting evidence isn’t taught in schools, nor are the basics of evidence-based medicine and epidemiology, yet these are obviously the scientific issues that are most on people’s minds.

Does the rooster’s crow cause the sun to rise? No. Does this light switch make the room get brighter? Yes. Things can happen at roughly the same time, but that is weak, circumstantial evidence for causation.

Quotes:

“The true cost of something is what you give up to get it.”

There is nothing new under the sun.

“If I had a T-shirt slogan for this whole book, it would be: ‘I think you’ll find it’s a bit more complicated than that.'”

“The idea is to try and give all the information to help others to judge the value of your contribution, not just the information that leads to judgment in one particular direction or the other.”   – Richard P. Feynman

“The plural of anecdote is not data.”

“We all fall for reductionist explanations about the world.”

“Fancy cosmetics and other forms of quackery can be viewed as a special, self-administered, voluntary tax on people who do not understand science.”

New Words:

Subterfuge (noun) – deceit used in order to achieve one’s goal.

Hygroscopic (adjective) – tending to absorb water from the air.

Hydrolysis (noun) – the chemical breakdown of a compound due to reaction with water.

Teetotal (adjective) – choosing or characterized by abstinence from alcohol.

Heuristic (noun) – a heuristic is an approach to problem-solving or self-discovery that employs a practical method, not guaranteed to be optimal, perfect, logical, or rational, but instead sufficient for reaching an immediate goal. Where finding an optimal solution is impossible or impractical, heuristic methods can be used to speed up the process of finding a satisfactory solution. Heuristics are mental shortcuts that ease the cognitive load of making a decision. Examples that employ heuristics include using a rule of thumb, an educated guess, an intuitive judgment, a guesstimate, profiling, or common sense.