Good News/Bad News

You may have heard about a woman who recently decided to try living on solar power, tea, and water, instead of food. Good news: She’s stopped. Bad news: She lost about 20% of her weight in the experiment, and she’s rationalizing. One sentence in particular got me annoyed:

“I was just asking a question, but there was just so much negative response that that means the question can’t even be asked,” she said.

This is the sort of tepid excuse I’ve come to expect from a lot of woos regarding scientific questions. They can’t handle adversity like adults, and science is pretty adversarial. You don’t just dismiss criticism for being “negative.” You deal with criticism as rationally as you can, using logic and evidence to answer it. That’s a part of what it means to ask controversial questions. Criticism is supposed to be expected. Brainstorming without criticism is good for generating new ideas, but sooner or later, you need to sort the good ideas from the bad, and that typically means having a two-way conversation with critics. When you speak, you are not entitled to uniform cheering.

A lot of the time, people with the consensus view reacts negatively because the idea in question has been tested and failed or is implausible for well-established reasons. And with ideas like hers, the most likely outcome was that she’d harm herself. I care about people, even if they do stupid things. I criticize precisely because I care and want to dissuade them from harmful action.

Here’s the kicker: If you don’t have answers to criticism, question the value of your idea. You might be the one who’s wrong.

I’m glad she stopped, though I’d prefer if she did so for rational reasons.

Anti-Doggerel #1: “I Don’t Know”

The beginning of wisdom is, ‘I do not know.’ [gestures toward the “hole in space” on the viewscreen] I do not know what that is.

– Lt. Cmdr. Data, “Where Silence Has Lease

“I don’t know” is a phrase that probably should get more mileage. When used appropriately, it’s humble. It’s honest. It’s open. The universe is a big place with lots of tiny details, so it shouldn’t be a surprise that there is still much that we don’t know as a species, let alone as individuals. Being aware of that ignorance inspires both caution and curiosity, virtues of science. We can devise hypotheses to explain unknown phenomena, but a scientifically minded person doesn’t jump to the conclusion that his hypothesis is true without carefully testing it. Sincerely admitting ignorance typically means being open to entertaining new ideas as well. Those ideas still have to be tested before they’re accepted as knowledge, of course.

There’s an annoying idea I’ve encountered with various pseudoscience trolls, quacks, and especially with Creationists. They treat any admission of ignorance from their opponents as a victory for their ideas. It doesn’t work that way. For an example, let’s say a particular type of cancer has no known effective treatment. Just because the scientific community doesn’t have an answer doesn’t mean that we should accept a quack’s answer, especially if that answer wasn’t informed by scientific research into its plausibility.

Science is cautious by nature. The world is a complicated place, and there’s always the possibility of discovering new nuances and exceptions to the rules we’re familiar with. We can’t have absolute certainty in what we do know because of our human limitations. The language of scientists typically reflects this, since they will mention nuances, limitations, exceptions, and uncertainty from simple probability.

Pseudoscience doesn’t like humility or measured confidence, often characterizing it was “weak” language. Statements of absolute certainty and absolute rules are much more marketing friendly and easier to fit into a slogan. Religion is quite aware of this and sets up gods and holy books as absolute authorities with circular reasoning. Quacks and pseudoscientists often follow suit and enshrine their gurus and particularly the original creator and his texts. In either case, they often implicitly or explicitly claim they have all the answers in a convenient package. This tends to lead to stagnation. The scientific community knows that it doesn’t know everything. If they did, science would stop.

I think treating “I don’t know” as a concession taps on an unhealthy obsession with completeness and perfection that overrides the healthy desire to know the truth. One problem with many religious, supernatural, and pseudoscientific ideas is that they can explain anything. If you’re sympathetic to those sorts of hypotheses, that’s not a strength. If an idea can explain anything, that’s actually a big problem: It can explain things that don’t exist just as readily as those that do. It can explain failures and success equally. It essentially means that we can’t use it to make predictions to verify its accuracy. We can’t use it to make predictions or decisions. It’s ‘heads I win, tails you lose.’ Such ideas are essentially a way to deceive yourself with the comforting illusion of understanding without the practical benefits of real understanding.

There’s another idea that any answer is better than none. This is simply not true. Actions based on an incorrect idea can be more harmful than inaction. They can waste resources better spent elsewhere. I can understand desperation in the face of death and the desire to go down fighting, but that doesn’t mean I should rhetorically support those who can exploit desperate people just because I don’t know the true answer.

Idolatry

I’ve been slowly making my way through Married to the Sea‘s archives. It’s a silly comic most of the time, but sometimes it gets political. This one reminded me of a point I like to make. It’s the 10 Commandments as a graven image that gets pushed into schools and courthouses. On top of violating freedom from government imposition of a religion, showing undue favoritism, and all that, it makes the point that the 10 Commandments is essentially an idol.

Continue reading

Why I Don’t Trust Burzynski

If anyone’s suppressing Burzynski’s research, it’s Burzynski. The only motivations I can think of are greed and delusion because he’s not doing what an honest, altruistic scientist would do.

An honest scientist wouldn’t have any reason to delay publication of positive results for decades. If he got negative results, he would have moved onto more fruitful research long ago. If he suspected there was some flaw in the study, he would turn it over to peer review so they could engage in constructive criticism and he could start over and avoid those mistakes. Only a delusional scientist would keep testing after negative results, use cherry-picked anecdotes to falsely bolster his confidence and recruit test subjects, and avoid scientific scrutiny.

An honest scientist wouldn’t charge patients ridiculous amounts to participate, because that would make the selection non-random, biasing the results and negating the study’s value. Patients who invest a lot of money into a treatment are also emotionally invested in interpreting their situation in a positive light. Statistical analysis is how we remove our rose-colored glasses.

An altruistic scientist would instead pay for the study by asking for research grants and donations. He wouldn’t give out false promises of results, only that his treatment be given a chance to live up to his hopes. A con artist, however, would seek to make a profit by overcharging desperate patients for drugs that can be bought more cheaply. He would encourage people to spread cherry-picked testimonials to convince laypeople who don’t understand science. He wouldn’t publish his statistics and research methodology because that would allow scientists to discover the scam.

An honest scientist wouldn’t take a long time to publish a study unless what he’s studying really and truly takes decades to gather the data and make conclusions from them. In the case of a long term cancer treatment study, he’d at least publish preliminary results of what happens in the first few years and then continue following the patients for longer increments. That way, if the initial results are promising, other scientists can try to replicate them, and not have to wait for decades.

An altruistic scientist wouldn’t keep his research to himself. Science today depends on a culture of altruism. Scientists are expected to share information relatively freely. Science thrives with transparency and cooperation because new research depends on the reliability of existing knowledge. The era of the lone genius toiling in isolation is long dead because we’ve got good reason to think we’ve figured out all the “obvious” stuff. New research is about the fine details and nuances or the rare and exotic. A scientist who wants to find something new needs to know what others have already found out. Keeping your research secret from the world is downright Randian, because it depends on authoritarianism and the blind trust of consumers, instead of informed consent.

If Burzynski is allowed to continue his scam, that sets a precedent for big pharmaceutical companies to do the same.

Bruce Lipton, Nut

In an earlier post, asking for targets for me to scrutinize, Yakaru suggested I look over Bruce Lipton. I was a bit slow in jumping into the topic, so Yakaru made a post of his own.

I’ll still make my contribution, and skimming over Lipton’s website, I’m drawn to a book excerpt he posted: “The Nature of Dis-ease.” The title alone is an ideological red flag. I’ve seen a lot of altie gurus use it in the past, trying for some kind of clever wordplay. I tend to see another aspect to this word splitting: I think it promotes the idea that health is the default state of being. It’s a meme that’s going to need more and more critical scrutiny because as medicine advanced, we’ve been able to lead healthier lives. It’s because of modern science-based medicine and our society’s infrastructure that we generally take health for granted and see disease as an aberration instead of an integral part of life.

Onto the article itself:

Continue reading

Newage Culture: An Overview

I discover Yakaru has a blog! It’s got a good post on the front page right now: Blaming the Victim: Comments on Louise Hay. It deals with one of the familiar tropes in both religion and newage that serves to protect the higher-ups when they can’t make good on a promise. It’s given me some motivation to share my extended thoughts on newage culture.

Continue reading