Good News/Bad News

You may have heard about a woman who recently decided to try living on solar power, tea, and water, instead of food. Good news: She’s stopped. Bad news: She lost about 20% of her weight in the experiment, and she’s rationalizing. One sentence in particular got me annoyed:

“I was just asking a question, but there was just so much negative response that that means the question can’t even be asked,” she said.

This is the sort of tepid excuse I’ve come to expect from a lot of woos regarding scientific questions. They can’t handle adversity like adults, and science is pretty adversarial. You don’t just dismiss criticism for being “negative.” You deal with criticism as rationally as you can, using logic and evidence to answer it. That’s a part of what it means to ask controversial questions. Criticism is supposed to be expected. Brainstorming without criticism is good for generating new ideas, but sooner or later, you need to sort the good ideas from the bad, and that typically means having a two-way conversation with critics. When you speak, you are not entitled to uniform cheering.

A lot of the time, people with the consensus view reacts negatively because the idea in question has been tested and failed or is implausible for well-established reasons. And with ideas like hers, the most likely outcome was that she’d harm herself. I care about people, even if they do stupid things. I criticize precisely because I care and want to dissuade them from harmful action.

Here’s the kicker: If you don’t have answers to criticism, question the value of your idea. You might be the one who’s wrong.

I’m glad she stopped, though I’d prefer if she did so for rational reasons.

Anti-Doggerel #1: “I Don’t Know”

The beginning of wisdom is, ‘I do not know.’ [gestures toward the “hole in space” on the viewscreen] I do not know what that is.

– Lt. Cmdr. Data, “Where Silence Has Lease

“I don’t know” is a phrase that probably should get more mileage. When used appropriately, it’s humble. It’s honest. It’s open. The universe is a big place with lots of tiny details, so it shouldn’t be a surprise that there is still much that we don’t know as a species, let alone as individuals. Being aware of that ignorance inspires both caution and curiosity, virtues of science. We can devise hypotheses to explain unknown phenomena, but a scientifically minded person doesn’t jump to the conclusion that his hypothesis is true without carefully testing it. Sincerely admitting ignorance typically means being open to entertaining new ideas as well. Those ideas still have to be tested before they’re accepted as knowledge, of course.

There’s an annoying idea I’ve encountered with various pseudoscience trolls, quacks, and especially with Creationists. They treat any admission of ignorance from their opponents as a victory for their ideas. It doesn’t work that way. For an example, let’s say a particular type of cancer has no known effective treatment. Just because the scientific community doesn’t have an answer doesn’t mean that we should accept a quack’s answer, especially if that answer wasn’t informed by scientific research into its plausibility.

Science is cautious by nature. The world is a complicated place, and there’s always the possibility of discovering new nuances and exceptions to the rules we’re familiar with. We can’t have absolute certainty in what we do know because of our human limitations. The language of scientists typically reflects this, since they will mention nuances, limitations, exceptions, and uncertainty from simple probability.

Pseudoscience doesn’t like humility or measured confidence, often characterizing it was “weak” language. Statements of absolute certainty and absolute rules are much more marketing friendly and easier to fit into a slogan. Religion is quite aware of this and sets up gods and holy books as absolute authorities with circular reasoning. Quacks and pseudoscientists often follow suit and enshrine their gurus and particularly the original creator and his texts. In either case, they often implicitly or explicitly claim they have all the answers in a convenient package. This tends to lead to stagnation. The scientific community knows that it doesn’t know everything. If they did, science would stop.

I think treating “I don’t know” as a concession taps on an unhealthy obsession with completeness and perfection that overrides the healthy desire to know the truth. One problem with many religious, supernatural, and pseudoscientific ideas is that they can explain anything. If you’re sympathetic to those sorts of hypotheses, that’s not a strength. If an idea can explain anything, that’s actually a big problem: It can explain things that don’t exist just as readily as those that do. It can explain failures and success equally. It essentially means that we can’t use it to make predictions to verify its accuracy. We can’t use it to make predictions or decisions. It’s ‘heads I win, tails you lose.’ Such ideas are essentially a way to deceive yourself with the comforting illusion of understanding without the practical benefits of real understanding.

There’s another idea that any answer is better than none. This is simply not true. Actions based on an incorrect idea can be more harmful than inaction. They can waste resources better spent elsewhere. I can understand desperation in the face of death and the desire to go down fighting, but that doesn’t mean I should rhetorically support those who can exploit desperate people just because I don’t know the true answer.

Idolatry

I’ve been slowly making my way through Married to the Sea‘s archives. It’s a silly comic most of the time, but sometimes it gets political. This one reminded me of a point I like to make. It’s the 10 Commandments as a graven image that gets pushed into schools and courthouses. On top of violating freedom from government imposition of a religion, showing undue favoritism, and all that, it makes the point that the 10 Commandments is essentially an idol.

Continue reading

Why I Don’t Trust Burzynski

If anyone’s suppressing Burzynski’s research, it’s Burzynski. The only motivations I can think of are greed and delusion because he’s not doing what an honest, altruistic scientist would do.

An honest scientist wouldn’t have any reason to delay publication of positive results for decades. If he got negative results, he would have moved onto more fruitful research long ago. If he suspected there was some flaw in the study, he would turn it over to peer review so they could engage in constructive criticism and he could start over and avoid those mistakes. Only a delusional scientist would keep testing after negative results, use cherry-picked anecdotes to falsely bolster his confidence and recruit test subjects, and avoid scientific scrutiny.

An honest scientist wouldn’t charge patients ridiculous amounts to participate, because that would make the selection non-random, biasing the results and negating the study’s value. Patients who invest a lot of money into a treatment are also emotionally invested in interpreting their situation in a positive light. Statistical analysis is how we remove our rose-colored glasses.

An altruistic scientist would instead pay for the study by asking for research grants and donations. He wouldn’t give out false promises of results, only that his treatment be given a chance to live up to his hopes. A con artist, however, would seek to make a profit by overcharging desperate patients for drugs that can be bought more cheaply. He would encourage people to spread cherry-picked testimonials to convince laypeople who don’t understand science. He wouldn’t publish his statistics and research methodology because that would allow scientists to discover the scam.

An honest scientist wouldn’t take a long time to publish a study unless what he’s studying really and truly takes decades to gather the data and make conclusions from them. In the case of a long term cancer treatment study, he’d at least publish preliminary results of what happens in the first few years and then continue following the patients for longer increments. That way, if the initial results are promising, other scientists can try to replicate them, and not have to wait for decades.

An altruistic scientist wouldn’t keep his research to himself. Science today depends on a culture of altruism. Scientists are expected to share information relatively freely. Science thrives with transparency and cooperation because new research depends on the reliability of existing knowledge. The era of the lone genius toiling in isolation is long dead because we’ve got good reason to think we’ve figured out all the “obvious” stuff. New research is about the fine details and nuances or the rare and exotic. A scientist who wants to find something new needs to know what others have already found out. Keeping your research secret from the world is downright Randian, because it depends on authoritarianism and the blind trust of consumers, instead of informed consent.

If Burzynski is allowed to continue his scam, that sets a precedent for big pharmaceutical companies to do the same.

Bruce Lipton, Nut

In an earlier post, asking for targets for me to scrutinize, Yakaru suggested I look over Bruce Lipton. I was a bit slow in jumping into the topic, so Yakaru made a post of his own.

I’ll still make my contribution, and skimming over Lipton’s website, I’m drawn to a book excerpt he posted: “The Nature of Dis-ease.” The title alone is an ideological red flag. I’ve seen a lot of altie gurus use it in the past, trying for some kind of clever wordplay. I tend to see another aspect to this word splitting: I think it promotes the idea that health is the default state of being. It’s a meme that’s going to need more and more critical scrutiny because as medicine advanced, we’ve been able to lead healthier lives. It’s because of modern science-based medicine and our society’s infrastructure that we generally take health for granted and see disease as an aberration instead of an integral part of life.

Onto the article itself:

Continue reading

Newage Culture: An Overview

I discover Yakaru has a blog! It’s got a good post on the front page right now: Blaming the Victim: Comments on Louise Hay. It deals with one of the familiar tropes in both religion and newage that serves to protect the higher-ups when they can’t make good on a promise. It’s given me some motivation to share my extended thoughts on newage culture.

Continue reading

Doggerel #4: “Shill!”

Welcome back to “Doggerel,” where I discuss words and phrases that are misused, abused, or just plain meaningless.

It’s a common accusation, and a convenient one. Accusing a skeptic of being a shill for some industry, pharmaceutical companies, or any allegedly evil organization is a popular appeal to motive and ad hominem fallacy. Even if the skeptic does indeed have ties to an organization, that doesn’t mean the evidence or arguments he presents should be ignored. If anything, it means they should be examined a little bit of extra scrutiny, not simply ignored.

A big problem with this doggerel, however, is that it’s commonly used against anyone who disagrees with the accuser, not just those with demonstrable organizational ties. Simply assuming that everyone who comes to a position out of profit motive demonstrates great cynicism as well as an unwillingness to consider alternative views or motives. Skeptics generally take up the viewpoints that appear to have the best evidence in their favor. If we’re mistaken, present evidence and ask critical questions. Jumping to the conclusion of selfishness and malice instead of considering the possibility that we might be mistaken or even that we might be right does nothing productive.

Even in the event that someone does have ties to such an organization, it’s still cynical to assume there’s a profit motive, especially since causation doesn’t have to flow one way. People might join an organization because they honestly believe in what the organization is doing. Researchers might join a pharmaceutical company and endorse their products because they honestly consider them good for society. Those same researchers might reject a quack’s claimed alternative because they have reasons to suspect fraud. There are honest people out there, just like there are selfish ones. Reckless use of this fallacy is tantamount to denying human diversity.

Advice to my opponents: Don’t use this argument without exceptionally clear evidence of a connection. Even then, you should maintain focus on the quality of evidence, especially independent lines of evidence. Simply asserting bias without looking at the evidence and its countermeasures against those biases is seen as manipulative and reflexive. Using it on someone simply because they disagree with you on a topic makes you look like an egotistical black-and-white thinker who can’t deal with the idea that other people can think independently and come to a different conclusion as a result of their exploration of the issue. Think carefully before you use this line of argument.

Doggerel #3: “You’re Just Jealous!”

Welcome back to “Doggerel,” where I discuss words and phrases that are misused, abused, or just plain meaningless.

Fry: This is so unfair! I liked you back when you were a cyclops! That guy’s only interested now that you have two eyes.
Leela: You’re just jealous!
Fry: No, I’m not! Oh, wait, I am. But my point remains valid!

It’s a classic mix of fallacies: appeal to emotion and appeal to motive. An arguer’s emotional state doesn’t make his point invalid. Neither do motives. They may give you a good reason to double-check the quality of evidence, but they do not change the way you should approach the argument.

The way this argument is commonly used is a bit worse than that. Quite often, someone will accuse a skeptic of being jealous of a psychic’s alleged powers, an alleged healer’s skills, or whatever. The problem is that, contrary to what many believers have been led to say, skeptics are skeptics. We simply don’t believe in extraordinary things without good evidence. Lots of people have been harmed by frauds, and it’s a common motive among skeptics to protect people from those frauds by spreading information and teaching people to think critically. We can’t exactly be jealous of someone’s ability if we don’t believe they have it.

Worse, some believers will accuse us of being jealous of their wealth or their fame. This cynicism is quite a toxic attitude. We know quite a lot of the tricks from various frauds we’ve seen, but refuse to sink to that level because of our ethics. It’s also a very shallow way to view humanity, as if no one has desires beyond money and fame.

In the information age, censorship is generally much more difficult. With the Streisand Effect, legal challenges can end up emboldening critics because they signal a lack of rational arguments. The favored alternative method for handling critics is propaganda and indoctrination. Intentionally or unintentionally, frauds and self-deceived believers create a culture that encourages the use of logical fallacies and other ways of evading critical scrutiny. Encouraging cynicism by depicting their critics as having petty, ulterior motives is an effective means to prevent followers from listening to their core criticisms.

Advice to my opponents: Focus on the facts, not the emotions. To a skeptic, employing this argument is a sign of weakness if you don’t include rational arguments alongside it. Assuming we’re jealous of great wealth or fame typically makes us wonder if you’re projecting your own psychology onto us and it shows great cynicism toward humanity in general if you jump to such conclusions so quickly.

Bill Maher is Crap

Crap. I don’t like Bill Maher. I know he’s said some clever pro-atheism/anti-religion things, but my charity towards him pretty much vanished when I found out about his anti-vax stance along with some miscellaneous altie canards he spread around. The misogyny I later found out about didn’t help his case, either. There’s probably more crap that’s shown up since the time I started mostly ignoring him.

I want to make it clear that I don’t consider him a role model, and that I wouldn’t want him to speak for me, politically, scientifically, or philosophically. I’d rather avoid being lumped with him whenever possible. I know that I can’t expect anyone (even myself) to have perfect reason, but Maher falls too far outside my range of tolerance for a supposed leading atheist.