This intriguing article “The Truth Wears Off”appeared in the New Yorker before Christmas, but I’ve only just read it.
It’s all about the “decline effect”, whereby effects that initially appear strong in experimental studies seem to start declining in effectiveness as more and more scientists try to replicate the original findings.
Some (or all) of this is understandable in terms of better experimental set ups, and publication bias, which means that it’s much easier to get an apparently new effect published than a study that has a negative result.
But the article is of most interest when discussing cases where the scientists who first made a positive finding discover that they can’t replicate it themselves. A good example is given from parapsychology, where pioneer JB Rhine initially seemed to have a star subject, but he subsequently seemed to lose all of his ESP powers. Sure, the simple explanation may be that, even though Rhine may have thought the replication set up was identical with the previous one, it really wasn’t, and a trick the subject was previously using subsequently failed him. But it does seem a little odd to me that Rhine wouldn’t find the trick in a case like this: one feels sure that this would be more satisfying in its own way than admitting that he can’t explain why a subject lost his power.
There are other examples of this given in the article, and from less contentious fields than parapsychology. It’s well worth a read.
Talk of all this couldn’t help but remind me of Rupert Sheldrake and his odd morphic resonance idea. In short, he believes that you can scientifically show that it becomes easier to acquire knowledge as more and more subjects learn it, be they birds, dogs or humans. (And, of course, in his theory, it’s not via simple imitation of the first creature who learnt the talent.) A genuine decline effect would seem to be the opposite of that.
A possible example of the decline effect that came to my mind was the original cold fusion experiments, and perhaps some of the subsequent ones too. These are not mentioned in the article, however.
If it were true (the decline effect) I guess it could be explicable (warning: wild speculation about to embarked upon) by either:
a. the universe really being a computer simulation game run by a mega intelligence that changes the rules for some obscure purpose while the program runs; or
b. God, his opposite number, or aliens (take your pick) finding it important that certain things not be discovered by humans until the time is right. I personally like the idea of undercover teams of angels, demons, aliens or Men in Black interfering with important experiments in very subtle ways to confound humans at particular points in time. Of course, this may make “sense” for something fundamentally groundbreaking like the discovery of ESP, or perhaps cold fusion, but why it should apply to the effects of antipsychotic drugs would be rather harder to explain.
I guess there is probably some science fiction (or supernatural fiction) that has been written along the lines of b, but I can’t bring any to mind. The nearest may be the idea in the Day the Earth Stood Still that aliens would give us a warning to mend our ways by one spectacular demonstration of their power. But that was far from an discrete way of interfering. And I do recall David Brin wrote “The Practice Effect”, in which inanimate objects get better with “practice”, but again that is more akin to Sheldrake’s idea than a decline effect.
A decline effect has better fiction possibilities than morphic resonance, and maybe that is its most endearing feature.
1 comment:
The problems with vanishing reproducibility may occur in some studies, but cold fusion does not fit this pattern. Cold fusion remains difficult to replicate, but the success rate is much higher than it was in 1989, and the signal to noise ratio is much higher.
In some cases, for some aspects the effect, the s/n ratio is orders of magnitude higher than it was. For example, tritium was measured at 10 to 40 times background in 1989 in many experiments, at Los Alamos and elsewhere. That was a definitive result, but in subsequent work tritium was measured 10E6 to 10E8 times background.
Excess heat was typically measured at a fraction of watt in 1989. More recently it has been measured at 20 W, with no input power, using a cathode roughly 100 times smaller than the ones used in 1989. That is a much stronger effect.
You will find roughly 1000 papers on cold fusion here:
http://lenr-canr.org/
Post a Comment