An older article by Brian D Earp looks at the problem of bullshit in scientific publications:
… if you love science, you had better question it, and question it well, so it can live up to its potential.
And it is with that in mind that I bring up the subject of bullshit.
There is a veritable truckload of bullshit in science. When I say bullshit, I mean arguments, data, publications, or even the official policies of scientific organizations that give every impression of being perfectly reasonable — of being well-supported by the highest quality of evidence, and so forth — but which don’t hold up when you scrutinize the details. Bullshit has the veneer of truth-like plausibility. It looks good. It sounds right. But when you get right down to it, it stinks.
There are many ways to produce scientific bullshit. One way is to assert that something has been “proven,” “shown,” or “found” and then cite, in support of this assertion, a study that has actually been heavily critiqued (fairly and in good faith, let us say, although that is not always the case, as we soon shall see) without acknowledging any of the published criticisms of the study or otherwise grappling with its inherent limitations.
Another way is to refer to evidence as being of “high quality” simply because it comes from an in-principle relatively strong study design, like a randomized control trial, without checking the specific materials that were used in the study to confirm that they were fit for purpose. There is also the problem of taking data that were generated in one environment and applying them to a completely different environment (without showing, or in some cases even attempting to show, that the two environments are analogous in the right way). There are other examples I have explored in other contexts, and many of them are fairly well-known.
But there is one example I have only recently come across, and of which I have not yet seen any serious discussion. I am referring to a certain sustained, long-term publication strategy, apparently deliberately carried out (although motivations can be hard to pin down), that results in a stupefying, and in my view dangerous, paper-pile of scientific bullshit. It can be hard to detect, at first, with an untrained eye — you have to know your specific area of research extremely well to begin to see it — but once you do catch on, it becomes impossible to un-see.