So many "studies" are nothing but dressed-up junk, but they still do harm to genuine investigations
Almost without exception, when I see a news report starting off with the expression "according to a recent survey," "according to a study," or "according to a report," I skip the story.
Why? Because most of the so-called items are nearly worthless pap. So many are half-baked and biased stalking horses for a political agenda, or thinly veiled prods more funding for a project or cause, and are often self-reported drivel.
For example, I just saw "More Than 15% Obese in Nearly All U.S. Metro Areas" and made the mistake of actually looking at it. It had a detailed table listing the purported obesity percentages in various cities: Poor McAllen-Edinburg-Mission, Texas, where residents were the most likely to be obese, at 38.8%.
Except for one thing: this is ridiculous accuracy and precision. As you read further into this Gallup survey—a supposedly respected organization, BTW—you find out this apparent clarity to three significant figures is based on—get this—"self-reported height and weight." As we say, "oh, well, never mind."
What's the accuracy of such data? Pretty lousy, I'd say. After all, many people, shocking to say, probably under-report their weight and over-report their height. And who reports it, anyway? It's probably not a representative cross-section or sample, I'd bet.
I suppose it is possible to argue that the many data errors will average out, as they sometimes do, but not in this case, I'm pretty sure. I think giving the results with a range of, say, "between 35 and 40%" for those Texas-area folks would be far more realistic
This is just one example of a news story based on a nearly meaningless survey; there are so many more. Why do these get so much coverage?
I think there are several reasons. First, reporters and the media like to cite supposedly scientific studies that support their personal political and social biases. Plus—and here's the dirty little secret—they are easy topics to cover and write up. You take the reported survey results from the press release, make a call to get a few quotes, write a summary with an attention-grabbing lead-in, and poof! your work is done: no need to do real research, reporting, checking of facts, analysis of method, or ask for other views from other experts. In short: many reporters are lazy, and this is an easy way to do an assignment.
Normally, I wouldn't care about these pseudo-scientific reports—but the problem is that what's "pseudo" one day soon brings everything which sounds similar down to its level. As a result, credible scientific reports with solid backing are soon lumped in with these semi-bogus ones, and soon all elicit the same "you must be kidding" response.
One of the first lessons engineers and scientists earn, either formally or via experience, is not to impugn too much precision into test results, and to recognize sources of error in experimental data analysis. A second lesson takes longer to learn: those who do things "quick and dirty" often diminish those nearby, who are trying to do it right and clean.
Have you seen any so-called studies, surveys, or reports that made you angry due to their idiocy? And worse, have you ever had to write one? ?