Image from Microsoft Word

Last month another principal in our firm came to me for a discussion.  He had found a survey report on the internet that was relevant to a project he was undertaking, but he was suspicious.  “Do you see any way that this data can be accurate?” he asked.  “The findings seem far-fetched.”  I reviewed the study, which had been conducted by a well-regarded academic institution, and I shook my head.  “There’s no way this was done correctly,” I concluded.  We did not use the study.

It reminded me of something that we do behind the scenes.  We think a lot about methodologies in our company.  We’re very careful about how we design, sample, recruit, and analyze our work, and there’s a reason for that:  we want to develop accurate and reliable information.

But in addition to all of the planning beforehand, there’s one critical step that occurs late in the process It’s seldom spoken about, but it’s a critical step in the process.  It’s the use of common sense.

Using surveys as an example, we often say that the most useful survey is one where 80 percent of the survey affirms what you think you know, and the other 20 percent surprises you.  We all develop a sense of how the world works, and in our business at Corona Insights we develop a very strong sense due to our exposure to lots and lots of data and information.  (We would be good at Family Feud.)  So when we review survey results, whether it’s our survey or another organization’s survey that we’re considering for reference material, we will scan the results to see if the survey results seem reasonable before declaring it valid.  It’s not a particular system or a scoring rubric, but rather a feel based on our experience working with data.

Now, this type of review just seems like common sense, but you would be surprised at how often this does not occur in the survey world.  We’re open-minded about being surprised by a survey’s results, but … here’s where common sense comes to play.  Just because something is presented with a lot of numbers does not mean that it’s accurate or correct.  We have seen some survey reports over the years where the results are not supported or corroborated in any way by the combined 100+ years of experience in our company.  Upon a deeper review, it was clear in those cases that there was a major error in part of the research process, but there had been no common-sense check at the end that would have identified the results as being faulty, which would have alerted the authors to the presence of a major error.

The lesson here is that no source of data should be viewed in a vacuum, and data is not guaranteed to be accurate even if it’s in a glossy report.  We review our own work and the work of others with a healthy skepticism until we’re comfortable that it either fits into our universe of knowledge or we can understand why it doesn’t.  We encourage other data users to do the same.