Ignoring Customer Comments: A Disturbing Trend
One-Third of Researchers Think Survey Ratings Are All They Need
You’d be hard-pressed to find anyone who doesn’t think customer feedback matters, but it seems an alarming number of researchers don’t believe they really need to hear what people have to say!
In fact, almost a third of market researchers we recently polled either don’t give consumers the opportunity to comment or flat out ignore their responses.
- 30% of researchers report they do not include an option for customer comments in longitudinal customer experience trackers because they “don’t want to deal with the coding/analysis.” Almost as many (34%) admit the same for ad hoc surveys.
- 42% of researchers also admit launching surveys that contain an option for customer comments with no intention of doing anything with the comments they receive.
Customer Comments Aren’t Necessary?
Part of the problem—as the first bullet indicates—is that coding/analysis of responses to open-ended questions has historically been a time-consuming and labor-intensive process. (Happily, this is no longer the case.)
But a more troubling issue, it seems, is a widespread lack of recognition for the value of unstructured customer feedback, especially compared to quantitative survey data.
- Almost half (41%) of researchers said actual voice-of-customer comments are of secondary importance to structured rating questions.
- Of those who do read/analyze customer comments, 20% said it’s sufficient to just read/code a small subset of the comments rather than each and every
In short, we can conclude that many researchers omit or ignore customer comments because they believe they can get the same or better insights from quantitative ratings data.
This assumption is absolutely WRONG.
Misconception: Ratings Are Enough
I’ve posted on the serious problems with relying exclusively on quantitative data for insights before here.
But before I discovered text analytics, I used to be in the same camp as the researchers highlighted in our survey.
My first mistake was that I assumed I would always be able to frame the right questions and conceive of all possible relevant answers.
I also believed, naively, that respondents actually consider all questions equally and that the decimal point differences in mean ratings from (frequently onerous) attribute batteries are meaningful, especially if we can apply a T Test and the 0.23% difference is deemed “significant” (even if only at a directional 80% confidence level).
Since then, I have found time and time again that nothing predicts actual customer behavior better than the comment data from a well-crafted open-end.
For a real world example, I invite you to have a look at the work we did with Jiffy Lube.
There are real dollars attached to what our customers can tell us if we let them use their own words. If you’re not letting them speak, your opportunity cost is probably much higher than you realize.
Thank you for your readership,
I look forward to your COMMENTS!
[PS. Over 200 marketing researchers professionals completed the survey in just the first week in field (statistics above), and the survey is still fielding here. What I was most impressed with so far was ironically the quality and thought fullness of the two open ended comments that were provided. Thus I will be doing initial analysis and reporting here on the blog during the next few days. So come back soon to see part II and maybe even a part III of the analysis to this very short but interesting survey of research professionals]