The fact that I happened to load up the Logitech and Dunkin Donuts surveys on the same day is pure random chance. They are in wildly different categories, so it would make sense that their approaches to gathering customer feedback would be wildly different.
That won't stop me, however, from making a few observations about each one.
The Logitech survey is really a classic implementation of a Net Promoter approach to gathering customer feedback. (You can download a ppt of survey screenshots here.) I couldn't really say that if I didn't know how extensively and carefully Logitech's teams use the feedback to inform product development and customer service improvements. They have talked about their approach in several public forums (plus, they were a member of the NPS Loyalty Forum and hosted one of our meetings).
Logitech is interesting, in part, because they are a product company with challenging product innovation cycles. They use NPS as a key tool to inform their innovation and product development processes, cutting cycle times from product introduction to product enhancement. They also have worked hard to figure out how to get a really good sample of feedback on every product, and continue to improve this over time. They ask for feedback at just the right time -- a couple of weeks after you have installed the product, which gives you the opportunity to get past the initial excitement of the purchase and into the early usage experience. In my case, I was giving feedback on a webcam. My feedback was far more valuable and well-informed because I had been using it for long enough to have observed some of its unexpected good features, as well as some of the unexpected annoyances. So my feedback was far richer and more detailed than it could have been immediately after installation.
A few things to notice, some of which you couldn't glean just from the document here:
- They trigger the survey off of product registration and time the request to coincide with the first couple of weeks of usage, rather than just purchase or installation
- Very short, simple survey - it all fits on a single web page
- Simple, context sensitive drop downs to help you let them know which product you're telling them about
- Plenty of open-ended, verbatim space. In their product categories, there are an infinite number of observations a customer could make. Rather than box us in, they let the end customer decide what to mention and what matters
- If a customer happens to let them know they are really unhappy with a product, someone from Logitech will typically get in touch with them to follow up - they try to learn more about the situation and even engage in some customer recovery, if possible
- This Net Promoter survey process is only one part of an overall customer feedback and research approach that also incorporates other forms of customer research, observation and feedback
Dunkin Donuts, on the other hand, is all about the in-store experience. I don't really know enough about how Dunkin management and front line use the feedback they get, but I do have an hypothesis: I don't think their front line get very much direct customer feedback.
This is another example of a retail survey that uses a register receipt to deliver the invitation. I've said before that I never ever had filled one of these out until a few months ago, when I started to wonder what these were like and who spends their time doing this. I do have to scratch my head and wonder what would motivate the AVERAGE customer to spend time filling out this sort of survey. Sure, there's a promise of a gift of some sort. But you had to look closely at the receipt, notice the part asking for a survey, and then make the decision to spend the time going online -- typically a long time after you had consumed the coffee, donut, bagel or sandwich, and maybe even after you'd made yet another visit to one of their stores.
I wonder how many surveys they get each day or week per store. Is it more than single digits? I'd be surprised if they get 14 per week per store, on average. Maybe someone can enlighten me here. Is it enough to provide quick-cycle feedback to the store manager on how things are going? Rich enough to provide granular feedback to individual employees (or even to a whole shift) on what they are doing to create lots of satisfied, profitable promoters?
The survey takes 14 pages to get through. (You can download screenshots here.) I guess if you've gone to all the trouble of finding the survey invitation and getting to their web site, you are geared up for the survey. There's nothing spontaneous about this process at all. And for a relatively frequent category (I would guess their best customers come in several times a day, and lots of others come in every work day), it must take something unusual to get most people to sit down to do this. So a longer survey might be just fine?
After entering the store number, the survey asks customers to identify some details about their visit. I find myself wondering if the point of sale system generates a unique transaction number that could have been used to populate this whole screen for the customer. It would also take care of the next couple of follow up questions about what you ordered, providing real accuracy about the transaction and taking the burden off the customer. But maybe they don't have the technology infrastructure to enable that.
The sixth and seventh pages of the survey ask a number of detailed questions about your satisfaction with individual elements of the experience. I assume these are the ones they have demonstrated empirically over time to be the biggest drivers of overall satisfaction? They certainly seem reasonable enough.
Once again, however, I find myself wondering whether all these detailed multiple choice questions are going to generate the granular and detailed feedback that a shift supervisor or store manager would need to coach members the store staff, improve training, change hiring practices, or to set staffing schedules differently.
Finally, if I were completely irate and took the time to complete this survey, what would happen? Would I get any follow up? I certainly don't see a mechanism for that here. Looks like I'm sending my feedback into the "black hole" of market research.
What do you think?