Hopefully you have read part one of our post regarding feedback. If not, you may want to start...
Lies, Damned Lies, and Surveys
Do you ever feel like the little girl in the gif above? If you’ve attended a conference, placed an online order, or chatted with a customer service rep, chances are you’ve filled out your fair share of digital surveys and you're probably pretty tired of them. The use of surveys has exploded in recent years, and their use is only going to continue to grow. It’s easy to understand why–they’re user friendly, provide quick data, and can scale seamlessly. But they’re not always as effective as they can first appear. In fact, they can even be deceiving.
Post-Conversion Surveys
Digital surveys can be used for a wide range of purposes, in a wide range of settings. They can be used to gauge interest before launching a product or event, and they can be used within companies to provide human resources and senior leadership information about their team. But let’s focus on their most common use, which is to gather user data post-conversion.
Post-conversion surveys help marketing and leadership identify strengths and weaknesses in their product, and more importantly, they can help identify areas of opportunity for future conversions. But the survey is only useful if it’s gathering accurate and complete information, and designing successful surveys is more difficult than it may appear on the surface.
The Allure of Quantitative Data
Most digital surveys allow users to answer questions using various different response methods, with the most common being multiple choice, star ratings, and slider ratings. The data is collected, and many survey platforms will crunch the data and spit out a handy export document. While this looks effective and easy-to-use on the surface, such answer formats have an annoying way of flattening reality and eliminating crucial context.
As a user, it’s all too easy to fly through a post-conversion survey and mindlessly check boxes, but we’re not really providing full answers. Let’s say you’ve been asked to rate a speaker panel at a conference, and you’re given a slider rating of 1 to 10, 10 being the most positive. Well, you loved the speakers and the content, but the seating was uncomfortable and the panel started 15 minutes late. So do you answer with a 7? Maybe an 8? It’s too easy to just shrug and pick a number almost arbitrarily. Plus, the person next to you may have had the exact same experience, but they checked a 6 instead.
Given just the quantitative data, how do you account for context and natural subjectivity? Surveys may ask users to provide further detail with short-answer sections, but the user is then just being asked the same question twice. No one really looks forward to answering surveys, and being asked to answer the same question twice just adds to any existing irritation.
The allure of quantitative data is certainly understandable. It provides for quick analysis and aesthetically appealing graphics, and it can scale quite easily. But unfortunately, you’re just not getting the full picture.
Encouraging Thoughtful, Intentional Survey Responses
Instead of relying on primarily quantitative data to gather post-conversion user data, marketers should feel empowered to encourage survey users to provide more thoughtful, intentional feedback by using digital tools like LoopingBack. Here's an example survey we did for Taxa Outdoors so you can understand the experience as well as see the feedback I had for them!
If users could leave feedback through video or audio messages, marketers would get a much more complete picture of user experience. Most experts agree that between 70% to 90% of communication is nonverbal. Reducing user response to just words or numbers therefore eliminates a critical amount of actual communication. Being able to see body language and hear changes in tone all help to paint a fuller picture of experience and feedback. Plus, many people just communicate better when they are able to provide stream of consciousness responses instead of having to string together written responses.
Of course, analyzing video and audio responses is not nearly as easy as quantitative data, nor is it as immediately scalable. But surveys are only useful if they’re providing useful data and effective data, and if having to choose between users thoughtlessly clicking through checkboxes or users taking the time to provide meaningful feedback, the meaningful feedback should win every time. Plus, LoopingBack’s AI functionality will help to greatly increase the speed at which video and audio responses can be interpreted and analyzed.
High Quality Testimonials
If users are willing to provide video and audio responses to surveys, then they are most likely willing to have their feedback be used as testimonials for future marketing purposes. Testimonials are always high quality marketing, and testimonials provided during surveys will result in more thoughtful and impactful messaging from the user, since they’ll already have been thinking about the experience in a critical way.
Surveys can be incredibly useful, and they definitely aren’t going away anytime soon. In fact, they’re expected to continue to grow in popularity, especially in North America. But like all things, they can be improved and optimized for better results. Instead of relying on mostly quantitative data to sum up a user’s experience with a product or event, using tools like LoopingBack can enhance the survey experience for the user and provide marketers with much more meaningful responses and data.