Library Watch, read, and learn from the best.

Customer feedback: how to hear the voice of the customer

Customer feedback
Imagine you spend weeks researching and drafting and double-checking facts for a presentation, and when you finally think it’s ready and ask a colleague to “just take a look” she makes suggestions and asks questions that never even occurred to you.

Sound familiar? We’ve all been through this, either at work, school, or with a personal project. The same happens with the products and services our companies offer. Because we help build them, we know how we use them, how to combine their most complex features, and how we want to improve them. But, there are many ideas we haven’t thought of, that never occurred to us.

It’s not unusual for those on Customer Success teams or UX designers to receive customer requests and think, “How did it never occur to us to use our product for this?” Even if you work on a big team and each team member has a different use for your product, it’s impossible to cover all your customers’ use cases.

Because we immerse ourselves in our work, it’s hard to take a step back and see it with fresh eyes. That’s why customer feedback is so crucial. Listening to the difficulties they have with the products can help you to better understand how to improve your products and make customers happier.

About this guide

At Zendesk and Typeform, collecting and using customer feedback is an important part of how we do business and how we build great relationships with our customers. In this guest article from Typeform, we explain the processes used to collect, analyze, and use customer feedback to build better products. We'll cover:

  • 3 types of feedback: given, requested, and observed
  • How to make sense of it all
  • Sharing your customer feedback with your entire organization

3 types of customer feedback: given, requested, observed

Customer feedback comes in multiple forms and from multiple sources, all of which we can organize into these three categories: given, requested, and observed.

Given: This is any type of feedback your customers proactively send in without being asked or encouraged to do so. Some examples of given feedback include the support tickets they open, phone calls, live chat conversations, and social media messages.

Requested: Requested feedback is when you take the initiative to contact customers and ask for suggestions to improve the product or the way you communicate with them. This includes requests for customer satisfaction (CSAT) ratings, NPS (Net Promoter Score®) surveys, replies to email newsletters or in-app messages, or when the Product and UX teams conduct interviews with customers.

Observed: This is the feedback you get when monitoring how your customers interact with your products, the paths they follow, and the documentation they read. At Typeform, our Customer Success team closely monitors our visitors’ behavior in our Help Center to see what our users are searching for, the articles they read, and the articles from which support tickets are generated. We also work closely with the Data team to understand how customers use our product.

In the following sections, we’ll go into more detail.

Given customer feedback

Support tickets

Support tickets are probably the main point of contact between customers and your company. Support tickets are created via the many channels you provide to customers whenever they have a question, want to make a suggestion, or report an issue.

One of the advantages of having help desk software to manage your tickets is that it allows you to add tags to each ticket, so you can flag common issues and the areas of your product that generate the most tickets (and are therefore probably the least intuitive to your customers).

Additionally, using the analytics tools provided by your help desk software, you can easily create reports and dashboards to better understand your conversations with customers. You just need to focus on tagging your tickets correctly and consistently.

In large teams (e.g., Typeform has a Customer Success team of 25 people) it can be a struggle to ensure that everyone consistently tags the same types of tickets with the same tags. Sometimes a suggestion is seen as a feature request for one person, but as a UX issue for another, who might then tag it as a bug instead. Train your team to follow tagging guidelines so you can trust that the aggregated insights you get from the tags are accurate.

The customer feedback metrics we track are Customer Satisfaction (CSAT) ratings and the number of tickets for each feedback type tag such as feature requests, the type of issue reported, and so on. We also look at the evolution of tags over a certain period of time.

Customer calls

Customer calls

Tag your phone support interactions using the same tagging scheme you use for tickets. If your phone channel is not an integrated part of your help desk software, you’ll need to manage that feedback source, and analyze the data from it, separately from your other support tickets.

How you weight the feedback received from customers on your phone channel might be completely different if you offer phone support to all your customers or just to your Enterprise customers, for example.

Because customer feedback metrics track the number of feature requests, bugs reports and so on, if you only offer phone support to a small group of customers, the volume of these requests will be, naturally, smaller compared with support tickets received from your other channels. Nonetheless, these may be your most valuable customers, so you need to make their feedback more visible and translate its importance based on business impact rather than volume.

Outbound messages

Whenever you send a newsletter, an onboarding email, or show an in-app message to your customers, you can provide customers the option to reply with feedback or be redirected to your website or Help Center. Either way, it’s important to send these messages using a tool that tracks these customer interactions. Interesting metrics include the open and click-through rates and the number of replies.

Ideally, this tool also allows you to set goals for each message sent and then track completion rates. For example, using a new feature or prompting customers to subscribe to a specific plan are examples of goals you might define.

With this approach you should target a specific segment of your customers. For example, if you just launched a new feature and want to know how your customers are using it, instead of sending an email to your entire customer base, you can target those who have tried it.

Because outbound messages can be an interruption, something sent to your customers without their request or prior approval, be selective with how often you send them. To help you understand how your customers react to these messages, look at the number of unsubscribe requests each message triggers (in the case of email messages and newsletters). If you use in-app messages, you can track a similar reaction if you provide a rating option such as thumbs up or thumbs down feedback.

Requested customer feedback


Requesting feedback using surveys can be, like outbound messages, intrusive and sometimes seen as annoying by your customers. Therefore, you should be thoughtful about how you do them. Also, if you do use surveys, be prepared and willing to engage in a conversation with your customers after they’ve provided you with their feedback. It’s an awful experience for customers to spend their time sending in feedback and questions, only to never hear back from you.

When done the right way, surveys are essential sources of feedback that can help you better understand your customers.

The Net Promoter Score®

The Net Promoter Score (NPS) is a one question survey asking how likely your customers are to recommend your company to a friend or colleague. From it you’ll discover if they are Detractors, Passives, or Promoters.

The customer can give you a score from 0 to 10 and, additionally, can provide a comment explaining why they gave you that score.

The final score ranges from −100 (everybody is a Detractor) to +100 (everybody is a Promoter). A score on the positive side of the scale is good, with for example +50 being excellent. The score is calculated by subtracting the percentage of customers who are Detractors (scores between 0 and 6) from the percentage of customers who are Promoters (scores of 9 or 10). You can read more about how NPS surveys work in 5 ways to better connect with customers using NPS data.

The written response is as important as the score itself. It explains the reason behind the score, which the number alone does not provide. This is the main advantage of the NPS: it lets you feel the pulse of your customers’ engagement with your company and, as a bonus, you can understand why they promote your product or why they’re Detractors.

Keep a close eye on the overall NPS score, but also segment it by plan or milestone and monitor its evolution over time. As for the open-ended feedback, use the same tagging scheme you use for support tickets to categorize and then analyze the reasons behind the score. You can then act on this feedback, either redirecting it to your Product or Engineering teams, or using it to improve your documentation or customer onboarding.

The churn survey

Churn surveys help you turn unfortunate experiences—a customer canceling or downgrading service—into an opportunity to learn. Your churn survey should include mainly fixed-answer questions so that you can easily analyze results. But, like the NPS survey, you should also add the open-ended question asking them to tell you more about why they’re leaving. This allows your customers to comment on anything not covered in the pre-defined options you provided in the fixed-answer questions.

What metrics can you track here? You want to know if they’re leaving because they’re not satisfied (experienced bugs or wanted more personalized support) or if they’re not using and no longer need the product.

The reason for leaving will define how you approach a change in your products or services. In the first scenario (they’re not satisfied), consider improving the product or providing new channels or levels of support. In the second (no longer using the product), focus on inspiring your customers with new use cases to keep them engaged and using the product.

As with the NPS survey, you should also segment churn satisfaction by plan, reason for churning, or milestone. For example, do customers who churn after the first month give higher scores than the ones who churn after a year?

Our newest feedback source, the welcome survey

Typeform recently introduced a new Welcome survey to understand why users subscribed to a paid plan in the first place (we also offer a freemium version of our product), what their main use case is, and the goals and metrics by which they will measure their own success.

The survey analysis helps us better understand our customer base, which means we can create content better suited to their needs, and that reaches them at the right time—when they’re most engaged with the product.

Understanding why your customers are leaving is important to all companies, but it assumes an even higher importance if you have a subscription business model.

Customer Success teams want customers to be successful, to stay and renew their subscriptions, so they really want to understand the reasons that lead a customer to click on the dreaded "cancel subscription" button. By surveying customers as they are in the process of cancelling their account (or after they’ve cancelled it), you can collect that vital feedback.

Observed customer feedback

Your Help Center, the self-service help, and user community available to your customers are some of the most important sources of observed feedback. You can learn a lot about your product just by looking at these sources of feedback:

  • Popular features (most visited articles)
  • Search terms without results (which often translate to feature requests)
  • The areas that cause problems and confusion (support tickets created after visiting a certain article)
  • What makes your users convert (what articles they visited before they subscribed)

Ticket deflection is also important. At Typeform, we measure this by looking at the ratio of education tickets generated via the Help Center divided by the unique visits to the Help Center. What do we classify as education tickets? All tickets submitted from the Help Center minus those tickets that do not reflect on the quality and effectiveness of the Help Center. These include project-related tickets, bug-related tickets, billing issues, and so on. It’s an important metric to keep track of because the ultimate goal for a good Help Center is to reduce the number of support tickets created in the first place.

null

How to make sense of all this customer feedback

Now that you’ve collected all these types of feedback (suggestions, requests, and complaints from your customers), the next step is to organize it in a way that allows you to identify trends and observe patterns. Some of the questions you may want to consider when organizing include:

  • Are the feature requests you receive from your support tickets the same as those mentioned by Passives in your NPS survey?
  • Do your Detractors share the same complaints that you see in your churn survey?
  • Are there pain points across all your products or services, or is there a specific problem affecting just a segment of your customers (your Enterprise customers for example)?

To organize the feedback collected from so many sources, we decided at Typeform to create a three-level tagging scheme that we use across all those feedback sources. The levels are type of feedback, product area, and feature name.

Level 1: Type of customer feedback

First, categorize what the user is reporting. Here’s how we break down the tags at Typeform:

  • Feature request. This includes requests for completely new features or new functionalities for existing features. It’s important to tag these to help the Product team prioritize their roadmap.
  • Product pain. This is mainly used for UX or UI issues. If too many customers have trouble using a feature or applying a setting, we use this tag to make our UX team aware of it.
  • Education pain. This is used whenever a feature or workflow is not documented or the existing documentation needs to be improved.
  • Unaware. It’s not uncommon for customers to request features that already exist, so we track these instances to improve onboarding and report it to UX team who then may be able to improve the product design to make customers more aware of these features.
  • Billing. As with any business, it’s imperative that the payment process is as painless as possible, so it’s important to track any billing issues that customers have.
  • Bugs. It’s impossible for a product to be bug-free, so we need to document it not only so it can be fixed, but also to understand the impact bugs have on our Customer Success metrics.

Level 2: Product area

Once we categorize the feedback by type, we need to know in which area of the product issues occur and that’s what we do at Typeform. For example, because we provide form and survey builder tools, we need to know if the request is in the Admin Panel (new question type, new functionality), on the respondent’s side (support a new device, allow speech-to-text), or with third party apps (new embedding options, native integrations).

Tagging by product area helps us forward the feedback to the correct Product team, making sure they only receive feedback that is relevant to them.

Level 3: Feature name

Our last tagging level allows us to be specific about the feature that is being requested or that customers have trouble with. We have many tags for existing features, but we don’t tag every support issue we see with a feature name tag unless a specific feature is becoming a support trend over time. We start with the product area tag and then create and use a feature name tag when that support trend emerges.

Listen, segment, and prioritize

After all your feedback is categorized, you may be tempted approach the data with the same treatment, but your customers are not all the same. For example, you shouldn’t analyze the feedback from an Enterprise customer that’s investing let’s say $2,000 per year in your product the same way you do the feedback from someone who is on a $35 per month plan. Nor can you give a feature request from a power user the same importance you do as one from a new customer. You need to arm yourself with these three rules: listen, segment, and prioritize.

Listen: don’t assume you know your customers. One source of feedback doesn’t tell the whole story. If you send out a survey, don’t make all the questions multiple choice. We know it takes time to read and categorize all the open-ended feedback, but taking that time will reveal new requests, use cases, and purposes for your products that you never thought of.

Segment: group feedback by customer type. When analyzing feature requests, it’s important to know if they come from someone that just signed up and barely knows your product, or from a power user that has been an advocate for your brand for years. Segment the feedback by user plan, industry (this may indicate how often and to what extent the customer will use your product), when in the customer lifecycle that the feedback was submitted, and if they’re likely to stay with you for one month or years.

Prioritize: focus on projects that have the biggest impact. We all have limited resources, time, and budget. After you gather all your feedback, prioritize the projects you’re going to deliver. How? Focus on the projects that will have the biggest impact on your most valuable customers and on your customer success metrics.

As you’ve seen, all this customer feedback is important to the entire organization – not just a Customer Success team. It helps the Product team make informed roadmap decisions, the Education team to focus on the articles and tutorials they need to write, and the Customer Experience team to improve onboarding and retention projects – just to name a few examples.

sharing feedback

Sharing your customer feedback with your entire organization

Now that you’ve collected all this great customer feedback, it’s time to share it with the relevant departments in your company and collaborate to act on it. Here are the ways that we do that at Typeform.

Customer Success partnering with the Product team

In the Customer Success team at Typeform we assign an ambassador for each Product team, a specialist for a product area that acts as the point of contact between the two teams, and product area “go-to person” responsible for the following:

  • bugs statuses, testing and learning new features, and updating Help Center articles as well as internal knowledge base
  • collecting insights and feedback from customers when a feature needs to be improved or a new feature is being developed

The product area customer success specialist meets with the product owner every two weeks to exchange information and get prototype and new feature demos. Depending on the Product team, sometimes the customer success specialist also attends the product sprints.

The Customer Voice report

At Typeform, the Customer Voice report is presented by the Customer Success team to the whole company every quarter. It provides customer feedback insights gathered in the previous quarter. We highlight how our customers are using the product, as well as their requests for new features, and their complaints and pain points.

We started doing this presentation in 2015 when we realized that each department in the company only had a partial view of what our customers were saying. The Growth team mainly looked at signups and upgrades, and the Engineering team barely had contact with customer feedback. Even within the Customer Success team, it was difficult to take a holistic view, since some team members only analyze feedback in tickets and others only the feedback in NPS surveys.

We also wanted to make the Product team more aware of our customers’ pain points and feature requests. Before the Customer Voice report, we requested UX changes from the Product team based on complaints and churn. However, when they asked for numbers to justify making the changes, we could only show them the number of support tickets or the number of customers who had churned because of the issue. Sometimes, the numbers were unconvincingly low to get prioritized by Product.

However, with the Customer Voice report, we’re able to show the number of support complaints about an issue, as well as the number of low NPS ratings and the number of customers who churned because of that issue. When aggregating all this data and identifying trends across all our different channels, the Product team has a much better view of the impact on the business and is therefore able to make more informed decisions.

For many people in the company who do not have direct contact with our customers, the Customer Voice report provides them with the opportunity to learn from them and “hear” their feedback.

What the Customer Voice report looks like

Using feedback from the Product team and CEO over the past year, we iterated on the Customer Voice report. As the team grows and we add new sources of feedback, we will continue to develop and improve the report. Here’s how it’s currently structured:

  1. Overview. We start with a table that includes the top feature requests and pain points gathered from each feedback source. To avoid a fragmented view of customers across and within teams, use this data to offer a global picture of the current situation. It gives the stakeholders who have less time to review the entire report a quick overview of our customers’ feedback.
  2. Sections by feedback source. Following the overview, focus on the specific pain points, feature requests, and use cases from each feedback source. These include: Sales calls (prospective customers), Account Management accounts (Enterprise customers), support tickets (free and paid accounts), NPS surveys, and churn surveys. Structuring it this way provides a segmented analysis, which leads to a better understanding of the feedback particular to each type of customer. This also also helps with prioritizing product roadmap

Conclusion

We hope this guide helps you create or improve your own customer feedback program. The practical step-by-step processes and examples can be used to apply to your company. This guide offers you an alternative to what you may already have in place, so you can compare and adopt the tactics that are relevant to your own organization.

Whatever you choose, don’t ever stop listening to your customers and incorporating their suggestions whenever it makes sense. They’ll always speak their minds, in private and public forums, and that’s a good thing because customers that don’t voice their concerns, complaints, or suggestions may no longer be your customers soon.

Want to learn what areas of your customer service need improvement? Take our assessment survey to see the current state of your support and how to take it to the next level.

Angela Guedes was the 2nd member of the now 25-strong Customer Success team at Typeform. With 10 years of experience in Customer Communications, she is now leading the Customer Experience pillar in the team, focusing on understanding customers and creating 1-to-many strategies to improve the user experience.

Anton de Young is a published writer and photographer. As a long-time Zendesk employee, he built the Zendesk customer education and training teams, and then as a Marketing Director launched the Zendesk customer service leadership program and event series, which he then helped to expand into the Relate website and event series. Now a freelancer, Anton is busy exploring the world from his new home in Lisbon, Portugal. Find him on Twitter: @antondeyoung.

Typeform is trying to make the world a little more human by changing the way we collect data online. It combines the human-centric approach of email and chat with the data-structuring capabilities of web forms.

Zendesk builds software for better customer relationships. It empowers organizations to improve customer engagement and better understand their customers. More than 94,000 paid customer accounts in over 150 countries and territories use Zendesk products. Based in San Francisco, Zendesk has operations in the United States, Europe, Asia, Australia, and South America. Learn more at www.zendesk.com.