Customer Support | 22 min read

Slack’s Judy Watkins on improving customer support as you grow

hero

With growth comes growing pains, and there are few places that feel those pains more than customer support.

As your product and company scales, it can become hard to maintain the high standards for customer support you initially aspired to, and to make sure you’re helping your support team grow and improve.

I recently sat down with Slack’s Judy Watkins, where we discussed how our teams have evolved and the challenges presented for support as our businesses scaled rapidly.

Interested in joining our next webinar? We’ll be hosting another next week, on March 1st, where we’ll be sharing insider tips on how best to use Intercom for customer support. Sign up here.

 

What follows is a lightly edited transcript of the interview, but if you’re short on time, here are four key takeaways:

  1. Growing fast means getting comfortable with reviewing, experimenting and adjusting your processes every few months, even throwing them out if you have to. What works for a team of 10 won’t work for a team of 100.
  2. Use your own company and team values to define what high quality support looks like for you. Many customers don’t get a great customer support experience from other companies, so their bar for quality might be lower than yours.
  3. Regular, structured reviews are crucial for improving the quality of your support. Teams only get good at giving and receiving feedback when forced to practice it all the time.
  4. As your team grows, it becomes more important to define a standard, and make sure there is clarity across the team about what you define as ‘good’ support.

Jeff Gardner: Thanks for joining us. I’m the Director of Customer Support here at Intercom. Our support team is currently a little over 50 people. We are spread across three main offices with quite a few in other locations around the planet. It’s a pretty complex beast – we can’t hold meetings with everybody on the team because at some point during the day, somebody is going to be asleep! In 2016, we saw a lot of growth, going from around 20 people to over 50.

Judy is the Senior Manager for Customer Experience at Slack. It’s a great product and it’s great to have you here with us talking about this today. What is the size and scope of the support team at Slack?

Judy Watkins: There are about 135 people globally in our support team. Slack launched three years ago and back then we had one person on support. When I joined two years there were 14 of us, which gives you some idea of how big our company has grown in a short space of time. Our support team is also based around the world, across four of our Slack offices – Vancouver, San Francisco, Dublin, and Melbourne – as well as a few people working remotely as well. There are now around 800 people working at Slack, which is a lot of growth in three years.

Jeff: That speed of growth can break things over time – what works at smaller scales can really quickly break down. You had mentioned that you felt like about every four months you had to revamp everything and redo all your processes.

With growth comes problems

Judy: Yes, for processes definitely. What works for a small team of two or ten is not going to work for a team of 100 people. When you’re going through high-growth periods or if something significant changes then things can break every couple of months. This year we’ll be looking at internationalization, which brings a whole new set of challenges.

Measuring support

Jeff: As you grow, you have to become more metric-driven, and find a way to keep leadership in the know about what’s going on, on the ground. The main way that most teams do that is through metrics. The two big parts of measuring usually are: how fast you’re getting back to customers and what type of quality responses and interactions are you having with those customers. How do you think about measurement at Slack?

If you really believe in the product and the value it brings, you’re going to be excited to help people
use it

Judy: What’s most important to us is speed and accuracy. Part of that is a quality review and part of it is simply how quickly, as a team, we’re getting back to customers. We don’t look at individual metric, we’re unique in that way. We’re not looking at how many tickets an individual agent is responding to, for example. It’s much more on how the team is performing.

Jeff: Faster responses are always better, without a doubt. But quality is an interesting one, because there can be a dichotomy. There’s quality as measured by your customers, the people that that are on the other side of the interaction, and then there’s quality as measured internally against what you believe, what your company stands for, what your mission is and what you’re trying to achieve. At Slack, do you have a set of values that you use?

Judy: We do look at CSAT scores and they’re consistently high (above 97% month-on-month), which that’s great, but we hold ourselves to much higher standards than our customers would expect. Customers, on the whole, don’t get a great customer support experience from many companies, so their bar for quality is not necessarily as high as our own is. When we were looking at quality of our support, we decided to measure against our company values, it was important for us to mirror that.

We have a set of six values at Slack, but the four that we chose that were relevant were empathy, courtesy, craftsmanship, and solidarity. Part of our ticket quality process was defining what that means, in terms of support. What does it mean to be empathetic? What does it mean to show solidarity? So the first thing we had to do was define what they are.

Jeff: There’s one thing that is really interesting about metrics like CSAT and even NPS that begets the reason for needing to start to dig into values and what that means for customer support.

There are two scenarios where these sorts of metrics fail that are pretty interesting. One, as you were saying, is that customers are not used to being treated really well, so they have a different bar than you may have for yourself. But it’s also not always clear what they’re rating on. It’s possible that they could rate an interaction poorly, but it’s unclear whether it’s the interaction they are rating, or they simply didn’t like the fact that your product didn’t do something.

Judy: Unless they leave a comment.

Judy: Comments are great. What we often find with negative ratings is that it’s reflecting people’s frustration with the product, not with the support. They could say “I really want your product to do this,” and we say, “Really sorry, our product doesn’t do that,” and they get frustrated. They show that through the CSAT, but it doesn’t mean the support is not good.

Jeff: The other scenario is the reverse of that in a sense, where it’s possible for people to get great scores consistently, but be completely off-brand or off your company’s values. It’s important that you don’t stop at just measuring CSAT, isn’t it? It’s important to think about your values.

Judy: And the hard thing about that is, it’s not something you can measure automatically. It’s largely a manual process.

Jeff: Everybody wants to find the silver bullet that can be measured automatically by some background process on what’s happening from the customer’s perspective. Where you’re going to get the most value is when you put the time and the work into really digging in and deciding: One, what are your values? Two, how can we measure this, even if it’s really manual and time-consuming? And then three, later on perhaps, how do we make this a little faster? How do we automate this?

Judy: And at Slack, we appreciate that great support is expensive. That’s just how it is. But it’s absolutely worth paying for.

Some people are not used to a feedback culture, and it can be hard.

Jeff: It’s an investment. Broadly in the industry, we’re seeing a big shift. Companies nowadays are pretty bought into the general philosophy that customer support is not something you can outsource and have really great outcomes from. It is something you’ve got to invest in and any part of that investment really has to be figuring out how to do it well and how to improve over time.

How Slack do it

Judy: The most important thing to remember when you’re doing this reviews is that it’s very much a coaching opportunity. It is an opportunity to improve your support and develop your team.

We chose four of our values – empathy, courtesy, craftsmanship, and solidarity. Empathy is about always seeing things from the customer’s point of view. Are we being friendly and humble? Those are things we really value at Slack. Solidarity is more about processes. Are we following our internal processes? Are we putting Slack at risk? Craftsmanship is about asking, is it accurate? And that’s, in some ways, the most important one. Are we giving good information to the customer? Is it right?

That’s the one where I think most people can learn from. Do we need to provide more training in a particular area? And the last one is courtesy. Are we getting back to our customers in a timely manner? We mustn’t assume that they’re not as busy as us. Their time is valuable, too.

That’s roughly what we focus our quality reviews on, which our managers do – that’s important for us. We see managers as learning coaches. They have to be close to the product, they have to know what our customers are going through.

From a mechanics point of view, we use Zendesk and all of that ticket quality information is kept within the ticket itself.

Jeff: Is there a regular review process, when a manager sits down with their report and they go through things together?

Judy: Yes, we have weekly one-to-ones with our reports where we go through the tickets that we review. We’re doing about five per agent per week. We’ll talk about every ticket. Great feedback on tickets that are great, opportunities for learning. Things that aren’t right or that were right and can be leveraged into something really great in that ticket.

Jeff: Do you find that when people join Slack from other companies that it takes them a little while to get used to the idea that they are going to get feedback regularly on their work?

Judy: It depends where they’ve come from. Some people are not used to a feedback culture, and it can be hard. It’s something we work on constantly. It’s part of the career description at Slack – you’ve got to be really comfortable giving and receiving feedback to everyone, both peers and managers. Practicing it on a weekly basis is very valuable for us to help develop everyone.

Jeff: We’re huge on feedback here at Intercom. One of our team values is “be radically candid” . I love when I receive feedback on what I could have done better because it’s a good sign that things are working and that people are comfortable giving feedback.

How Intercom do it

Jeff: We got into this a little more than a year ago, and we started the way we do most things at Intercom – we started as small as we possible could and tried to just figure out if we could make it work.

An early tool designed to help us measure the quality of customer support

One of my leads wrote a script that picked ten random conversions from people on the support team and sent everybody on the team an email each week with ten conversations. Every single person, from manager to rep, would get this email. You’d go in, you’d click into every single conversation, leave feedback and they would get a notification that you left a comment so they could read back over that.

It worked really well as a way of testing the theory out, but was pretty deficient in a couple of ways. One, it was quite slow. Two, because of the way the random conversations were pulled, sometimes you’d get conversations that simply didn’t need to be reviewed. And we weren’t happy with the fact that it was simply qualitative, there wasn’t a way for us to track improvement over time, and it made it harder than necessary to keep up with how somebody was doing or how many of these reviews somebody was doing each week.

An internal app we have developed to help our team give customer support feedback

We moved from that to building our own app that uses our API and pulls data and tries to be a bit more structured about how we give feedback. We decided to measure on quality, tone and difficulty. Quality is that craftsmanship part, tone is about empathy and courtesy. And difficulty is a bit of a relative, slightly nebulous score, but it’s trying to ask, “How long should this conversation have taken? Did this conversation go on longer than it needed to, or was this just a really difficult investigation that somebody did a really amazing job on and we need to call them out for it?”

Judy: That must be a hard one to measure.

Jeff: It is. It’s very relative, because you don’t know what you don’t know, so if you have a customer service rep who’s looking at a support engineer’s conversation, it’s sometimes difficult for them to accurately judge that. There’s a bit of a give-and-take, but it tries to put a bit bit of structure on it while also leaving space for qualitative feedback, like comments, which are required.

This app principally helps us track that data, but it’s also very fast, which means it’s easy for for everybody on the team to do lots of these reviews each week and not feel like it’s a big burden. All individual contributors, that’s reps and engineers, do at least ten per week, and all managers need to do at least 30 per week, across their entire teams. So it’s definitely a significant time investment.

Judy: I’m curious, is it anonymous?

Jeff: Nothing is anonymous, everybody knows exactly who rated what. That comes back to our value of being radically candid. You only get good at giving and receiving feedback when you’re forced to practice it all the time. We’ve legitimately had times where people have said, “You went too easy on me on that one,” or somebody’s been a little bit ticked off about how harsh somebody was in one of their reviews. But generally, it’s just a good chance to have a conversation about it, and that’s the important part about.

Defining quality

Jeff: Regardless of what mechanism you’re using to review quality, you need to take time to define what is good, what is passable, and what is totally unacceptable. Being opinionated is important, and that’s not always obvious.

Judy: Yes, you have a bit of a scale with yours, and we’re more either “it is or it isn’t.” It’s not always that clear-cut, but we didn’t want to make it complicated, because once you introduce that complexity, if people find it difficult to follow, then it’s not going to be used. So for us, either it does or it doesn’t meet the rubric for what we’ve defined as empathy. But again, it’s not about pass or fail, it’s a learning opportunity. It’s very much in a positive light.

Jeff: The coaching aspect has to be emphasized above all else, because there are always going to be situations where somebody’s not performing at the right level. There are, unfortunately, always situations in companies as large as ours where people don’t perform and eventually leave, but that’s not what this is about. This is about helping people get the most out of the job they’re doing and helping people understand and learn over time what is great, what is good, and what’s okay.

Judy: What it comes back to is the customer. We’re doing this to help develop our team, but ultimately it’s so that we can improve the support we’re giving to the customer. Everything we do comes back to that. It’s not about pass or fail for the support person, it’s, “Are we doing the right thing by the customers?”

Jeff: It’s a really great opportunity to see where people are doing really great work and be able to call that out regularly.

Maintaining consistency

Jeff: There’s one really important thing that everybody that’s thinking about implementing something like this needs to do, and that’s consistency. How do you maintain consistency across ratings?

Judy: We use something called inter-rater reliability (IRR). It’s a commonly used research method. It allows us to really measure whether the quality assessment tool we’re using is valid when used by a large group of people.

How it works in practice is, once a month, we take three tickets at random and we get everyone on the team to review those tickets. Everybody scores them for each of the four values and then we calculate the IRR score. The accepted agreement level is 75%. Doing this consistently on a monthly basis means that we have an insight into where we’re agreeing and where we’re not agreeing. For example, if we’re getting a really low score on solidarity, then maybe something is not working in our processes.

Jeff: So IRR helps ensure people are aligned on how the rating system should work, but also helps to highlight where the process might be broken.

Judy:
Things do break from time-to-time, and just because we’ve put this system in place doesn’t mean it’s always going to work. There a lot that could change, and measuring the IRR monthly gives us some insight into where we are on that journey. Our reviews wouldn’t work without IRR, because if two people are doing reviews and they’re not calibrated in any way the. it’s almost meaningless.

Jeff: Something we’ve found is that we don’t have as much confidence as we would hope in our numbers. We’re in the middle of doing our first IRR review system calibration, and I’m excited to see what comes out of it.

The one crucial point for anybody who wants to put this into practice is you really need to think about how you’re going to do a regular calibration.

Judy: It’s often the last thing people think about.

Answering your support questions

A selection of questions sent into us by webinar attendees. Interested in asking some of your own? Sign up here for our next live webinar on March 1st.

Why do you choose to review five tickets per week? How did you arrive at the number five?

Judy: I don’t know, is the honest answer. It works for us right now, though it’s not always five. For new hires going through onboarding, it’s ten. We don’t have to only do five, sometimes we do more. Perhaps if somebody needs a bit extra support, or if time allows, we might do more.

Jeff: With growth and scale, especially at the pace that both of our companies are going through, things constantly need to be reviewed, adjusted and experimented with, and there is no “right answer.” It’s about experimenting and trying to figure out what works for your team.

What is the process for handling different types of requests? For example, tech support requests versus other kinds of support.

Jeff: We use Intercom for everything and for different types of requests, we use routing rules in Intercom to send conversations to the right group first. But sometimes you get a conversation that starts in one direction and then takes a turn and ends up being a really complex investigation or debugging issue and in those cases you have to hand it to the right person.

Judy: We don’t do tiered support. We are very big on ownership.

Jeff:
And are all of your agents fully technically able to answer pretty much anything?

Judy: So, it used to be that way, and as we’ve grown, we’ve had to change that up a little bit. So now have our agents specialize in a particular area, but everyone should feel comfortable picking up any ticket. We always say that there’s no ticket that comes into Slack that any individual can’t get an answer for.

But rather than passing the ticket from person to person or department to department, it’s about finding out those answers for yourself. Forming relationships with other people in the company. If you’re just passing a ticket along, you may never meet those people, but if you take some time to work with people in other departments – our engineers, our product teams – it strengthens the support you can give.

Jeff: That is one of our core values, to be an owner. Even when ymelou get to a point where you gave to split things out to a model where some people are more technical and some people are less technical, you can still be smart about how you get conversations or tickets to the right person at the right time.

What are your best approaches for differentiating between free, paid, and anonymous users? Should Service Level Agreements (SLAs) be different for them?

Judy: When a ticket comes in, we can see if it’s from a paid customer, and what plan they’re on. There are SLAs at Slack for our paid plans.What’s important to us our internal SLAs that the same across the board. We aim to get back to all customers within an hour, which is much lower than any of our public-facing SLAs.

All customers are equal, Some may be paying, but if we provide the same level of support to everyone, then maybe some of those free customers will become paid customers because of the level of support we’re providing. It’s a false dichotomy to say, “these people are paying, so let’s give them better service.”

Jeff: And free can be very different things. A free user that’s not quite committed and is still testing things out is much more valuable to you than somebody that’s been using the service for maybe nine months for free. You have to decide as a company what is right for you. Do you want to support that long-term free user or do you want to support the brand-new free user?

But I think you’re right, if you want to do right by the customer, and you want to maximize the chances that this free customer may eventually become a paid customer, then doing right by them is key.

Judy: Maybe they’ll never become a paid customer. But, if they’re using our product and having a great experience with it, and getting fast, top-quality support then maybe they’ll tell people about it.

Jeff: It’s the network effect. It’s something that’s pretty important that maybe you don’t appreciate it in the early days; the difference between really strong, organic growth versus growth that’s really driven by marketing spend. There’s no way you can pay enough money for strong organic growth. It’s something that you want to hang on to, and something you want a bias towards.

Sometimes support can be a tedious task, so how do you make your employees engaged and happy?

Jeff: It comes back to making sure you’re hiring the right people. If you’ve got your values, and your mission as a company and as a team correct, it goes a long way to making support not a tedious task. If you really believe in the product and the value it brings, you’re going to be excited to help people use it and get more out of it.

When you’re really under a deluge and it’s really busy, it can become a little bit monotonous, but there’s a lot of ways to break that up. It’s also the sort of job where as your knowledge in the product grows, you are able to answer a lot more and different things, and so you’re learning constantly and expanding your skill-set constantly. That was what kept me interested for a number of years giving frontline support.

Judy: It comes down to developing people, and giving them opportunities to develop their knowledge, skills and experiences. Some people will naturally see it as a stepping-stone into something else, others will join your team and really like what the sales or marketing teams are doing, while others love working on a support team that’s run really well, that is highly valued. That’s something we do really well at Slack. Support is a core function in our company, it’s something that is heavily invested in, and it’s at the forefront of everything we do, so it’s valued.

Learn the insider secrets of how we use Intercom to provide world class customer support at our live webinar on March 1st