Category Archives: Customer Experience

How To Not Change Your Product

Change Management on the Metal Gears on Black Background.

The blog has been silent for way too long, I know. I will do my best to continue at a better pace until life intervenes again

One of my guiding principles, at work and outside of it, has always been “no surprises”. I try to apply this principle to every interaction and relationship, customers and partners, executives and fellow employees.

Recently the validity of this principle was reaffirmed when an online service I use extensively eliminated a very valuable feature without notice. The change removed important data users have saved and, initially, did not offer any way to access it. The problem was fixed, in a very kludgey manner, only after several weeks, during which the data was not available for use. The official statement, released after the fact, claimed this was done in the name of user experience improvement

The vendor’s customer support team provided a response along the lines of We have your data, you just can’t see it. We plan to make it available again, but don’t have a time estimate.

Having encountered similar situations in the past, there are several guidelines I’ve always relied on when making product changes with potential impact to customers:

  • Plan Well – understand the change you are introducing, the variety of ways it will affect customers, and the number of customers impacted. SaaS vendors, especially, have more detailed knowledge of their customers than on-premise, and should use that information
  • Weigh the need – ask yourself whether the benefit offered justifies the impact to customers’ ability to derive value from your product or service
  • Seek customers’ input – invite customers to offer their perspective on complex or high impact changes. This can be done in a variety of ways, from 1:1 conversations through surveys all the way to open discussions on your community sites or during user group meetings
  • Communicate deliberately:
    • Provide sufficient advance notice – let your customers know the time a change will take place, and do it well in advance of the change so that they are able to prepare for it. If your customers’ business follow a business cycle, aim for the low activity periods
    • Explain the impact the change will have, list ways with which customers could minimize or eliminate this impact and explain the benefits in case they are not immediately obvious
    • If customer data or their ability to access it will change in any way, explain the data’s disposition – what will happen to it, how it can be accessed, when it will be available for use again
    • Provide a mechanism to circumvent the problem – for example, in this example allow customers to download a copy of the data for offline use
    • Offer a mechanism for feedback at each and every stage
  • Prepare for impact – arm your support team with answers and talking points. Some customers will invariably be unhappy about this change, ensure your support team is not facing them empty handed

Obviously, different demographics call for different methods. High end enterprise vendors could schedule individual meetings with customers to prepare for high impact changes, while those in the consumer markets could send a single generic email with instructions. In any case, however, do not leave your customers in the dark!

NPS Again, This Time With Feeling?

Recently I came across two interesting posts discussing NPS®. Each of them seems to miss one, or more, important points about creating a sustainable customer survey program that actually produces tangible results

First, and very interestingly, is Fred Reichheld, creator of NPS, on linkedin telling us to Stop Thinking Like a CEO (and Think Like a Customer Instead). If we think about the title for a minute we’ll realize that unlike the customer, a CEO has responsibility for identifying weaknesses in the customer experience and taking action to fix them. In his post, Mr. Reichheld tells the stories of two CEOs who transformed their companies’ customer experiences. But, did those CEOs do what Mr. Reichheld is asking us to do? No, they did not. They never stopped thinking like a CEO, but they did bring the customer perspective into the organization, and, most importantly, acted on it to deliver a better customer experience

In his post Mr. Reichheld claims that:

“The best approach is to ask customers on a scale of zero to 10 how likely they would be to recommend your products or services. This is what […] companies do every day to be loyalty leaders. By closing the loop with their customers and taking action on the reasons why customers love doing business with them or not […]”

But while the scale is detailed, the need to “take action” remains nebulous – how do we know what actions to take, and how do we verify their impact?

Another post that caught my eye last week was on the Bluenose blog, titled Driving Net Promoter Adoption: 3 Must Do’s. In his post Don MacLennan, Bluenose CEO, states that leadership commitment, organizational alignment and customer follow through are essential to increasing a company’s NPS results, and he recommends engaging with customers as a survey follow-up in order to establish the cause of their dissatisfaction and eliminate it

Some readers may ask whether there’s anything wrong about engaging with customers to find out what’s causing dissatisfaction. The answer is that there’s nothing wrong about that, but when we think about a process driven business we can’t rely on anecdotal evidence as the sole input to our improvement efforts, and that for several reasons:

  1. The plural of anecdote is not data – so no matter how many customers the company interviews they will be only a small portion of the customer base and their opinions remain anecdotal rather than a complete picture
  2. Discrepancy between stated and actual behavior drivers is a well known phenomena and is documented in numerous academic papers
  3. The challenge in converting anecdotal evidence into a sustained organizational effort to improve products and services

With that said, what would be the best way to follow up on NPS, or any other, survey and generate sustained improvement

The answer should be no surprise to the blog’s followers. Survey responses, in correlation with your operational data, provide the organization with very clear insights on actual (as opposed to stated) customer behaviors and their drivers. With these insights it is possible for the organization to develop sustained improvement efforts based on the most critical elements of the customer experience, and gauge their impact on customer satisfaction, as well as customer behavior. It’s doable, and usually easier than we tend to think, but does require commitment and a change in the way we think

Net Promoter, NPS, and the NPS-related emoticons are registered service marks, and Net Promoter Score and Net Promoter System are service marks, of Bain & Company, Inc., Satmetrix Systems, Inc. and Fred Reichheld

Sometimes The Extra Mile Is Free

Restaurant scene

Recently I visited a coffee shop while waiting to meet a friend. Walking in, I was impressed – the place was large, well lit and tastefully decorated. The food and pastries in the display cases seemed attractive, with quality ingredients and professional preparation. Clearly, those who designed and built this business aimed high, and their prices reflected that. As a long time observer of service operations I started wondering – could they deliver on the promise of the decor and the food?

Considering I only had tea, I couldn’t tell anything about the food except that other diners seemed to be enjoying it. However, from the service perspective I left with a few nagging points that directly apply to other service organizations:

  • The food is served in nice porcelain dishes which the crew clears once the customer has left. But, they do not wipe the tables, consequently, each table had some crumbs. Very few, but noticeable. When clearing the tables they do not use a tray, so we got a chance to see one of the crew walking slowly with a pile of dishes on her arms, trying not to drop them. Solution – get a tray, and put a little wet towel on it. Clear the dishes into the tray, wipe the table and be done.
  • When ordering a drink they take your order at the counter and bring the drink to the table. But, the crew has no clue how to walk straight while holding a cup. It was very comical watching one of them holding a saucer with both hands and walking slowly trying not to spill the coffee. Solution – rehearse, work on your muscle memory.

In both these cases, not only did the operation look unprofessional, but the employees were visibly embarrassed.

  • Last – my tea was delivered to the table in a cup. There was nowhere to dispose of the teabag nor were there sugar or stirrer on the table. I had to get up and get them myself, negating the point of table service. Solution? You guessed it. Bring a saucer, and place a few bags of sweetener on each table.

Now, there is a common thread between all these points. Fixing them will cost the business absolutely nothing, but requires an observant manager with a burning desire to keep improving the service. This begs the question, how much improvement could each of us make to our support operations at zero cost while helping our employees increase their skill and professionalism? How much better can we make them? What if we took the time to observe our organization from the side, and inspect every move and every action as they are perceived by the customer? Sadly, in many situations this seems to be everybody’s last priority.

Enterprise Support Maturity Model:

Layered rainbow colored pyramid

Several weeks ago Harvard Business Review published a blog post by Vikram Bhaskaran, titled “Customer Support Hierarchy of Needs” which I read carefully but found unfocused and lacking in depth. Consequently I felt the challenge to develop a better, more coherent, model for enterprise technology support.

Eventually I came up with the following as a base model (see the bottom of this post for the naming choice). It is still a work in progress and as it is being refined I’ll post additional versions to the blog.

Pyramid Simple

The model attempts to capture the two most critical investment any organization makes, technology and people, and show the path they progress along in order to deliver a more comprehensive experience. It is based very loosely on the concepts introduced by the various CMM models and progresses through the various maturity stages.

We can all understand that the various maturity phases will not have clear transitions. In fact, most support organizations I have seen tend to have different segments of their operations at different maturity levels. The common theme for all, however, is the desire to make progress along the path to a more mature level of operation. Obviously, as companies expand, efforts may be directed to other pressing concerns, such as global expansion, supporting additional product lines, or adding third parties to the support chain. However, the need for continuous progress up the maturity levels is shared by most support executives I have met.

As a a first step I’d like to define a few of the essential characteristics for each phase identified earlier:

  • Chaos – Usually exist only very early in a company’s life, ad-hoc process, lack of infrastructure, metrics or dedicated staff. Support is provided by engineering teams and frequently personal heroics are key to resolving problems with any significant urgency or complexity. Third party partners may provide some support to specific customers, but the interaction is mostly at the technical level.
  • Managed – Basic processes exist, along with entry level staff to manage non-technical customer communication (e.g., case opening, status requests). Technical interactions still managed by engineering. Basic case tracking and multi-channel capabilities exist. Metrics are used but will usually focus on cost and volume. Basic offering parameters defined and communicated to customers.
  • Professional – Processes more elaborate than those at the Managed level, and may include various escalation guidelines. Technical staff is added to the support organization, and the infrastructure becomes more sophisticated in capabilities as well as utilization. Customer Satisfaction will be added to metrics. Sporadic collaboration with engineering. Additional services may be offered, such as Support Account Managers or extended coverage (e.g., around-the-clock or weekend) to supplement standard offering.
  • Proactive / Predictive – Knowledge management is used, customers are alerted to potential problems through proactive notification. Close collaboration with engineering teams to ensure high impact or widely encountered problems are addressed rapidly. Vendors can predict which customers or hardware components will encounter certain failures and act accordingly.
  • Invisible – Support integrated into other organizations in the company, resting a continuous customer experience through all points of interaction. Multi-channel interaction through forums and other social and traditional channels channels managed consistently. Customer intimacy used to eliminate failures, reduce their impact on customers and increase value derived from products.

How does this model fit with your experiences? Surely everybody who has been in this business for some time has a similar concept about the progress support organizations make over time. It does open the door to understanding the different perspectives to this progress, which we’ll discuss in future posts

The main reason I chose to call the model Enterprise Support Maturity Model as opposed to the frequently used Hierarchy of Needs is that while an individual’s needs have hierarchy, an enterprise customer has choices. We can imagine a hungry person might give up dreams of self-actualization while searching for food. A customer, on the other hand, will go looking for other, more competent vendors.

Looking at Surveys? Here Are Some Practical Tips:

Ojo cibernético

Many companies run customer surveys, and sadly many enter this effort without giving sufficient thought to the benefits they intend to derive. When asked, a typical response would be “I’d like to know how we are doing against the competition” or “I’d like to know what customers think of us / our engineers / service / products” and so on. While those indeed are good reasons, there are a few additional points we should think about before starting a survey. This post will not get into the philosophical discussion of NPS or not, but focus on the practical sides of creating a survey that functions successfully

When thinking of surveys, wanting to know how you do, especially against the competition is interesting, but as we argued repeatedly in this blog and elsewhere, it is a futile exercise. The best use of a survey is to identify improvement opportunities and reinforce strengths based on customers’ feedback, and then confirming the effectiveness after actions have been put in place

The two basic operational goals to consider when running a survey are, firstly getting a sufficient number of responses, and secondly obtaining meaningful and actionable information from those responses. When those are accomplished organizations can incorporate the insights derived into their operation to drive sustainable improvement. This post will focus on the first two objectives – driving a high response rate and asking meaningful questions. In a subsequent post we will look at what we can do with the insights provided and the organizational implications of that

Driving the response rate There are several factors that influence survey response rates:

  • Effort – the number of questions to answer, their complexity and whether they are in the customer’s own language
  • Benefit – does responding to the survey affect the service?

Let’s look at effort first. We all heard of survey fatigue, which occurs when customers receive too many survey requests in a short period of time. This is more common with enterprise technology companies, where specific individuals are responsible for the vendor relationship and handle the bulk of the interactions. Many companies, therefore, use a pacing mechanism that limits survey requests to one every certain period of time (anywhere between one and four weeks seem to be common practice). Second is the number of questions, the fewer questions there are, the more likely customers will be to respond – many articles and books discuss survey design, I recommend you become familiar with at least a few of them. Last is language – companies that operate globally should invest the effort and translate the surveys to their main customer constituencies

Now, let’s discuss benefit. Customers would be more inclined to respond if they see tangible improvement in the service based on their feedback. Many companies choose to sit back and hope customers will notice those improvements, however, it is a good idea to help. First, engaging in some self promotion is never a bad thing – if you publish a newsletter or have a blog you can include a section on recent improvements, or you can have a small section on your support home page showing the same. Second, use every opportunity to talk to customers and tell them about improvements that are made in response to surveys. This is especially true when you call in response to poor surveys and listen to their comments (you do make those calls, right?)

Getting meaningful data Frequently we hear comments such as “I get the most useful input from the free text questions”. Well, there is a reason for that. Frequently we see questions that focus on the agent providing the service rather than the customer’s experience. These surveys will ask questions about courtesy, professionalism and so on, rather than the solution to the customer’s problem (timely? accurate? easily usable?) and the amount of time and effort it took to accomplish. Esteban Kolsky has written a number of good posts on customer surveys, start here and explore his blog for excellent insights. Now, by all means have a free text field for the customer to provide input on, but never confuse the squeaky wheel with the voice of your entire customer base

Asking the right questions is only the first step in making data meaningful. Once the survey is operational, there are three points we need to focus on:

  • Analyzing the data in order to understand what we need to fix or reinforce
  • Taking the right actions to drive improvements
  • Confirming or correcting the actions through a feedback loop

Edit – A long time friend reminds me of an excellent way to increase survey responses. When closing a case, support engineers should inform the customer that a survey may be coming their way, and they’d be very grateful if the customer responded

How To Not Ask Questions

Time for feedback

A coworker of mine from many years ago had as one of her guiding principles “Never ask a question if you are not ready for any possible answer”. I was reminded of her, and her favorite principle, while discussing customer surveys and customer feedback recently

It is common practice in service situations to inquire about the quality of the service, and I am sure every person who is even remotely associated with supporting enterprise technology had participated in numerous discussions about surveys, customer forums and similar initiatives. But then, how much thought was given to making the results actionable and actually acting on it? Here’s an example I am sure many of us encountered while having a meal in a restaurant. Inevitably one of the waiters will come by the table and ask if everything is OK. Our answer, very frequently, is “yeah, everything is great”, even if, in fact, nothing is. I am sure we have all wondered what would happen had we provided our honest feedback in response – “the soup was cold and the bread was stale”

Now, anybody who has visited more than a single yelp page would know, negative feedback is an opportunity to make things right, but that opportunity is too frequently neglected, leaving the customer to vent over the internet and creating significant damage to the restaurant’s reputation

So, how should we address customer feedback? Here are some pointers:

  • Only ask questions that will provide you actionable information
  • When feedback is received, act as soon as possible and keep the customer informed
  • Clarify negative feedback, e.g., when a customer responds with a poor survey, call and discuss their concerns
  • Report back to the customer community through a newsletter, website updates, or similar means on all the changes you made based on their feedback, encouraging them to provide you with more information more frequently

Complicated? Not really, but it does require constant focus and commitment.

What methods are you using to collect customer feedback and encourage dialog?

Customer Support or Customer Success?

Man Woman face people problem puzzle

I recently read an interesting discussion on the ASPI linkedin group about the role of customer success manager and the differences between this role and others in the customer support world.

A number of people have written about customer success management in the past. For example, Mikael Blaisdell, here, yet the more I read about the customer success manager role, the clearer it has become that there is neither a broadly accepted definition of the role nor any agreement on how people in this role accomplish their task. But, thinking about it in the context of the direction the technology world is taking may have the key to understanding the progression from Customer Support to Customer Success.

Historically and until early in the previous decade, vendors’ engagements with their customer were very structured. Customers made large, long-term investments in technology upfront. The deployment plan was then developed jointly between the customers’ IT and business teams and the vendors’ sales and professional services groups (or a third party implementation partner). The role customer support played in this context was mostly reactive and was focused on rapid break-fix problem resolution

The growing adoption of SaaS delivery brought a very different approach to adopting new enterprise applications. There was no need for a project plan developed together with the IT team, server provisioning or storage space. All you needed was a credit card and you were in business. With BYOD you did not even have to install anything on your company’s laptop. All that’s required are a credit card, a need to accomplish something and sufficient amount of curiosity to go look for an app or tool that will address that need

This change introduced a new and very different challenge that SaaS vendors had to address and which traditional enterprise IT vendors never faced. In their world every small purchase (or download of a free application) could eventually develop into a large deployment. The conditions faced by those vendors were relatively challenging:

  • The initial investment made by customers is small and usually without minimal term commitment
  • The purchasing individual may not have complete view of the needs and the product’s abilities to fulfill them
  • Products were installed to experiment with rather than to achieve a clear goal

These lead to reduced level of commitment on the customers’ behalf and therefore made it easy to walk away from the product when even the smallest difficulty or challenge are encountered. Obviously these conditions are much more common with small, tool-like services rather than major enterprise products. An individual at an organization may, for example, experiment with dropbox, but a salesforce.com implementation would still be done using the traditional enterprise technology model

Taking all these into account, we can begin to see the difference between the engagement model in traditional enterprise deals and that for SaaS delivery. It is safe to assume that while large, enterprise implementations are driven by sheer momentum, visibility and active management, small, ad hoc purchases are made and often neglected as discussed earlier. The role of the customer success manager would then be identifying installations that are not progressing according to certain pre-dtermined success criteria (see this post on the totango blog for a very good discussion ) and helping those customers overcome challenges, understand the capabilities of what they have just bought and make the most out of it, hopefully expanding the product’s use and stickiness

Obviously, as the terminology around customer success proliferates, we see increased creep into areas traditionally covered by other roles, for example, support account managers or technical account managers, engagement managers and others. But, I still believe the clear distinction between customer support and customer success is in the latter being more proactive and focused on the implementation and usage expansion while the former is more reactive and break-fix focused.

What is your opinion around this topic? Have you implemented customer success management in your organization? How have you defined the role and goals?

The Only Question You’ll Ever Need, or NPS, But Carefully.

Question mark

David Kay has an excellent post discussing Net Promoter Score surveys. While I agree with David on many of the points he makes I also have a few reservations about NPS and especially its usability in enterprise technology environments and I’d like to share both in this post

To those unfamiliar with NPS, it is a methodology developed by Frederick Reichheld to measure customer loyalty using the response to a single question: “How likely are you to recommend [product or company] to a friend or a colleague?” It was introduced in the December 2003 issue of Harvard Business Review, in an article titled “The One Number You Need to Grow” (subscription required). According to the article, it is the single best predictor of future revenue growth

Over the years I had multiple exposures to NPS surveys in enterprise technology environments and found a number of things I like about it. Unsurprisingly, those are similar to what David lists in his post:

  • NPS offers companies a single point of focus to present to the different interested parties with a simple story
  • Separation of the respondents population into promoters, detractors and neutrals, and the single index created to represent progress across the spectrum
  • A clear and concise methodology that is easily understandable and relatively easy to follow

However, there is a number of things I do not like about many NPS deployments. First is the confusion between “the only number you need to grow” and “the only question you need to ask”. Second is the inability to derive actionable data from the would recommend question even in the simplest of customer experiences, let alone in a complex, multi-faceted environment such as enterprise technology. Namely, NPS maybe a leading indicator to corporate performance, but it certainly is a trailing indicator to many decisions and actions which need to be monitored and tweaked to produce the desired results

To demonstrate the actionable data point let’s imagine a simple consumer product where the expectations are clear and the likelihood of failure very low. I am sure every one of us, with a little effort can imagine half a dozen points which would cause them to not recommend the product (price, availability, functionality come to mind immediately). Each of those would require a different action from the vendor, except there is no way to determine the reason and subsequent action from the response to the NPS question without additional data and analysis. Now how many more different interaction types and touchpoints does an enterprise technology provider have with its customers?

Thinking about the suitability of NPS to the enterprise technology environment, we add to the complexity of assessing which segment of the performance spectrum of the product we encounter additional complexities. Interestingly, one of them was made in the the original HBR article: “The ‘would recommend’ question wasn’t the best predictor of growth in every case. In a few situations, it was simply irrelevant. In database software or computer systems, for instance, senior executives select vendors, and top managers typically didn’t appear on the public e-mail lists we used to sample customers. Asking users of the system whether they would recommend the system to a friend or colleague seemed a little abstract, as they had no choice in the matter.” The extension to this argument is that growing the NPS number for all respondents regardless of role maybe a futile, unfocused effort that discounts the opinions of the decision makers due to sheer scale. After all, most companies have only one CIO and very few decision makers overall at a strategic level. Also, from a support executive perspective, our ability to control all those touchpoints is limited. How many of you have ever tried to change the structure of an invoice your company sends to its customers? How successful were you?

So, should you implement NPS?

My recommendation is that you should, but you should do it very carefully. There are several things to keep in mind when doing that, and the bigger and more complex your engagements are, the more critical these points become. First, ensure every stakeholder understands that way individual actions and decisions drive NPS and it is not a number you can grow per se. Second, understand the critical touchpoints, their influence on the individuals within your customer organizations and their influence the overall desired business outcome. Third, understand your respondent population and the impact they would have on your company. Last, but not least, use NPS as a banner number to rally the troops, but be aware of the complexities behind it and be ready to tell a more nuanced story than The Only Number You Need to Grow

Thank you for reading through this long post. I’d love to hear some opinions about this very loaded subject. If you have experiences to share having implemented NPS successfully, or those implementations that did not achieve the desired results.

What Can Enterprise Support Learn From Auto Rental? (Part 3 of 3)

How to choose correct education?

So, what did we have in our car rental story (part 1 and part 2)?

  • Lack of familiarity with regular customers
  • Redundant requests for information
  • Confusion of onstage and backstage activities
  • Asking questions that the company has no intention to act on

How many similar failures are built into each of our processes? Let’s think about possible situations we are all familiar with:

  • Transitioning cases between individuals in different tiers (here’s something I wrote in the past)
  • Incomplete record of customer entitlements, support levels and preferences. These may have multiple reasons ranging from poor maintenance of records all the way to legacy of acquisitions with different offerings and practices
  • Being unprepared for a meeting or conversation with the customer. How many of us have employees going on a phone call with a customer without reviewing the case record or possible resolutions?

Many of these habits were discussed in the Harvard Business Review article, Stop Trying to Delight Your Customers. What can we do to correct them?

What Can Enterprise Support Learn From Auto Rental? (Part 2 of 3)

Barrier Gate

In the previous post we looked at the service experience of a car rental company. We can identify four interactions between the customer and the rental company:

  • Registration
  • Reservation
  • Collecting the car
  • Returning the car

Keeping in mind that the customer’s main objective is to be on their way, let’s now examine the experience through its various stages, and whether each of those adds any value to the customer:

StageInteraction DetailsPlatformValue?
RegistrationPersonal Details
Driving License
Credit Card
Insurance Needs
Car Type
Other Details
WebYes
ReservationPick-up Time and Place
Duration
Deviation and changes from saved information
WebYes
CounterReview personal details and credit card information
Upselling (GPS, child seats, refueling)
PersonalNo
Parking GarageReview insurance options
Inspect car for damage and gas level
Sign paperwork
PersonalNo

We can see that the two face to face encounters add little value to the customer.

Many car rental companies have eliminated those steps, at least for regular customers who can walk directly to their cars and drive away immediately. Looking at the content of those two interactions we can see several patterns we all recognize:

  • Repetitive requests for information
  • Confusion of onstage and backstage activities as well as with supporting processes causing delays and extra expenditures

In the next and final post – learnings from this experience we can implement in the enterprise technology business