Home Service Excellence

Service Excellence

10 Quiz Questions on Quality Assurance

by OmniTouch International OmniTouch International No Comments

In this short post I challenge the Reader to answer 10 Quiz questions on Quality Assurance.

Though the Quality Assurance function is most commonly sited with the Contact Center, its use and understanding can be broadened across any Customer Service environment including hospitals, universities, government offices and more.

Those of you that I’ve worked with in classes or talks around the world know how much I like to give out these kinds of Quizzes.

And this Quiz is free, doesn’t involve any registration and your name won’t be added to any list.  We do this just to help & inspire!

When you coach you’re either helping or keeping score

The 10 Quiz Questions on Quality Assurance

Here are the 10 Quiz Questions on Quality Assurance.

Reach each question carefully and then select the right answer which is either a, b, c or d.

Yes – there is only one correct answer for each question.

 

1.  Which of the following is the BEST example of a Compliance Standard?

 

a. Greeting

b. Tone of Voice

c. Rapport Building

d. Empathy

 

2.  The 3 most common inputs used in Performance Standard design are:

 

a.  Customer Expectations, Profit Forecasts, Manpower Requirements

b.  Regulatory Requirements, Customer Expectations, Market Share

c.  Customer Expectations, Regulatory Requirements, Headcount Requirements

d.  Organizational Vision, Customer Expectations, Regulatory Requirements

 

3.  The best description of a Service Delivery Vision is:

 

a.  A statement that lists out all the Compliance Standards to follow

b.  It is usually the same as the Organizational Vision

c.  It describes the kind of service we will deliver around here

d.  It is most useful for Contact Center Agents

 

4.  If you rely too much on Compliance Standards your Frontline Agents will sound:

 

a.  Friendly

b.  Robotic

c.  Warm

d.  Compliant

    

5.  Which of the following statements is/are TRUE?

 

I.  All Performance Standards on an “Interaction Audit” form should have equal weight

II. First Contact Resolution can be difficult to calculate

III. Customer Expectations are the main source for selecting Performance Standards

IV.  A high First Contact Resolution rate is always good

 

a.  II only

b.  II and IV only

c.  II, III and IV only

d.  I, II, III and IV

 

6.  Which of the following are included in the formal documentation of a Performance Standard?

 

I.   The purpose or business reason for the standard

II.  The scoring logic for the standard

III. Examples of how the standard is to be used

IV.  A formal definition of the standard

 

a.  I & II

b.  I, II and III

c.  I, II, III & IV

d.  None of the above

 

7.  Which of the following statements is/are FALSE?

 

I.  Normally Quality Assurance does all the interaction monitoring & scoring

II. It’s best to let Quality Assurance do the Agent coaching

III. Team Leaders should focus mostly on productivity

IV.  It’s ok to schedule one full hour of coaching per week per Agent

 

a.  II only

b.  II and IV only

c.  II, III and IV only

d.  I, II, III and IV

 

8.  Which of the following statements is/are TRUE?

 

I.  All Calibration sessions should incorporate a Scorecard

II. Calibration sessions should be held once a month

III. In Calibration make sure everyone agrees on every Performance Standard on an interaction before moving on

IV.  It’s a good idea to include Agents in the Calibration sessions

 

a.  II only

b.  II and IV only

c.  II, III and IV only

d.  None of the above

 

9.  If you had only one way to achieve behavioural change through coaching which one would be the BEST?

 

a.  Give detailed graphs showing the performance of all Performance Standards over a 3 month period

b.  Ensure that Agents are coached without a scorecard at least one time per week

c.  Ensure that Agents are coached with a scorecard at least one time per week

d.  Allow Agents to coach themselves

 

10.  When it comes to monitoring which one of the following statements is TRUE?

 

a.  Side by side monitoring doesn’t work well because Agents can ‘fake it’

b.  Mystery Shopper is one of the formal methods of monitoring

c.  Mystery Shopper research is best done ‘in-house’ rather than outsourced to a research company

d. It’s always best to let the Agent self evaluate first

What a great Quality Assurance professional can do

Would you like to know how you did?

If you’d like to know if your answers are correct we’re happy to help.

We’ve intentionally gone ‘low-tech’ here.  Once you’ve answered all (10) questions just drop an email to me at daniel[email protected]

Let me know the question # and the answer that you chose (either a,b,c or d).

You can use the following format in your email to me:

  1. a
  2. d
  3. c
  4. c (and so on for all 10 Quiz Questions)

It helps also to tell me which Quiz you took. This Quiz is for Quality Assurance.

I always do my best to answer quickly and let you know which ones you got right.  And for the ones you may have gotten wrong I will let you know what the right answer is.

Thank you for reading and giving the Quiz a go!

Daniel

[email protected]

What kind of Customer experience does your Contact Center deliver?

by OmniTouch International OmniTouch International No Comments

In this short article I discuss the question – what kind of Customer experience does your Contact Center deliver?

It ties together two of my favourite topics – Customer Experience & Contact Centers.  And it’s the title of one of my best Keynote talks for various conferences around the world.

The Contact Center in the context of Customer Experience

The Contact Center is a touchpoint that only some Customers will use across some subset of all possible Customer journeys.

And for some organizations it can be less than 1% of Customers who utilize the Contact Center touchpoint at all.

Daniel Ord speaking on Customer Experience

Daniel Ord delivering a keynote on what kind of Customer Experience does your Contact Center deliver?

For example, imagine that on the spur of the moment you decide to stay in a hotel this upcoming weekend.

You ask a friend to suggest a place, you do some research online and finish by booking a reservation on your mobile phone.  No Contact Center involved.

But with that said, when a Customer needs the Contact Center, it can be a real moment of truth.

An experience that has significant ‘weight’ in their overall perception of the organization.

So not every Customer interacts with the Contact Center.  But every interaction with the Contact Center is really important.

The Contact Center is the formal living room in a house

Formal living rooms may sound old fashioned – but they’re still around.

When I was growing up we had a formal living room to receive and entertain special guests or to use for special occasions.

It’s a room that’s always perfect. It’s got the best furniture, the best art and it’s always spotless.  Because even though it’s not used everyday, it must always be ready.

And I think of the Contact Center within an organization in the same way. It’s the formal living room in the house of your organization.

Not every Customer will need to use it.  Nor will every Customer journey involve it.  But for those Customers who do come into our Center, it’s our job to always be ready for them with our very best resources.

So what kind of Customer experience does your Contact Center deliver?

Much of the subject matter for our keynote talk – and for this post –  is based on nearly 20 years of conducting Mystery Shopper research – especially for Contact Centers.

And most Centers have a list of ‘Quality standards’ they use to train Agents and measure their quality performance – and which they hope or believe will deliver a great Customer interaction.

Simple examples of Quality standards include:

  • Clarity in presenting the product or service
  • The level of Human Touch on display
  • The use of branded language
  • The conciseness of the email
  • The sales or upselling skill

The possible list of Quality standards is endless because there is no industry standard set of standards that work for every Center.  If that were the case, all Customers of all organizations would be happy all the time.  And obviously that’s not the case.

And what we’ve found in our research work with Clients is that there is a positive correlation between the sophistication behind selecting and defining Quality directives and the resulting Customer experience.

Or put more simply – when there’s more thought, effort and rigour put into selecting Quality standards – the resulting Customer interactions are better.  And Agents benefit from being treated like adults – and not compliance machines who have to do things like say the Customer’s name three times.

Let’s look at some example Quality standards now.

What to look for when you hire a new Contact Centre Manager

So what’s an example of a Quality standard that was impressive?

One of our most interesting engagements was as the Official Mystery Shopper Evaluator for the Singapore Government.  Which basically meant mystery shopping the quality of different government agencies for phone, face to face and email interactions.

And one of the standards set by the Singapore Government was amazing.  They practiced what they called ‘No Wrong Door’.  Let’s say the Customer had a personal taxation question but accidentally contacted the housing authority.

In most countries, the Contact Center Agent would tell the Customer that they reacehd the wrong place and perhaps give the number for the correct place to call – if that much.

But with No Wrong Door in Singapore, the Contact Center Agent will either arrange a connection to the right Agency or arrange for the right Agency to get back to the Customer directly.

And in a public sector setting that’s amazing.

Having lived in multiple countries, I sometimes joke that trying to get public service assistance through a Contact Center could be branded as ‘Every Door is the Wrong Door’.

That is unless you’re fortunate enough to live in Singapore.

 

What’s an example that wasn’t so great?

Isn’t it funny that we can sometimes come up with the not so great examples more easily than the great examples?

Here are three.

The ‘Ready to Serve’ Quality standard

The Client, a major mobile phone manufacturer, wanted our Mystery Shoppers to evaluate if the Contact Center Agent we reached was ‘ready to serve’.

Did you just read that twice?  So did we.

The question we had was this.  How is it possible for us to tell if someone was ready to serve?  In our opinion, that sounded like something a Team Leader should be doing internally.

We went back and forth with the Client to get some clarification.  But eventually our Client contact wrote us and said – “Look Dan, just ask the Mystery Shopper to do it”.  Which was shorthand for ‘we’re done talking about this.’

So we sat down and came up with our own logic for this Quality standard and moved on.

But here’s the thing.  If senior management selects a Quality standard that even they can’t explain clearly – how can we expect an Agent to bring that to life in their Customer interactions?

The ‘Tai Chi’ standard

For a University Contact Center, the Agents were instructed to immediately redirect the Caller to the university website if it turned out that the information was available there.  

Don’t answer the Caller question.  If it was on the website then send the Customer straight to the website.

I decided to call it the ‘Tai Chi’ standard because they really just tai chi’d Customers to the website!  And avoid answering the question.

And their rationale for this standard?

They had attended a seminar where the speaker told the audience they should focus on efficiency.  And to get people to use the website you have to force them to go to the website.

And you can just imagine the Customer Experience here.

After dialling, listening to the recorded announcements, punching through the IVR options, finally reaching a live Agent and asking their question – the Customer gets tai chi’d to the website.

Yikes.

The every Quality standard is measured as a Yes or No

For a few Centers we’ve worked with, management had decided that all or most of the Quality standards should be measured on a binary scale.  Yes / No.  1 / 0.  It happened or it did not happen.

Because they felt it was less complicated and easier to implement for them internally. That’s classic inside-out thinking.  Do what is easy for the Center – not necessarily for the Customer.

I bet you can imagine what those Agents sounded like when we listened to the calls.  Yup that’s right.

They sounded like robots.  There was no style, no articulation, no effort.

When every Quality standard is measured on a binary scale, that doesn’t just set a low bar for Quality.

There’s almost no bar for Quality.

 

There’s an art & science to selecting Quality Standards

There’s an art & science to selecting the right Quality Standards for your Contact Center.

If you’re lucky enough to have a well-defined Customer Experience Strategy in place that can help a great deal.  Because a Customer Experience Strategy describes the kind of experience you aim to deliver.

It provides a high level guide to coming up with the right Agent standards.

If you don’t have a Customer Experience Strategy, then a Service Delivery Vision can help.

A Service Delivery Vision is very much like a Customer Experience Strategy, but it tends to be focused only on the Customer Service function.  Whereas the Customer Experience Strategy is meant for the entire organization.

Now – if you don’t have a robust Service Delivery Vision then the next question is this.

How did your Contact Center choose its Quality standards?  What guided the decisions?

Here are some of the answers I’ve heard:

  • I think our Managers came up with these.
  • I think our Quality Assurance people came up with these.
  • The last Mystery Shopper provider we used came up with these.
  • Our Agents know how to talk to Customers – we don’t really use any standards.
  • I’m not sure but we don’t want to change them because everyone knows them already.
  • I’m new here and I don’t know – I was just asked to find a Mystery Shopper company.
  • We’ve used these for years and they’re ‘industry standard’ for our X industry 

Answers like these aren’t indicative of any level of sophistication in Quality standard selection & design.

And as I shared earlier, we’ve found a positive correlation between the sophistication of the Quality program and the Customer’s interaction experience.  And that makes complete sense.

Because when there’s more thought, effort and rigour put into selecting Quality standards – the resulting Customer interactions are better.

What we’ve learned about conducting Mystery Shopper Research on Chatbots

 

In closing

I may write a book sharing nothing but Mystery Shopper stories and the ins and outs of how to get Quality right.  There are just so many stories and learnings.

Because your Contact Center does deliver some type of Customer Experience.   The question is whether its the experience you wanted or planned for.

Thank you for reading,

Daniel

[email protected]

 

 

 

Whatever happened to First Contact Resolution?

by OmniTouch International OmniTouch International No Comments

In this short article I consider this question – whatever happened to First Contact Resolution?

Last week I was judging Contact Centres

Last week I chaired a panel of Judges for a number of Contact Centre Awards entries.

One of the Judges on our panel asked several of the entrants –

“So how do you measure your First Contact Resolution rate?” or

“Based on the initiative you’ve shared, what were there changes to your First Contact Resolution rate?”

So that got me to thinking – is First Contact Resolution – or ‘FCR’ – still relevant in today’s Contact Centre?

 

First Contact Resolution is a multivitamin KPI

When I teach Operations I suggest Participants look at First Contact Resolution as a multivitamin KPI.

That’s because it does a few things for you.

FCR helps you to:

  • Improve Customer Satisfaction (through reduction of Customer effort)
  • Reduce cost (through reduction in unnecessary repeat contact volume)
  • Improve future Service Level (through reduction in unnecessary repeat contact volume)

No wonder FCR is referred to with such reverence in the Contact Centre industry.

 

But it’s always been hard to measure

I’ve seen First Contact Resolution formulas out there that would put Einstein’s formulas to shame.

They’re complex and require a lot of internal communication to understand and apply.

So, it’s worth considering why that’s so.

Everyone gets the general idea around FCR.  Assist the Customer to the degree that they won’t need to contact you again.  It sounds easy.

But the practical application is more complex, in part because there’s no industry standard for how to measure FCR.

Push-button KPIs

Many Contact Centre KPIs are push-button KPIs.  Push the button and you get your result.

Push the button and get your Service Level.

Push the button and you get your AHT.

Push the button and get the Occupancy rate.

You get the general idea.

But there’s no button to press for FCR.  It falls into the category best called ‘assembly-required KPIs’.

Think of some other assembly-required KPIs for a moment.

Employee Engagement, Customer Satisfaction, Turnover Analysis are all good examples.  To get at the data for these KPIs you can’t just push a button.

Getting at assembly-required KPIs requires you to design & implement a solid methodology for data collection & analysis.

Common data sources for First Contact Resolution

When it comes to FCR data collection, the most common sources are to:

  • Allow Agents to rate their own performance (not really recommended for obvious reasons)
  • Ask Quality Assurance folks to weigh in on FCR when they do their evaluations (this can be powerful and more on this soon)
  • Survey Customers and ask them if their need was met (but aren’t Customers getting tired of getting surveyed and is this the right question to ask?)
  • Run scans across the CRM system to see if a single Customer record shows multiple contacts for the same ‘reason’ within X time frame (based on business assumptions)
  • Use operational data (when the nature of the interaction is very transactional such as tracking shipments)

And because there are pros & cons to each data source, you choose multiple data sources, assign a weightage to each one and assemble the results together to get an outcome.  The purpose of blending different sources together is to alleviate the inherent advantages & disadvantages of each individual source.

I think of it like making a stew.

You have to select a variety of ingredients, throw them into a pot in the appropriate ratios, stir well and season to taste.

It’s a robust but complex process.

 

So how can we address some of this complexity?

It helps to remember that FCR is ultimately a measure of quality.

Sure – FCR helps reduce unnecessary repeat contacts – and that’s cool.

But at its heart Centres pursue FCR to help Agents create great conversations with Customers.

Conversations that address spoken and unspoken needs – not just deliver transactional answer-based service.

So with that direction in mind, how can we improve our FCR delivery while mitigating the complexity inherent in assembly-required KPIs?

 

Define what First Contact Resolution looks like for each of your Top 10 enquiry types

Every inbound Centre has a Top 10.   The Top 10 ‘reasons’ a Customer contacts you.

While your Top 10 changes over time, these enquiries easily represent 60% – 80% of your monthly contact volume (excluding one-off events of course).

So rather than looking for a magical or ‘industry standard’ FCR rate, take your FCR magnifying glass down to the enquiry type level.

For example, if your Enquiry Type #1 = Questions on room rates you’d sit down with a small group of folks and consider what FCR can and would look like.

What has to be conveyed, whether explicitly asked for or not, in that conversation.

But be careful.

Except for highly transactional enquiries you can’t rely exclusively on your internal determination of what FCR would look like.  You’re going to have to consider FCR from the Customer perspective as well.

And here I always suggest you do some qualitative research.

Bring in some real Customers.  Buy them lunch.

Ask them about their needs, expectations & wants (both expressed and unexpressed) when they ask about room rates.

I don’t see how we can talk about Customer-centricity without actually talking to real Customers face to face.

There seems to a tremendous amount of fear or skepticism or just plain lack of know-how around qualitative research.  That’s an article for another day.

Remember that if you pursue this Top 10 approach – your monthly FCR will fluctuate over time – in part due to changes in the enquiry mix.

For example, if in Month 2 – as compared to Month 1 – you got more volume for an enquiry type where FCR is ‘easy’ to achieve – that will weight up your overall FCR rate in Month 2.  You can’t simply assume this as an improvement in Agent performance – which is what folks tend to believe when they see FCR rates inch upwards.

So the key here is to be able to articulate why overall FCR rates change from month to month – was it a change in enquiry mix, a one-off event that weighted results up or down or did Agent Quality improve or decrease.  These are all potential factors.

By the way – it’s good to know that if your FCR rate is consistently high (let’s say high 80’s and 90s) that could be a sign of a poor self service strategy.  Why are Agents getting such simple enquiries which naturally lend themselves to a higher FCR rate?

That’s why I always smile (and grimace) inside when I hear a Centre say that their FCR rate is in the 90 percentile range.  That’s almost always bad from a self service strategy perspective.

As Centres shift the simpler enquiries to self service you see FCR rates naturally decline overall.

 

Accept that not every enquiry type might ‘qualify’ for First Contact Resolution

By the way – it may turn out the some of your Top 10 can’t be FCR for some reason.  That happens.

But in these cases I ask myself what has to be conveyed or gathered in that conversation to make the ensuing process as effective as possible – even when the overarching goal of FCR can’t be achieved from the Customer’s perspective.

Earlier this year a Contact Centre Manager from a travel company told me that FCR is a mindset and that mindset training would be enough to raise their FCR rate.

But I disagreed.

Yes – having a vision for FCR and putting it front and centre in your Agent’s performance basket matters.  But it’s not enough.

You’re going to have to get a bit more granular – and the Top 10 approach is a practical way to do that.

 

Ask yourself – does my current metrics system align to First Contact Resolution?

Contact Centres are important touchpoints within an organization.  But sometimes that very (self) importance leads to decisions which are good for the Centre but not necessarily good for the Customer.

Let me explain what I mean from a metrics perspective first.

If your Centre focuses heavily on Average Handling Time (AHT) as an Agent efficiency metric or on # of calls produced by the Agents you’re not really considering the Customer journey – you’re looking at what’s good for you.  Short call = lower cost (goes the reasoning).

That’s a touchpoint perspective.

FCR by its nature implies that we take the time needed to get the job done.  To provide the Customer with what they should know – whether explicitly asked for or not.

I’ve written extensively on Average Handling Time but for purposes of this article – if due to your Centre’s metrics perspective your Agent is more focused on quantity or time taken, it’s quality that takes the hit – and that includes  a hit to FCR.

Don’t get me wrong – cost efficiency is great.  But every financial model I’ve worked shows that reduction in future unnecessary contacts saves more $$ overall than trying to shave 30 seconds off current calls.

Why are you still talking about Average Handling Time?

 

Customers think in Journeys – not in Touchpoints

McKinsey writes that Customers think in journeys – not in touchpoints.

There’s a beginning, a middle and an end to a journey.  Some journeys go from start to finish and never touch the Contact Centre.

For other journeys the Contact Centre is a key participant – and important to the Customer’s overall perception.

In Service Design you learn that the various touchpoints need to work in harmony together – to avoid dissonance or distress.  So it makes sense to evaluate the harmony across the journey – not just look at what happens ‘inside’ the Centre.

 

Have your Agents been trained on Customer journeys?

I don’t mean journey mapping – that’s not needed at the Agent or Team Leader level.

I’m talking about sharing the motivations and experiences that led to the Customer contacting the Centre.  What was their mood, what was their ‘job to be done’ – what was the role of the Centre in helping the Customer achieve their goals?

On the other side of the interaction – where will the Customer go next in their journey?  Is there some way we can help them accomplish that better?  What can the Centre bring to the table to deliver a standout role in the Customer journey.

When I do Frontline training I often ask – “Do you know what your music on hold is?” or “Have you experienced your own IVR?  Your own Delay Announcements?”.

Because the Contact Centre Customer Experience doesn’t begin when you start talking (or typing).  It begins earlier upstream.  When the Customer begins to think and feel that they have to contact you.

Nine times out 10 the Agents hadn’t spent time studying the Contact Centre journey – much less the Customer journey.

I think this represents a real opportunity for training and discussion at the Agent & Team Leader level.

 

Should you pursue First Contact Resolution?

My personal belief system around First Contact Resolution is this.

It doesn’t make sense to implement an elective process where the costs and effort of the process aren’t outweighed by the benefits delivered by the process.

If you can prove out that your complex but solid methodology to get at metric-oriented FCR is yielding dividends – then by all means go for it.  Just keep Quality as your North Start for putting together your FCR program – it should always be aligned to what the Customer would say.

So I’m never surprised or judgemental when I meet Centres that don’t specifically measure FCR.  That puts me into the minority I think.

Lately I’ve seen some Centres take a less metric driven approach to FCR that I admire.  It’s also been quite effective for them.

They build the concept of FCR into their Service Vision & Principles.

If you haven’t heard of a Service Vision or Service Principles, they’re essentially a set of statements that answer the question – “What kind of service do we deliver around here?”

For example, if one of their Service Principles is ‘to be helpful’ – they consider all the ways they can be helpful to Customers (and each other) across their various interactions.  The successful behaviours  that enable ‘being helpful’ become codified across the Centre.  Culturally ingrained.

And the use of the Top 10 enquiry type approach works wonderfully here.

Measurement-wise – the use and impact of  these helpful behaviours are picked up in the normal Contact Centre monitoring processes through Quality Assurance, Team Leaders, Mystery Shopper providers and the like.

3 Suggestions for Contact Centre Leaders to transform into Customer Experience Leaders in 2019

In closing

I think FCR still has relevance in today’s Contact Centre.  That’s simply because it has to do with making Customers lives better through letting them know all that they need to know to achieve their goals.

And I think there are alternative ways to achieve the multivitamin benefits inherent in FCR.

If you can prove that your robust FCR measurement system yields results then well done – and keep it up.

But if robust measurement systems are a bit out of reach for your Centre, driving FCR-style behaviour through your Culture & Quality program is a viable alternative as well.  Service Visions & Service Principles are relevant for every Centre.

Thanks for reading!

Daniel

 

 

When good people follow bad Contact Centre process – a story

by OmniTouch International OmniTouch International No Comments

In this short article I look at an example of how otherwise ‘good’ people follow bad Contact Centre process.

Sitting around our workshop table, one of the Participants – a former Contact Centre Agent from a Philippines-based BPO – shared.

“Dan – it starts like this.

QA walks over to our station and while we’re talking to a Customer they give us the time out sign.  That’s their signal telling us to wrap the call up quickly so they can conduct our side by side coaching session.

That time out sign approach is a little off-putting but you have no choice but to get used to it.  

After they settle in and connect their headset to our phone, they pull out a scorecard.  

And as I log back into the system and receive my next call, they quietly mark their paper while I’m talking.  

When the call is done, I log back out and they talk me through each tick-box they made.

Mostly I just hope that my score is a ‘pass’ because if it isn’t, they can just go on and on about my mistakes. 

So of course while they’re sitting next to me I do everything in my power to achieve a pass.

I never knew that there were ‘right’ and ‘wrong’ ways to do side by side monitoring.  Your course is the first time I heard this.

I only have my own experience to go by.  And it wasn’t a good one.”

There’s a lot that’s wrong in that story

At this point in our workshop, when the story gets shared, we’re talking about the power of the side by side method for monitoring & coaching.

The relationship building, the power of personal connection – the time to build trust.  The opportunity to make the time to spend with the people who work for you.

But in the many years I’ve taught this method – admittedly one of my favorites – I find that very few either practice it (we have no time!) or they practice it in a way that damages the relationship – not strengthens it.

There are a few things wrong in this story – and practices like these are more common than you’d think.

  • Using hand signals to summon people is rude – these should be reserved for animals – not human beings
  • Using a scorecard at a side by side session makes no sense – talk about frightening
  • Everyone’s faking it here – especially the Agent who is put in a no-win situation
  • The entire point of helping someone do ‘better’ has been lost
  • The focus on what went ‘wrong’

But the QA person in this story isn’t the villain

It’s easy to say – oh – the QA person you’re describing is the singular villain in this story.

But you’d be wrong in most cases.  Because what happens is this.

Good people readily conform to and carry out bad processes.

To ‘fit in’, to ‘get the job done’ to ‘show they’ve got the stuff’ for advancement & promotion.

And to be fair –  it may be the only way they know because that’s all they’ve ever experienced or been taught.  I see this a lot in the Contact Centre industry.

Even former Agents – who disliked everything we’ve just talked about – will readily jump in the saddle and carry on a legacy process that’s broken.

The villain in this story is the bad Contact Centre process.  In this case around side by side monitoring & coaching.

 

It’s not so great from a values perspective either

As a side observation to this story there’s an impact on ‘culture’ here too.

It’s likely that this Philippines-based BPO has the word ‘respect’ in their core values and if not ‘respect’ then something similar and equally lofty sounding.

We ‘respect’ each other, we ‘respect’ our Customers’, etc.  The posters are everywhere.  And here are pictures of all of us on our annual Team building showing our respect for each other.

But culture is nurtured through the actual behaviours of people at work.

Especially those in leadership and professional roles.

Summoning people with hand gestures and scoring them when they’re trying to serve a Customer aren’t really brilliant examples of respect.

So if you’re after building a ‘culture’ (and today who isn’t) it’s a worthwhile effort to filter your processes – and they way they’re executed – through the lens of your values.

Why are you still talking about Average Handling Time?

Is the Contact Centre industry really that mature?

I once received a comment from a reader who said – “Dan, why do you constantly write about the Contact Centre industry?  It’s a very mature industry already.”

And that comment made me think.  Sure – it’s a mature industry.

But do we always run it in a mature way?

Thanks for reading!

Daniel

 

 

 

 

 

 

 

 

 

When you coach you’re either helping or keeping score

by OmniTouch International OmniTouch International 1 Comment

When you coach you’re either helping or keeping score.  In this short article I explain the difference between the two.

We measure everything!

In the Contact Centre industry we tend to be obsessed with measuring things.

From Occupancy rates through to Net Promoter Score we have dashboards and dials for everything.  (Even though not everything matters.)

And we have a whole special set of measurements reserved just for Contact Centre Agents.

When we’re able to influence and guide our Agents to better Productivity, Quality & Attitude, life is good.

And measuring progress quantitatively along the way is fine.  It’s really important to let people know how they are doing.

Measuring Quality

One of the most important processes in the Centre is Monitoring & Coaching.

We monitor Customer interactions, document our findings and talk to the Agents about their performance.

Great Monitoring & Coaching improves Quality, drives better Customer Satisfaction and delivers higher Employee Engagement.

It’s a multivitamin process with lots of great benefits.

But only when it is well designed.

There are many questions to answer to create a great Monitoring & Coaching process

The Monitoring & Coaching process is more complex than it first appears on paper.

  • Who should monitor interactions?
  • How often should we monitor?
  • What do we monitor for?
  • Who makes the rules for defining and calibrating Performance Standards?
  • How often should we listen, how should we listen, what do we listen for?

And when it comes to Agents –

  • Who should talk to Agents?
  • With what frequency should we talk to Agents?
  • What is the role of Quality Assurance?
  • What is the role of the Team Leader?
  • When or how should a score be involved?

Wow – there’s a lot involved.  But there are some answers too.

Let’s focus in on the use of scoring.

What is the role of the Scorecard?

Let’s zoom in questions around scoring.

  • What is the role of the Monitoring ‘Scorecard’?
  • Do I have to use it every time I speak with my Agent about their interaction?
  • Do I as a Team Leader use it or does Quality Assurance use it?

You’re either helping or you’re keeping score

In our Client work, we find that both Team Leaders and Quality Assurance have an unhealthy attachment to the scorecard.

Every quality discussion with an Agent involves a score.

Even side by side sessions – the rare times they seem to be conducted – involve a scorecard.

Isn’t this all rather disheartening and unnecessary? And typically all the Agent wants to know is the score.  Or ‘did I pass or not pass’?

That’s not a formula for improvement.  And a sure sign there is confusion between helping or keeping score.

What do we mean by that?

Scorecards are wonderful tools for gathering quantitative data.

Providing a developmental summary of scores across randomly selected interactions can be a great tool for Agent performance trending.

Here’s your trend here.  Here’s your trend there.  The big picture of performance and what contributes to it.

But scoring on a day to day basis in the Centre can inhibit growth.

Imagine your Agent comes to you and says –

“Boss, I’d like you to help me with my communication skills. Can you sit with me and listen to a few of my calls and give me your thoughts?” 

You reply, –

“Sure, give me a minute to get my scorecards – I’ve got to score everything I hear and that we talk about – be right there…”

I don’t think you would say this.

Even writing these lines makes me cringe.

The role of a Coach within the context of transactional coaching is to help their Agent get better and better at what they do.

Since when did helping someone get better involve a score?

Scorecards don’t change behaviour

A Scorecard is a judging tool.

It tells you how you did.

Just like watching the scores presented by Olympic Judges after the skater has skated, or the diver made their dive.

They tell you how you did.  But they aren’t designed to help you get better.

It makes me sad when Quality Assurance people tell me that all they do is issue scorecards and hope that Agent quality performance improves.

Dream on.

But helping people changes behaviour

What the best coaches do is sit with their folks – on a regular basis – and help them get better.

They understand that helping is something they do for their people.

“Here’s where you did well.  Here’s where you can improve.”

With no score attached. And why would you need one?

And the more you help someone – the better they will score when the time comes.

In closing

When people ask me how many interactions they should monitor I ask them to rephrase the question.

“How many interactions will you monitor for scoring purposes and to provide trending?” 

“And how many interactions will you conduct to help your Agent get better?”

Then add the answers to these two questions together to get your answer.

Thank you for reading!

Daniel

 

Funny things Contact Centre Managers ask their Agents to do

by OmniTouch International OmniTouch International No Comments

This short article provides a humorous and perhaps disturbing look at how Contact Centre Managers ask Agents to do funny things.

Especially in the context of interacting with Customers.

There’s a well understood process

There is a well understood process Organizations use to select which behaviours they want Agents to display during Customer interactions.

Sometimes called KPIs, Performance Standards or CX Standards, management selected behaviors let Agents know what matters most during Customer interactions.

Example behaviours include:

  • Tone of Voice
  • Branded language
  • Empathy
  • Product know-how
  • Objection handling

The potential list is infinite.

And the final selection of these core behaviours is based solidly on the organization’s CX strategy, Corporate strategy and/or Customer Service strategy.

A lot of work goes into selecting the right behaviours, keeping them up to date and making sure everyone understands the ‘why’ behind each one.

But that work pays off in multiples as relevant quality goes up and good things like experience and advocacy happen.

But some Centre Managers choose to circumvent the process

In what I think represents a misguided attempt to deliver ‘a Customer experience’, management sometimes asks Agents to do funny things.

Let’s start with one of my favorite examples.

At an Asian bank, Contact Centre Agents who logged in for the morning shift, were asked to say a version of the following at the end of their first call that morning.

“Mr/Mrs. XX, thank you so much for helping me start my day off so wonderfully.”

Really?

Sometimes it is hard to know where to begin on something as silly as this.

But let’s try.

First point of view – that of the Agent.

How many Agents would you guess supported the use of this behaviour?

Yup – none of them.  It felt odd and inauthentic.

That should have been the first clue that something wasn’t quite right.

It’s called Voice of Employee or VOE and is an important source of Customer understanding.

Secondly, let’s get practical.

What if the first Caller was angry?  Crabby?  Too little coffee intake as of yet?   Does the Agent still have to deliver the behaviour?

Another personal favorite

Another Asian bank – different country.

The Service Quality Team had engaged a ‘Customer Service Expert’ who convinced them that there was an industry standard for a smile.

A proper smile must show 12 teeth.

And they bought it.

And then they Mystery Shopped it.

Can you imagine the training session for the Mystery Shoppers?

“Ok guys – when the Banking Officer smiles at you be sure to count if 12 teeth are showing.”  

And can you imagine the final Mystery Shopper presentation to the Board?

“And ladies & gentlemen, we’ve got a problem – on average less than 7 teeth are showing and let’s not even talk about the  intensive dental work cases that we will report to you separately.”

This story is a little different

This story is a little different as it comes from Mystery Shopper research.

An international mobile handset manufacturer wanted to Mystery Shop their Frontline Agents.

The Mystery Shoppers were to dial in, ask a specific set of questions and record the conversations.

So far so good.

Because to Mystery Shop well you need to select and define the key behaviours to be measured as part of the program.

The Organization had a prepared list of behaviours which they turned over to us.

Behaviour #1 – “Was the Agent prepared and ready to take the call?”

So we asked – “Oh. How would a Mystery Shopper know if the Agent was prepared and ready to take the call?  

To which they replied – “The Mystery Shopper should be able to tell if the Agent was prepared and ready to take the call.  Score it.” 

Ah, ok.

The Customer Experience Mystery Shopper Program – are you on track?

And what’s up with this Small Talk standard?

Depending on your CX Strategy, your Corporate Strategy and your Customer Service Strategy, it may make perfect sense to implement a ‘Small Talk’ behaviour into your Agent set of quality standards.

Typically I see Small Talk expressed as “Have you had your lunch yet sir?” or “How’s the weather in Singapore today?”

An unrelated question added into the conversation with the intention to build rapport.

I’m not disparaging the standard.

If your organization went through the full and proper process of selecting and defining relevant standards and Small Talk presented itself – then by all means implement it.

But the Agents I meet tell me that the Small Talk standard was literally grafted on to their existing set of standards.  They felt they were being asked to do a funny thing.

When done right – appropriate small talk can elevate a conversation.

But when used at the wrong time, or in the wrong way – it sounds at best inauthentic and at worst – irritating.

Making it a compliance behaviour is almost guaranteed to be problematic.

At the end of the day you can’t capture the entirety of the Customer Experience in a single interaction

It’s well understood that the Customer Experience consists of the Customer’s perceptions across their entire experience with an organization.

And that sometimes that experience doesn’t even touch ‘Customer Service’ or the Contact Centre.

Of course, when it does touch Customer Service or the Contact Centre that interaction may have more emotional resonance than other types of interactions.

And that matters.

The management decisions described in this short article were not made by one individual.  A group or committee of smart people sat around a table, decided that these were good ideas and implemented them.

But grafting on Agent behaviours in the hope they deliver a positive Customer Experience shouldn’t involve Agents saying funny things.

Thanks for reading!

Daniel

“Image © Matt Madd/Dentist” https://costculator.com/dentist/

 

 

 

 

 

Customer Experience lessons we learned and apply in our Art Gallery

by OmniTouch International OmniTouch International No Comments

We opened our Art Gallery in 2011

We’ve learned a lot of Customer Experience lessons in the 7 years since we opened our art gallery, The Art Club Singapore.

Fred Gowland

After months of set-up, crafting our mission and developing our Artist roster, we held our first gallery reception in Singapore on October 2011, featuring California Artist Fred Gowland (shown in photo).

Owning both a CX/Service consultancy, OmniTouch, and an art gallery, The Art Club Singapore, is not as dissimilar as it might sound on paper.

It’s clear that both great service and the consideration of an artwork to purchase are emotionally rich activities.

We learned to apply Customer experience lessons in our work at The Art Club Singapore and in this article, we share some of those lessons.

The (6) Customer Experience Competencies

The Customer Experience Professionals Association (CXPA) has defined 6 competencies for mastery in Customer experience.

The Missing World, Giada Laiso

In this article, we share our learnings via the (6) competency framework.

Our reasons for using the 6-competency framework approach are simple:

  1. We wanted to work through the mental exercise of applying the (6) competencies to a real business – our gallery
  2. We wanted to help the Reader ‘digest’ the (6) CX competencies for their own use and benefit

(Photograph shown, The Mising World, Italian Artist, Giada Laiso)

The (6) competency areas defined by the CXPA are:

  1. Customer Experience Strategy
  2. Voice of the Customer, Customer Insight & Understanding
  3. Experience Design, Improvement & Innovation
  4. Metrics, Measurement, and ROI
  5. Organizational Adoption and Accountability
  6. Customer-Centric Culture

https://www.omnitouchinternational.com/our-services/ccxp-practice-quizzes

In this article, we cover our learnings across the first 2 CX competency areas.

#1 – Customer Experience Strategy

From the beginning, we knew what we did not want our Customer experience to be.

We did not want to be a stereotypical gallery with white walls, antiseptic displays and fashionable assistants.  We found that approach to be intimidating and ‘unhelpful’.

Particularly in our local market where art appreciation and widespread collecting was still in a developmental stage.

That allowed us to focus on the kind of experience we did want for our Customers.

We began with the company name, The Art Club Singapore.

The Art Club Singapore

The ‘Club’ was important to us because it represented a space where people could come together to share –

  • Eagerness to explore art
  • Enjoyment to socialize in a home-like space
  • Joy of just being themselves without the pressure of purchase

The logo

Once we had the gallery name, we worked through the design of the logo.

The Art Club Singapore

The logo was designed to represent the three stakeholders involved –

  • Red for the creative Artists that are often misunderstood in their work.
  • Blue for the people that want to appreciate art but may not know where, when, why and how to go about it.
  • Green for the Art Club Singapore that brings the circle of Artists (red) and Art Appreciators (blue) together.

The process of creating the gallery name and the logo helped us clarify the role we wanted to play in the lives of our Customers.

We further refined our intended experience through the following guidelines which have served us well:

Dietmar Gross

For our Artists

  • We would only show the work of professional, full-time Artists, known in their own markets
  • We would show work from Artists based in the Americas, Europe and Australia that we had collected ourselves and who we knew personally

(Briefwechsel, Oil on Belgian canvas, German Artist, Dietmar Gross)

For our Guests

  • We would use our space for public education, benefits and art talks as well as Artist shows
  • We would provide a place where experienced Collectors would mingle with folks who had never attended a gallery event before

The Art Club Singapore

For our Collectors

  • We would provide an eclectic collection of pieces across countries, mediums and Artists in an atypical gallery space
  • We would provide access to the Artists to allow them to immerse themselves in the Artist’s story

After we crafted our name, logo and guidelines we shared them across our small group, our Artists, our Partners and publicly with our Guests & Collectors through social media and marketing communications.

The Mission Statement

Next came our Mission Statement.  The Art Club Singapore, where Art & People meet

The Art Club Singapore

The Mission made it clear to us what we were supposed to ‘do’ or provide every day and we consider it to be an integral part of our Customer Experience strategy.

If I had to sum it up – our Mission Statement = our Customer Experience Strategy while our name, logo and guidelines represent our ‘Corporate Strategy’ and brand.

The Customer Experience Strategy really matters

When it comes to the Customer Experience strategy, it was helpful for us to put first things first.

Who were we, the our intended Customer experience and what ‘purpose’ would we refer to as we evolved over time.

Ingela Johannson

Of course, as a small business we had a major advantage.

We didn’t have hundreds or even thousands of Employees to immerse in our intended Customer experience.

But the process we went through and the learnings gained from doing it right are relevant to anyone pursuing Customer experience as a business strategy.

(With Swedish Artist, Ingela Johansson)

Now let’s turn to Competency #2 – The Voice of the Customer, Customer Insight & Understanding.

#2 – Voice of the Customer, Customer Insight & Understanding

While our Customer Experience strategy was clear to us – tying that to who our Customers were and what gallery Customers really want from their visit was an ongoing learning experience.

The Art Club Singapore

We stepped back and used our CX/Service consultancy credentials to look at the entire gallery experience.

How Customers would learn about us, what would entice them to come to a talk or event.

And perhaps most importantly, how could we orchestrate a gallery event that exceeded their expectations.

And result in them telling more people about us?

(Photo of Singapore Deputy Prime Minister Teo viewing our donation to charity)

Who were our Customers?  The role of Personas

Within months of launch we were able to document our Customer Personas.

The Art Club Singapore

Here are some of the Personas we identified:

  • The Cultured Expat

Married couple, 40s – 60s, very comfortable with their own taste, had purchased art before, looking for an experience not just a ‘purchase’

  • Students / Early working years

20s through early 30s, sought ‘date night’ events, sought a chance to brush up their art viewing skills, appreciated being treated with respect

  • The Professionals

    The Art Club Singapore

30s through 50s working professionals, looking for a new experience, like to learn, interested in refining their art viewing skills, very practical

  • The Gallery event mainstay

All ages, attend all gallery events, catch up with other ‘regulars’, food & drinks matter

  • The Socialite

Generally female, already a collector, events were a chance to dress beautifully, good in a crowd setting, loves being in photos, great with social media

The Art Club Singapore

We continued to refine our Personas based on observation, listening, asking questions and studying our ongoing email correspondence and social media posts.

Even Guest Visitor books provided a lot of rich commentary as to what people enjoyed about their visit with us – we learned to have those prominently featured at all events.

This was an important learning for us because we found people tend to be super direct and specific when signing a Guest Book whereas that same person may be more ‘polite’ in a face to face discussion.

Ethnographic Research

Daniel Ord, The Art Club Singapore

Ethnographic research – which refers to observing Customers in their natural setting – was easy for us as the Guests entered ‘our’ environment.  We simply had to watch and compare notes at the end of the evening (sometimes that was 2AM!).

Examples of Customer Insight that we picked up from our Guests included:

The cultured expat persona was interested in having you come to their home and provide design advice as well as ensure the end to end hanging and arranging service.

They also typically had a home in their country of origin packed with art but were keen on smaller pieces they could display in their current home in Asia.

The Art Club Singapore

We learned that Students /early working years persona were eager but had limited ‘self-confidence’ in how to look at a piece of art and interpret it.

Some useful tips and advice went a long way with this group – as well as the free art lectures.

From our local Guests we learned that certain subject matter, including some animals and depictions of human faces, were considered unlucky.

Fred Gowland, Green Fox

With one series of Foxes done by Fred Gowland we were told that the term ‘fox’ was a colloquialism for a husband-stealer.

It seems that a married woman might not want a fox in her home!

What would our Guest go through?  The role of the Journey

Again, our CX/Service consultancy background served us well.

Arman Fernandez, The Art Club Singapore

We understood that the Customer journey for an art gallery event did not begin when our Guests walked through the door.

It began with receiving our invite, marking the calendar, figuring out how to reach our venue and even what to wear (maybe especially what to wear!).

We realized that each event needed to be unique – so we gave each event its own theme.

  • Travels of Fred Gowland – paintings created through extensive travels of the Artist.
  • Raise the Pink Lantern – An event focussing on the LGBT community in Singapore.
  • The Monk wears Prada – Paintings of Buddhist monks exploring urban Singapore
  • Masterful European Bronzes – A Society Collection

(Bronze shown, La Mandoline, French Artist, Arman Fernandez)

Where the event was a lecture we came up with a new offshoot of our logo, so our Guests would know that the next event was specifically a lecture.

The Art Club Singapore

Even though we had a clear curatorial direction –  the Artists and types of work we wanted to show – the Voice of the Customer encouraged us to try new things and expand our offerings in new ways.

Thank you for reading!

Daniel Ord, Marcus von Kloeden, The Art Club Singapore

Daniel and Marcus – Co-Founders, The Art Club Singapore / Owners, OmniTouch International

 

What I learned judging this year’s UK Complaint Handling Awards (2018)

by OmniTouch International OmniTouch International No Comments

This year I had the great opportunity to judge at the UK Complaint Handling Awards, covering the latest in complaint handling practices.

After my return from the event my colleagues and friends asked me, Marcus, what did you learn?

First, I had the chance to meet hundreds of people across various industries, including Telcos, Banks, Insurances, Utility providers and many Government agencies.

That’s always one of the great things about participating at an Awards event.

The complaint handling practices shared in this article come from my role as a Judge.

As a Judge, I had the chance to meet all the Entrants, read their written submissions and listen to their respective face to face presentations to the panel.

As a result I learned some of the latest complaint handling practices out there in award winning Organizations.

Across all the Complaint Handling Team Entrants, 3 important things stood out 

There were 3 key complaint handling practices that I picked up from interacting with all the Complaint Handling Teams.

To improve the Customer complaint handling ‘practice’ within their Organization, they focused on:

  1. Analysing data from the Customer point of view
  2. Improving internal & external processes to reduce complaint volumes & time
  3. Targeting the Ownership of the complaint

Let me give some more detail on each learning

1. Analysing data from the Customer point of view

The Complaint Handling Teams indicated that they analyzed thousands of recorded calls, reviewed piles of surveys and read through thousands of emails and contact forms submitted by Customers.

Their strategic purpose was clear – to understand their Voice of the Customer.

The Complaint Handling Teams told us that, though other departments and functions did their own sets of analyses, they felt that pure focus on the Voice of Customer was missing.

So they created their own analysis function.

Guess, how important is it to the Customer that you use their names three or five times during a call?

Isn’t it more important to listen and understand what their concern or matter is?

Entrants started to read between the lines – from the Customer point of view – and acted on what they learned.

2. Improving internal & external processes to reduce complaint volumes & time

Equipped with the results of their analyses, the Complaint Handling Team went to their Management to propose changes to processes or rules that caused Customer discomfort.

Some of the process changes the Complaint Handling Teams shared were –

 

They took out the Average Handling Time to measure the Agent’s performance.

Agents suddenly had the freedom to listen, to react and find with the Customer a solution.

Escalation processes decreased dramatically.  Agents started to become more personal in their conversations.

Frontliners & Agents were officially empowered.

They were given the power to decide on the spot what to do for the Customer instead of getting permission from their superiors.

That helped to ease processes for the customer and complaint could be resolved during one contact.

Adding empowerment to the job makes it more interesting, enjoyable and challenging as well.

Some Complaint Handling Teams introduced new technology into the Contact Centres to support staff members.

Technology was introduced to support Team members to read the Customer’s history, react proactively, share information with other departments and schedule follow ups.

Training around the new technology and processes was scheduled and conducted so Employees were prepared before using the newsolutions.

That eased the transition for the Customer and held back stress on the Employees.

Interestingly, many complaints stemmed from questions about bills and statements.

The Complaint Handling Teams shared they were in the process of breaking this big topic down into workable parts.

3. Owning the Complaint

The Complaint Handling Teams shared was how important the concept of ownership was to complaint resolution.

That took one of two forms.

The Customer gets either one point of contact to deal with them all the way through.

Or the Customer history is made available to everyone in the Organization, and they are tasked to work together to resolve the issue.

While technology supported or ‘helped’ it was the process and the people that put things into action.

This really impressed me.

In closing

This year’s UK Complaint Handling Awards (2018) have shown that Listen & Understanding the Customer’s Voice, more accessibility of data to Agents, simpler processes and taking ownership, lead to big improvements in Customer Experience.

Aside from the many KPIs, like NPS, that were presented, the most impressive part was the presentation was the gathered feedback from real Customers.

These Customers’ shared how impressed they were about the good care (“Ownership”), the easy processes (“History availability/System improvements”) and someone listened and heard their issues (Data analytics).

The Customers felt they were heard and more importantly helped.  They seemed to like using email to share their compliments.

In all cases, these simple changes reduced dramatically painful Customer Journeys throughout the organisations.

The Customer experience score for the organization went up dramatically.

I am glad to share that investment in data analytics, new processes and training in Frontline training have really paid off.

All the Complaint Handling Teams were able to demonstrate a financial ROI to back up their work.

They all retained Customers, gained new business and got promoted by their now happy Customers to others.

Isn’t this reason enough to start thinking on this?

Thank you for reading!

Marcus von Kloeden

Email Writing Tips for better Customer Experience – the Ritz Carlton, Santa Barbara

by OmniTouch International OmniTouch International No Comments

In this article we share specific email writing tips for better Customer Experience and Service Recovery using a real case study at the Ritz Carlton, Santa Barbara.

The Ritz Carlton Hotels.

From their webpage:

100 years of history. Countless rewards. With an unshakeable credo and corporate philosophy of un-wavering commitment to service, both in our hotels and in our communities, The Ritz-Carlton has been recognized with numerous awards for being the gold standard of hospitality.

Santa Barbara, California.

The city lies between the steeply rising Santa Ynez Mountains and the Pacific Ocean. Santa Barbara’s climate is often described as Mediterranean, and the city is referred to as the “American Riviera”.

So, the expectations for service at the Ritz Carlton Bacara in Santa Barbara, California are understandably high.

The situation

On a recent holiday in the U.S. I spent time with my sister Diana who lives and has her business in Santa Barbara.

The Ritz Carlton Bacara, Santa Barbara

One evening she turned to us and said – let’s have a leisurely dinner at The Ritz Carlton Bacara tonight – to which we all emphatically nodded yes.

The following day, my sister sent a detailed email to the Ritz Carlton to share on our experience.

The purpose of this article is not to complain about service.

I’m not a fan of articles where Customer Service experts write to vent frustration or unhappiness under the guise of promoting Customer experience.

My intention in this article is to share email writing tips for better Customer experience and Service recovery efforts.

The email exchange with the Ritz Carlton provided a perfect and personal case study.

Here is the email my sister (the Customer) sent

Good Morning,

I am writing because I felt compelled after a bumpy visit to the resort yesterday in Santa Barbara and I thought it would be helpful for your managerial staff to be made aware of so many missed opportunities for our visit to have been special.

I have family in town from Singapore and Germany and felt a visit to the Bacara would cap off their trip spectacularly.

I made reservations for the Bistro at 5:30 to enjoy a leisurely time outside during a typically slow time for restaurants.

An hour after the reservation was made, Stephanie called from the Bistro and left a message to inquire whether we would want inside or outside, which I appreciated.

I called back a few minutes after her message and couldn’t reach anyone in the Bistro for a few tries (the PBX call bounced back to the operator).

When I reached her, I verified that we would be outside and see her in an hour and a half.

The Ritz Carlton Bacara, Santa Barbara

We parked in valet and entered the lobby where an absolutely spectacular floral arrangement greeted us. This was going to be great.

We reached the Bistro and the hostess stand was empty.

We waited a few minutes and Stephanie came up and greeted us and led us to our reserved table for four which was only set for three.

We sat and several minutes later the fourth setting arrived.

Approximately 10 minutes later bread arrived but no bread plates, so we waited another 10 minutes to give our order and at that point asked for bread plates.

Our Server was sweet but only came to the table a couple of times in the two hours we were there.

When we ordered our food, she didn’t ask about drinks, and on our side, we forgot to order them.

Risotto

The food came 45 minutes later, and the chicken/risotto dish was amazing (my visitors had this and they loved it).

I was beginning to get frustrated because of the wait times between visits to our table so we asked for the bill and a person we hadn’t seen yet brought it.

We decided that rather than leave straight away, we would have a drink/coffee in the bar and get a change of scenery.

At the bar the bartender told us that there is no coffee available at their bar but that they would get one from the restaurant.

The Ritz Carlton Bacara, Santa Barbara

We settled in front of the fireplace in the lobby and 30 minutes or more passed without any word or visit from the staff, so we left.

I was so disappointed because I felt like there were so many missed opportunities to be treated like welcome guests.

I truly hope this beautiful setting can be matched by top notch service soon.

Thank you for the opportunity to share our experience,

Best,
Diana

Here is the reply from ae Food & Beverage Director

From: “Lawrence Teatree”  (names are changed)
Date: April 16, 2018 at 1:50:17 PM CDT
To:” <[email protected]>
Subject: Your stay at The Ritz-Carlton Bacara, Santa Barbara

Dear Mrs. XX,

Thank you for choosing to stay at the Ritz-Carlton Bacara, Santa Barbara and providing your honest feedback.

Providing the highest level of hospitality is our number one priority and we sincerely apologize for falling short of meeting your expectations.

We have shared your feedback with the Bistro and Bar team to ensure the necessary guidelines are in place to improve the restaurant experience. I have also passed your comments to our Chef regarding the risotto! Thanks!

I do appreciate you giving us the opportunity to restore your confidence in Food and Beverage by speaking to me directly. Please let me know the best contact number and time to reach you, or you can call me at any time at 805 XXX XXXX.

Once again, thank you for your valued feedback and we hope to serve you again whenever your travels bring you back to Santa Barbara.

Lawrence Teatree
Food and Beverage
The Ritz Carlton, Bacara Santa Barbara

Here are email writing tips for better Customer Experience –  documented within the body of the reply 

The Subject Line

From: Lawrence Teatree
Date: April 16, 2018 at 1:50:17 PM CDT
To:” <[email protected]>
Subject: Your stay at The Ritz-Carlton Bacara, Santa Barbara

With regard to the Subject Line, we were not hotel guests at the Bacara. We were clearly dinner guests.

The Subject Line made it clear that Alex had not read our email or that he was simply following standard (and robotic) protocols.

The Subject Line matters.  It should be well crafted.

The Opening

Dear Mrs. XX,

Thank you for choosing to stay at the Ritz-Carlton Bacara, Santa Barbara and providing your honest feedback.

We did not stay at the Bacara, we were dinner guests. So, the Opening line is irrelevant at best, tone deaf at worst.

The Apology

Providing the highest level of hospitality is our number one priority and we sincerely apologize for falling short of meeting your expectations.

Lawrence is a Director of Food & Beverage.

Based on his title, the restaurant where we had dinner and the bar where we later tried to get coffee would both fall under his purview.

The email would have sounded a lot more personal if he referred to himself – “I” and not “we”.

For example:

I apologize that I and our Team fell short of meeting your expectations and that of your dinner Guests…

And by talking about himself and/or his Team, he would have demonstrated that he took ownership of the experience.

This Empathy statement would have sounded more human and sincere than “we sincerely apologize”.

If you need to use the word ‘sincere’ in a Customer communication, that’s already a red flag.

If you have to sincerely apologize, does that mean you have insincere apologies too?

The Corporate Speak

Now let’s get to the Corporate speak.

How does the following phrase help matters?

Providing the highest level of hospitality is our number one priority…

Is that so? Providing the highest level of hospitality is our number one priority?

The entire reason the Customer took the time and effort to write a long and detailed email is because that didn’t happen for her.

He might as well have written –

The Ritz Carlton Bacara, Santa Barbara

Providing the highest level of hospitality is our number one priority, except obviously what happened in your case…

When you make a mistake – you apologize first.

You don’t couch the apology in ‘corporate-speak’.

This statement, coming at the opening of the Empathy Statement, reduced the impact and sincerity of the apology.

It sounded robotic and scripted.

The Content

We have shared your feedback with the Bistro and Bar team to ensure the necessary guidelines are in place to improve the restaurant experience.

The Customer was very detailed.

She shared no less than 10 observations about the experience across both the restaurant and bar.

The Ritz Carlton Bacara, Santa Barbara

She took effort and time to help the Ritz Carlton improve and even references at the end of her email that “I truly hope this beautiful setting can be matched by top notch service soon.”

Lawrence’s reply did not address a single specific point out of the 10 raised – nor did he share any details of “ensuring the necessary guidelines are in place.”

Lawrence could have done so much to restore the confidence of the Customer.

While it may not be necessary to address each of the 10 points raised by the Customer, Lawrence could have better matched her effort.

He could have specifically shared what he was going to do with that information that had been given.

As an example – and with better service recovery in mind – he could have said –

With regard to the number of settings at the table when you were seated (3 vs. 4), we have asked the Team that takes our reservations to indicate clearly to our Servers, the number of diners expected and the preferred seating location.

The Ritz Carlton Bacara, Santa Barbara

I’m really glad you brought this to my attention.”

When you learn how to write an efficient & effective email, you learn that you need to address both the Tone of the Customer and the Content of the Customer.

This Customer deserved a better ‘Content Match’ than she received.  She put a lot of effort and detail into her email.

That was not reciprocated in the reply.

I have also passed your comments to our Chef regarding the risotto! Thanks!

This was a nice statement and showed that Lawrence read the email.

The Recovery

I do appreciate you giving us the opportunity to restore your confidence in Food and Beverage by speaking to me directly. Please let me know the best contact number and time to reach you, or you can call me at any time at 805 XXX XXXX.

This invitation to reach out to him is excellent and shows a personal touch.

The recovery would have been so much more effective if the overall email had been better.

The Closing

Once again, thank you for your valued feedback and we hope to serve you again whenever your travels bring you back to Santa Barbara.

The Customer is a long-term resident of Santa Barbara – making assumptions that all your Guests are tourists or visitors is not very welcome for locals.

Lawrence Teatree
Food and Beverage
The Ritz Carlton, Bacara Santa Barbara

In closing

If you attend to Customers by email, it’s important to –

Know what your brand ‘voice’ is – and confirm that it sounds human.  The days of Corporate speak and roboticism in email writing are over.

In this new world where chatbots and AI Assistants sound friendlier than a real human being does, humans should sound more human!

Understand  that email is a complex form of one to one communication.  Training and coaching really matter.

Ensure all your Customer channels are operating to the same, high standard.  

I hope this article has been helpful!

Daniel

Daniel Ord / [email protected]

How to be a better Service Quality Manager in 2018

by OmniTouch International OmniTouch International No Comments

It’s a new year – and a great time for Service Managers to reflect on ways to deliver more value for their Teams & Organizations in 2018.

Here are some ideas.

More problem prevention, less problem solving

You’ve earned the right to be proud when your Service Staff competently handle Customer inquiries.

But you serve a higher purpose when you stop and ask yourself – why do we get these inquiries in the first place?

To solve that question requires courage.

The courage to get up, leave your desk, and traverse your organization to piece together the root causes of Customer contact.

Because for most Customer Care & Technical Support environments – the best contact is no contact.

The best Service Quality Managers work ‘up and out’ throughout the organization.  They don’t focus exclusively on their internal Staff performance.

More inspiration, less compliance

It’s so easy to get caught up in compliance.

Did Staff say the Customer’s name three times.  Did they wear black socks.

Sure compliance matters – but it’s not the stuff of Staff inspiration or culture building.

In the best circumstances, Service Quality Managers help design ‘what kind’ of experience to deliver.

Then using vivid language – written with adults in mind – they create a ‘statement of experience’ that links the day to day activities of their Staff to the kind of experience the organization aims to deliver.

But crafting an inspirational statement of experience – while important – is only one aspect of inspiration.

The best Service Quality Managers are also inspiring people.

What they speak about, the way they speak, the rituals they build into the lives of their Staff.

The best Service Quality Managers understand the impact of inspiration and harness it through how they behave and what they do. 

More ‘let’s try new things’, less ‘we have to do it the same way’

As the new year begins, some organizations will begin planning for Wave 12 of their Mystery Shopper program, or Wave 23 of their Customer Satisfaction survey program.

They will trot out exactly the same programs they’ve always run – with the refrain that ‘we need to keep things the same for trending analysis’.

With rapidly changing Customer expectations, delivery methods and opportunities to learn and grow, it doesn’t make much sense to keep looking at the past when your Customers live in the present (and of course the future).

Again, courage is required.

The courage to challenge senior leadership to be brave.

To set aside endless trending and really dig deep to learn.

To stop looking at only numbers and respond to what Customers want, need and feel.

The best Service Quality Managers keep updated on evolving approaches & practices in Customer experience and are willing to try new things to learn and grow their effectiveness.

More ongoing Staff development

In so many workshops, Participants tell me, “Dan, I haven’t attended a workshop, training or developmental program in (fill in the blank) years.

Many receive training at hiring and then, as the years roll by, they’re expected to organically ‘get better’ through repetition and tenure.

While I remind folks that they need to largely self-manage their own career these days – that’s not an excuse for organizations to forego Staff development.

The best Service Quality Managers understand that their Staff need to feel they are learning & growing to stay equipped and engaged.

More journey, less destination

Embedding daily, weekly and monthly rituals & storytelling practices, keeps Service & Customer experience front and center in the lives of your Service Staff.

Look deeply into the frequency and intention of your rituals.

A once a year ‘Service Week’ or participation in an industry Awards activity – on their own – won’t be enough to effect profound change.

Whether it’s the morning huddle or a weekly Customer sharing session it’s important to keep Customers front and center in the minds of your Staff.

If you commit to a monthly / quarterly Service ritual – then stick to it.

And use the ritual to focus on Customers – not organizational announcements.

The best Service Quality Managers know that meeting & exceeding Customer expectations is a journey, not a destination. 

They design & execute meaningful rituals that routinely bring Customers to life in the lives of their Staff & Organization.

I hope some of these ideas were useful for you.

Happy holidays!

Daniel

Daniel Ord

[email protected] / www.omnitouchinternational.com