Skip to main content
Consortium for Service Innovation

Appendix C: Metrics Matrix

The comprehensive metrics matrix below (or in .xls format) provides a snapshot of everything we have learned to date about measures. We are not recommending organizations use all of these; this is a list of possible measures and some of their attributes.

 

Adptn Phase

Audience

Data Sources

View

Use/Comments

Team

Individ

Activity (leading indicators)

 

DO NOT PUT GOALS ON ACTIVITIES!

 

Assisted (support center)

           
 

Article create/modify

2

X

X

KM tool

Trends

 
 

Reuse of others Articles

3

 

X

 

Trends

 
 

Competency levels

1

X

 

Manual

Trends

 
 

Participation

2

X

X

CRM and KM tools

Trends

 
 

Workflow alignment monitoring

3

 

X

Manual - Coaches assessment

 

 KCS Competency level

 

Currency trends (obsolete, modify)

3

X

 

KM tool or data mining tool

Patterns

Use a data mining tool that will identify patterns in the KB based on the content not based on predefined buckets (manual classification is marginally useful)

 

Incidents closed

1

X

X

CRM tool

Trends

Number of assisted support cases coming into the support center.

 

Web

           
 

Sessions/sign ons

1

X

 

Web reports

Trends

Related to technical support issues

 

Searches/queries

1

X

 

Web reports

Trends

 
 

Page hits/views

1

X

 

Web reports

Trends

 
 

Incidents opened within 24 hours of web session

3

X

 

Web reports and CRM

Number

Link web session to incidents opened by individual

 

Avg # of page views/exception

3

   

Survey or usability studies, web analytics

Number

Some use exceptions/session

 

Community

           
 

Sessions/sign ons/visits

3

X

 

Web reports

Trend

Health of community, trend compared to total potential population

 

Posts

3

X

 

Web reports

Trend

Health of community

 

Valued players

3

 

X

Manual

Trend

Number of designated "valued players" in the community

Outcomes (lagging indicators)

           
 

Demand based view - Whole system health (customer experience)

     
 

Total support demand

3

   

CRM, Web, community

Trends

Support contribution to customer success.  Customer experience - An approximation of the total customer demand for support

 

Demand satisfaction by channel

3

X

 

CRM and Web and community

%

Optimize the overall system - % of total demand satisfied through each channel.

 

Process - Support Center (assisted support)

         
 

Time to resolve/relief

2-3

X

X

CRM tool

 

Not time to close, relief is the point at which the customer is offered an answer, fix or work-a-round

 

Known Vs new

3

X

 

CRM and/or KM

%

Helps you to understand the maturity level of KCS and web delivery in your organization.   Ideal = 85% new; which means most known are being solved on web or in the community

 

Time to relief - known

3

   

CRM

Avg. minutes

An indicator to improve the effectiveness of the KB.  The faster staff are able to find content in the KB, the faster they can provide relief to a customer.

 

Time to relief - new

3

   

CRM

Avg. minutes

Indicator of effective problem solving.

 

First technical contact resolution

3

X

X

CRM tool

%

These measures are impacted by a successful self-service model, as self-service becomes more effective First contact resolution will decline and cost/incident will go up - this is a good thing as total support costs should be going down

 

Cost/Incident (and/or exception)

4

X

X

CRM and financials

$

 

Citations (Reuse by others)

3-4

   

KM tool

Number

Articles created, Articles modified (citations for each)

 

Time to publish

2-3

X

 

CRM and KM tools

Avg minutes

Helps assess the flow of content to self-service by measuring the average minutes to get articles visible through self-service. Typically measured from time stamp of  "relief given" to the time stamp for when the article was "published"

 

Collaboration (assisted support)

           
 

Team health

4

X

 

Survey

% satisfied

Used to identify areas for improvement.  Trust, conflict resolution, commitment, accountability, focus on results (see the Consortium's collaboration health survey)

 

Organizational network Analysis

4

X

 

Manual

Network map

Identifying Coach candidates and indicators of overall network health

 

Communications and Alignment

           
 

Employee understanding

2

 

X

Survey

Score, trend

Assess effectiveness of management/leadership

 

Employee buy-in

2

     

Score, trend

Assess effectiveness of management/leadership

 

Communications effectiveness

2

     

Score, trend

Assess effectiveness of management/leadership

 

Article Quality

           
 

Quality index (finished Articles)

3

X

X

Manual

Score

Goal set, those below quality goal risk losing their KCS license

 

Framing quality index

3

X

X

Manual

Score

Input for Coaches

 

Article life cycle

1

X

 

KM tool

Pattern over time

Monthly snap shot of article States, over time will show if articles are moving through the life cycle

 

Customer success with self-help

3

X

X

Web and manual

%

Can be measured "explicitly" by using a survey, but can also be a derived metric based on user click paths.  Did they log a case after their self-help session within a defined period of time?

 

Diversity of source; internal, external

3

X

 

CRM, KM, Web, Community

%

Indicator of health of the whole system.  % of total KB content from each source

 

Value of content (Articles)

2

     

Index

Two views the value of the collection of content and the value of specific pieces of content

 

The value of the KB

4

X

 

CRM, Web

$$

Self-service success on issues customers would have opened an incident about had they not found something helpful (some times called case avoidance, call deflection; both of which are terrible terms)

 

Value of a Article - internal use

3

X

 

CRM, Web

Score

Assesses the value of specific content.  To calculate, assign points to an Article for activities that imply value. For example, when it is linked (solves) an incident - weighting may be applied based on severity, impact or importance - it can get complicated quickly....

 

Value of a Article - Web use

3

     

Score

Assesses the value of specific content.  Example, assign points to a Article when it is the last Article viewed in a success self-service experience (see click stream analysis - success)

 

Customer sat w/KB use Vs without KB use

4

X

 

Survey and CRM/KB

 

Incident based cust. Sat. - compare satisfaction when a Article was used to solve the incident to satisfaction when a Article was not used

 

Self-Service Success

           
 

Customer use of web first

3

   

Survey, web analytics

%

% of customers who went to the web site first, before contacting assisted support.  Measured through a survey (usually pop-up, sampling)

 

Customer success on the web

3

   

Survey, web analytics

%

% of customers who went to the web site and solved their problem.  Measured through a survey (usually pop-up, sampling)

 

Customer visit w/o incident opened

3

     

%

Customer visit/session and no incident opened in X amount of time (examples of X range from 8 hours to 7 days).  Variation on this is to assign points to all Articles viewed in a session when no incident was open within X amount of time

 

Value of web  

           
 

Triangulation method

         

Assesses the value of the web.  There is no one measure we can use to assess the value of the web - we have to look at the web from three different perspectives to get a true representation.

 

1. Click stream analysis

2

   

web analytics

%

First side of the triangle - Where traffic is going - to & from.  % of users that are successful vs. unsuccessful

 

2. Customer experience

2

   

Survey

% satisfied

Second side of the triangle - What customers are saying about you

 

3. Case/incident volume

2

   

CRM, financial reports

#

Third side of the triangle - Incident volume - Case rate normalized; to total revenue or # of licenses or # of customers

 

Community Success

           
 

%posts with community response

3

 

X

 

%

Individual who nurtures community

 

Time to response

3

 

X

 

Avg. minutes

 
 

Health of community

3

X

X

Survey

Index

Level of trust

 

Reach

4

X

 

Network analysis

Index; size and diversity

Assess the effectiveness of the community.  Two dynamics of Reach - 1. how big is the audience involved in the network, 2. diversity of the players in the network

 

Relevance

4

   

Network analysis, survey

Index

Assess the health of the community.  How often do people find content or people that are relevant to what they are looking for?

 

Loyalty

           
 

Customer loyalty

3

X

 

Survey

Score

See "Net Promoter"

 

Renewals

3

X

X

CRM tool

%

 
 

Employee loyalty

3

X

X

Survey

Score

Loyal employees contribute to loyal customers

 

Collaboration/team health

3

X

 

Survey

Score

 
 

Employee turnover rate

3

X

 

HR reports

%

 
 

Community health

3

X

 

web reports/surveys

Score

Online forums

 

Organizational Learning

           
 

Time to fill knowledge gaps on the web

3

X

 

Web analytics, click stream analysis

Avg min/days

 
 

% of issues promoted by support implemented by Development

4

X

 

Manual

Issues promoted Vs implemented

Indicator of health of relationship with Prod Mgmt and Dev/Engineering

 

Time to cure (time from id to removal of problem)*

4

X

 

CRM, KM and release dates

 

Support's ability to work with product management and development/engineering to improve products based on customer experience (includes documentation)

 

Time to proficiency - new Analysts

2

X

 

Manual

weeks/months

Current compared to baseline.  New people

 

Time to proficiency - experienced Analysts, new products/technologies

3

X

 

Manual

weeks/months

Current compared to baseline. New products

 

Time to equilibrium* (new release)

4

X

 

CRM+Web+community reports and product installed reports

Trend - exception rate per installed product per week

New product compared to mature product

 

 Time to adopt/install

4

X

   

Trend, install rate of new release/product

Customer confidence in support is one driver of time to adopt

 

Financial

           
 

Total support costs as a % of total company revenue

3

X

       
 

Support margins (contract rev)

3

X

 

Financial systems

%

Support costs as a % of revenue (or install base, or product shipped)

 

Cost/exception

3

X

   

$

Across all channels cost to resolve exceptions

 

Cost/incident (assisted)

2

X

   

$

Support center

 

Cost/incident - known (assisted)

3

X

   

$

Support center

 

Cost/incident - new (assisted)

1-3

X

   

$

Support center

 

  • Was this article helpful?