Creating Service Design measures for a complex service

Creating Service Design measures for a complex service

COMPLEX SERVICE MEASURES

Measurement and service design go hand in hand. But, for many of us, the word 'measures' harks back to graphs and numbers. It speaks of a pile of charts arriving on a managers desk, for them to review and assign blame.

We are going to use system thinking as a foundation to this methodology. Looking at the service from end to end, and outside-in, the measures that we can create are different to those that there created by a Command & Control organisation design. This is because the command & control paradigm relate measures that focuses on control, categorisation data, internal departmental efficiency. Measures are almost always used to monitor staff and activity.

So, what are alternative measures? They are those that measure of the performance of the WHOLE service and indicate what is going on in the wider system. As a part of the definition of person-centred they begin with the customer. They are a window for us to peek into the system.

Good Measures

Good measures do more, much more. Measures are part of the behavioural control that are used to prioritise and direct decisions. Command & control measures reinforce reductionism thinking. Good measures encourage systemic operational design. This means that they point everyone in the organisation to focus on the customer, that encourages working across departments, and they encourage managers to design cross-functional teams. They encourage a positive culture within the organisation. They align governance and auditing to focus on the right things. They encourage the design of the service to continue to become person-centred; their correct use starts the journey of learning and improvement. For many of us, they are the secret level that opens up a new way of thinking in the organisation.

We are trying to move away from:

 It is old-fashioned performance management that keeps us in a world of humans as resources, as command-and-control takers, with rigid top-down planning, and solid prevention of curious and exploratively-minded cooperation. Corporate-Rebels

Complexity

However, when we have COMPLEXITY as a characteristic of our service, we need to understand two important points;

1.  we have to approach the creation of measures as an iterative task. Measures are developed over time, with the prototype design, not before it. This is due to the fact that complexity can only be understood when we have designed the new prototype.

2. Complexity is often about ever-changing knowledge, that is probable highly qualitative, and possible subjective. It may not be possible to convert into data.

Now, we are going to define what we mean by measures

Service measures as: that which helps us to understand what is going on in the service,
and,
how well are we doing with respect to our purpose (our customers)

Where so we start? I like to start from the measurement guru's, Dave Wheeler, wise words; Ask ourselves what is the problem we are trying to solve?

Problem 1 - how well is our service performing?

Problem 2 - we want to know how well the new design compares to the old.

Problem 3 - we wish to alter the behaviours of how measures affect managers and staff.

In the description here, all three appear intermingled.

Managers

One of the tactics I use to engage and help managers to participate in a relevant way, is to ask them to participate in the defining and gathering of data, and knowledge, to compile the measures. They have to get connected to the work to undertake this. I ask them to work with the team, and to observe how the team use the measures to learn and improve. In the new design, one of the managers tasks is to identify and remove systemic barriers that get in the way of what the team need to do, and measures are used to identify those barriers and to track the changes.

Measures Framework

Systemic person centred thinking can orienting measuring performance into four main areas of a service; 

service and purpose, efficiency, revenue, and morale (culture).

We are going to use a step by step framework to define our systemic measures, obtained from basic systemic service design principles. This framework should be applied during and after the new design is created, not before.

systemic design measures

LEADING MEASURES

PURPOSE & WHAT MATTERS

To begin the measurement framework we start with Purpose, and ask ourselves how do we measure achievement of purpose? Purpose as defined by the customer or citizen. Purpose can then be split into:

- the what we do, and 

- the how we do it

How we do it, for example of a measure, might be defined as 'end to end time', or 'needs met'.

What Matters is obvious when each customer interacts with us. It is often very individual; "I want it now, or "I need help with..." In a complex environment, What Matters is often variable over time, and cannot be defined clearly.

Focusing and measuring Purpose and What Matters defines a person-centred service. This focus leads everyone in the organisation in being mindful of the customer in every activity and interaction. It is the core of the design. This measure rises above all others, it is the primary measure. It is a leading measure; we use it to lead our design, actions and decision-making. Complexity causes all of this to change over time, so beware of recording these aspects of a demand, and using that data over time.

VALUE

Value is a term that is defined when we ask ourselves;

  What are the activities that a customer recognises as providing value to them, that helps to achieve purpose?

Anything that directly provides value to the customer is defined as Value activity. So measure Value, and compare that to the amount of non-value work. It can be simply expressed as a percentage. 

When we focus on designing a service using only Value work, then we design the most efficient and effective workflow that we can. 

This is a measure that is second in importance to Purpose

Summarising this fundamental point that brings systems thinking to the heart of our service, a person-centred (outside-in) perspective,

purpose is defined by the customers in general

what matters to individual customers, and changes over time

value are our activities that directly contribute to Purpose

and dont use averages, as they remove the individual variation between customers that we want to understand.

LAGGING MEASURES

And now we look at the remaining measures, that help us to understand the operations as a business. These are almost always Lagging measures, in that they are Outcomes of working to Purpose and focusing on Value. Lagging measures should not be used to manage operations, as they have occurred in the past. They point us to what has occured, and help us to learn what we need to change in the service.

EFFICIENCY

This is a simple measure that, in its most simple form, it is cost per unit. In most services this is a financial calculation that should be kept unique for individual demands, rather than averaged.

REVENUE, OR PROFIT

The amount of resources that are consumed in the end to end flow of the service delivery. This is one that so many managers focus on as the primary measure, only to find that they are chasing what has occurred. As a systemic thinker, we are looking to identify and focus on the Causes of Cost.

MORALE

The culture of an organisation is an outcome of various elements of the way that leadership directs the organisation, and how managers behave. Morale and culture is perhaps inappropriate to quantify in categories and numbers, but can be understood easily by paying attention and asking the right questions. 

Linking up the leading and lagging measures brings us to EFFECTIVENESS. This is a combination of how we would review the measures together, and learn how they work together.

Principles of Good Measures

And, these are some principles which define behaviour (so that we dont revert back to the old ways)

  • measures should be in the hands of those who do the work (visible on the wall), so that they use them to understand and improve.
  • customer purpose and what matters to customers must be derived from the work (not in a room) and drives our workflow design.
  • used to analyse and understand, and by managers to improve the system.
  • they measure what is real and happening, and demonstrate true variation over time. (they are not targets, or averages)
  • distinguish between focusing on understanding the variation between individual customers (individual comparison - rather ineffective), and understanding the systemic design (trends - what we are interested in)

And one last thing, measures should focus on creating VALUE. What that does it to direct discussion towards Purpose. That will then develop into a learning cycle, where the value of the service is improved over time.

How we measure

I always have a bit of a fight at the beginning of this. The team automatically begin to think about what digital method we will record data. I insist that we will do this on flip-charts, on the wall. After lots of groaning and jokes about dinosaurs, they reluctantly comply.

Why do I do this? The real detailed answer is below, but primarily this is about making the data and the measures always visible and owned by the team in real time. Over time much of this can be Digitalised.

When we measure

service design measure

Oftern at the green arrow. With a complex service, I find that the clarity of purpose, what matters, cannot be well defined until after the experiment is accomplished. Therefore I find that measures, at the earliest, can be defined at the end of the experimentation. In some cases I have even had trouble with some measures until after the prototype is complete.

A real example - Healthcare

(This is a non-digital design, but the approach can be applied to a digital design we we are focused on the service)

Lets use a healthcare service as an example, a nice complex one! The current traditional measures are;

  • number of assessments achieved in the target (2 weeks), resources spend vs budget, number of complaints, number of staff sick, number of referrals to different departments, number of cases per worker per day, time taken to close a referral

In our initial analysis of how the current service worked, we realised that the measures were created by managers with a particular paradigm;

- that all cases should be treated as the same, therefore we can compare each case to each other. We create average trends.

- we create standard ways to measure, that treat every demand as equal.

- we standardise demands into the service, we have a set of pre-defines problems and we fit all demands into that list.

 - we measure time spent doing anything, and attempt to reduce it, regardless of its value.

- use set arbitrary targets.

- we challenge staff when they want to spend what managers think is too much.

In our systemic Prototype Design, we want to use measures to help senior leaders to develop systemic understanding, and to demonstrate how well the new design prototype was working compared to the current way of working. And we want the focus to be a person-centred view of the service. The old measures fail to do this, and we have to create a new set. In the prototype, as we take on each case, we record as much information as we think we might need, and make it very visible.

complex service design measures

each case visible on the wall

I work with the team and team manager to decide what the new measures are going to be, and this takes time. What to measure was decided as a team and in the subsequent weeks they were tweaked as we learned more. And one team member showed flair to do this, so they became the one focused on this more than the others.

First we developed concepts of the old vs new ways of working. Below is one comparison of the the flow between the old vs new. We used this simple approach to start creating new measures based on a. new set of principles. The team had never created measures before, so this has to start from first principles.

No alt text provided for this image

Deciding what to measure

To develop a new approach to measures, we need something that takes us away from what we have done before. So, we started with the main areas of; service and purpose, efficiency, revenue, morale (culture). This covers the main areas that measures should cover in any service.

With a complex and high variety service like this, there is no true correlation with measurement when comparing one case to another, because of the complexity involved in this service. In fact, this is what the current method of measurement was doing and we found it to cause all sorts of bizarre inappropriate behaviours and decisions by staff and managers. However, when we take a random sample of cases, and compare those in the old way of working, with the new way of working, that difference can point to interesting characteristics of the prototype way of working. 

PURPOSE

- Purpose was defined as

   the what; 'help me to live a good life.' 

   the how; 'by listening to me, and helping me, and use what I have to help me do what I cannot do.' 

- we recoded number of assessments as an indication of a poor approach to understanding need.

We measured purpose by their ability to live without unnecessary support from us, and any follow-up repeat demands. 

- number of repeat demands, we recorded every time the citizen would contact a health professional anywhere in the system.

- feedback from visits, we would record their  trust and depth of relationship with us.

- number of people connected to the citizen, this would indicate the fact that many different connections to the citizen causes splitting up of a relationship.

- matching the referral to the real needs of the citizen, how well did we originally assess their need, compared to what we eventually learned.

WHAT MATTERS

This was different for each person, and we asked them what this was. We recorded this down, and went back to these now and again, and it changed over time. What was important is that these guided us on how to proceed and what to focus on to help them achieve a good life.

The information contained here is in flux, it changes over time. So the question is; 'how appropriate is this information to be standardised and Digitalised?

what matters measure ICS

an example of what matters

VALUE

This became evident as we worked with each person. Value is the agreed activities between the person and ourselves; those actions that directly helped them; talking and listening, guiding, making things happen, organising activities, building knowledge. We looked to their family environment, friends, community, and health.

What was NOT value was easier to see; writing records, report, asking for permission, applying for funds, creating plans.

The control was directed by the person, as opposed to the control derived by the organisation.

EFFICIENCY

The health service is not a transactional and standard service. Therefore efficiency in the traditional sense was not appropriate. Were had to use what we could measure to indicate efficiency.

- the number of handoffs & referrals, as these indicate waste in the duplication through handoffs.

- any non direct contact was recorded as non-value. 

- repeat demand.

REVENUE (SPEND)

- real total time spent on a case. This indicated rough total cost, rather than actual cost.

- repeat demand, indicating that the cost calculation needed to continue with the new demand.

A really important element at this point was the team helping senior leaders to deframe their understanding of cost. The team showed how focusing on cost actually increased overall cost. And that what they were doing was understanding the causes of cost, and working on those to reduce cost.

MORALE

- Morale was described by the team directly in feedback to managers and leaders. It was evident in their dialogue, behaviour, and attitude.

Collecting the information

Each of the prototype cases were rigourously analysed with those working on each case.

No alt text provided for this image

The details calculations for a case

We created new measures, based on the purpose of what we were doing, to prove the effectiveness of the new prototype; to demonstrate to leaders the difference between the old and new ways of working. They were going to be the basis of deciding if we continue or not. We were looking for meaningful information, rather than data that is easy to record.

No alt text provided for this image

Here they are being collated by two members of the team.

And finally, the summary that the team discussed with the leaders (night = current way of working, day = new way of working

summary systemic design measures

This is the main summary presented to the leaders

No alt text provided for this image

This is an alternative way of describing a person-centred view

How we used the measures

Why have everything on the wall? The impact of putting it on the wall was really profound and helpful. The team had the measures in the room with them at all times. The measures became theirs. The interpretation of the measures would develop over time, and became part of team discussions and reviews. 

Those leading the methodology in the team, used the measures to bring staff together to collaborate to a common purpose. But the development of how this was achieved was a collaborative effort decided by the whole team. For a group that is used to top down decision-making, this new approach, led by Roxanne Tandridge, was a catalyst to develop a new team working.

At this stage in the design, perhaps the most important point to note here is that the feedback of the Prototype is to the decision makers, and it is not about a presentation, it is feedback to the group who have already been engaged with the team and the design.

The purpose of this measures aspect of the design was to link leaders with what was important to both the user, the business, and the leaders. And at this point, it appears we have reduced to numbers, but we have not. We use these numbers to describe the concepts. And thats why we do this face to face. 

When the team went through this, the leaders basically fell off their chairs! Collated and discussed by front line staff, to the most senior people in the organisation. Job done!

Measuring Digital Site Metrics & GDS

This is a far cry from how we measure Digital website performance. You can see that in this case, we have created systemic measures, rather than measuring the use of technology.

Website measures can be relevant when we are designing a highly transactional online demand like renew a driving license. However if GDS and service design agencies want to move into the realm of wider service transformation, then we need to expand our design reach to cover all aspects of a business. And to do this we can reject command & control thinking, and turn to the work that has been going on in progressive person-centred organisations over the last 50 years. 

But the real issue is how to we combine Digital with complexity? By definition, complexity is non-logical, and does not exist within boundaries. This topic is perhaps too deep to detail out here at this time.

Principles and Methodology

Warning; The article only hints at a much wider deeper part of systemic service design!

What this is, and is not

It is about engaging with managers and decision-makers in new ways. It is about them getting closer to the work, and the recognition of their behaviours. It is about recognising the impact of measures on staff and stakeholders, and the manipulation that goes on to reach targets. It is all about learning and how to learn and collaborate. It is about individual needs. It is about real time information, rather than looking in the rear view mirror and averages over time.

And it is not about departmental functional measures, it is about the end to end journey.

 Measures are much more than 'measures'. We can try and deal with them as graphs and figures. It is about moving away from for traditional management approaches. 

Human Learning Systems is one way to frame systemic design, especially in the public sector. It is an approach that offers an alternative to the “Markets, Managers and Metrics” approach of New Public Management. It outlines a way of making social action and public service more responsive to the bespoke needs of each person that it serves, and creates an environment in which performance improvement is driven by continuous learning and adaptation. It fosters in leaders a sense of responsibility for looking after the health of the systems, and it is these systems which create positive outcomes in people’s lives.

With respect to this example of measures in Health and Social care.

HUMAN

It is about understanding the true value that the service provides, and the true needs of customers. The relationship between staff and managers is altered, allowing for greater devolvement of decision-making. 

LEARNING

It is about engaging with managers and decision-makers in new ways. It is about them getting closer to the work, and the recognition of their behaviours. It is about recognising the impact of measures on staff, and the manipulation that goes on to reach targets. It is all about learning and collaborating using measures as the vehicle. The front line participate in interpreting and learning from measures. 

SYSTEMS

It is not about targets, KPIs, SMART. It is not about reinforcing command & control principles. It is not about attempting to convert qualitative knowledge to quantitive data. It is not about standardisation. Beware of managing lagging measures. It is not about averaged figures over a period of time.  And it is not about website performance. 

One key element of systems thinking that is in use here is the alternative paradigm of how a service should be designed. We use the systems thinking iceberg model as the link of the elements of the purpose, what matters and measures framework described in this article.

The foundations to this work

There are perhaps two strands of service design; the modern Digital strand, that emerged relatively recently from product design. And Organisational Development (OD) that has been going for decades. The commonality between them both is large, but today, we seem to keep them separate. This article is both of them combined.

The overall concepts in this article come from:

- Design Thinking, with its focus on iteration and emergence.

- Systems thinking, with its wholeness, of all parts working together.

- David Wheeler, perhaps one of the world most well known authority on measurement.

- The nature of variation, Ashby, Stafford Beer and Deming.

- Mindset, paradigms, and worldview; Meadows, Argyris & Schon.

- Organisations as complex adaptive systems.

- Motivation, Dan Pink.

- Seddon - methodology and tactics.

- Dialogic & Teal.

- Idealised design and Design Thinking, Ackoff.

- Continuous improvement and culture. Toyota.

John Mortimer

Reinventing services and organisations, moving beyond traditional ways of working. Leading and empowering new ways of working.

1y

The reality of systems thinking and then viewing measures, what they are, and how to use them, allows us to redefine them from the beginning. I find that this occurs to the extent that measures must not be seen as separate, as something we have to do. It is part of learning, behaviours, team work, customer interaction, and management. Basically it can be seen as a glue that lives in the interaction and communication between each element.

Like
Reply
Bob Cotton

Executive Coach | Business and Technology Consultant | Strategy | Governance | Change | NED

1y

Thanks John, this was great. This approach makes Measures a toolset that is owned and evolved by the practioners, developed from a perspective that is grounded in what is necessary to achieve their purpose at the time. As opposed to being done to them, or used to judge them, which was a traditional approach. I think this perspective is critical for driving the best outcomes. It sets up a dynamic that evolves over time as the system and purpose evolve - some measures last, some drop away, some new ones are created. All in response to the purpose and complexity of the system, driven by the ownership of the practioners.

John Mortimer

Reinventing services and organisations, moving beyond traditional ways of working. Leading and empowering new ways of working.

2y

To help those who wish to research further, I have added a short section at the end of the article, that lists some of the concepts that went into this work.

Joanne Dong

Innovation facilitator & pathfinder

2y

This looks quite comprehensive. Thanks for sharing, John. It reminds me of the measurements I defined for business processes which involve answering three basic questions: 1. What does the process (or service) do? 2. Who benefits from the process (or service) or who is the primary client of the process (service)? 3. Why is the process (or service) needed? The three questions are asked and applied repeatedly throughout the discovery and the design of the business processes.

To view or add a comment, sign in

Insights from the community

Others also viewed

Explore topics