How accurate is your testing routine?

Traynor Guitar Amp

Testing is not just for software, but for the business processes, organisation or service that you’re implementing?

I’ve seen many test routines that are too artificial, too removed from the reality of what the users will go through. Fortunately this factor has improved over time, especially with more focus on user stories.

Let’s consider one of the best examples of testing I’ve ever seen. Guitar amps are generally fragile. They’re usually robust enough for scrapes and minor bashes as you’re carrying them through doorways, but they don’t survive being dropped down stairs very well.

One amp manufacturer had a test routine of removing the glass valves (they’re replaceable consumables) and then throwing the test amp from the roof of the building to emulate the journey that some amps go through. On the ground, they inserted valves and powered it up to see if it would work.

How does that compare to your test routine? Is yours as accurate to the reality that it will be used in?

Here’s a clip of the actual test

KonMari Method applied to Organisational Design

Tidy Clothes Hangers

Can we apply the KonMari to Organisational Design?

The KonMari method describes how to tidy your house and how to keep it tidy. It is a set of rules that you can absorb in order to keep a less cluttered house. Having read through the concepts and the rules, I noticed some similarities to the domain of Organisational Design. So let’s work through some of the main rules.

1. Tidy all at once

The premise of this is the concept that tidying an untidy house a bit at a time doesn’t work. Now before I hear you say Continuous Improvement, we have to remember that at first with KonMari, the house is already untidy. And so it is with our organisation. We’re performing Organisational Design because the  organisation is an unfit state; it’s untidy.

So let’s allocate enough time, effort and people to redesign the organisation to get it into a fit shape. We need to accept the fact that it will take time and can’t be done piecemeal.

2. Visualize your destination.

This is the vision that sets the direction for your organisational design. Think of the vision that fits with MSP (Managing Successful Programmes) or the visions provided by your CEO or MD.

Define what you want your organisation to look like in the future, what it does, how it does it. That’s the direction you want to travel in and then that can act as constraints and drivers later on.

Create the design principles in order to identify what criteria you’re going to use to make decision about the future.

3. Identify why you want to live the way you envision

I think this is potentially the wrong way around. In my eyes, the ‘why’ should come before the ‘what’ of the vision in Step 2.

Using KonMari, you’d go through the exercise asking yourself why you need the items that are in your vision. This is a questioning of self, to understand how important an item is for you.

For us, this is the questioning we should ask about the vision. It’s time to reflect on the vision and evaluate it from a different perspective. As opposed to the vision that you’ve developed over time, question “if we achieve this vision, will it be good enough?”, “will it do what we set out to do?”, “are we aiming far enough?”

It’s also the questioning we should be asking ourselves when looking at the design principles. Are they fit for purpose? Are they good enough?

4. Determine if each item “sparks joy”

When you consider the items to keep or reject, KonMari suggests an emotional angle over whether it sparks joy. This enables you to reduce the items beyond just whether it’s functional or not, or whether you may use it in the future.

So we should look at each business capability in your organisation. Does its current implementation, e.g. which team makes it happen, work for you? Does it feel right? This isn’t about the logical response, but the emotional one.

5. Tidy by category, not location.

Remember this. It’s probably the most important one.

The KonMari suggestion is to bring all items of a certain category (e.g. sweaters) into a single place and work through them one-by-one. Rather than sorting through a chest of drawers, drawer-by-drawer.

Consider all the teams that deliver each Business Capability. Is that the way that you want to deliver that capability in future? How does it fit in with your vision? Is there a different way for you to provide that Business Capability, e.g. merging teams, outsourcing, joint venture, etc?

6. Tidy in the right order.

The KonMari method describes this order:

  1. Clothes
  2. Books
  3. Papers
  4. Komono (miscellaneous.)

But also to create subcategories within the categories.

It’s a more complex situation for organisational design. We could find equivalent capabilities for clothes, books, etc, but the reality is that each organisation is different. They may have similar capabilities, but it’s likely that any two organisations will have different implementations of each capability or have capabilities at different stages of maturity.

So we need to create a plan that makes sense for each organisation. Start with the fundamentals of what needs changing first in order to create space (or capacity) elsewhere. For instance, if an intake team within value chain can’t change it’s filtering for whether customers, prospects, etc are passed further down the chain. Then it probably makes sense to focus on the receiving teams first, to then free up people assist with the intake team. It’s similar to clearing out a large enough area to act as swing space, so you can then make bigger changes more efficiently with the space you’ve just cleared.

7. Discard before you place things back

Set honest expectations early. If jobs will be at risk, complete the consultation and follow-up actions before you move people into the new team structure. There is always pain, but better to start a new organisation design with team members who are committed rather than retaining those who know that they are leaving.

Reflection

I hadn’t intended this as a serious article, but I the more I wrote, the more I realised that there may be some useful perspectives to gain from the exercise.

Can we perform organisational design using just the KonMari method? From what I’ve found of the method so far, no we can’t.

Can we benefit from considering the KonMari method when performing organisational design? Yes, most likely we can.

Confirmation bias in design thinking

Post-it Notes

Every few years another fad comes around.

Workshop
Workshop

Look back long enough and you’ll see lean, systems thinking, TQM, CRM, structured systems and many, many more methodologies and/or approaches.

The problem is in the delivery of the projects when the terms become more widespread.

Background

What we’re seeing is the climb of two methodologies: Service Design and Design Thinking. You can probably add in Human-Centered Design and Inclusive Design into that mix. Many organisations are adopting these methodologies to solve their existing problems, switching from a lack of methodology or from more formal, structure methods to ones centred around design.

I’m increasingly seeing Design Thinking heralded as the way forwards, but there are some issues with that approach.

To be clear, I’m a supporter of Design Thinking and many related methodologies. I’m not necessarily a supporter of how those methodologies are implemented in many organisations. I don’t believe design thinking is a fad or at least a significant number of its concepts will remain even if a newer or revised method takes pole position.

Question

What we’re not seeing though are the reports of the design thinking projects that have failed. I find it hard to believe that design thinking and similar, related methodologies are the magic bullets that we’ve all been waiting for.

So do failed projects exist?

For every other methodology, there are numerous reports about how it has failed the client. The application of Six Sigma to Nortel comes to mind as a famous example. But there are plenty of other reports detailing failed projects.

So where are those reports for Design Thinking? It could be that it’s too early to tell since we’re on the early adopters curve.

Analysis

Methodology Maturity

Gartner produce the hype curve for technology adoption. It describes how people react to new technology, by relating their emotions to the stage of the technology and their use of it. We can apply a similar description to most methodologies.

They usually start with practitioners noticing something wrong in their current way of implementing projects. So they add, modify or remove elements. This may be a 2.0 version of the original method. Or if sufficient changes have been made, then it becomes a new method. Occasionally changes are blended from another method or a practice outside of the change domain. This new 2.5 version may take on a life of its own and become yet another rebranded method; this time at 1.0. So this fresh 1.0 version has only been used on a few projects and has had great success from the perspective of the facilitators. Therefore it’s a brilliant solution to fixing any problem and becomes heralded as the latest and greatest. Unfortunately those implementations are only just becoming lived-with, i.e. the changes occurred but they haven’t had time to become embedded in the organisation. That’s when murmurings of disgruntlement appear, sometimes with a “told you so” attitude depending on whether or not the change team had listened to the advice of their subject matter expects. Often the change team has moved on to the next project, this time with a couple of tweaks to their methodology, so version 1.1 which addresses some of the issues they were confronted with at the time of making their changes, but importantly, not those that appeared following the change.

The combined status of design thinking is relatively immature, compared to SSADM as an example and I’d measure that maturity through the age of the methodology, the number of project years it has been exposed to and the rigour involved in the definition of methodology. Design Thinking could potentially have more project years (due to the vast number of projects running now), but many of those projects would be, themselves, immature (due to people trying it out for the first time). Moreover there’s no single agreed definition of the method to follow, but instead many different flavours as different authors, consultancies and training organisations create the latest version of design thinking.

So whereas other methodologies are further along the hype curve, with many of them on the “slope of enlightenment”, there are times with design thinking when I think we’re at the “peak of mount stupid”. I could probably say that about a number of projects, regardless of methodology employed, it’s just that more projects now are aligned with design thinking than before.

Confirmation bias and success bias

We only want to hear what we want to hear and what we’ve planned to hear. So we will either not hear or discount information that contradicts our beliefs or expectations. This selective perspective allows us to promote one methodology above another without viewing all the evidence. It takes time and experience to continue to assess beyond those initial boundaries and look for evidence that contradicts our expectations.

I suggested that the reason we’re not hearing about the failed design thinking projects is that they haven’t been in place for long enough for us to realise they were failed projects. But perhaps, it’s that there’s something more insidious about the design thinking movement, in that only the positives are celebrated. Even to the point that failed projects receive political spin so that the positive elements of them are celebrated.

I’ve seen examples where the design team ran their sessions and sprints, came to implement new organisation designs, process designs, etc and were heralded as successful, even as leading lights as other similar organisations came in to learn how to achieve the same magic. However that same team hadn’t engaged with many of the other departments required to make the changes work, most notably IT. That meant that they had a great design that would take 2 years to implement if and only if the IT department had the same vision and decided to implement. Remember that IT was just one of the forgotten departments. Would the design have looked different had the other departments been involved? I’d bet money on it.

I’ve also wondered if it’s the age and associated behaviour of the people performing these sprints. Remember that most of these will be millennial, coming from a segment of society that did not get overtly negative feedback at school. There’s no right or wrong there, just a different perspective and a different approach to giving and receiving feedback.

Conclusion

Many change professionals have been combining elements from numerous methodologies or creating some methods almost from scratch for years. You can only do that successfully and repeat successfully with the experience of having first worked with a number of different methodologies. Once you have the background of what works and what doesn’t, then you can better understand what is likely to work in any one situation.

That leads to a few issues that we can learn from:

Issue 1: Vanilla Method

My view is that no methodology should be applied to an organisation without some tailoring. Applying the out-of-the-box tasks and activities makes a mess of an organisation, where the objectives become secondary to the method. Part of the skill of any methodologist or senior business analyst is in knowing which parts of a methodology can be removed while still providing acceptable risk to the organisation and its objectives. Otherwise, we would all be producing documents for every part of the methodolgy rather than getting any work done.

Issue 2: Inexperience in change

Reading the book doesn’t make you an expert. Methodologies take tact, understanding and planning to implement. They also take the extrapolation of what’s worked and what hasn’t worked so well. Without that analysis, whether seen first hand or learned from others, the method is untested.

Issue 3: Inexperience in business

A significant number of people I see involved in design methodologies are new to their careers; any career in fact. Would I expect them at the start of their career to know how HR, OD, IT and legal interact. I’ve been doing this for years and I still have to question each of my clients what their HR function actually does. I learnt to ask since in some it’s purely advisory, in others in full-stack recruitment, sickness management and training. It’s different in every client (with a few common patterns appearing if you work across enough clients). That’s part of what we need to unpick before we can figure out who needs to be in the room. I wouldn’t expect that from a relatively new starter.

Issue 4: Underestimating

Not only underestimating the number of people that need to be involved in the change, including taking those off the front-line so that they can contribute, but also underestimating the skill set required to facilitate changes through implementation and beyond, or better still, underestimating the skills set required to have the changes graciously accepted.

Issue 5: Learn from mistakes

We all make mistakes. Even the best project has had a few errors. Learn from them and take action to not repeat them. By moving design teams onto new challenges too quickly, they won’t be in a position to receive that valuable feedback that is required.

Issue 6: Implement

Design with a view to implementation. Design for design’s sake will get you so far, probably for your outputs to sit on a shelf. If you’re designing to implement, then you’ll think differently about who to include and also how to include them. Thinking about the fact that it needs implementation forces you to think through timescales (e.g. what’s feasible?), think through the politics and governance, think through the people you’re going to affect internally.

Afterword

Think of this current article as being a first draft. It’s likely that my thoughts will clarify as I hear of other projects and have further discussions about this concept. If you’d like to discuss it more, get in touch at @alanward

Designing for Everyone

Crowd of lego people
Crowd of lego people

Whatever system, process, technology we’re implementing, shouldn’t we be designing for everyone? Or at least everyone in the target customer segment?

Background

In the last couple of weeks, I’ve read a number of articles that have consolidated and made me reflect on my thinking about designing for disabilities and what counts as normal.

Having spent a number of years working in the health and social care sector, I’m well-versed in the practicalities of working with people with disabilities. But I still hate the phrase “people with disabilities” and every other similar phrase I’ve ever seen. I don’t like the word inclusion, not that I don’t like the concept itself, but that I don’t like that the concept has to exist. Hence the title of this article as “Designing for Everyone”.

What’s an average person?

I read The Atlantic’s article on how we’ve ended up with a definition of a normal person. That’s at the crux of a lot of the disparity that we can see in the thinking of a lot of designers; they design for the average person or people similar to themselves. By using the term designer here, I’m not necessarily thinking of an artist or a creative, but rather the person responsible for delivering a changed process, a changed organisation or a changed way of working. They may have a creative background, but often are from their own professional background, e.g. in the front-line work or a change management professional. Fortunately, a more creative influence is coming into the change profession, for example we’re seeing newer methodologies such as Design Thinking, Service Design and Inclusive Design.

The problem with most of these approaches is that they develop solutions for the average person. There may be several average people in the target. These personas should have been based on the likely customers that the service wants to attract/serve. But considering how many conditions and disabilities there are in the world, there’s no way to account for all of them. Instead, we’re back to averaging again and possibly some Pareto analysis to account for 80:20 of the target population. That still leaves 20% who are not included in the thinking behind the design.

And that’s part of the theme of the article; that by defining a normal, we start to react towards the average as the ideal and the non-average as divergent.

How can we be completely inclusive?

Microsoft have released their Inclusive Design toolkit. The start of the toolkit is a touch simplistic, especially if you’re worked in health and social care, but it gets interesting part-way through. I’m also aware that the beginning portion could still be a incredibly valuable education source for those not used to having think from this perspective. So for that reason alone, I’m grateful to Microsoft for having released it to the world.

But more than that, there are a few nuggets of quality information in that method that I haven’t seen written down anywhere else. I’ve had to reign in proposals by pointing out difficulties of interacting in the proposed manner, so the 2 points below resonate with me.

The first is the potential to abstract away from individual conditions and dis(abilities) to perform tasks and instead focus on the interact between the person, the technology and the environment. That way, you can focus on resolving issues or improving the interaction between the person and other people in the context of the environment and the technology used.

The second is that disabilities do not need to be permanent. There’s a description of a spectrum from permanent through temporary through to situational. And there are more people in situational or temporary with difficulties than with permanent disabilities.

I’ve cropped the slide here and clicking on the image will take you to Microsoft Design Practice.

Disability Spectrum showing difference between permanent, temporary and situational disabilities
Disability Spectrum

How do we include views of everyone?

This is an old source for me, but one that I still point people to when they’re thinking of how to approach their change programme. Beware though, it only becomes inclusive if you included a wide range of people in the interviews and in the service design. It’s a concept of Experienced-Based Design that I’ve seen from the health sector. It’s the best example of a co-production/co-design methodology that I’ve seen.

There are two sources for this: The King’s Fund and the archived NHS Institute for Innovation and Improvement.

Conclusion for Designing for Everyone

Implementing changes for people with disabilities is difficult to achieve since you’re already on the back foot with that perspective. We can see this by the difficulties involved in making websites accessible when that’s been added as an afterthought. Instead, by bringing the focus on a more inclusive design up-front in the process, we have the opportunity to design changes that suit many more people.

Above, I’ve listed a few articles and methods that could help influence others around you. The main item to take away concerns perspective; anyone involved in change has to be able to shift perspective to include that of all customers in the target segment.

Forthcoming Book on Improving Your Own Service

Lean Service Improvement Book

Some of you may already know, I’m in the process of writing a book on improving your own service.

Lean Service Improvement Book
write by followtheseinstructions under CC BY-SA 2.0

I’m aiming the book at the people who work the process themselves, e.g.:

  • nurses
  • social workers
  • claims adjusters
  • HR/OD staff
  • office managers
  • office administrators
  • hotel staff
  • and their managers
  • and change agents/analysts

As you can see, it’s not restricted to any industry, but will be most relevant to those working in service industries (whether from private, public and 3rd sector), so that should include:

  • public sector
  • health
  • finance
  • retail
  • leisure
  • legal

More accurately, the information in the book could be useful for any industry, however there already exist books for improving manufacturing production processes, so I have not covered them.

What’s the book about?

The focus is on improving a service without recourse to large consultancy fees and should work well on small changes locally within a team and managed changes with partner teams and organisations (e.g. suppliers and B2B clients). It’s heavily based on Lean concepts, using simple tools, but also includes a framework in which to manage the changes. I’ve borrowed from a number of methodologies and concepts to meld together a method that is suitable for the average worker and implementable in any service team.

Your Input

While I’m happy to write this book alone and for everyone to read, I really like the idea of the readers contributing their thoughts as I write it. This fits nicely with the Lean Startup model, so to accomplish this, I’ve listed the current table of contents below. Please have a read through the table of contents and let me know what you think. If you’re interested in this book, let me know what you want to learn from it.

Draft Table of Contents

Section I: Beginning
1    Introduction
2    Background
3    Where to Start?
Section II: Redesign
4    How to Redesign the Service
5    Detailed steps for How to Redesign a Service
Section III: Other Paths
6    Refocus service on customer
7    Only have today to make changes
8    Bottleneck Resolution
9    Reduce errors and improve service
10    Create a new service
11    Improve office layout
Section IV: Case Studies
12    A Real World Example: Capacity and Value Stream Owner
13    A Real World Example: Duty Role in Social Care
14    A Real World Example: Urgent Cases in Social Care
Section V: Extensions
15    Other sorting methods
16    Making it Happen
17    Managing the Change
Section VI: Continuing
18    Sustaining Change
Section VII: Reflections
19    Important Perspectives
20   Other Frameworks
21    A final piece of advice
Section VIII: Appendices
22    Appendix A: The Rules
23    Appendix B – Pocket Guide for Service Redesign
24    Appendix C – Indicators of Blocked Flow and Waste
25    Appendix D: Tools
26    Appendix E: References
27    Quotes

Methodology Design

Angle-poise lamp
Angle-poise lamp
Angle-poise lamp, show some light

This starts with a review of your team, what it’s trying to achieve and how it’s trying to achieve that. Following that, we can advise on and develop suitable methodologies that will work for you and what you’re trying to achieve.

The main focus with previous clients has been on integrating change methods and software development methods. Often the project management method is already present within the client, even if it’s just what their practitioners bring with them. Sometimes, it’s a different combination or developing a new change control process that fits in with what the stakeholders expect of it.

Want to know more, then contact us.