Article Title - Hall Aitken

5 downloads 130 Views 650KB Size Report
and is reproduced by kind permission of AMED www.amed.org.uk‟. For permission to reproduce ... Facilitation training f
Volume 18 ● Number 3 ● Autumn 2011

Building bridges through facilitation

Journal of

The Association for Management Education and Development

Volume 18 ● Number 3 ● Autumn 2011

Building bridges through facilitation A special edition in collaboration with the International Association of Facilitators to mark their European conference, Istanbul, Turkey, from 14 to 16 October 2011

This edition of e-O&P may be downloaded from the AMED web site, priced at: £27.50 or £14 for members of IAF £0 for full members of AMED and e-O&P subscribers

© AMED 2011. ISSN 2042 – 9797. Copyright of this eOrganisations and People journal remains jointly with The Association for Management Education and Development and with the author(s). However, you may freely print or download articles to a local hard disk, provided they are for your personal and noncommercial use only. Please ensure that you acknowledge the original source in full using the following words „This article first appeared in e-Organisations and People Vol 18 No 3, Autumn 2011 and is reproduced by kind permission of AMED www.amed.org.uk‟. For permission to reproduce article(s) from this journal more widely, please contact the AMED Office www.amed.org.uk, Tel: +44 (0)300 365 1247. The views expressed in this journal by both editorial staff and contributors are not those of AMED or any of the organisations represented by the editors, but reflect the opinions of the individual authors only.

Contents

CONTEXT 3

Building bridges with words

Rosemary Cairns and Bob MacKenzie ........................................................................................................ Celebrates the power of the bridge metaphor in spanning various perspectives on facilitation and offers a snapshot of the articles.

Reflections on the history of professional process facilitation

13

Richard Chapman .......................................................................................................................................... Provides a personal view on how professional process facilitation emerged and has developed since WWII.

Facilitation training for the real world:

21

Viv McWaters and Johnnie Moore ............................................................................................................... Introduces a novel improvisational approach to helping people become confident facilitators.

The power of transformative facilitation:

31

Annette Moench and Yoga Nesadurai ......................................................................................................... Creates a conceptual framework for supporting „transformative facilitators‟ for a changing world.

FACILITATOR PRACTICE 42

Building bridges

Ann Alder ....................................................................................................................................................... Offers an approach to help clients learn how to learn through working with patterns.

50

Spanning a divide

Sarah Lewis.................................................................................................................................................... Illustrates how a facilitator deals with the challenge of assuming temporary group leadership.

57

The art of online facilitation:

Simon Koolwijk .............................................................................................................................................. Identifies twelve distinctive factors and eight competencies for successful online facilitation.

FACILITATING FACILITATORS Transforming trainers into facilitators of learning

67

Pamela Lupton-Bowers ................................................................................................................................ Shows how a shift from „death by Powerpoint‟ to lively experiential learning enables subject matter experts to embrace facilitative interventions.

____________________________________________________________________________________________________________ e-ORGANISATIONS & PEOPLE, AUTUMN 2011, VOL. 18, NO. 3 www.amed.org.uk PAGE I

Contents

79

First person plural:

Bob MacKenzie .............................................................................................................................................. Suggests how learning facilitators can build bridges between their multiple selves and those of others using a personal self-facilitation framework

TRANSFORMATIVE FACILITATION 87

Less is more:

Vicky Cosstick ............................................................................................................................................... Argues that the less a facilitator appears to do, the greater the opportunities for transforming conversations.

96

Building a future together

Jonathan Dudding and Ann Lukens ............................................................................................................ Demonstrates how participatory techniques can help all stakeholders develop a strategic plan while building capacity.

105

Facilitating local peacebuilders

Rosemary Cairns ........................................................................................................................................... Highlights how facilitation helps local peacebuilders to know and increase their impact in areas of conflict.

Proving you’re worth it

115

Jeremy Wyatt ................................................................................................................................................. Demonstrates a facilitative approach to generating meaningful „hard‟ evaluation data for local organisations.

Invitations An Invitation to ‘Building Bridges’, the 2011 IAF European Conference in Istanbul, 14-16 October. ...... 123 Advance information about the Winter 2011 issue of e-O&P .................................................................. 125 Write for e-O&P Spring 2012! An invitation to contribute a story about „making the invisible visible‟. ...... 126 Would you like to get involved in producing e-O&P? We‟re looking to expand our networks of guest editors, contributors, and critical friends. See how you can contribute. ........................................................ 128 The AMED Writers’ Group: Your invitation to come along to forthcoming events. ..................................... 129 Your invitation to the Joint IAF/AMED post-publication, post-conference workshop in London, „Building bridges through facilitation‟, Friday 23 March 2012 ................................................................................ 130 About our associations: the International Association of Facilitators (IAF) and the Association for Management Education and Development (AMED) ..................................................................................... 131

____________________________________________________________________________________________________________ e-ORGANISATIONS & PEOPLE, AUTUMN 2011, VOL. 18, NO. 3 www.amed.org.uk PAGE II

Proving you‟re worth it: facilitating impact evaluation Jeremy Wyatt Evaluation of publicly funded programmes has traditionally been a function of external ‗objective‘ evaluators. But using a facilitative approach and workshops can help organisations use the lessons from evaluation to a greater extent than is usually the case. In this example, a facilitated workshop has been refined to provide the foundation for a collaborative learning process. Keywords evaluation, outcomes, prioritisation, self-evaluation

Introduction Now more than ever, people and organisations have to demonstrate that they are delivering value for money when using public funds. There‟s nothing new about this and there‟s a raft of guidance and instruction on how to do it. The UK Government Treasury sets out how to do this in its “Green Book” (HM Government, 2003 and updates). While some may question that “evidence-based policy making” will really operate at the highest levels in public decision-making, there are strong signals in the UK that demonstrating value is being taken much more seriously. th

For example, Peter Wanless, Chief Executive of the Big Lottery Fund, wrote in his blog on 19 March 2011, ―So outcomes are essential and will become more so. Measurement (quality and consistency is crucial too)‖. And in a similar vein, Graham Stuart MP, chair of the education select committee inquiry into young people's th

services, said on 26 of January this year: "It does seem an extraordinary failure that you [the youth sector] can‘t make a better fist at explaining the difference you make." Traditionally, impact has been demonstrated through technical evaluations conducted by academics and consultants. Substantial research and consultancy budgets have been spent on carrying out evaluation studies by national and local government and the organisations they fund. As managing director of a research and evaluation consultancy, it pains me to recognise that much of this effort has been wasted, along with the money that funded it. This is not because of flawed evaluation methods or poor analysis and reporting, although this may sometimes be the case. Rather, it is because research and evaluation studies rarely affect either policy or practice. External „objective‟ evaluators conduct research and deliver their conclusions as reports which all too often end up sitting on the shelf. Evaluation has been something done to people and organisations rather than with them. Thus when the report is ____________________________________________________________________________________________________________ e-ORGANISATIONS & PEOPLE, AUTUMN 2011, VOL. 18, NO. 3 www.amed.org.uk PAGE 115

completed, it is often ignored, and the real learning remains with the researchers. It‟s quite difficult to give an example of such waste without causing offence and you‟ll understand I‟m not keen to offend past and potential clients. But here‟s an illustration of the sort of thing I mean, based loosely on two real examples.

An example of wasteful, imposed evaluation Not so long ago, a major national UK agency funded several projects to the tune of between £0.5 and £1.5m each. The organisation‟s goals were clear and, from the start, it took on evaluators to explore how the goals were being met. This was a good beginning – many evaluation studies are commissioned at a programme‟s “mid-term” or end point, when it is generally too late to gather robust and meaningful data. The evaluators began to put in place systems that would measure, among other things, the improvements in well-being experienced by thousands of project beneficiaries. So far, so good. But it quickly became plain that the projects had rather different goals than those of the national funder. Their aims were often less ambitious than the national programme and they intended to monitor what they did (activities), not what happened as a result (outcomes). They saw the evaluation efforts as an unwelcome administrative burden and a way of checking up on them, rather than a learning exercise. As a result, organising the evaluation became quite challenging. The evaluator had to find new methods to gather data directly from beneficiaries rather than through the projects.

Despite this, it proved possible to gather

sufficiently robust data to demonstrate programme impact – at least at a national level.

There were

interesting findings, including the relative value of capital and revenue investment and true costs of making a significant difference to people who started off with quite low levels of well-being. Despite lack of detailed project impact data, it was possible to identify key elements in projects that worked well compared to those that didn‟t. But by the time the report was completed, policy and people at the national agency had moved on. Local projects had never been keen on the evaluation in any case, so while the final evaluation report was technically sound and included interesting conclusions, there was, quite simply, nowhere for it to go.

A contrasting, facilitative evaluation approach Over the last five years or so, my company (Hall Aitken) has been developing approaches that contrast sharply with the typical case study I‟ve just described. They are based on using a facilitative approach overall, and specifically using facilitated workshops. The genesis of our approach was a series of commissions from national organisations aimed primarily at helping their funded projects to self-evaluate.

Various funders have commissioned a range of support

services and programmes for grant holders and provide advice and written guidance to encourage selfevaluation. Many of these initial attempts, including some of our own, tried to translate large-scale technical evaluation approaches into a smaller scale activity that would be more manageable at a local level. Thus, for example, most self-evaluation guides would include a glossary of terms aiming to translate and explain evaluation jargon.

____________________________________________________________________________________________________________ e-ORGANISATIONS & PEOPLE, AUTUMN 2011, VOL. 18, NO. 3 www.amed.org.uk PAGE 116

Example 1: working with youth organisations Over time, we have moved on from this starting place. Now, rather than providing robust evaluations with well-founded recommendations that aren‟t acted on, we focus on helping projects take a few steps towards focusing on outcomes, and gathering and thinking about evidence to track their performance. The resulting evaluation studies are far less technically robust but promise to make a much bigger practical difference. For example, we‟ve been working with a series of local YMCAs, all of whom are affiliated to the national organisation, but who act independently. They‟ve been increasingly keen to demonstrate and understand the undoubted value of their work – for themselves (to improve their service) and for funders (to be able to win funds to do more). Rather than applying our technical evaluation expertise at arm‟s length, we have engaged closely with the local YMCAs.

We kick-started the

process by facilitating a workshop for a representative group of some 30 or so front line workers and managers at different levels.

Using „sticky note‟

clustering and simple

prioritisation

tools such as sticky dot voting, we started to identify the key changes the groups deliver for young people, rather than the activities they provide. At that first workshop, we also began to explore some methods they could use to identify their success in achieving these

changes,

discussions.

using

focused

By the end of the

workshop, we had identified where there was consensus and where there were disagreements or differences in emphasis that we would need to resolve later. Figure 1: Simple clustering

Principles underpinning our facilitated evaluation workshops While we have since refined this initial workshop, we are still using the principles we started with. These are: 

Challenging participants to think about the changes they produce (outcomes) from the viewpoint of their beneficiaries and other stakeholders.



Developing a story of change to help participants articulate how they achieve the outcomes they claim.

____________________________________________________________________________________________________________ e-ORGANISATIONS & PEOPLE, AUTUMN 2011, VOL. 18, NO. 3 www.amed.org.uk PAGE 117



Relentless prioritisation to end up focusing on measuring a few important things really well, rather than trying (and often failing) to gather enormous quantities of information.



Using both qualitative approaches (softer evidence that doesn‟t easily turn into numbers) and quantitative techniques (hard figures) to „triangulate‟ findings.

After the initial workshop, we held a series of consultations with small and larger groups, facilitated and unfacilitated, to develop a common system focusing on nine key “transitions” that capture the most important changes the organisation has produced.

Figure 1: Key transitions summary Outcome

Key transition

Well-being Confident and effective

Scores above average on a well-being scale. Has high measured self-efficacy.

Independent living

Can live independently with minimal or no support from agencies.

Positive activities

Is regularly involved in music, arts, sport or an organised club or group.

In education, employment or training

Regularly attends school, college or communitybased learning.

Volunteering

Gives time regularly as a volunteer.

Skills and qualifications

Achieves a recognised work-relevant skill or qualification at a level (or in a subject) not previously achieved.

Physical activity

Takes the minimum recommended amount of moderate exercise.

Healthy eating

Typically eats „five a day‟ (fruit and vegetables) as part of a healthy, balanced diet.

Source: Hall Aitken

Piloting this approach with several of the YMCAs has been the next step and we‟ll shortly be rolling out the approach further, with some refinements.

Example 2: Evaluating mentoring support In another example, we are working with the Scottish Mentoring Network, which helps local projects recruit and support mentors in a range of contexts. Our initial workshop, involving both local project and national support agency staff, built on our experience to that point. We succeeded in making faster progress and included more in a half day than we would have been able to achieve in a full day using our initial approach. At this stage, the key improvements came from using a range of voting systems to identify where easy consensus existed and therefore avoid unnecessary discussions on these issues.

____________________________________________________________________________________________________________ e-ORGANISATIONS & PEOPLE, AUTUMN 2011, VOL. 18, NO. 3 www.amed.org.uk PAGE 118

Figure 2 Using positive and negative sticky dot voting

One effective approach has been to use „sticky dot‟ voting with both positive and negative votes, following a sticky note clustering exercise. This quickly identified the most important issues (in line with our design principles). At the same time, collections of positive and negative votes for the same statement identify it as a key area of disagreement. This initial workshop led us to identify a small number of key outcomes at a national and local project level, along with just three ways of tracking these changes: a self efficacy measurement tool (Schwarzer & Jerusalem 1995: 35-37), a story based technique (Davies and Hart 2005), and basic data collection of numbers of people involved. In consultation with a key member of the national team, we developed an outline for a simpler system that was circulated to the member organisations for agreement. With a positive response, we moved on to five follow-up workshops held around the country that helped participants to replicate and understand the original thinking process and then try out the measurement tools. These workshops were a combination of training and facilitation. We are now in an implementation phase and we expect to learn and move forward, refining the approach and the systems as we go. Feedback from these and other experiences strongly suggests that we have made good progress in helping organisations think about what they are achieving and how they might do better. Furthermore it this has been at a fraction of the cost of past large-scale evaluations. The interest being shown in our branded version of this process (Indicate™) suggests that we have hit a chord with many. Some particular lessons and observations that we can draw from the process may be of interest to others.

____________________________________________________________________________________________________________ e-ORGANISATIONS & PEOPLE, AUTUMN 2011, VOL. 18, NO. 3 www.amed.org.uk PAGE 119

Reflections and possibilities Delivering and refining essentially the same workshop to a variety of groups covering different fields of work has also been instructive for us. It has enabled us to realise a number of important principles. 

The first, and perhaps most obvious to facilitators, is the power of doing things with, rather than to, people. We have worked with our clients to help them find and implement solutions, drawing on our expertise, but owning the decisions themselves.

In doing so, we have begun to show that

„subjective‟ self-evaluation can make much more difference than traditional objective evaluation approaches. 

The precise use of language really does matter, possibly even more than is emphasised in typical facilitation training. For example, a key part of our workshop is having participants identify their key outcomes.

Initially we explained what an outcome is, ran a short exercise to help with

understanding, then used focused discussion in groups to develop a long list for later prioritisation. Now we simply provide a list of starting words for an outcome (better, faster, longer, shorter and so on) and go straight to having small groups generate a long list for their own context. Thus we have reduced a two- to three-hour exercise to between 30 and 60 minutes while still achieving the same result but with a better understanding, buy-in and contribution. Smaller changes to the language we use when introducing exercises have also made a significant difference to the progress we have been able to make. 

Extensive discussion is overrated. An apparent lack of group consensus often relates more to the argumentative tendencies of some group members than to an actual difference of viewpoint. Thus a facilitator can be led into an extensive discussion which produces a conclusion that the participants were well aware of before it ever started. This isn‟t to say that such discussions can‟t throw up new perspectives and insights, but the reality is they often don‟t.

So by using a series of different

approaches to voting (including shows of hands, priority ordering, sticky dot voting, and so on), we‟ve been able to identify quickly where there is consensus and where there is disagreement which requires focused discussion. 

Although we have succeeded in designing a process that achieves consensus very quickly, it does not always work. Specifically, while mixing managers and front line staff works well in small groups, it does not with larger numbers. In large groups of, say, 30 or so, no amount of division into smaller groups, using different exercises or use of additional facilitators produces a useful result, and we have had to do follow up work before we‟ve been able to generate consensus and move forward. We‟ve therefore started taking the easy way out, running an initial workshop with managers to set an overall framework and then repeating this with front line staff to get them to test and refine. So far this seems to be working well and is in line with our pragmatic approach of making a few steps forward rather than trying to change everything all at once. But it is an area that we‟ll look at refining further in the future.



No matter how tightly structured a programme is and how well it is designed, individual facilitation skills are still critical. We‟ve now reached a point where the workshop is carefully structured to

____________________________________________________________________________________________________________ e-ORGANISATIONS & PEOPLE, AUTUMN 2011, VOL. 18, NO. 3 www.amed.org.uk PAGE 120

provide a series of interlocking exercises that help participants to explore their practice and reach consensus on how to move forward with self-evaluation. We have a much greater level of detailed guidance and more use of structured tools for this workshop than we would normally use. Because we offer a structured follow-up service and a „mass customised‟ set of tools, it is really important that each workshop arrives at more or less the same point. This also helps us keep our service very keenly priced. But as individual facilitators tackle things in very different styles, we are now working to flex the workshop to suit not only the participants but also the facilitator‟s style.

Conclusion Overall, we feel we‟ve made significant progress from the days when our expensive evaluation reports only served to make bookshelves look full. We are using facilitation skills in a range of ways and helping our clients achieve change in a step-by-step fashion. Most recently, we‟ve also been helping them to tie their own self-evaluation efforts into demonstrating financial value – using approaches borrowed from Social Return on Investment (Cabinet Office 2009).

So we are quickly moving to a position where using a

facilitative approach (which to some people can sound very soft and qualitative) will produce hard financial proof of value. All we have to do now is convince more people to spend less on getting a better result.

References and websites www.hallaitken.co.uk www.indicatesupport.co.uk Green Book: HM Government, 2003 and updates. www.hm-treasury.gov.uk/data_greenbook_index.htm Schwarzer, R., & Jerusalem, M. (1995). Generalized Self-Efficacy scale. In J.Weinman, S.Wright, & M. Johnston, Measures in health psychology: A user‘s portfolio. Causal and control beliefs (pp. 35-37). Windsor. http://userpage.fu-berlin.de/~health/engscal.htm Davies, R. & Hart, J. (2005). The Most Significant Change Technique. Care International www.mande.co.uk/docs/MSCGuide.pdf Cabinet Office (Office for the Third Sector) (2009) A guide to Social Return on Investment. See also www.thesroinetwork.org

About the Author Jeremy Wyatt is Managing Director of Hall Aitken, a research, advice and support consultancy working with organisations that support excluded individuals and disadvantaged communities throughout the UK (www.hallaitken.co.uk). He holds an MBA, is a Chartered Fellow of the Chartered Institute of Personnel and Development, a Certified Professional Facilitator and a member of the UK Evaluation Society. He has worked in the public, private and not-for-profit sectors in a range of training, management and leadership roles.

____________________________________________________________________________________________________________ e-ORGANISATIONS & PEOPLE, AUTUMN 2011, VOL. 18, NO. 3 www.amed.org.uk PAGE 121