Download the Socitm Improve Summary Report for 2018 here

1 downloads 93 Views 2MB Size Report
from the WannaCry ransomware attack that seriously disrupted parts of the NHS. Years of quite aggressive PSN compliance
Socitm Improve summary report May 2018

May 2018

Summary report p. 01

Socitm Improve

“The user satisfaction survey has allowed us to get a better understanding of the user base and what they actually think about the ICT service"

Mark Whelan, Service Innovation and Efficiency Manager, Derbyshire County Council

Socitm Improve summary report May 2018

Contents Introduction 04

Estate module

A note on quartiles and the presentation of results

05

Socitm Improve – additional knowledge

05

Rates of ICT staff turmoil or churn over the last 10 years

19

Confidential

05

Users per ICT FTE over last 10 years

20

Net change in ICT service size over last 10 years

20

Numbers of devices per person over the last five years

21

Numbers of people sharing a printer

21

Percentage of devices replaced (reviewed over last few years)

22

Executive summary

06

Finance

06

Schools

06

Cloud computing

06

VDI – Virtual Desktop Integration

06

Cybersecurity

06

Outsourcing

07

19

Performance module

23

Service desk calls per user

23

Service desk incidents compared to requests

24

Size of service desk

25

User satisfaction

10

Reasons for satisfaction

11

Support for corporate network

26

Service improvement model

11

Network reliability

26

Key driver analysis

12

Resolving issues at first point of contact

26

Weighted gap analysis

13

Cost module

15

ICT budget compared to organisational budget

15

Size of ICT organisation

16

Network expenditure

17

Expenditure per ICT employee

18

Expenditure changes over last 10 years

18

p. 03

Conclusions and thanks

27

Socitm Improve

Socitm Improve summary report May 2018

Introduction

In April 2017, Socitm launched a completely upgraded version of their previously very successful benchmarking service – a key feature of which was the delivery of results in an interactive format using the tableau online information display tool, which allows real time graphs and “drill down” into the data each participant provides. In response to requests from participants of the previous survey, which covered a large spread of interest areas, the 2017 survey was broken down into five modules:

findings with a wider audience. The data is anonymised so that only participants can identify themselves; there is, therefore, no “name and shame” aspect. Fourteen organisations chose to take the three core modules. This year, an additional nine organisations selected to take part in the user satisfaction module only (with two organisations doing both the core modules and the user satisfaction module). This meant that overall there were 23 members taking part in the year. The breakdown of different types of member organisation taking part is also significant, in that it covers good representative types of IT service and authority type, each of which is coded: • English unitary (E)

Cost

Estate

Performance

• Scottish unitary (S) • Welsh unitary (W) • English county (C) • English district (D)

Digital

User satisfation

• London borough (L) • Metropolitan borough (M)

As a result, the 2017/18 Improve programme (looking at 2016/17 data) was a success, attracting more participants to the three core elements of the survey (cost, estate and performance), with an additional number of participants electing to take the user satisfaction module only. Whilst the results have been accessible to all participants since they completed the online submission process, this report has been written to share the major

p. 04

• Third sector/charity (T) • Health sector/NHS agency (O) • Government body (O) Additionally, the geographical spread extended literally from one end of Great Britain to the other, making this year a very representative sample.

Socitm Improve

Socitm Improve summary report May 2018

A note on quartiles and the presentation of results Often with statistical results people will infer that first or upper quartile is good and fourth or lower quartile is bad. It is the case that several indicators having a lower quartile result could be subjectively good: those related to costs, for instance. However, it is important to view all the indicators in combination with other indicators. Having a low overall expenditure could be a sign of significant under-investment, especially if this is coupled with other negative indicators. Analysis of the results often suggests that many are “Goldilocks” indicators, where being somewhere close to the average (not too hot and not too cold, being just right) is the desired place to be.

Socitm Improve – additional knowledge

The report is intended as a companion to read alongside the view that each member taking part in the Improve service has from their online tableau representation of their data.

Confidential The participants to this study have agreed that the information contained in this report is confidential. For that reason, the information is labelled using letter + number codes (‘X125’, for example) so that participants may not be identified directly. When the report, or part of it, is required for use by appropriate external organisations – such as auditors and inspectors – this is permitted on the basis that such persons agree that (i) reference to the report should focus only on the participant concerned, and (ii) that no reference to, or commentary about, another participant will be made.

The new Improve service incorporates a site visit from an experienced Socitm consultant to assist members in completing the process in line with published guidance – but also to ensure “fair play” and act as an informal audit of the data provided. This process creates a report for each council detailing the discussions and identifying any issues to be resolved prior to submission. The resultant reports, together with the observations of the consultants engaged, have enabled Socitm to analyse the results in the context of ICT in the public sector in 2017/18. This report reviews the vast array of statistics and indicators and attempts to interpret selected indicators that tell us what is happening to those Socitm members taking part. Drawing where possible suggestions as to what good values for some indicators are, it is hoped that those taking part will gain a valuable insight into the areas that allow them to improve.

p. 05

Published by Socitm Ltd, 8a Basset Court, Grange Park, Northampton, NN4 5EZ. All rights reserved. This publication and the content of the online system from which it is derived may not be reproduced, stored in a retrieval system or transmitted in any form or by any means electronic, mechanical, recording or otherwise without prior permission of the publisher. Please direct enquiries about this report or the Socitm Improve service to [email protected] (telephone: 01604 709456). Further information may be found on the Socitm website at www.socitm.net

Socitm Improve

Socitm Improve summary report May 2018

Executive summary

Finance Socitm members clearly faced a year of challenges in 2016/17. However, the impact of financial restraint was not consistently evident across all those participating in the process. While many members have cut back on numbers of employees and operate on budgets that have reduced over a number of years, others are only just “feeling the pinch” whilst others have seen recent investment or even expansion in operations by working in partnership with others. Overall, the picture is not one of total “doom and gloom” – though financial pressures are still high in the minds of the IT heads taking part.

Schools For those authorities who previously supported ICT within schools, the picture in England and Wales compared to Scotland is markedly different. In England and Wales, the involvement with schools is now very much either “arm’s length” or non-existent. Many Scottish respondents still have an active involvement with their schools’ ICT. For the most part, the expenditure and staffing involved in school support have been excluded from the survey so as to compare like with like.

Cloud computing Migrating to cloud is a current hot topic; but only one of this year’s group has successfully completed a wholesale cloud migration. Most participants, however, have only limited exposure to cloud services. The organisation that did conduct a wholesale transfer had an outsourced ICT service operated by a major provider of cloud services, doing so as part of the award of a new contract. It is,

p. 06

therefore, fair to say that whilst in this case a sensible business case exists, the same has not been evidenced elsewhere in the survey group.

VDI – Virtual Desktop Integration Virtualisation of the desktop has had a mixed year in the survey group, with a number of respondents replacing a Citrix desktop environment with more traditional PCs or newer laptops/tablets. User satisfaction results for those “ditching” Citrix appear to have improved, as reliability issues with under resourced infrastructures appears to have led to a collective “loss of faith” with that type of delivery. Advances in the control of individual devices using SCCM to update, together with reduced unit costs for small form factor PCs have also brought into question some of the benefits that VDI has previously addressed. The push by organisations to be more agile and have a flexible/mobile workforce have also led to a pressure to have more capable devices.

Cybersecurity Security concerns and the requirements of PSN (public services network) compliance have not seen a rise in network expenditure; in fact, the reverse is true. Reductions in the numbers of remote offices (and, in many cases, councils concentrating onto a single site) along with increases in the amount of available bandwidth have seen many respondents lower network spending. This reduction in spending has not dented overall security – as was evidenced in 2017 with no authorities suffering from the WannaCry ransomware attack that seriously disrupted parts of the NHS. Years of quite aggressive PSN compliance do appear to have worked, even after recent concessions to allow more flexible use of equipment. Socitm Improve

Socitm Improve summary report May 2018

Outsourcing

With the exception of the previously mentioned and very successful cloud migration by one out-sourced ICT service, several of the group cited mounting costs and inflexibility of outsourced partners as a reason to bring back the services.

Along with rolled back VDI implementations, a number of the survey group also took services back in-house in 2016/17 and are taking part in Socitm Improve to help monitor their new services on an ongoing basis.

The breakdown of the respondents in recent years can be seen below, with 21 fully in-house, 18 mostly in-house and just three fully outsourced.

Fully in-house

Mostly in-house, some outsourced

Mostly in-house, some shared service

Mostly in-house, some outsourced, some shared service

Fully outsourced

Mostly outsourced, some inhouse, some shared service Mostly shared service, some inhouse, some outsourced Equal mix: in-house, outsourced shared service

Not specified

5

p. 07

10

15

20

Socitm Improve

Socitm Improve summary report May 2018

Numbers of ICT staff

The end of the desk phone?

The size of the ICT function and its composition has been a major debating point over the 20 years of the benchmarking ICT service, with discussions revolving around whether the ICT department actually contained all the people supporting ICT or not.

Only one of the participants has taken the “plunge”, but Socitm is aware of other members who have removed what are now traditional IP telephony switches and replaced them with a “mobile only” solution, meaning each member of staff has a mobile phone only, no desk phone.

Nearly all those involved in this year’s survey have incorporated departmental ICT into a central service; however in one or two, this is move is now being actively considered, where they have significant resources outside of the ICT service in system administrator roles. The number of people supported by each member of the ICT service varies on this year’s data, from 24 in a district council (4.1% of employees) to 107 in a metropolitan borough (0.9%), with 36 to 72 users to each member of ICT more likely to indicate a sensible size of ICT function (approximately 3% to 1.5% of all employees).

The move from calling a desk to calling a person presents members with interesting options in supporting a more mobile or agile workforce, but clearly requires suitable mobile device management (MDM) software to underpin some of the functionality found in fixed telephony solutions.

Feedback from workshops

The deployment of ERP (electronic resource planning) solutions, such as Oracle or SAP, also appears to have a role in the size of ICT function, with those with these solutions being more likely to have 3% of the total number of users in ICT.

As part of the Improve process, the majority of participants took part in two workshops held in 2017 and 2018. These workshops explored the reasons behind the results both in the core modules and the user satisfaction surveys (who, as part of their process, did not complete the background data to the same level).

Size of ICT service desk and calls per user

The detailed feedback remains confidential, but the overall trends and underlying information derived inform the commentary in this report.

The number of users supported by an ICT service desk varies considerably, with some organisations supporting nearly 900 people per service desk person. However, those with the best satisfaction ratings are closer to the 500 people per service desk person. Likewise, the average number of times an individual calls ICT varies from over 22 times a year to as little as three or four times a year. The optimum number of contacts appears to be in the region of eight to 12 per year, striking a balance between being useful and easy to contact, as opposed to either being too difficult to contact or needing to contact ICT too many times.

p. 08

Socitm Improve

Socitm Improve summary report May 2018

Key metrics of the participants

Socitm Ref

Type of organisation

Population

Supported users*

C1750

C

750,000 to 999,999

189.5

6,911

C1775

C

500,000 to 599,999

202.3

5,094

D1773

D

100,000 to 149,999

12

291

E1754

E

500,000 to 599,999

160.4

6,084

L1751

L

300,000 to 399,999

N/A

3,247

L1772

L

200,000 to 299,999

47.1

3,007

M1752

M

750,000 to 999,999

246.9

13,553

M1753

M

200,000 to 299,999

30.1

3,228

M1755

M

500,000 to 599,999

147.3

5,670

M1760

M

200,000 to 299,999

102.6

2,646

O1774

O

N/A

2.2**

238

S1768

S

Less than 50,000

24

N/A

T1762

T

N/A

89

9,094

W1760

W

50,000 to 100,000

49.1

2,639

* Employed Supported Users ** This NHS agency operated with an outsource partner and was unable to allocate a set number of FTE to them, so the figure refers to commissioning and strategic ICT staff only

p. 09

Socitm Improve

Socitm Improve summary report May 2018

User satisfaction This year’s Business User Satisfaction (or BUS) process within the Improve programme used the established 1 to 7 rating system, so that users of an ICT service were required to rate a number of different aspects of the service received by means of an independent online survey. The results can be interpreted as follows:

1 2 3 4 5 6 7

Extremely Dissatisfied / Very Poor Dissatisfied / Poor Slightly Dissatisfied

The application of simple maths dictates that average scores of 7 and 1 are not attainable, unless all participants score the same, so results near the extremes are unlikely. However, with one council scoring over 6.0 this year, and two others over 5.5, it shows it is possible to impress a large majority of users. None of the respondents had surveys that dropped into the dissatisfaction range, despite individuals often returning low scores, as in all cases there were a number of positive responses to balance them. The overall trend is one of improvement.

Neither Satisfied or Dissatisfied Partially Satisfied Satisfied / Good

User satisfaction scores compared to last time taken

Extremely Satisfied / Excellent

2017

Previous

S1768 D1763 C1750 S1770 M1764 S1765 S1769 S1766 S1771 S1767 D1757 E1759 1.00

p. 10

2.00

3.00

4.00

5.00

6.00

7.00

Socitm Improve

Socitm Improve summary report May 2018

Reasons for satisfaction One of the principal interests in comparing satisfaction ratings is to try and learn what works well and what does not. In some cases, there are elements that may be entirely local, but it can be the case that general approaches or specific technologies have a bearing. The top three councils represent a Scottish unitary, an English district and an English county. With a similar pattern, the bottom three represent a Scottish unitary, an English district and an English unitary. It can, therefore, be deduced that size in this case does not matter.

It is also true to say that the highest scores correspond to those with previously low scores, so where you have come from is clearly a factor.

Service improvement model One of the key features of the Socitm Improve service is the ability to match expectations of service with actual delivery across a wide range of issues. Using this model, it is possible to show how close members are to delivering what is important to their service users. Generally, members with better satisfaction have scores closer to 100% in a number of key areas.

Main reasons for positive scores: • New ICT management – “just do it” attitude • Streamlined procedures, more freedom for ICT operational staff to make decisions • Recent investments in new equipment • Having a robust five-year replacement/refresh cycle for equipment

The two graphs below (adapted from the Socitm Improve tableau online interface) demonstrate the differing results between a top performing council and a less well performing council in this year’s group.

Appropriate training

62.55%

Comunications

74.50% 85.11% 88.08%

Fitness for purpose

77.35%

Reliable system

78.50%

Support service delivery

78.36%

• Ageing equipment/performance issues

Trusted management

74.63%

• Lack of flexibility in thin client environment

Appropriate training

60.26%

Comunications

56.49%

Main reasons for negative or low scores:

• Using Windows Mobile! • Security restrictions causing problems • Problems with deploying new tablet devices

p. 11

64.27% 71.60%

Fitness for purpose

55.19%

Reliable system

54.23%

Support service delivery

52.35%

Trusted management

59.84%

Socitm Improve

Socitm Improve summary report May 2018

Key driver analysis

The gaps are obvious. Whilst training is similar in both organisations, the council that had the best overall satisfaction score (top graph) consistently outscored the less well performing council (bottom graph) by around 15-25%.

In practice, the service improvement model is a means of establishing not only how you measure up to others but also, by using the new online tableau presentation of key drivers, it is possible to understand which areas of a business are the source of results. For example, when looking at the fit for purpose score (of 55%) of the less well performing council it can be seen that in one area the average rating was only 45% (see below).

So, the recipe for an excellent service is clear: just have wonderful staff who fix things first time, operating reliable systems that are fit for purpose and help the organisation!

This clearly indicates a significant issue in adult services for this authority, as it shows that other areas of the council have much higher ratings and this area should be investigated. In this particular case, the relatively low score for “ability of ICT support staff to fix problems” also points to either a poor relationship with the system supplier or a skills gap that needs addressing.

Role

K1. Working relationships

Middle manager

K2. Political and senior management commitment K3. Downtime

Professional and operational employees Strategic manager

Segment Adult services

K8. Speed of response to requests for assistance

members Children's services Communities and place Economy, regeneration and skills

K12. Communication channels

Resources and transformation

K13. Resource plans for new systems and developments

Main device

K15. Lead times

Desktop PC Laptop

K17. Fitness for purpose of the hardware provided Mode of working Mobile

K19. Fitness of purpose of the business systems provided K20. The quality of training provided 40 0.

45 0.

50 0.

55 0.

60 0.

65 0.

70 0.

75 0.

80 0.

Calculated from 41 responses

p. 12

Socitm Improve

Socitm Improve summary report May 2018

Weighted gap analysis Continuing the lessons of identifying areas to concentrate improvement, Socitm Improve helps identify the perceived gap between how important an area of delivery is and its actual delivery. Taking the same less well performing council, it shows (in this case, larger numbers equal a great gap so the larger the bar the greater the issue) big gaps in a number of key areas for this directorate (see 'weighted gap analysis - fig.1' graph on next page). The high value for training is reflected across all organisations and Socitm is looking to revise the way these questions are phrased in future, as all our members report a much lower take up of training than apparent demand. The two biggest gaps in delivery against importance for this adult services directorate are fitness for purpose of business systems, and fitness for purpose of hardware provided – whereas other systems are clearly more acceptable (for instance fitness for purpose of office systems provided). This lends further evidence for reviewing both the business system in question and the hardware provided.

For those interpreting their own results, we have picked one of the top performing council’s detailed results for comparison (see 'weighted gap analysis - fig.2' graph on next page). It can be seen that, overall, each of the individual scores is lower, indicating a closer relationship between importance rating and delivery (the lower axis has a maximum of 16 not 20). However, within adult services it is clear the same issues with the business system and hardware provided exist with both councils, as both show as the top two issues (after training). In this case, the principal difference is the significantly lower gaps for “customer service skills of ICT support staff” and “ICT staff understanding of the user’s business” together with “working relationships” – these point very clearly to why the overall gap is so much lower (between 2 and 4 in the better performing ICT service and between 9 and 10 in the service that is struggling more). So, in addition to reviewing the hardware and business system in the first council, it would be as well to look more closely at these areas of “soft skills” within the ICT service.

"Continuing the lessons of identifying areas to concentrate improvement, Socitm Improve helps identify the perceived gap between how important an area of delivery is and its actual delivery"

p. 13

Socitm Improve

Socitm Improve summary report May 2018

Weighted gap analysis - fig.1 Role

K1. Working relationships

Middle manager

K2. Political and senior management commitment K3. Downtime

Professional and operational employees Strategic manager Segment

K8. Speed of response to requests for assistance

Adult services members Children's services Communities and place

K12. Communication channels

Economy, regeneration and skills

K13. Resource plans for new systems and developments

Resources and transformation

K15. Lead times

Main device Desktop PC Laptop

K17. Fitness for purpose of the hardware provided

Mode of working

K19. Fitness of purpose of the business systems provided

Mobile

K20. The quality of training provided 0

Calculated from 41 responses

-6

-4

-2

-8

0 -1

2 -1

6 -1

4 -1

8 -1

0 -2

Weighted gap analysis - fig.2 Role

K1. Working relationships

Middle manager

K2. Political and senior management commitment K3. Downtime

Professional and operational employees Strategic manager Segment

K8. Speed of response to requests for assistance

Adult services members Children's services Communities and place

K12. Communication channels

Economy, regeneration and skills

K13. Resource plans for new systems and developments

Resources and transformation

K15. Lead times

Main device Desktop PC Laptop

K17. Fitness for purpose of the hardware provided

Mode of working

K19. Fitness of purpose of the business systems provided

Mobile

K20. The quality of training provided Calculated from 41 responses

p. 14

0

-2

-4

-6

-8

0 -1

2 -1

4 -1

6 -1

Socitm Improve

Socitm Improve summary report May 2018

Cost module All organisations in the public sector have to demonstrate value for money. The Socitm Improve service helps in this process, but it is important to balance what services cost against what scope of estate they support and level to which they perform. In examining metrics from the cost module, we will attempt to highlight interactions with other factors. Local circumstances and priorities do affect the cost base of councils, so there can be no absolute right and wrong answers. Underinvestment in ICT services has a habit of occurring in cycles, with major problems often leading to investment and lack of immediate problems leading to cost-cutting.

• Whether or not the council has reduced desktops in line with flexible working policies • How many remote devices have been issued or supported • The level of involvement in supporting schools Not all of these factors are recorded in the Improve service; however, we will attempt to draw lessons from relative size and expenditure.

ICT budget compared to organisational budget

Key factors effecting costs are: • Geography and the number of supported sites – having a lot of sites connected remotely adds to base costs significantly • The level of outsourcing of other service areas or partnership working • Use of some systems leads to significantly greater demands for support/configuration or development (such as SAP or Oracle ERP solutions)

The individual district council D1773 shows as one of the most expensive in relative terms (see table below). This is almost certainly due to the lack of economies of scale as they will require a disproportionately large fixed resource in the form of servers and staff to support many of the same systems as larger councils; as detailed later, their figure for network costs is disproportionately high and will need further investigations.

Socitm ref

Type of org.

Org. Net expenditure (£m) *

ICT expenditure (£m)

% of ICT expenditure

C1750

C

750 to 999

17

1.8%

C1775

C

500 to 749

19

2.9%

D1773

D

Less than 50

1

3.7%

E1754

E

750 to 999

14

1.6%

L1772

L

500 - 749

8

1.4%

M1752

M

1,250 - 1,500

28

2.0%

M1753

M

300 to 499

4

1.0%

M1755

M

300 to 499

18

4.6%

M1760

M

500 to 749

8

1.4%

W1760

W

150 - 299

6

2.9%

p. 15

Socitm Improve

Socitm Improve summary report May 2018

Investigating the figures, there is clearly an interesting observation regarding outsourcing. The highest percentage of expenditure was for M1755, a council that had only recently brought back its service in-house (the year in question), as well as supporting an Oracle ERP. By comparison, L1772 has recently re-let its outsourced contract and included a successful move to cloud-based services across all services and shows as one of the lowest cost organisations in relative terms. These comparative pieces of information suggest that, whilst there is a trend to bring outsourced services back in-house to better control costs, there are examples of the outsource process working in the right circumstances. E1754 brought services back in-house a couple of years prior to this comparison and has reduced expenditure noticeably. Notwithstanding the caveats provided at the start of this section, an overall expenditure in the region of 1.5% to 2.5% would appear to be in a sensible range. Site visits confirmed that with only 1.0% expenditure, M1753 was struggling with underinvestment to deliver the services the council required.

"Local circumstances and priorities do affect the cost base of councils, so there can be no absolute right and wrong answers"

p. 16

Size of ICT organisation There are a number of ways of assessing the relative size of an ICT organisation. We have selected the relationship between the number of users and ICT staff to draw comparisons, as this, to some extent, evens out issues relating to what services of the organisation are outsourced and which are not. It is also a comparison our third sector partners can partake in (not withstanding that they support a smaller number of applications than others in the Improve service).

(FTE)

Supported users

Numbers of users per ICT FTE

C1750

189.5

6.911

36.5

C1775

202.3

5,094

25.2

Socitm ref

D1773

12

291

24.3

E1754

160.4

6,084

37.9

L1772

47.1

3,007

63.8

M1752

246.9

13,553

54.9

M1753

30.1

3,228

107.2

M1755

147.3

5,670

38.5

M1760

102.6

2,646

25.8

T1762

89

9,094

102.2

W1760

49.1

2,639

53.7

These figures suggest that an optimum number of ICT staff should be in the region of the average of these ratios; e.g. around 50 users for each member of the ICT team. As before, smaller councils are likely to suffer from a lack of economies of scale, so having figures around 25 users may not be much of a surprise. The charity sector member may not be overstretched with 100 plus users per ICT employee, as the number of systems supported is less. However, other members at either extreme (around 25 or 100) are likely to have either the potential for streamlining (C1775, M1760) or should examine if they have sufficient resources to cover their business requirements (M1753).

Socitm Improve

Socitm Improve summary report May 2018

Network expenditure This area of cost is considerably influenced by geography and the number of locations supported.

Socitm ref

Network revenue ICT revenue expenditure expenditure (£m) (£m)

% of ICT expenditure

C1750

2.454

17.083

14.4%

C1775

1.426

18.917

7.5%

D1773

0.549

1.261

43.5%

E1754

1.852

13.75

13.5%

L1772

0.286

7.544

3.8%

M1752

1.305

27.962

4.7%

M1753

0.488

4.232

11.5%

M1755

0.900

17.573

5.1%

M1760

1.438

8.19

17.6%

W1760

0.634

6.299

10.1%

Network expenditure per user

M1760

C1750

E1754

C1775

W1760

This area of cost is considerably influenced by geography and the number of locations supported. It is interesting to note the lowest percentage of expenditure is with a London borough (L1772) that has carried out a cloud migration. This is perhaps counterintuitive. Councils C1750 and E1754 both cover significant land areas and their relatively high ratio of expenditure is understandable. The two other members with relatively high ratios of expenditure should examine their networking costs more closely (D1773 and M1760). The cost per user of network connection is also related and indicates a strong relation with percentage expenditure. The largest single factor does appear to be related to the geography of the council in question.

M1755

M1753

M1752

L1772

£0 £50 £100 £150 200 250 300 350 400 450 500 £ £ £ £ £ £ £

graph so as to aid presentation.

p. 17

Socitm Improve

Socitm Improve summary report May 2018

Expenditure per ICT employee

Expenditure changes over last 10 years One of the unique features of the Socitm Improve service is the access to data stretching back in time. The impact of the financial crisis of 2008/9 can clearly be seen impacting in 2010/11, as can the introduction of austerity measures by their impact in 2014/15:

M1752 T1762 C1775 M1755

% ICT spend

C1750 E1754 D1773

2.50

W1760

0

0

0

0

£7 0,0 0

£6 0,0 0

£5 0,0 0

0 0,0 0

0,0 0 £4

0 £3

£2 0,0 0

£0 £1 0,0 00

M1760 2.00

1.50

It is likely that overhead costs feature in the costs associated with M1752 as pay rates in the north of England (where all the M-prefix councils operate) are not normally regarded as high. The charity sector member T1762 does, however, operate in the Greater London area so may expect this to be greater part of the costs. It is surprising to see a district council not at the bottom of the chart, and with their previously mentioned high networking costs it suggests their overheads are unusually high and require investigation. M1760 for a metropolitan council, on the other hand, appear to be underpaying for their staff.

p. 18

1.00

0.50

0.00 200 6-7 200 7-8 200 8-9 200 9-10 201 0-11 201 1-12 201 2-13 201 3-14 201 4-15 201 5-16 201 6-17

Various factors govern the expenditure per member of staff. Unfortunately, for commercial reasons, the outsourced London borough has not completed this section for comparison. These figures not only relate to pay rates but also to the level of overhead costs per employee.

These figures are averages across all organisations and often the profile differs. The expenditure percentages are also relative to the total expenditure of the organisation they support in returning to pre-crash percentages; ICT are, of course, doing so with relatively less money as their host organisations cut back and become more efficient. This is a positive story with evidence they have been able to contribute to overall savings, having perhaps been “over cut” in response to pressures in 2010-11 and 2014-15.

Socitm Improve

Socitm Improve summary report May 2018

Estate module

For most organisations, the direct comparisons of estate between organisations tell members very little other than that they are different! But what they can do is put into context the type of ICT service they operate. The analysis below concentrates on changes in the ICT estate over time and shows how the world has changed.

Rates of ICT staff turmoil or churn over the last 10 years

Turmoil

The most important factor in any ICT service is the employee base. Recent years have seen interesting fluctuations in the rate of turmoil or churn in numbers of ICT staff. It is now on average over 21%; in 2008/9 it was down to under 12%, as the initial impact of the financial crisis was first felt. The greater mobility in staff suggests that competition for skilled ICT staff has increased in recent years. ICT services now routinely have to deal with one in five staff leaving each year, as skills are sought by businesses willing to pay more for them.

22%

20%

18%

16%

p. 19

14%

12%

10% 200 6-7 200 7-8 200 8-9 200 9-1 0 201 0-1 1 201 1-1 2 201 2-1 3 201 3-1 4 201 4-1 5 201 5-1 6 201 6-1 7

"One of the unique features of the Socitm Improve service is the access to data stretching back in time"

Socitm Improve

Socitm Improve summary report May 2018

Users per ICT FTE over last 10 years Cuts corresponding to national funding reductions also appear to align with the number of employees each member of an ICT supports in a wider organisation – with larger numbers of people covered immediately after cuts, reducing to smaller levels as a reaction to “over-cutting” or as service areas themselves cut back on numbers of staff. This analysis tends to suggest the “axe” falls first on ICT and then elsewhere.

Net change in ICT service size over last 10 years It can be seen that the overall size of ICT functions has been one of reduction year on year for each of the last 10 years, with an increase only in the last year measured. This reflects anecdotal feedback from consultancy visits: most ICT services have sufficient resources to “keep the lights on” but struggle to support major new projects in the rest of the organisation or the replacement of existing systems.

Net change

Users per FTE

3%

65

2% 1%

60

0%

55

-1% -2%

50

-3% -4%

45

-5% -6%

40

p. 20

200 6-7 200 7-8 200 8-9 200 9-1 0 201 0-1 1 201 1-1 2 201 2-1 3 201 3-1 4 201 4-1 5 201 5-1 6 201 6-1 7

200 6-7 200 7-8 200 8-9 200 9-1 0 201 0-1 1 201 1-1 2 201 2-1 3 201 3-1 4 201 4-1 5 201 5-1 6 201 6-1 7

-7%

Socitm Improve

Socitm Improve summary report May 2018

Numbers of devices per person over the last five years Despite cutbacks, there appears to have been something of an explosion in the number of devices supported per person; this is mostly explained by the increase in mobile devices, such as smart phones and tablets. On average, individuals have at least three devices (a PC or Thin Client, an IP telephone and a smartphone or mobile device). Whereas in the past lots of staff appear to have shared limited numbers of fixed devices, the greater use of mobile devices does appear to be a factor. The reduction in the last year may point to some recent rationalisation.

Numbers of people sharing a printer The introduction of multi-function devices (MFDs) over the last 10 years appears to have been something of a success, with the numbers of people sharing printers showing an increase over time. The variance in some years is due to differing numbers of larger authorities taking part in the survey. However, the trend is clear. Members should aspire to have more than 20 users per printer/MFD.

Users per printer (trend over last 6 years) Number of devices per person

21 19

5.0

17 4.0 15 3.0

13 11

2.0

9 1.0 7 0.0

p. 21

201 6-1 7

201 5-1 6

201 4-1 5

201 3-1 4

201 2-1 3

201 1-1 2

201 6-1 7

201 5-1 6

201 4-1 5

201 3-1 4

201 2-1 3

5

Socitm Improve

Socitm Improve summary report May 2018

Percentage of devices replaced (reviewed over last few years)

With a majority of purchases being below the 15% mark, it would suggest that a lot of devices are being stretched beyond a life of seven years. As this trend appears to be consistent over the last few years, it points to many members “sweating their assets”. It can be seen that one respondent has had to make a significant investment in devices (39% replaced), but more typically somewhere around the four to five-year mark (20 to 25% of devices) would appear sensible (limiting the resources required for replacement in a single year).

This measure can be used in a variety of ways. In some cases it can show a regular refresh cycle of four, five or six years (25%, 20% or 16.67%), or show an investment cycle whereby there has been little or no investment in the year (less than 10%) in question, or a major replacement in a single year (more than 30%).

% of all devices purchased in the benchmarking year

0%

p. 22

5%

10%

15%

20%

25%

30%

35%

40%

Socitm Improve

Socitm Improve summary report May 2018

Performance module

The ICT service is often the most easily criticised unit in any organisation, as people find it easier to vocalise frustration or dissatisfaction with ICT systems than they do with other people. However, this module provides some more objective measures to compliment those of the user satisfaction module.

Service desk calls per user Taking the total number of interactions for both service incidents and requests it is possible to see how used (or, for that matter, over or underused) an ICT service desk is. Whilst many ICT organisations strive to reduce service desk calls (for instance, reducing password reset or account unlock requests) having a low number of calls is not always the best outcome, just as having a very high

number (particularly if they are fault reports) is also not desirable. Like any service business, having a good service often encourages people to call or email or report online rather than trying to resolve their issue without using the ICT service! So, what is a “normal” level of calls? Looking at the data for last year, it would suggest that over 12 calls a year on average for all users (a call per month per user) is a bit excessive, and analysis of the types of call and any root causes is required (M1760 and to a much lesser extent M1755). Those with less than six calls per user per year (one call every other month) should really investigate the effectiveness of their service desk, as the low number of calls suggests difficulty contacting the service or active avoidance (M1753). Being in the region of eight calls per year per user would appear subjectively good.

Calls per user

M1760 M1775 C1750 L1772 C1775 D1773 W1760 E1754 T1762 M1753 0.0

p. 23

5.0

10.0

15.0

20.0

25.0

Socitm Improve

Socitm Improve summary report May 2018

Service desk incidents compared to requests

Calls per user

W1760 M1752 L1772 C1775 D1773 E1754 M1755 M1753 T1762 M1760 C1750 0

10

20

30

Most respondents now operate ITIL-based service desks (or at least a suitable subset of the full model). Whilst not consistent, it is generally the case that incidents are recorded when something goes wrong (it is a fault), and a request is when ICT are required to do something that is not a fault (password reset, new user, change in permissions or access rights etc.). Balancing the desire to use your service for assistance and reducing the number of reasons for someone to call is difficult. W1760 may wish to check their categorisation

p. 24

40

50

60

70

80

of incidents as opposed to requests, as the balance appears high with many more incidents than requests. The site visit data did not suggest unreliable systems, so it is more likely to be the categorisation of incidents. M1760 and C1750 on the other hand, may wish to check their request processes, as two thirds of all calls are requests against a high number of calls in total, suggesting business processes requiring too many interactions with the ICT service.

Socitm Improve

Socitm Improve summary report May 2018

Users per service desk FTE

Size of service desk Obviously, larger organisations will be able to achieve higher numbers of users per service desk FTE and the role and definition of service desk FTE will alter in larger organisations (the larger the ICT function, the more specialist the roles tend to be); and, additionally, the approach of the organisation as to whether they operate with first line staff with limited ability to resolve issues or a more skilled “first contact fix” philosophy. Figures at the high end of the chart (L1772, T1762 and M1753) suggest a very limited role for first line staff – or potentially if coupled with high calls per user, difficulty in contacting the ICT service desk. C1750 has a very good user satisfaction score, so operating with over 500 users per service desk FTE does appear efficient and perhaps should be aspired to. Smaller organisations will find it difficult to achieve this (particularly if they only have 200 users or less and require their first line staff to carry out many other duties).

L1772

T1762

M1753

C1775

M1755

E1754

C1750

"...operating with over 500 users per service desk FTE does appear efficient and perhaps should be aspired to"

M1760

M1752

W1760

D1773 0

p. 25

100 200 300 400 500 600 700 800 900 1000

Socitm Improve

Socitm Improve summary report May 2018

Support for corporate network

Network reliability Only one respondent, E1754, had any significant downtime in the relevant year. All others had no downtime (outside of agreed or scheduled maintenance) or only a matter of a few hours.

Supported hours per week

C1750

Resolving issues at first point of contact C1775

The trend over the last 10 years is very much one of “upskilling” first line service desk personnel to enable more issues to be resolved at first contact.

E1754

M1755

L1772 M1753 D1773 M1752 M1752

C1775 T1762

W1760 L1772 M1760

W1760 E1754

T1762 0

20

40

60

80

100 120 140 160 180

M1760 C1750

Only one member, C1750, provided 24/7 support for their corporate network. The majority of respondents now only provide between eight and 11 hours a day, five days a week.

p. 26

D1773

0% 10% 20% 30% 40% 50% 60% 70% 80% 90% 00% 1

Socitm Improve

Conclusions and thanks The changes in financial climate over the last 10 years have seen many changes in ICT; the impact of security restrictions from PSN compliance and the conflicting requirements for more flexible working have certainly been challenging. This year’s Socitm Improve service does suggest, however, that many members are continuing to improve services despite these pressures. Some of the statistics suggest that some amount of “over-cutting” has taken place that has then required revised budgets, with those cut the earliest now requiring investment, and those who avoided cuts earlier having to suffer them. 2016/17 appears to be the first year in the last ten where ICT services stopped getting smaller on average. The service deliberately does not attempt to provide a “beauty parade” and hold up individual authorities as beacons of good practice. The figures suggest that all members taking part can improve elements of their delivery. The important element of the new service is to allow each partner to monitor its own improvement from year to year – and where they need to make the case for changes, can compare their resources to those who may perform better. The report author would like to thank all those who welcomed him into their organisations and provided their time and energy to collect all of the relevant data. He looks forward to seeing many of them again next year and at the next set of workshops. He would also like to thank the Socitm office for their support and organisational skills.

"The important element of the new service is to allow each partner to monitor its own improvement from year to year – and where they need to make the case for changes, can compare their resources to those who may perform better"

Socitm Improve summary report May 2018

Socitm Improve 8a Basset Court, Grange Park, Northampton, NN4 5EZ Stay connected: Tel: 01604 709456 Email: [email protected] Website: improve.socitm.net Twitter: @Socitm Linkedin: Socitm KHub: KHub.net/Socitm

p. 28

Socitm Improve