Human-Computer Interaction Design - Growlingfish

7 downloads 303 Views 14MB Size Report
What is the best design to support the intended activity? .... indirectly via ticketing system, or directly via email, p
Human-Computer Interaction Design Ben Bedwell, Horizon Digital Economy Research Institute

HCI •  A synthesis of tools, methods and knowledge from a range of disciplines that enable the practitioner to: –  Evaluate user interaction with existing interactive systems –  Reveal and understand the implications of new designs on user experience

the “Human Factors of Computers” = HCI sold short!

HCI •  Beyond evaluation –  What is the best design to support the intended activity? –  What range of activities can be supported? –  What haven’t we already though of?

•  Beyond display design –  Multi-modal, embedded, wearable, mobile, tangible, paper-based … –  Intentional and unintentional interaction

HCI •  Beyond the lab –  Bugs and opportunities will reveal themselves –  Demand characteristics and novelty come second to pressures of real life

•  User centred (users as co-designers, not just subjects) –  Subjectivity: users sometimes know themselves better than you do … –  Ripple effects: identify and understand networks of stakeholders, not just users –  Resistance: combat unfamiliarity through buy-in and evangelism

Research in HCI

Research in HCI •  Requires application –  Should not preclude rigour in methods or reliance on proven theory! Good HCI research is not “build something fun and see what happens”! –  May require the researcher to trade-off generalisability vs. innovation! Good HCI may pave the way for others to prove your results, or build on and strengthen the vision of others!

Research through design 1.  Understand and specify context of use – 

2.  Determine the user requirements – 

3.  Produce design solution(s) that meet requirements – 

4.  Evaluate solutions against requirements – 

5.  Iterate solutions around change as appropriate – 

Research through design 1.  Understand and specify context of use – 

Ethnography è Thematic analysis

2.  Determine the user requirements – 

Participatory design è Functional requirements

3.  Produce design solution(s) that meet requirements – 

Agile development è Functional prototypes

4.  Evaluate solutions against requirements – 

Usability heuristics è Key performance indicators

5.  Iterate solutions around change as appropriate – 

Longitudinal study è Contextual change indicators

1. Understand and specify context of use

5. Iterate solutions

Research through design!

4. Evaluate solutions

2. Determine user requirements

3. Produce design solutions

Horizon workplace energy

C-Aware onwards “Digital innovation to improve users’ awareness of their personal energy consumption, and modify their energy demand”! •  Focus on the workplace rather than the home or transport •  New technologies for monitoring and feedback of behaviour related to energy consumption (not process improvement)

Context of use 1. Understand and specify context of use

5. Iterate solutions

Research through design!

4. Evaluate solutions

2. Determine user requirements

3. Produce design solutions

Context of use Ethnography “Immersion in local culture to understand the underlying social machinery” – James Colley, 2013

–  Attitudes and values –  Roles –  Relationships

Context of use Ethnography •  To support inductive analysis! –  What are the patterns of interaction in the local community?

Context of use Ethnography •  Contextual inquiry •  Participant observation •  Observational study

Context of use Ethnography •  Contextual inquiry (directed) •  Participant observation (hands-on) •  Observational study (hands-off) Qualitative: rich, subjective

Context of use UoC Computer Lab & UoN Horizon •  Identification of significant energy stakeholders, roles and interactions –  5 working days’ observational study –  2 hours contextual inquiry (over 2 sessions with Facilities and Research managers) –  2 hours unstructured interview (over 3 sessions with Facilities and Research managers)

Context of use UoC Computer Lab & UoN Horizon •  Identification of significant energy stakeholders, roles and interactions –  40 hours of video, with transcriptions –  300+ photos –  Lots of field notes

Thematic Analysis

Collect data

Browse to become familiar

Generate starting codes

Thematic Analysis

Collect data

Browse to become familiar

Generate starting codes

Patterns in the data! (what, how, when, why, who, where)!

Thematic Analysis Code: Temperature •  08:13 –  In his office, [Facilities Manager] wakes computer running Building Management System and checks temperatures of each monitored space in building against desired levels (21°C); [FM] says: “some temperatures are lower, but those ones will rise later naturally due to heat that occupants give off when they arrive”.

•  11:47 –  Staff and students in café area appear to be clustered away from fire door that is currently wedged open for delivery man; can hear discussions about draughts.

•  17:22 –  [FM] receives email on his phone during walk around building from [Professor X]: “Will meeting rooms on first floor will be heated at 6pm? I booked them for an evening event but theyre currently pretty cold […]”.

Thematic Analysis

Collect data

Browse to become familiar

Generate starting codes

Thematic Analysis

Collect data

Browse to become familiar

Generate starting codes

Code data, tracking new codes

Iterate coding while new patterns emerge

Thematic Analysis

Collect data

Browse to become familiar

Generate starting codes

Code data, tracking new codes

Iterate coding while new patterns emerge

Reduction/! complication!

Thematic Analysis

Collect data

Browse to become familiar

Generate starting codes

Code data, tracking new codes

Iterate coding while new patterns emerge

Generate themes from codes

Thematic Analysis

Collect data

Browse to become familiar

Generate starting codes

Model of relationships between codes!

Code data, tracking new codes

Iterate coding while new patterns emerge

Generate themes from codes

Thematic Analysis Theme: imbalance in agency over working conditions! •  Related codes –  Working conditions: temperature, lighting, noise, physical space, … –  Control mechanisms: door lock, thermostat, air conditioning system, radiators, booking system, … –  Organisation: operations management, facilities management, colleagues, neighbours, … –  Communication: support ticket, email, complaint, request, operations meeting, gossip, … –  Motivation: reduce cost, improve efficiency, do my job, feel uncomfortable, …

Thematic Analysis Theme: imbalance in agency over working conditions! •  Thematic links –  Ideal working conditions defined and maintained by management • 

Facilities Manager centrally controls air flow and chilled-beam systems to attempt to maintain working conditions at “ideal levels” determined in management meetings (based on trade-off between acceptable comfort and operational costs); air and chilled beam systems are set to operate automatically within limits unless FM needs to intervene

–  Staff have awareness and control over only certain elements of infrastructure • 

Staff can control heating systems locally (via thermostats); most non-management staff appear unaware that temperature is affected by both local heating and centrally controlled air-flow/chilled-beam systems

–  Communication channels between staff and FM exist, but have politics attached • 

Facilities Manager uses mass emails to inform staff of any changes to operation of building infrastructures; staff can contact FM indirectly via ticketing system, or directly via email, phone or by visiting his office; it appears that only management staff contact FM directly, sometimes as a result of staff complaining to their line manager

–  Staff take unexpected measures to affect working conditions • 

There is a history of staff affecting air flow (e.g. by jamming windows) to affect air quality, or regulate temperature rapidly; FM must manually adjust the air-flow/chilled-beam systems to compensate for knock-on effect on temperature, and fix damage caused by staff DIY

Thematic Analysis Result? •  Model of the current state of the culture into which technology can be placed –  Policies, regulations and incentives that technology can align with/alter –  Existing technology specifications and uses that technology must function alongside, or can extend/replace –  Key user-groups of the technology, and relationships it can augment

User requirements 1. Understand and specify context of use

5. Iterate solutions

Research through design!

4. Evaluate solutions

2. Determine user requirements

3. Produce design solutions

User requirements Participatory design “an approach to the assessment, design, and development of technological and organizational systems that places a premium on the active involvement of workplace practitioners (usually potential or current users of the system) in design and decisionmaking processes”

User requirements Participatory design •  User-design •  Co-design •  User-centred design

User requirements Participatory design •  User-design (design by users) •  Co-design (design with users) •  User-centred design (design shaped by users) Usability & user empowerment

Future workshop •  Design activities with mixed group of user representatives •  Shares the advantages of focus groups! –  raise and explore locally-salient issues –  expose local language and culture

•  And benefits of participatory design! –  empowerment of participants through treatment as experts and involvement with researchers –  increase vested interests in technology –  instill understanding of process of development

Future workshop Three phases • 

Critique! –  Assess utility of archetypal technological solutions to encourage reduced energy consumption in the workplace •  • 

• 

Unpack positive and negative elements of design Determine how utility varies across user groups

Fantasy! –  Consider what would encourage reduced energy consumption in this workplace • 

• 

Determine how widespread the benefits would be: who is the user?

Implementation! –  Evaluate the feasibility of implementing fantasy designs •  •  •  • 

Does the user exist? How much would it cost? How would the workplace need to change? Does the technology change or align with the workplace culture?

Future workshop Critique •  What do we need to avoid doing?

Future workshop Critique •  “Demonisation” of energy consumption –  Energy consumption != waste: “don’t I consume more if I work harder?” –  One size does not fit all: “how can anyone else really judge what it is necessary for me to consume?” –  Morality: “I don’t like someone else making me feel guilty about doing my job”

Future workshop Critique •  Scope of behaviour change –  Targeting individuals: “how much can I really change by myself?” –  Hidden infrastructure: “don’t hold me responsible for consumption by infrastructure that I cannot control” –  Apportionment: “it isn’t fair that blame and reward are distributed evenly when I have to consume more or less than others. We don’t get paid the same amount!”

Future workshop Fantasy •  What should we try to do?

Future workshop Fantasy •  Bringing colleagues to a consensus on waste –  Visibility is necessary: “we only really take action when we cause each other inconvenience” –  Cooperation makes actions significant: “I think its hard to make a difference individually, but when we get together we can kick up a fuss” –  Waste is a common issue: “we all agree that waste is bad, despite not really agreeing on the green side of things”

Future workshop Fantasy •  Revealing management motivations –  Skepticism born of ignorance: “we’re likely to be suspicious if everything gets decided behind closed doors; they might be telling the truth or lying” –  Organisational pride: “now that I know what good things [FM] is trying to do, I’d really like to help, I just didn’t know about it, and still don’t really know how I’d have found out otherwise” –  Local knowledge: “I think we know far more about how this place really works”

Future workshop Implementation •  What should we be sensitive to?

Future workshop Implementation •  Political sensitivity –  Cohabiting with existing incentives: “some policies and incentives get made for political reasons or for PR: that’s part of running a business and we can’t contradict them” –  De-skliling: “this would make my job easier, but I’d be worried that it makes me more dispensable” –  Flattening hierarchies: “its great talking like this in this workshop, but on a normal day I think it would make me look like a trouble-maker”

Future workshop Implementation •  Cost –  Installation: “the outlay for installing new infrastructure is always a big deal, and its always more costly than we expect” –  Maintenance: “its hard enough keeping the basics running day-to-day” –  Proving benefits: “we would always need to show on paper that we had made the right decision to procure a new system; can we do that in this case?”

Design solutions 1. Understand and specify context of use

5. Iterate solutions

Research through design!

4. Evaluate solutions

2. Determine user requirements

3. Produce design solutions

Design solutions (Light) Agile Development “promotes adaptive planning, evolutionary development and delivery, a timeboxed iterative approach, and encourages rapid and flexible response to change”

Design solutions (Light) Agile Development •  •  •  • 

Cross-functional team(s) Regular contact with/participation by end-users Simple, quick iterations (time-boxing) Feature addition/extension immediately but only when necessary

LAD •  Development roles –  Development team •  “Technician” •  “Developer” •  “Interaction designer”

–  User testers •  “Technical user” •  “Dumb user” •  “Marketer”

LAD •  Development roles –  Development team •  “Technician” •  “Developer” •  “Interaction designer”

–  User testers •  “Technical user” •  “Dumb user” •  “Marketer”

LAD •  Development roles –  Development team includes users •  •  •  •  •  • 

“Technician” “Developer” “Interaction designer” “Technical user” “Dumb user” “Marketer”

LAD •  Development roles –  Development team includes users and roles/skills are shared! •  “Electrical technician who can develop software, but is also the departmental manager” •  “Developer who is learning technical skills as well as social science methods” •  “Interaction designer who has experience of studying organisational change” •  “Technical user who has a background in commercial marketing” •  “Dumb user with strong social links to management and staff and an interest in electronics” •  “Academic marketer with a doctorate in Psychology”

uTherm From PoV of staff •  Increasing transparency of temperature management/opening access to management decisions

From PoV of management •  Monitoring local preferences, not just local environment (“staff as sensors”)

uTherm Component 1: local temperature monitoring •  Distributed, independent infrastructure –  Local, not management control –  Low power (avoid the “energy hypocrisy”)

•  Mapped to structure of local responsibilities and relationships

uTherm Component 2: personal preference lobbying •  Simple (esp. quick and convenient!) •  Allow anonymous contribution to sentiment of local peer group –  I trust the sentiment of my peer group –  I value being a part of that group –  I do not want to be singled out over the sentiment of that group

•  Provide understanding of decision-maker’s (facilities manager) view –  The preferences we have expressed –  The state reported by FM’s monitoring system

uTherm Component 3: community preference mapping •  Analytical views to inform central decision-making –  View to provide detail on distinct peer-groups –  View to allow comparison between peer-groups –  View to allow historical comparison

•  Independent of building management system –  Mobile to allow reference while surveying the building –  Supplementary to (not replacement for) existing feedback systems

Evaluation 1. Understand and specify context of use

5. Iterate solutions

Research through design!

4. Evaluate solutions

2. Determine user requirements

3. Produce design solutions

Evaluation Participatory monitoring “the systematic recording and periodic analysis of information that has been chosen and recorded by insiders with the help of outsiders”

Evaluation Participatory monitoring •  Collectively-defined KPIs for deployed system •  Evidence-based •  Regular

Participatory monitoring Defining KPIs •  What were goals of deploying the system: how can success be measured? •  Collectively, KPIs should represent PoVs of all stakeholders –  Are communal goals being met?

•  Individually, KPIs should represent specific PoVs of different stakeholders –  Are certain stakeholder groups being favoured?

•  Method for measurement of each KPI should make sense to all stakeholders –  May require knowledge transfer to achieve a shared understanding

Participatory monitoring Evidence collection •  Focus on digital data –  Foster an understanding of how organisations/systems expect evidence –  Easy to archive for comparison over time –  Easy to reformat to determine a suitable shared language

•  Encourage local community to be reflexive –  Do the perceived problems actually exist? –  Can another community be expected to understand them?

Participatory monitoring Regular evaluation •  Longitudinal process –  Change due to technology may manifest gradually over time/be triggered by a situation in the future –  Changes due to other factors (e.g. changes in season) may have knock-on effects for technology

•  Reflexive process –  Impact of technology and evaluation process should be reassessed •  KPIs change to adapt to changes in technology/context

Iterate and adapt 1. Understand and specify context of use

5. Iterate solutions

Research through design!

4. Evaluate solutions

2. Determine user requirements

3. Produce design solutions

Iterate and adapt HCI design for research rarely a perfectly sequential process with atomic stages •  Evaluation, LAD and design likely to overlap, if not be concurrent

BUT conditions for an end-state for iteration should be specified (expectation management)

Iterate and adapt Expectation management •  Iteration may continue until major changes occur! –  Organisational change that makes adaptation infeasible •  Human resources (redundancies, restructurings, transfers, etc.) •  Funding •  Policy

–  Technology is rendered obsolete •  By changes to infrastructure •  By changes in user expectations

Iterate and adapt Expectation management •  Iteration may continue until a goal is reached! –  Evaluation targets •  KPI thresholds

–  Research objectives •  Quantity of qualitative data •  Quantitative proof of hypotheses •  Negative results/method failures

Outputs •  Customised research process –  Subset of available methods that has achieved useful goals for this context •  Generalisability: is the same process suitable for any other contexts?

•  Functional technologies –  As above

•  “Improvements” to culture –  –  –  – 

New relationships between/within user-groups Incentives aligned more closely with local values Clarity over local norms (soft-landing for newcomers) Boundaries of communities defined

•  Instilled understanding of design and evaluation methods, and of motivations for change