A Study on the implementation of the learning-outcomes based ... [PDF]

1 downloads 490 Views 913KB Size Report
Jul 6, 2012 - materials gathered in this way should allow for analysis to test ... should be perceived as a wake-up call for intensified action in which governmental ..... linked to EAIE Conference); September 2014 (Prague; linked to EAIE ...
A LONG WAY TO GO … A Study on the implementation of the learning-outcomes based approach in the EU and the USA

Tim Birtwistle and Robert Wagenaar

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

This study, conducted by the International Tuning Academy Groningen, was commissioned by the European Commission and Lumina Foundation for Education. The report reflects only the opinions of its authors. The European Commission and Lumina Foundation may not be held responsible for any use made of the information contained herein.

2

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

Table of contents 1. Introduction.......................................................................................................................... 4 2. Executive Summary............................................................................................................... 7 a) Approach: ............................................................................................................................... 7 b) Major findings......................................................................................................................... 7 3. History and Context .............................................................................................................. 9 4. Methodology ...................................................................................................................... 11 a. Research methodology .............................................................................................................. 11 b. Structure .................................................................................................................................... 12 5. Terminology ....................................................................................................................... 14 6. Survey results ..................................................................................................................... 16 7. Visits process and results .................................................................................................... 25 8. Comparison of EU and US outcomes .................................................................................... 31 9. Examples of good practice ................................................................................................... 33 10. Conclusions and Next steps ............................................................................................... 35

Appendix A - Visit Documents ................................................................................................. 37 Appendix B - List of Countries, States and Subject Areas .......................................................... 43

Research team Robert Wagenaar, Coordinator of EU study Tim Birtwistle, Liaison EU (International Tuning Academy) - USA (Lumina Foundation for Education) Courtney Brown (Lumina Foundation), Coordinator of USA study Researchers: Ingrid van der Meer and Robert Wagenaar, University of Groningen, The Netherlands Tim Birtwistle, Professor Emeritus, Leeds, United Kingdom Edurne Bartolomé Peral, University of Deusto, Bilbao, Spain Anna Serbati, University of Padova, Italy

3

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

1. Introduction More than a decade has passed since in 2003 when as part of the Bologna Process through the means of the Berlin Communiqué - the Ministers of Education encouraged the member States “to elaborate a framework of comparable and compatible qualifications for their higher education systems, which should seek to describe qualifications in terms of workload, level, learning outcomes, competences and profile”. As a consequence of this call the European Higher Education Institutions were urged to re-define their degree programmes in output-based terms. In other words to make these programmes student-centred to prepare graduates best for their future role in society. This approach became the axiom for modernizing higher education in Europe. Already three years earlier a project was launched with the support of the European Commission by a significant group of renowned universities to develop an approach that would offer the instruments to make the required modernization a reality. This project was named Tuning Educational Structures in Europe, in short Tuning1. Since 2003 the methodology which was developed, spread gradually over the globe. In 2009 Tuning USA was initiated by the Lumina Foundation2 involving Higher Education Institution in four US states. By 2010 the need was felt to check whether in two world regions, the USA and Europe, the intended modernization was actually taking place and how this process was perceived by its main stakeholders. To find this out, a study was set up and implemented during the period 2011-2012 the purpose of which was to develop robust evaluation survey instruments.3 It resulted in two quantitative instruments: one questionnaire for students, and one for university staff - academic staff, management, student counsellors etc. Already during the implementation of this first study, the need was felt for extension to other stakeholder groups, graduates and employers and to enhance and deepen the existing set. This resulted in a follow-up study, which covered the period July 2013 – January 2016. This report offers the outcomes of the second phase of this challenging study. Again, as was the case with the first study, this study was based on close cooperation of the International Tuning Academy Groningen, the Netherlands, and Lumina Foundation for Education, Indianapolis USA. The US mirror project started several months later, with some of the groundwork carried out for Lumina by staff of the Institute for Evidence-Based Change (IEBC)4 based in California. IEBC was also responsible for supporting the Lumina Project Tuning USA. The EU part of the study was co-financed by the European Union5, the USA part by Lumina Foundation. The report presented here, focuses in particular on the findings in Europe being referenced against those of the USA. The aim of this follow-up study was to further develop, test, improve and finalise the robust evaluation instruments developed in the framework of the first phase. These were designed to gather information and thus provide evidence of the relative impact on the learning environment as a result of the Tuning process and comparable initiatives and activities. In terms of impact this should be evidenced by changes in behaviour brought about by adopting the Tuning process or comparable Learning Outcomes based processes, changes in learning and teaching strategies and methodologies and the provision of learning opportunities and assessment of student learning. This set against the overall objective of the Tuning approach to prepare graduates better for their role in society, both in terms of employability and citizenship.

1

Tuning website: http://www.tuningacademy.org/ See: www.luminafoundation.org 3 Tender reference first phase of study: Negotiated procedure EAC-2010-1243 4 More information can be found at: http://www.iebcnow.org 5 Tender reference second phase of study: Negotiated procedure EAC-03/2013 2

4

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

The Tuning Process reflects the paradigm shift from input or staff/expert centred learning to output based student-centred learning, which it has promoted in the framework of the Bologna Process and in reform processes that Tuning has initiated in other parts of the world. Although the Tuning approach has been received well and is widely used today, there is only limited evidence how effective the student-centred approach is in practice for today’s and tomorrow’s society. In the framework of this second phase, the quantitative instrument, questionnaires for students, for academic staff and institutional management resulting from the first phase were supplemented by the development of two more ‘inner instruments’, questionnaires for graduates and employers focussing on the effectiveness of the Tuning approach for career development and by qualitative or ‘outer instruments’. These outer instruments, being structured interviews and focus groups involving Higher Education Institutions’ management, teaching staff including student-counsellors and students, were designed (as a part of the mixed methodology) to enable greater depth of information gathering and analysis. The materials gathered in this way should allow for analysis to test anecdotal evidence from Tuning projects across the globe, moving from anecdote to evidence, to test whether it does indeed addresses current issues better than the traditional forms of education. This responds to the expressed need both in the USA and Europe for hard data collected using a single methodology allowing analysis by project, subject, institution, region and group. The outcomes of the first phase have shown that the used methodology needed to be culturally sensitive, linguistically accurate, targeted, user friendly whilst at the same time probing. To this end great care has been taken in this follow-up study with regard to the initial drafting and the subsequent first pilot and post pilot analysis and adjustment. This second pilot was set-up to make the necessary further adjustment to the evaluation instruments and taking these to scale. The application of the outer instruments allowed for regional variation whilst maintaining the core evaluation instruments data sets. Implementation of the second phase processes has proven to be rather time consuming. Cooperation of Higher Education Institutions was not always easy to organise. Many institutions and their staff that were approached, were reluctant to discuss the state of affairs in their institution. This hampered the collection of data. Institutions did not promote participation in the surveys, for whatever reason. This also proved to be the case for the application of the outer instruments, that is the visits to selected individual higher education institutions for having interviews and focus group meetings. This applied to both Europe and the USA. It proved to be necessary to extend the original project period of the study to meet the planned objectives. Nevertheless, the outcomes presented here offer - in the view of the research team - a reliable picture of the actual situation regarding the implementation process of modernisation of Higher Education. Although the team found excellent examples of good practice, the overall picture is worrying. It seems that the discourse related to the paradigm shift is now landing, but that overall the actual implementation is very slow to commence or, indeed, not taking place at all. Only at places where tailored action has taken place, initiated by individuals because they were involved in Tuning, Thematic Network Programmes (TNPs) and/or ECTS related activities or through projects, has serious progress been made. When the findings in this study are compared to the Bologna Implementation report 20156, the European University Association (EUA) Trends 2015: Learning and Teaching in European Universities report7 and the European Students' Union (ESU) Bologna with 6

European Commission/EACEA/Eurydice, 2015. The European Higher Education Area in 2015: Bologna Process Implementation Report. Luxembourg: Publications Office of the European Union. 7 European University Association, 2015. Trends 2015: Learning and Teaching in European Universities by Andrée Sursock

5

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

Student Eyes 2015: Time to meet the expectation from 1999 report8, it is seen the state of implementation at Higher Education institutional level is even weaker than is stated in those reports. It is worth noting in this respect that in the ESU Peer Assessment of Student Centred Learning ‘Putting students at the heart of learning’ (2015)9, it is observed that “Institutional reviews (…) rarely signify the aspect of teaching and learning as a core one, which also gives a false signal to the institutional leadership about priorities of management”. What has to be noted is the disconnect between the various tiers of Higher Education, from the Ministers to the students, regarding the actual penetration of a student-centred/learning outcomes approach and the education experience of the students. In that sense this report should be perceived as a wake-up call for intensified action in which governmental structures, as well as Higher Education Institutions and their staff and students, should take responsibility. It is expected that the information gathered for and presented in this study will assist stakeholders, inter alia, in curriculum planning and development, resource allocation, pedagogy/andragogy, assessment, transparency of information. This should lead to a structure – in the form of tailored tools supported by evidence – to enhance study programmes on a regular basis to meet the changing requirements of society in terms of the most appropriate mix of knowledge, (new) skills and wider competences.

8 9

http://www.esu-online.org/asset/News/6068/BWSE-2015-online.pdf http://pascl.eu/publications/overview-on-student-centred-learning-in-higher-education-in-europe/

6

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

2. Executive Summary a) Approach: The initial project statement was that there is a need for evidence to substantiate claims of the penetration of the learning outcomes approach in Higher Education. To this end a robust methodology had to be designed, piloted, refined and used. A mixed methodology approach was adopted with surveys and visits. (a) The surveys developed were on-line with skip logic and were for: Staff, management and academics Students Subsequently a Graduate survey was developed (and is being piloted) and now an Employers survey is being developed because this will complete the loop of all stakeholders by involving the world of work. It was clear that institutions, staff and students feel there is overload regarding questionnaires so they have to be made as user friendly as possible (the skip logic achieves this). The system has been kept open to enable a continuation of data gathering; the mechanisms allow for analysis in many ways including time analysis. (b) The visits were very time consuming to set up and met some resistance from institutions because of uncertainty in progress made (actually said by some when refusing to take part). However the visits were very informative and indeed revealing (see below). The marriage of surveys and visits gave a very balanced insight, revealed a great many things and allowed for a cross checking analysis between opinions (the surveys) and reflection (the visits) both within an institution and between institutions in a European setting. The mixed methodology seems to be the only reliable way to get an accurate picture of the higher education landscape because it allows a comparison of data. Furthermore the ability to compare across continents using the same methodology gave a unique insight into higher education. b) Major findings - There is a lack of consistent use of terminology, leading to a great deal of confusion with implementation of the student-centred output based approach. - Staff development, where it takes place, has all too often focused on process rather than concepts and benefits of the learning outcomes approach. - There is a need for developing the skills base of those who are to undertake staff development; a proper infrastructure is needed with recognition that this is both important and essential if the paradigm shift is to be achieved. It is unfortunate but true to say that the vast majority of staff have not undertaken any training or development for higher education teaching and all the needs of the new environment. Without this any large scale improvement will not take place. - The discourse is taking place but this is largely at ministerial and university and faculty/school leadership levels. The disconnect becomes apparent at other implementation and delivery levels. - Students do feel the disconnect – lack of penetration to the user and a distinct lack of communication and involvement re the paradigm shift are evident in the vast majority of countries and institutions. - Teaching staff are struggling to adjust to the new concepts and paradigm shift. This is difficult for them in terms of no longer being the “knowledge owner” but being required to be a facilitator of learning and analysis – acquisition of facts is palpably easy in the digital age, what to do with those facts is the difficult skill. 7

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

-

-

-

-

-

-

Staff are having to adjust to close cooperation with their colleagues in terms of writing learning outcomes and then developing learning activities and assessment. This is also the case with regard to developing learning pathways. Institutions and systems are caught between the rankings and the need for research excellence and the policy drivers to teaching/learning excellence. Most career enhancement still comes from research. The visits show that in every institution there are pockets of good practice. In all cases these are based on external stimulus through participating in relevant initiatives (for example ECTS implementation and its alignment to learning, Tuning subject and sectoral projects, Thematic Network Programmes (TNPs), trans-national integrated programmes/joint degrees, national level programmes for implementation). All actions need continued participation and stimulus – this is to ensure development and to provide for incoming members of staff. In most places there was a clear differentiation between the level of interest, engagement and willingness of young/new staff compared with established older staff. It was often found that in places there was clear outspoken resistance to these changes making penetration of the concepts and processes difficult for those staff (often junior) who wanted to be involved. The level of time and investment needed to achieve the results sought after are not to be disregarded – much effort has to be put in to get the outcomes, which is difficult for already overburdened staff. The results to date of this research are significant in showing the level of penetration and the need, if this shift is to be achieved, for a renewed effort for staff development. The evidence base allows for policy to be further developed and aligned to this need. The study allows for evidence based decision-making based on evidence gained from a robust methodology and a continuing growth of the data. This research is unique in that it uses the same methodology (adjusted culturally) in both the European Union and the United States of America. The findings, when looking across the results for both continents, are to a very large extent consistent. This report, unless stated otherwise, refers to the European Union findings. The distinction between learning outcomes, what a student knows and is able to do, and the outcomes of learning, the overall change, maturation, development that an individual personally gains, needs to be understood.

8

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

3. History and Context The study originates from the co-operation between Tuning experts from Europe and Lumina Foundation for Education to enable the development of Tuning USA. The continued collaboration over the years has led to a fertile exchange of ideas and projects. One of the issues that came up was the felt mutual need for instruments to analyse the effectiveness of the many projects implemented over time. It was concluded this would require a robust set of instruments that should answer the questions: what to evaluate, how to evaluate, what to seek to determine, etc. In 2010 the embryonic discussions culminated in a fully defined study proposal. This proposal was submitted to the European Commission DG Education and Culture as well as to the Lumina Foundation and endorsed by both organisations. The title of this first study was ‘Co-operation in Higher Education between the United States of America and the European Union to produce a robust methodology to evaluate the application of the Tuning approach’. The Final Report for that work was submitted on 6 July 2012 to the European Commission and accepted. As stated before, the two initiators for this first study were Tuning and Lumina Foundation. To understand the initiative, it is of relevance to highlight the mission of both organisations. Tuning Educational Structures in Europe, launched in 2000, is a university-driven process which offers a universal approach to implement the Bologna Process at the level of higher educational institutions and subject areas. The Tuning approach consists of a methodology to (re-) design, develop, implement and evaluate study programmes for each of the three Bologna cycles. It serves as a platform for developing reference points at subject area level This work is based on a wide stakeholder consultation, including employers, graduates, students and academic staff. The reference points are relevant for making programmes of studies comparable, compatible and transparent. The reference points are expressed in terms of competences (distinguishing between general, transversal and subject-specific ones) and learning outcomes. Tuning contributes to the development and enhancement of high-quality competitive study programmes by focussing on fitness of purpose (to meet expectations) and fitness for purpose (to meet aims) as well as providing a "living" assessment and pedagogical learning environment that is applicable to the “4ever” learners: whoever they may be, wherever they may be, however they learn, whenever they learn. The methodology transcends "delivery" and encompasses all learners. The private Lumina Foundation for Education, has, as a part of its Goal 20%25 to achieve 60% of Americans with high-quality degrees (by 2025), funded a number of analytical tracts of the Bologna Process10 and projects (Tuning USA) and discussion working documents11 with the help of U.S. and European higher education experts. Tuning USA12 started with a pilot project (2009) involving three states and six disciplines with a mix of two-year, four-year, public and private institutions. The initial pilot project was completed in August 2010. Tuning USA 2 was launched in early 2012 with a combination of more states and disciplines plus taking the subject area of history deeper and wider with the American Historical Association (AHA). Now more projects are taking place and the Degree Qualifications Profile (DQP) and Tuning are being more closely aligned. Although the feedback of the projects implemented in Europe and the US was positive, the evidence of its effects has been mainly anecdotal as has been stated above. It became obvious that more substantiated data were needed to find out whether the (Tuning) competences/learning outcomes approach enabled an identifiable change to higher education. 10

11

12

Adelman 2009: “The Bologna Process for U.S. Eyes: Re-learning Higher Education in the Age of Convergence”. McKiernan and Birtwistle 2010: "Making the Implicit Explicit: Demonstrating the Value Added of Higher Education by a Qualifications Framework" See www.tuningusa.org.

9

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

It was therefore natural for the research and development and subsequent testing associated with the evaluation process to set-up a joint venture led by the Tuning teams in Europe and the United States. To this end Lumina's Director of Organizational Performance and Evaluation and the European Lumina Consultant were crucially involved in Phase 1 and later also in Phase 2. Although limiting the initiative to Europe and the USA, it was clearly understood that it should be structured in such a way to allow - at a later stage - to encompass the whole "Tuning Family" in all of its aspects, the nuclear family, the extended family, the dispersed family and the disenchanted family, stretching around the globe. It being obvious that local contexts, conditions, traditions and imperatives affect the way in which (Tuning) competence/learning outcomes based approach develops. Whether implemented in Africa, Canada, China, Russia, Central Asia, the United States, Latin America or Europe (or indeed in any of the other areas where Tuning is being used) the need for evidence based analysis is there, requiring a robust evaluation process to be able to be tailored to the local, national or regional context

10

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

4. Methodology a. Research methodology As explained above, the evaluation process reflected in this study is based on two pillars: quantitative and qualitative instruments. The quantitative or inner instruments are based on a set of surveys: (1) skip logic questionnaires for academic staff and institutional management, (2) questionnaire for students (3) skip logic questionnaires for graduates and (4) skip logic questionnaires for employers. Questionnaires 1 and 2 were developed as part of the first phase of this study and focus on the reception and implementation of the approach. They were piloted twice before going to scale as part of the second phase of the study. The questions included in the questionnaires were the result of intense cooperation between the EU and the US team. During this process sensitivities regarding educational models and use of terms came to light and required accommodation. Having started with common models it was then decided to split these into European and US versions but keeping the same methodology and core questions about the educational process. Questionnaires 3 and 4 were mainly developed during the second phase and focus on the effectiveness of the (Tuning) competences/learning outcomes approach for career development. They both need further field-testing before going to scale. The skip logic approach was applied for the 3 larger questionnaires to make these as user friendly as possible. The operational questionnaires can be accessed via the Tuning websites. Involving institutions and their staff and students to complete the questionnaires proved not to be a simple process of distribution. In January 2014 tailored action was required by the EU Steering Group to identify more institutions to be involved. A spreadsheet was set up to track contacts and responses. The following was done: - identify (from Tuning Academy updates 2013) those HEIs and personnel still involved with Tuning and willing to participate. From this a sample on-line look was undertaken. Where there was an obvious unit/department on curriculum development and learning, teaching and assessment (including learning outcomes and competences) these were noted (work done, contact point etc). A number of institutions (with a geographic spread) were identified to be the first approached. - to approach various representative bodies, being EUA, ESU and EMA. - stimulate the use of an "open invitation" for any who see themselves as involved in Student Learning Outcomes (SLOs). This was done through websites such as LinkedIn, EAIE Forum and those again of the representative bodies. The second pillar was introduced in phase 2 of this study: the qualitative or outer instruments. For this part the research teams in the US and Europe were both extended with researchers. The team in Europe was made up of 5 members, covering 4 nationalities, to be able to operate in pairs. In the original set-up of the study, it was foreseen that the “outer instrument” sessions (focus groups, interviews etc.) would be conducted initially by two members, an expert and graduate assistant, then by the graduate assistant only with periodic sampling and validation of the process by a Steering Committee member. In practice it proved necessary to involve for each session two experienced researchers, because of the size of the groups to interview, the complexity of the issues at stake and the note taking. For each visit a report had to be drawn-up. This obviously had budgetary consequences. The approach used in Europe was mirrored in the United States. The reports from these sessions have been the input for this final report. They are constructed around the following headings: 1. Introduction; 2. General information about the visit / Basic information; 3. Level of implementation of LO/competences approach at Institutional/ Programme/ course units level; 4. Kind of information/support for teachers provided by the institution to use Learning Outcomes/competences approach; 5. Strengths, weaknesses and main challenges occurred 11

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

in teaching, learning and assessment strategies by using the Learning Outcomes/ competences approach; 6. Changes and impact of LO/competence approach in student performance; 7. Students’ perspective on LO/competence approach and utility for them to find a suitable job; 8. “Tuning” dissemination in the institution (projects, materials, implementation, etc.); 9. Main conclusions of the visit including recommendations. These qualitative instruments should inform about behaviour(s) and attitude(s) of key stakeholders regarding redesigning/enhancing of curricula; formulating competence and learning outcomes statements and their practical use; learning opportunities and structures; assessment of students; communication of learning outcomes to students and other stakeholders, etc. This should lead to some clear evidence whether the use of the (Tuning) competence/learning outcomes approach has a (positive) effect on student and staff motivation and performances resulting in higher success rates. Even the data collected from the first Pilot has provided indicators of change. To apply the outer instrument of site visits a list of universities was drawn-up with potential experience in the application of student-centred learning and the competences/learning outcomes approach. For the site visits a clear set of documentation was prepared13. This scrutinizing led to further improvements of the material. During the lifespan of the second phase the EU was able to conduct 14 site visits, spread over Higher Education Institutions from as many countries14. The available budget did not allow for more visits. b. Structure To implement the study a work organization was set-up, in Europe this consisted of a Steering Group (coordinator, manager and EU-US liaison Tuning expert) and a research team consisting of the Steering Group plus a senior researcher and a postdoc. The co-ordinators from the European Union and the United States collaboratively developed: a) a programme of meetings for both the Steering Group and the Steering Committee. The 5 members of the EU research team met twice, a first time in Groningen (September 2013) and a second time in Washington (February 2014) together with the US coordinator and the US research team. b) a timetable for drafting the methodology in the light of Pilot 1 and 2 results and dialogue with relevant parties plus for the “outer instruments” Pilot A through to implementation and the carrying out of the instruments. For this purpose meetings took place in July 2013 (Washington), September 2013 (Istanbul; linked to EAIE Conference); September 2014 (Prague; linked to EAIE Conference); February 2015 (Washington; linked to AIEA Conference); July 2015 (Washington) and September 2015 (Glasgow; linked to EAIE Conference). In addition to these meetings there was regular contact by phone and e-mail. c) a dissemination plan for the results, for example publications, conference papers. Presentations were delivered at the EAIE Conferences of 2014 (Prague) and 2015 (Glasgow) and the AIEA Conference (Washington, February 2015). d) an overall final report on the outcomes of the study including recommendations for possible follow-up actions, and an executive summary.

13 14

Documentation for site visits Appendix A page 30 List of countries, states and subject areas Appendix B page 35

12

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

This report is the reflection of this objective.

13

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

5. Terminology The use of consistent terminology and well and broadly understood concepts are a crucial element for making reforms successful. In this case the paradigm shift from expert driven education to student-centred education based on the use of the competences/learning outcomes based approach. The outcomes of this study show there is (still) a lot of confusion about both terminology and concepts applied. The reasons for this are manifold. Terminology is to a large extent culturally and historically bound. In the framework of the Bologna Process it has been agreed to use English as the lingua franca. However, using an English term does not automatically imply that such a term has the same meaning and connotation in other countries. A good example is the term Competences. In the UK this term is traditionally associated with more applied forms of education, such as vocational education and training, while in the USA and continental Europe it is perceived as encompassing knowledge, skills and (personal) attributes. Differences in understanding and interpretation of terms has led to many misunderstandings, also due to the way these have been translated in other languages. These misunderstandings have been boosted by the definition and practical use of terminology in different European documents, two competing European Qualifications Frameworks, ECTS Users’ Guide, Tuning documents, etc. The many websites, course catalogues and course manuals, studied by the research team reflect the confusion in use of terminology. Concepts (and terms) such as competences, learning objectives and programme and module/unit learning outcomes are in the vast majority of documents mixed up and used interchangeably. Misunderstanding exists also about the term student-centred education, not meaning a cafeteria model, but flexible programmes focusing on a particular field of study by preparing students most effectively for their future role in society. In this study the definitions were used as defined by Tuning and applied world-wide. For the sake of clarity the most relevant definitions15 are offered here: Competences: Represent a dynamic combination of cognitive and metacognitive skills, knowledge and understanding, interpersonal, intellectual and practical skills, and ethical values. Fostering these competences is the object of all educational programmes. Competences are developed in all course units and assessed at different stages of a programme. Some competences are subject-area related (specific to a field of study), others are generic (common to any degree course). It is normally the case that competence development proceeds in an integrated and cyclical manner throughout a programme. Learning outcomes: Statements of what a learner is expected to know, understand and be able to demonstrate after completion of a process of learning. Learning outcomes are expressed in terms of the level of competence to be obtained by the learner. They relate to level descriptors in national and European qualifications frameworks. Learning objectives: Clear and concise statements that describe what the teacher intend the students to learn by the end of the course. It outlines the material intended to be covered or the questions related to the discipline that the class will address. This approach means in practice that the focus is on the teaching process (in stead of the learning process) and on knowledge transfer of the teacher to the students. 15

See Jenneke Lokhoff, e.o., A Tuning Guide to Formulating Degree Programme Profiles. Including Programme Competences and Programme Learning Outcomes. Bilbao, Groningen, The Hague, 2010.

14

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

Student-centred learning: An approach or system that supports the design of learning programmes which focus on learners’ achievements, accommodate different learners’ priorities and are consistent with reasonable students’ workload (i.e. workload that is feasible within the duration of the learning programme). It accommodates for learners’ greater involvement in the choice of content, mode, pace and place of learning. What does not seem to be sufficiently understood from the methodological point of view, is the difference between ‘learning outcomes’ (see above for the definition used) and the ‘outcomes of learning’. The latter is a very broad evaluation of the total gain made by a learner throughout their studies. This includes formal, informal and non-formal learning. This is a very relevant distinction, because the institution is manifestly responsible for the learning outcomes of its programmes; it can only be partly responsible for the total experience of learning, social interaction, maturation, etc. It became apparent during the course of the visits, in particular the interviews with the students, that there is a disconnect between the levels of communication regarding student learning outcomes and the value that students place, for obvious reasons, on their total learning experience, including other activities: group work, project work, work experience, etc. The student need to pass the hurdles to obtain their reward but they also wanted a rounded total experience to be better employable.

15

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

6. Survey results Surveys process and results Process: As outlined in Methodology (Chapter 4) the surveys were the first strand of the mixed methodology. There were two surveys, one for university staff and management ('Teaching, Learning and Assessment: Process and Impact') and one for students ('Student Learning Outcomes Survey'), both of which were piloted, returns analysed, adjustments made, on two occasions. Careful attention had to be paid to terminology used and the framing of the questions. The pilots quickly showed that recognition of the term Tuning was very limited (unless people had taken part in a Tuning project or, for management, had attended conferences), thus it was necessary to build in a line of questions (basically asking the same points) that referred to 'the learning outcomes/competences approach'. The university survey had skip logic built into it, such that if a respondent said they had never heard of Tuning they were taken to a series of questions that did not use the term Tuning but referred to 'the learning outcomes/competences approach' or if they designated themselves as 'management' they were taken to that series of questions and they skipped those relating to class room behaviour. The surveys, as with all parts of the research process, were used in both the European Union and the United States, with, for the latter, an exercise in 'translation' of terminology and context, for example referring to Community Colleges, faculty - not academic staff - etc.. This report shows the EU data first, then the total results but also cross refers the European data to the European university visits. The intention of this exercise was to gather data from as wide a range of relevant persons as possible. To facilitate this the links to the surveys were put on to the Tuning website (http://www.unideusto.org/tuningeu/component/content/article/385-euus-researchproject.html) with, to try to spread the reach, this being notified to the European University Association members via a link from the news section of its website, and the European Students Union latterly agreed to notify its members. All universities to be visited were given the links and requested to pass these to their staff and students. As the completed surveys were posted the returns were managed by our American partner, Lumina Foundation (where the data resides). The final surveys used had been through the rigorous piloting process described above with adjustments made at each stage in the light of the feedback received. The surveys remain 'live' and returns are still being received. It is intended to keep the two current surveys 'live' and to continue to periodically evaluate the data received. In addition to this there are two new surveys that have been drafted and adjusted and are waiting to go 'live', these are for Employers and for Graduates thus closing the loop between the universities (academic staff and management), their current students and then former students and employers. Findings: The first parts of the surveys were used to establish the context within which the respondent worked/studied: institution, post, how long in post, subject area, cycle of study and year of study etc. This data is of use to the researchers because it enables a helicopter view of where the response are coming from and thus an oversight of the project spread. Let it be said that the responses came from a wide range of countries, institutions, post-holders, cycles of study, subject areas. The raw data is available but is voluminous and is thus not appended to this report but, if needed, can be accessed via a request to the project leaders.

16

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

The report will deal with the data and the findings from the data is the following ways: SURVEY 1: the universities and their staff - 'Teaching, Learning and Assessment: Process and Impact' Interpretations and Data Summary. SURVEY 2: the students Overall data and findings.

SURVEY 1: 'Teaching, Learning and Assessment: Process and Impact' Interpretations and Data Summary The following show the university academic/faculty and staff data in a variety of formats to try to give the best and most useful representation of the returns. At times it is combined, sometimes the EU data is shown separately and sometimes both sets of data are shown separately. Not all questions are reported here, omissions included highly contextual ones. In most cases the question numbers from the surveys are shown with the data. Data from EU respondents is shown as 'EU' otherwise the figures are 'Total' (i.e. an aggregation of the data from the EU and US surveys thus showing the differences). Respondents were from a range of colleges and universities. Within the EU these institutions were from various countries and in the US these institutions were from various states. To illustrate the context of the respondents the following shows the responses to: What is your role at your institution? : Of the EU respondents: 70% were academic staff, 20% were management and leadership and 10% were student advisors or counsellors. However, in the EU, many respondents wore multiple hats, as both academic staff and management and leadership. So there is some overlap where a respondent could be counted for both the academic staff and other categories. Of the American respondents: 42% were faculty members, 46% were adjunct/contingent faculty, 2% were deans, 6% were department chairs and 4% wore a variety of other hats. 83.5% of the academics/faculty completing the survey have been in post for more than 5 years. Regarding administrators and other staff for 54.8% this was also the case. Most respondents were informed of the expectations for their courses. Were you informed of the expectations for your courses as they relate to the discipline and/or degree programs? EU: 53.85% Yes

Total: 72.3% Yes

EU: 46.15% No

Total: 27.7% No

EU institutions give credit across a wide range of learning, US give mostly for formal prior learning, but there are some less traditional learning modes gaining ground. Does your institution award students credit for any of the following? Check all that apply thus the numbers do not make 100. EU: 29.27%

Recognition of prior learning (informal)

17

Total:18.3%

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

EU: 85.37%

Recognition of prior learning (formal)

Total: 51.4%

EU: 14.63%

Massive Open Online Course (MOOC)

Total: 6.0%

EU: 22.76%

Experiential learning

Total: 12.8%

EU: 5.69%

Other

Total: 4.0%

Academic/Faculty members mostly use campus based learning models but flipped classrooms, blended learning, and online education are also used, this is more prevalent in the EU. What modes of instructional delivery do you use in your teaching? Select all that apply. - thus the numbers do not make 100. EU: 93.67%

Campus-based learning (lectures, seminars, etc)

Total: 72.7%

EU: 60.76%

Flipped classroom learning (lectures/notes online,

Total: 39.1%

discussion in class) EU: 7.59%

Massive Open Online Course (MOOC)

Total: 3.5%

EU: 50.63%

Blended learning (mix of online and in class)

Total: 36.8%

EU: 28.48%

Online only (distance education)

Total: 20.8%

EU: 2.53%

Other

Total: 3.5%

Most academics/faculty members take student academic workload into account when planning their courses but this is more prevalent in the EU. Do you take into consideration student academic workload when planning your courses? EU: 96.18%

Yes

Total: 82.4%

EU: 3.82%

No

Total: 17.6%

When asked how they define their curriculum, the vast majority of respondents indicated that they defined it through learning outcomes and competencies. Is your curriculum defined by… EU 80.27%

learning outcomes and competencies

Total: 78.2%

EU 12.93%

aims and objectives

Total: 12.6%

EU 3.4%

felt curriculum was defined by neither

Total: 2.2%

EU 3.4%

other

Total: 7.1%

Of those who stated defining their curricula on the basis of learning outcomes/competencies, most academics/faculty gathered information to help define these through discussions with colleagues at their institution, but some also frequently gathered information from discussions with colleagues at other institutions as well as students at their institution. How did you gather information to help define the learning outcomes and/or competencies? Please check all that apply. 18

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

Discussions with current students

EU 48.72%

Total: 22.1%

Discussions with discipline academic staff at my institution

EU 81.2%

Total: 41.6%

Discussions with faculty across subject areas/disciplines at my institution

EU 58.12%

Total: 25.1%

Discussions with faculty in my subject area/discipline in other institutions and sectors

EU 45.3%

Total: 27.3%

Discussions with professional organizations and/or discipline specific associations

EU 30.77%

Total: 19.3%

Discussions with other stakeholders (employers, alumni, community members, etc)

EU 42.74%

Total: 16.0%

Discussion has not been initiated

EU 5.98%

Total: 2.8%

By those academics/faculty acquainted with the learning outcomes / competence approach it was strongly agreed that they know the expected learning outcomes for each of their courses. Many also felt that the degree program has been discussed and agreed by academics/faculty and that they had discussions involve learning outcomes and competencies. As a result of using a learning outcomes/competencies approach to what extent do you agree with the following? 1= Yes, 2 = Somewhat, 3 = No, 4 = I don’t know %

The curriculum is designed in a collaborative way

The degree program has been discussed and agreed by academic staff

Yes

EU 48.15%

Somewhat

No

I don’t know*

EU 45.37%

EU3.70%

EU 2.78%

37.6%

5.0%

3.2%

EU 29.09%

EU 1.82%

EU 2.73%

USA Mean

EU Mean

1.49

1.50

1.36

1.31

1.35

1.47

54.1%

EU 66.36%

65.6% 22.5%

7.3% 4.6%

Faculty discussions involve student learning, degree outcomes, and competencies

62.8%

25.7%

6.0%

19

5.5%

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

There has been a change how we talk about curriculum in the department. It is more about degree outcomes, student learning, and competence.

EU 51.38%

EU 36.70%

47.3%

27.3%

EU 6.42%

EU 5.50%

1.63

1.48

15.5% 10.0%

None of above are significant. *Please note that “I don’t know” responses were eliminated in statistical tests. High percentages of respondents acquainted with the learning outcomes / competences approach agreed that as a result of using a learning outcomes approach, learning outcomes are more integrated in the classroom, that course learning outcomes align with degree program learning outcomes, and that the syllabus references learning outcomes. Respondents felt less strongly that the course catalogue reflects the learning outcomes for each course. As a result of using a learning outcomes/competencies approach to what extent do you agree with the following? 1= Yes, 2 = Somewhat, 3 = No, 4 = I don’t know

%

Yes

Somewhat

No

The course catalogue reflects the learning outcomes for the degree

EU 56.36%

EU 39.09%

EU 0.9%

The course catalogue reflects the learning outcomes for each course

Learning outcomes are integrated in assessment, learning, and teaching

USA Mean 2.13

EU Mean 1.33*

EU 2.75%

2.22

1.49*

1.63

1.57

1.24

1.38*

23.6% 56.4% EU 62.39%

EU 26.61%

6.0% EU 8.26%

29.5% Advising and information materials describe the learning outcomes for subject areas and degrees

I don’t know* 14.0%

47.8% EU 41.28%

EU 47.71%

14.7% 8.0% EU 6.42%

33.8% 40.2% EU 56.05%

EU 42.20%

EU 4.59% 18.3%

7.8% EU 0.0%

EU 2.75%

26.9% 68.0%

4.1% 0.9%

20

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

My course learning outcomes are consistent with degree program learning outcomes

EU 74.45%

EU 18.18%

EU 0.91%

EU 5.45%

12.3%

0.9%

6.4%

EU 14.15%

EU 3.77%

EU 2.83%

2.3%

1.4%

EU 7.41%

EU 0.93%

6.8%

0.5%

1.34

1.17

1.09

1.22*

1.30

1.55*

80.4%

My syllabus states learning outcomes/competencies

EU 79.25%

9.2% 87.2% I discuss learning outcomes with students.

EU 51.85%

EU 39.81%

26.0% 66.7%

*Please note that “I don’t know” responses were eliminated in statistical tests. As a result of using a learning outcomes approach, the majority of respondents felt that student learning is an indicator of quality, the learning outcomes/competencies approach drives the way they structure their courses and that assessments are based on learning outcomes. Fewer participants felt that they had tailored their specialization to the needs of the degree program. As a result of using a learning outcomes/competencies approach to what extent do you agree with the following? 1= Yes, 2 = Somewhat, 3 = No, 4 = I don’t know %

Yes

Somewhat

No

The learning outcomes/competencies approach drives the way I structure my courses I make adjustments throughout the term in my teaching when I see the students are not achieving the learning outcomes I have broadened my perspective of the entire curriculum by tailoring my specialization to the needs of the degree

EU 56.07%

EU 39.25%

EU 3.74%

I don’t know* EU 0.93%

5.5%

0.9%

61.3% EU 34.86%

32.3% EU 43.12%

EU 21.10%

EU 0.92%

12.3%

0.5%

52.1%

35.2%

EU 34.91%

EU 49.06%

EU 12.26%

EU 3.77%

15.9%

5.6%

39.7%

38.8%

21

USA Mean 1.43

EU Mean 1.45

1.42

1.88*

1.74

1.76

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

program My assessments are based on learning outcomes

EU 62.62%

EU 31.78%

Student engagement has improved

71.4% EU 34.26%

21.2% EU 34.26%

Student learning is a central indicator of quality

38.5% EU 55.66%

34.9% EU 37.74%

66.5% EU 45.37%

26.0% EU 38.89%

40.5%

28.9%

There is an opportunity for an end of course open dialog with students to discuss the extent to which learning outcomes have been achieved

EU2.80%

EU2.80%

5.1%

2.3%

EU20.37%

EU11.11%

15.6%

11.0%

EU 4.72%

EU 1.89%

3.7%

3.7%

EU12.04%

EU 3.70%

27.5%

3.2%

1.33

1.31

1.69

1.83

1.29

1.43

2.0

1.67*

*Please note that “I don’t know” responses were eliminated in statistical tests. Respondents felt that the most positive impact from applying a learning outcomes approach came from the way they assess learning, the way they present their course materials, and the way they teach. Please select up to 5 areas where you think applying a learning outcomes/competencies approach has had the most positive impact. The way I assess learning

EU 40.74%

36.1%

The way I present my course materials

EU 48.15%

29.8%

The way I state my course outcomes

EU 50.93%

23.1%

The way I teach

EU 55.56%

26.7%

The discussions I have with students

EU49.07%

21.4%

On student engagement

EU 31.48%

18.3%

The quality assurance mechanisms

28.79%

11.8%

The common language that has developed in my discipline on learning outcomes

EU 19.44%

12.0%

The alignment of the curriculum and courses to the learning outcomes

EU 43.52%

20.9%

The way it has changed discussions with faculty in my discipline

EU 24.07%

16.6%

The quality of the program

EU 41.67%

15.9%

__________________________________________________________________________________ 22

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

SURVEY 2: the EU students 86% of the respondents were from the first or second cycles (53% and 33% respectively), short cycle, doctoral candidates and 'traditional' long or single cycle students were also represented. Respondents were also from every year of study (1 to 6) and from across the spectrum of subject areas (architecture to zoology). In terms of the responses the following illustrate the direct questions regarding the curriculum: The 'disconnect' begins to show with the high response recognition to 'objectives'. This is at variance with the university responses but is reflected (by academic staff to an extent and by students) in the visits. Q7: Does your course unit description clearly state ... (please select all that apply) Learning outcomes

67.11%

Objectives

70.25%

Competences

57.05%

Don't know

13.65%

In only two cases do more than 50% of the students believe 'very much' that they are getting a clear explanation of what they need to do and why they need to do it to achieve their degree. 'Somewhat' figures large in all categories but the visits show that often 'somewhat' is a kind way of saying 'no'. Once again the 'disconnect' begins to show - academics and management believe one thing and the students perceive and believe something different. Q8: To what extent do you agree with the following statements ..... Not all

at

Some-

Very much

what When I was advised on course unit selection

Don't know

10.29%

53.91%

26.17%

9.62%

My discipline/degree programme has a clear statement of expectations

4.70%

39.15%

52.57%

3.58%

I understand why I am required to take the course units needed to earn my degree

6.05%

34.98%

56.05%

2.91%

My workload is appropriate to achieve the learning outcomes of the course unit

10.07%

37.58%

49.77%

3.58%

Advisors are able to provide a clear explanation of how course units fit into a bigger picture

14.09%

46.76%

33.11%

6.04%

The course catalogue states the learning outcomes for each unit

10.36%

36.71%

46.17%

6.76%

The course catalogue states the learning

9.23%

38.29%

44.37%

8.11%

There Was a focus on the competences I would gain

23

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

outcomes for my degree Progression routes to a degree are clearly stated and explained

13.48%

35.96%

43.37%

7.19%

The level of discussion of learning outcomes in class and at the end of the course is disappointing. The connection between the learning outcomes and the assignments is slightly higher but even so disappointing (once again the meaning of 'somewhat' is a problem). In Q20 51% of the academic staff state they discuss learning outcomes with students 'very much' and 39% 'somewhat' compared to the 23.49% and 51.01% respectively felt to be the case by the students. In Q21 45.37% of academic staff state that there is 'very much' ban opportunity for an open discussion with students at the end of the course whereas only 24.44% of the students feel this is the case. The 'disconnect' is writ large. Q9: To what extent do you agree with the following statements ...... Not all

at

Somewhat

Very much

Don't know

Learning outcomes are discussed in class

24.38%

51.01%

23.49%

1.12%

Objectives are discussed in class

15.02%

47.53%

34.53%

2.91%

My class assignments are based on learning outcomes

12.81%

40.00%

41.80%

5.39%

There is an opportunity for an end of course open dialogue to discuss the extent to which learning outcomes have been achieved

30.04%

39.69%

24.44%

5.83%

24

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

7. Visits process and results Process: As outlined in Methodology (Chapter 4) the institutional visits were the second strand of the mixed methodology (these, as with all parts of the research process, took place in both the United States and the European Union – this report focuses on the European Union results but cross refers to the United States. This exercise in the gathering of qualitative data was to be analysed both for each visit and then across the visits to give a cumulative set of findings. The visit data was then compared with the survey data. The original work plan envisaged visits across a sample of institutions and countries/states investigating the level of penetration of the learning outcomes methodology and student centred learning in the institutions. As stated before, each visit was undertaken by 2 members of the research team with the actual personnel involved depending on availability and at times, where it was felt necessary, to ensure linguistic backup. It is worth repeating, that setting up the visits proved to be very difficult. Some institutions actually stated that they felt they were not ready for such “scrutiny” (their term, we kept stressing at every stage of communication with all approached, that these visits were research visits and not, in any way, shape or form, validation linked or providing feedback to any outsiders or agencies – they were learning opportunities because of the feedback), others prevaricated to the point of time running out (giving a feeling of not wanting to take part) and some made every effort to accommodate the visit and to lay themselves open to analysis in the true spirit of the visits and the research objectives.16 In the end 14 visits across the EU took place from north to south and east to west, from research intensive universities to those with a teaching only mission, encompassing a wide breadth of missions and sizes. Where we missed out was not being able to visit a private for profit institution, but this was not for the lack of asking. In the set up phase each institution approached was sent the same information and suggested format for a single full day visit. The categories of persons the team hoped to see were stated but who the team did see was of course up to the institution, as were which subject areas were seen. This lead to a wide range of subject staff and students being seen but also some repetition of subject areas – this did not matter because the original assessment had been that, apart from subjects directly involved in Tuning, Thematic Network Programmes (TNPs) or ECTS projects from a particular institution, the methodology was unlikely to have been influenced apart from by national policies (the national qualifications framework, quality assurance mechanisms, diploma supplement, continuing professional development requirements etc.). Once the visit date had been agreed (and researchers allocated – from a calendar of availability) an internet search of the institution took place. This looked at references to the national qualifications framework, diploma supplement (examples and availability), quality assurance mechanisms (internal and external), in-house staff development availability, degree profile, curriculum, unit learning outcomes, any sample assessments etc.). This formed Part 2 of the institutional report and informed the researchers of the public face of the institution.

16

The aim of EU-US Study on the implementation of the Learning Outcomes/Competences approach is to determine the extent to which universities have adopted it. The methodology is to use a variety of instruments to find evidence (mixed methodology: online questionnaire plus in-depth interviews).

25

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

At the end of each visit the researchers gave informal feedback to the institution – to whom this was given varied by institution as it was for them to decide. The next step was that a draft report be sent for correction of factual elements. Following any required amendments of fact the final report was sent. It is important to note that anonymity was promised, no institution or individual would be identified or identifiable. Each institution will receive a copy of this final report after it has been accepted. As the research developed dissemination started to take place through the medium of conference presentations (see chapter 4). Findings: There are certain recurring themes from the visits (and these do actually show, to varying extents but are nonetheless present, across the continents. The main headlines are: 

Varied institutions display varied behaviour: Higher education activity still largely falls in to three categories: teaching, research, administration. The nuances of each of these have changed over the years and continue to change. Institutions have proliferated and with that (and the change in most places to mass participation systems even where there is still selection based on prior educational achievement) the variety of missions has changed and the mix of the elements. However, there are students in universities and they are there to learn. The mission of the university will impact on the learning process as will funding patterns, the political will of the state, the background of the student population, etc. Some institutions visited were highly micro managed – this impacted upon the curriculum, staff development, the mix of workload for staff, student staff ratios, assessment calendar, appraisal systems, internal quality assurance etc. Across the spectrum then there were: central macro management, devolved management, selfmanaged within institutional parameters. All styles, leading to varied operating environments. What is clear is that there is a disconnect between what different tiers of responsibility believe/imagine is the higher education landscape and what those who actually participate in the learning process experience. This shows to some extent in each and every institution visited and then, when one looks at 2015 statements regarding the Bologna Process, at the higher policy levels it is writ large with the font size diminishing slightly down the levels until there is, in some places it has to be said, a total lack of actual experience by the students of any active knowledge of, and participation in, the learning outcomes process. From a policy level examples of the perception of success in implementation of a student-centred learning approach are: Implementation Report 2015: “lack of recognition of the value of student evaluation, independent learning and the use of learning outcomes” Trends 2015: “not all these positive developments are common everywhere and, therefore, more progress is needed” Bologna With Student Eyes 2015: “there has clearly been some progress ……… 50% of respondents think that progress is slow…..the other half….are still not convinced that student-centred learning has been made a priority in higher education….” 26

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

This does recognize the lack of progress but, as the research results show, not to the extent of the actual lack of progress. 

Insufficient learning alignment: By learning alignment is meant the continuum of the learning environment from learning outcomes (LO) to the learning activities (LA) to the all essential learning assessment (LA). None of these segments is free standing and can make any meaningful contribution to the learning process without the other two. Learning outcomes are not a passive ossified artefact but must be active (and thus subject to re-evaluation and change after an appropriate feedback loop). The learning activities must reflect the learning outcomes and are now required (European Standards and Guidelines 1.3 2015) to: “encourage students to take an active role in creating the learning process” leading on to learning assessment that “reflects the approach” (that is reflects the student involvement). Once again there was a disconnect here; it varied in magnitude as did the institutions vary. However, although a few institutions were making very positive (in some cases strident) requirements of their staff to engage in all aspects of learning alignment, there remained a lack of report back from students that they could see the connection and that there had been continued efforts to both engage them in the process and to continually communicate with them. So, even where efforts were clear and demonstrable there was still a lack of meaningful penetration. Imagine how disappointing it was where there was no management drive or institutional buy in to ensuring that the learning outcomes approach and learning alignment were embedded in the warp and weft of the learning experience. It was totally obvious and showed no signs of there being a “learning spring” around the corner. In some sessions the lack of engagement by staff involved in pedagogics with the learning outcomes approach was clear (“what do we want to know about learning outcomes for?”) and thus shocking. Where staff development was taking place (why and how not the process of form filling to try and document something without engaging with it staff said) this made a difference. Where there was active engagement in mentoring/coaching that made a difference. Where there had been involvement in projects such as Tuning or in the past ECTS, that made a difference. Where there was institutional indifference or lip service that made a difference – negative.



Vocabulary, semiotics, messaging and communication: Any systematic search through university websites reveals much. Of course there are claimed problems with updating, editing, proof reading. However, the evidence on the websites (prior to a visit) is then confirmed by the visits – there is a lack of consistency in the use of terminology and vocabulary and then documents, web pages, course handbooks, study manuals and discussions confirm this. The question asked of course is “does it matter?”. The answer is “Yes it does matter”. Why? Confusion abounds when terms are used inconsistently, interchangeably and incorrectly. There is no single definition of terms such as ‘competences’, ‘learning outcomes’, ‘learning alignment’, ‘student-centred learning’ but there are recognized definitions used consistently in policy documents and working documents such as ‘’ECTS Users’ Guide 2015”. Adherence to these more commonly used and available definitions with 27

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

the phrase ‘’for the purpose of this document we use the following definitions’’ would start to eliminate wider confusion and would certainly limit internal institutional confusion. At meetings on the visits staff commonly used ‘’competence’’ and ‘’learning outcome’’ as interchangeable terms. Slipping back in to the language of the former paradigm (expert driven delivery), for example ‘’learning goals’’ rather than the language of the new paradigm, for example ‘’learning outcome’’ is more than a slip of the tongue. The semiotics of this is one of confusion, lack of clarity, lack of determination to join the paradigm shift and therefore lack of consistency. This confusion is commonplace. The lack of consistent messaging and communication does lead the stakeholders (across the spectrum) to lack in belief that a paradigm shift is underway, let alone that it has been achieved. This also leads to the question (see above) of how can there be learning alignment when there is a lack of clarity as to what it is that is being aligned. These are more than issues of editing and proof reading; they are issues of a true buy in to the paradigm shift. 

Staff development: As was said in the section above on learning alignment: ‘’Where staff development was taking place (why and how not the process of form filling to try and document something without engaging with it staff said) this made a difference. Where there was active engagement in mentoring/coaching that made a difference. Where there had been involvement in projects such as Tuning or in the past ECTS, that made a difference. Where there was institutional indifference or lip service that made a difference – negative.’’ Staff, well those staff who want to engage and master the learning outcomes approach – and many interviewed were of that mind - felt stranded both by lack of training and by the pull towards research and away from teaching as a career enhancement. It was often mentioned that at the outset of the introduction of their national qualifications frameworks and learning outcomes, there had been some training. From what was said then such development was either viewed as a done deal or any attempt to deal with concepts, benefits etc. was abandoned and replaced by process training. This was anathema to the staff. They want concepts, benefits, links etc. and not form filling to comply with internal QA and audit requirements. Where new projects were launched (for example joint degrees, centres of excellence in teaching etc.) there did tend to be a reinvigoration of training, or often what was much liked was in-house mentoring/coaching and peer-to-peer evaluation of documents. These ventures were both cost effective and engendered a collegial spirit. A main challenge for Higher Education Institutions is the lack of a well-established unit for staff-development. Some examples of excellent staff development provision were found either on university or faculty/school level. Some provision was also at country level. In general it has to be noted however, there is low priority for establishing and sustaining such centres. In many institutions there was a lack of informed trainers. As mentioned above staff will not accept sub-standard process driven ‘training’. They want to understand the concept and benefits of the new 28

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

paradigm. Without this it is feared that this shift will not take place. Use should be made of examples of good practice, which for part of the countries visited, will be outside those countries and therefore require an international endeavour. 

Student reaction All meetings with students were interesting, stimulating and regrettably confirmed beyond reasonable doubt the disconnect that exists between even the most pessimistic of the 2015 reports (BWSE linked to the ESU country coordinator reports) and the reality shown on the ground by the responses of the student interviews. The disconnect was confirmed by the consistent themes that they disclosed, namely: lack of (perceived) communication; lack of understanding of the gains to be had from having a good understanding of their studies and of what they would know, understand and be able to do on completing units of learning; that they displayed learning behaviour immersed in the former paradigm – what are we told, what information do we have, what are the past assessments, how can we best get through this subject. Thus in terms of the learning outcomes approach there was only evidence of a lack of penetration and understanding at first cycle in the vast majority of cases and at second cycle with some evidence of impact, particularly amongst mature students. In terms of student-centred learning, of course the European Standards and Guidelines 1.3 2015 is too recent to have impacted on process, but, notwithstanding this, at first cycle level there was very limited evidence of this, at second cycle there were some green shoots of development. Students were not convinced that there was any link between what was demanded of them and any description or analysis of what outcomes they would achieve by the end of their learning. Some knew that they had been told by some staff of the learning outcomes at the start of their studies but few felt there was consistent communication and messaging about this. Those who did placements (work based learning, internships, stages etc.) did not make any link between learning outcomes and the skills/competences that they could offer an employer. Even where they had been provided with CV writing guidance this link had not been made, nor had the simple benefits they would gain by using such language and demonstrating the competences they had gained from their studies been pointed out. In terms of their studies there was little perceived link to workload from the credits allocated to a unit of study. Some students did know what the norm should be (28 hours per credit being oft quoted) but few felt this was in any sense realistic. Most felt that the workload demanded of them was less than that quoted but there was a general feeling that the smaller the credit allocation the heavier the workload required to achieve the learning outcomes (in their terminology ‘’to pass’’). All institutions operated a post learning review in one form or another; this varied from the very tightly prescribed in terms of scheduling, analysis of responses and feedback to rather haphazard process and follow up, with all shades of process in between. All students felt that if their views were sought (which they were) then there should be some clear line of follow up – analysis of returns, discussion of the data, action plan, action and communication of what had happened and why. Once again the extent of this line of action being in place varied greatly – at the one end of the spectrum staff were replaced if the feedback and data was very negative, at the other no action appeared to be taken or follow up communication made.

29

A Study on the implementation of the learning-outcomes based approach in the EU and the USA



Impact of the National Qualifications Framework and ECTS In particular management and senior staff with management experience, or duties, acknowledged the impact that the introduction of their national framework had made. The link to ECTS in terms of programme structure and profile was also acknowledged. However, those engaged in the teaching did not often see this – of course if the university regulations required a certain format then that was enough (and often this was the case). The Frameworks had been, without exception, a catalyst for change in terms of levels, outcomes (the Dublin Descriptors were often cited as being a significant agent of change), and, of course, creating a fundamental and often fraught change to a 3 cycle system with the consequences of this still reverberating around some country systems.



Impact of Tuning Senior management at all institutions were aware of Tuning, some simply because of having received the documents for the initial approach and others because of involvement over the years with projects or having attended conferences. Staff who had already undertaken the on-line survey had some awareness of Tuning or, as with management had been involved in projects, however, others were not aware of the process. Students were unaware of the process, as they were largely unaware of the learning outcomes approach. There was little brand awareness of Tuning, but where there was awareness and where there had been participation in projects there was great brand loyalty, much more so to Tuning than to any passing knowledge of the learning outcomes approach.



Disconnect This term has become the by-word for the overall findings of the research. By the term is meant the inability to have, throughout the tiers of a higher education institution (and indeed beyond that throughout the European Higher Education Area) a consistent awareness let alone 'buy-in' and adherence to the learning outcomes approach. Given that this is a core element of ECTS, of Frameworks and the European Standards and Guidelines, this has to be both disappointing and indeed a shock and a wake-up call.

30

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

8. Comparison of EU and US outcomes The methodology used in both the EU and the US was the same (Chapter 4) using the online surveys and the visits (list of countries, states and subject areas)17. There were some responses to the surveys and some findings from the visits that did clearly show divergent development patterns and others that showed a remarkable congruence. In terms of the profiles of the institutions and the staff within them the overall picture was one of symmetry: range of institutional missions (from research intensive to teaching intensive); offering of awards/qualifications/credentials, etc. However, a methodological difference was the access of participants to the surveys and the institutions participating in the visits. For the former the US offering was ‘closed’ whereas the EU was ‘open’ (see Chapter 4). For the latter the institutions visited in the US were only from those who had participated in, or were participating in Tuning projects of one type or another (subject area or state), whereas in the EU the approaches had been to a wide range of institutions (not ‘just friends’). When asked about whether they had been informed about the expectations for their courses US faculty gave a much higher ‘Yes’ % than academic staff based in the EU. However, when asked about recognition of prior learning or massive on line courses, or experiential learning the positive response rate from the EU was much higher than that from the US. Yet when it came to words associated with the learning outcomes process/Tuning there was great similarity. Some of the responses might indicate the way that when a project or change is underway it is at the forefront of thinking and discussions but as the launch recedes in to middle history so does the thinking - the learning outcomes approach and its effect recede from the academics’ minds – thus when asked about whether academics/faculty discussions involve student learning, degree outcomes and competences the EU answers were below those from the US (attrition had taken place) but when asked about whether a change had taken place in the way in which the curriculum, degree outcomes, student learning and competences are discussed in the department the EU came out on top. But, in a similar vein but with a different emphasis again, the results from the US were higher than from the EU in terms of the learning outcomes approach acting as a driver for the way in which courses are structured, basing assessments on learning outcomes and tailoring their specialism to fit the needs of the degree programme as a whole. The similarities and differences were equally visible from the reports of the institutional visits. The understanding of the terms and indeed the vocabulary used is inconsistent (in both areas). Too often there is a return to ‘old language’ and talking about learning objectives, also in the unit descriptions there is a lack of attention to the use of verbs, context and level and therefore achieving academic and cognitive progression (‘ratcheting up) does not happen as it should. There is a lack of learning alignment with assessment too often being the forgotten element (in both areas). There is little staff development although the need for it is recognised, in both areas, and indeed the common statement was that knowledge and understanding of the concepts, the benefits and the manner in which to do it well is needed not just the process of how to account for it, lay an audit trail and document what has been done. There are always pockets of good practice within an institution but rarely, in fact never, regrettably, is this seen across an entire institution. One report from the US stated that where 2 year and 4 year colleges align their learning outcomes “transfer is easier and there is a shorter time to degree”. However, the estimation of “time to degree” was seen as 7 years to get “both the Associate Degree and the Bachelors”. A comment must be made that (a) given the widespread belief in Europe that transfer is easy in the US and (b) the misconception that a 4 year degree takes 4 years to get 17

List of countries, states and subject areas

31

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

(when of course the federal government estimates 6 years to be the time to acquire a ’4 year degree’ for most) this may send some shock waves through the “US watchers”. One visit report from the US also backs up the earlier reference to the difference between ‘learning outcomes’ and the ‘outcomes of learning’. Students had said that they want to know the connection between what they are learning and why they are learning it and how this fits in to the broader picture of “the outside world”. All visits showed that there is a belief that learning outcomes are widespread but so too is the disconnect between what policy makers, strategy makers, planners and authorities believe is taking place and what those at the learning/delivery end of the supply chain perceive as happening. One suspects that if the visit sample had, in both areas, been greater and even more diverse, that this would have shown to an even greater extent. The EU and US do have tremendous higher education systems, but they also have large resources (even now) but in both areas there is still a preponderance of academics/faculty that (for whatever reasons, be it seeking promotion/tenure track, pure love of research etc.) that place research above the assessment, learning and teaching of their students. Greater investment in staff development in this area will pay dividends; the visits showed that where good staff development took place the dividends were reaped. This is a long game but for the successful development of the greatest asset the two areas have, their people and the brainpower that resides within them (no knowledge economy without extensive development of all the assets available) careful targeted investment will bring about positive results.

32

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

9. Examples of good practice On the basis of the visits the team has been able to identify a number of good practices that are relevant to the whole sector. Each institution had examples of good practice but not one was exemplary. Nevertheless from these instances it proved to be possible to aggregate cognate areas and thus produce the following list: a) A well-defined university policy on learning, teaching and assessment in accordance with the mission of the institution. However, this policy must be put in to action right through the institution. Having the policy is not sufficient, the institution has to be sure that there is wide acceptance and, indeed, ‘buy in’ to the policy and the action resulting from it. The need for good communication is essential to ensure that all stakeholders are involved, aware and committed to the actions. It can be noted that where a clear policy has been defined and followed through there is a shift of paradigm underway, however, even in these institutions this remains patchy at implementation levels. This means that constant attention to the policy implementation is required for continuing development and success. b) Some universities are working with fixed templates for describing the curriculum as well as its modules and units. These require statements of the profiling of the programme and its learning outcomes as well as the learning outcomes for individual units, plus the learning and teaching methods and the forms of assessment. It is crucial that these are shared with potential as well as actual students. In the set up phase it is essential that these are viewed by the staff as something more than just a ‘tick box administrative task’ but as an integral part of the curriculum development. c) Staff development is an essential component for enhancing study programmes and their delivery that will meet the needs of all stakeholders (both internal to the university and its students as well as external, for example employers and professional organisations). Staff development can have many different forms. What seemed to work best was a central policy underpinned by central funding, the actual staff (who took part in training, advising, mentoring, supporting) was based in a central unit but with well organised and defined links to individual departments, faculties etc. The staff, of course, should be well versed in the paradigm shift taking place and able to communicate this whilst fully understanding the university policies and their place within the wider world. The staff often acted as the ambassadors for the university in national and regional bodies and activities. Decentralised models do exist and where there was alignment with university policies and excellent internal communications with some central co-ordination they too did work effectively. Activities that these models might deliver include: international staff mobility, courses, workshops, peer mentoring, continual professional development, learning gatherings (often ‘learning lunches’), team building, allotting credits to activities to enable staff to accumulate credit to achieve a qualification etc. d) With activities such as curriculum development the building of Teams (including staff, students, central staff development representative, employers, professional body representatives etc.) to take responsibility for defining, organising, implementing and delivering the learning in all of its aspects. This ensures collegial ‘buy in’. e) Structured links to employment and the world of work, including: alumni tracking, visiting lecturers, CV coaching, staff communication on learning outcomes, competences and professional standards, relations with employers, internships/ placements, entrepreneurships. All of these assist the students to understand their place within their studies and how to best present themselves when applying for internships/placements, jobs, further studies. f) National initiatives – these can provide impetus and re-launch the conversation about the paradigm shift. New initiatives are needed on a regular basis because otherwise other new ideas push the ‘older’ ones down the memory and institutional/personal

33

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

priorities. Such initiatives have included: centres of excellence, ‘lecturer of the year’, ‘best university’ etc

34

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

10. Conclusions and Next steps ‘A long way to go …” reflects the findings of this study. This is in terms of the findings in both the inner and outer instruments, being the surveys and site visits respectively. It is fair to conclude that the discourse about the shift of paradigm is taking place, particularly amongst management and staff but less amongst students. There is a long way to go but there is no certainty that the shift will be achieved, indeed it seems that it is finely balanced and could, without additional and continued support, fail. Making it work is the responsibility of all levels involved and cannot be simply left to the academic staff responsible for delivering the programmes. The evidence clearly shows the disconnect between the rhetoric, political ambitions and reality. This has already been reported on in the 2015 analysis of progress (Trends, Implementation, BWSE) and is confirmed by this study; in fact the actual level of penetration is lower than that which was stated in those documents. The main cause of this has been the insufficient communication between the political players and university hierarchies and the academic staff. In fact this was highlighted in the Yerevan Communiqué. There has been a failure to engage with and convince academic staff about the necessity and advantages of this paradigm shift. Many initiatives have been taken in terms of national and international cooperation but has not received the endorsement and support required by the political policy makers. Seed corn funding has proven to be of help in the launch of relevant activities but a long-term commitment is the only way to achieve changes of this magnitude across such a broad spectrum of higher education systems. It has been underestimated by all involved in the process how crucial a commitment to staff training and development is. It must be remembered that most staff in higher education have had no pedagogic/andragogic education and training – most staff are indeed ‘driving without a licence’, they base their own teaching on their own experiences as a student. The world has changed but not the training for life as a university academic involved in facilitating learning and then assessing the achievement of the learning outcomes. What came as a shock was that many ‘trainers/professionals’ interviewed were actually themselves still operating in, and indeed wedded to, the old paradigm of expert driven delivery. Many institutions proved not to have any form of a well working Staff Development Unit with a focus on the new paradigm and all that it entails, including the many benefits to both staff and students. If this is not remedied the future looks bleak. However, any such Units must be positive, well informed, truly engaged and truly serve the needs of the staff and their students in line with institutional policies. They must not be perceived as a ‘side show’. Recognition of such a Unit’s value and ability to enhance and add value to the learning is vital. Success without these factors is unlikely. Full engagement by all actors is a sine qua non for success. Without engaging students and employers in programme design, implementation, delivery and quality assurance there will not be the required level of progress. Good initiatives in this respect are there, but it is a patchwork rather than all pervasive. Given the financial situation students’ show, for obvious reasons, concern about their future role in society. What they observe is a flexible labour market in which they are expected to demonstrate a sufficiently wide range of general competences and where possible some work experience. They know they need subject specific knowledge and skills but do also desire the wider outcomes of learning. In today’s ever changing job market and challenged society it is of crucial importance to involve employers and societal leaders in the educational process, if possible in a structured way. They should be seen as advisers in this process, not decision makers in what should be taught and learnt, that is a collective responsibility but must have at its core the academic staff. Nevertheless their involvement as guest lecturers and placement/internship providers adds great value. Many institutions have already recognised this and taken appropriate steps in that direction.

35

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

To achieve these enhancements follow-up steps are required. The programme of visits was able to engage the institutions once more with the required paradigm shift and to re-launch the dialogue as well as allowing the researchers to analyse the state of play and the needed enhancements. The most important of these are:  





A stronger commitment at national level to achieving the paradigm shift, which is, in any case in the national interest in terms of economic prosperity and a sustainable society. European, as well as national, support to create better conditions for success. This can be both organisational and financial. This also implies a well-defined strategy for communicating the benefits of the paradigm shift at national, institutional and personal level. This might require tailored taskforces to operate at all those levels. Renewed institutional commitment and stronger leadership to achieve the paradigm shift including adopting those good practices that already exist. This requires serious investment in targeted staff development and effective structures for curriculum development and learning backed by an effective quality culture. A systematic approach for analysing the reality of what is happening in practice. This could make use of the robust instruments developed, tested and used in the framework of this study. Site visits by an international team have proven to be of great value both in the analysis that takes place but also in the heightened awareness created. It seems to be the best way to obtain a reliable picture of what is happening and allows for relevant and useful constructive feedback.

36

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

Appendix A - Visit Documents Invitation letter

Dear ……. As you are aware there is an on-going global debate in higher education about shifting from an expert driven model to a learners’ driven model based on Learning Outcomes/Competences. Higher education institutions are in various stages of the introduction of this model, which matches the European Qualifications Framework for LLL as well as National Qualifications Frameworks. To develop a better understanding of the level of implementation a EU-US research project has been set-up co-financed by the European Commission and Lumina Foundation and supported by the US national authorities. This research project is jointly undertaken by the International Tuning Academy Europe and the Lumina Foundation. It aims to determine the state of implementation of the use of the Learning Outcomes/Competences model in both Europe and the US. The project contains two main elements, the first of these is a consultation of main stakeholders: university management at various levels, academic staff, student advisors and students. The second part is a selective sample of in-depth interviews with the stakeholders mentioned. We invite your institution to participate in the second part of this important project. More detailed information about the procedures and implementation is attached. We thank you for your cooperation and look forward to working with you. Yours sincerely,

Robert Wagenaar Director International Tuning Academy Groningen

37

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

Explanatory note The aim of EU-US Study on the implementation of the Learning Outcomes/Competences approach is to determine the extent to which universities have adopted it. A mixed methods approach will be used to find evidence. There are on-line questionnaires addressing different stakeholder groups: university management at various levels, academic staff, student advisors and students. These questionnaires will go to a wide variety of institutions across Europe and the United States. In addition to the questionnaires there is a selective sample of in-depth interviews with the stakeholders mentioned in which around 15 Higher Education Institutions in each continent are invited to participate. Procedure Each selected institution will be contacted by e-mail and/or phone. The proposed procedure is as follows:



 

A one day visit to the institution by one or two researchers; The aim of the visit will be the following: - Interviews with the university management at various levels, including policy officers if appropriate (to be agreed with the researcher prior to the visit); -Small focus groups with academic staff, including student advisors if appropriate (from two disciplines/subject areas); -Small focus group with students (from at least the same as the academic staff areas). -Informal feedback will be offered following the visit. A written report outlining the findings of the interviews and focus groups will be provided at a later date. This report is confidential. A synthesis report for the project will be published and the participating institutions will receive a copy. No institution will be identified in the report; anonymity is guaranteed.

Further information If you wish to access further information the following URLs will be of use: Lumina Foundation: http://www.luminafoundation.org Tuning USA: http://www.tuningusa.org Tuning Europe: www.unideusto.org/tuningeu Contact points Robert Wagenaar Tim Birtwistle

[email protected] [email protected]

38

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

DRAFT FOR THE VISIT TO THE UNIVERSITY OF XXX

General aim of the study: The aim of EU-US Study on the implementation of the Learning Outcomes/Competences approach is to determine the extent to which universities have adopted it. The methodology is to use a variety of instruments to find evidence (mixed methodology: online questionnaire plus in-depth interviews). In this visit, the second part of the study (qualitative one with interviews and focus groups) will be conducted. Proposed schedule of meetings: Date 9.00 – 10.00 Interview(s) with the university management (selected people): Rector or Vice Rector or policy officer (to be followed later in the day, or the following morning by a preliminary informal feedback and agreement about the written report outlining the findings) 10.30 – 12.00 Small focus group with academic staff from xxx, including student advisors if appropriate; the optimal number of people is 5/6 12.00 – 13.30 Small focus group with academic staff from yyy, including student advisors if appropriate; the optimal number of people is 5/6 13.30 – 15.00 Break 15.00 – 17.00 Small focus group with students; the optimal number of people is 10/12, approximately 5/6 from xxx and 5/6 from yyy , all combined in the same session 17.00 – 18.00 Further interview(s) to other staff members who may be interested and available 18.30 – 19.30 Informal feedback to the internal organizer and to management staff available (which can be postponed to the following morning )

A written report outlining the findings of the interviews and focus groups will be provided at a later date. This report is confidential. A synthesis report for the project will be published and the participating institutions will receive a copy. No institution will be identified in the report; anonymity is guaranteed

39

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

Note: Please note that this is a draft schedule and it can be changed and tailored within the whole day, depending on the availability of academic staff, teachers and students. MAIN TOPICS OF THE INTERVIEWS AND FOCUS GROUPS MANAGEMENT:      

Level of information and implementation about LOs/competences approach Support for teachers in using LOs/competences Previous knowledge or experience about Tuning (if applicable) Strengths and weaknesses that academic community faced in using LOs/competences approach Influence of LOs/competences in student performance and their future entering the workplace Evaluation and quality assurance of course design and delivery

ACADEMIC STAFF:       

Level of information and implementation about LOs/competences approach Support for teachers in using LOs/competences Previous knowledge or experience about Tuning (if applicable) Influence of LOs/competences in teaching, learning and assessment strategies as well as in student performance and their future entering the workplace Workload of learning activities and LOs/competences Main challenges in using this approach and opportunity of sharing them with colleagues Evaluation and quality assurance of course design and delivery

STUDENTS:    

Level of information about LOs/competences approach in the attended courses Workload of learning activities and assessment formats Opportunity to reflect on learning and to find a suitable job through the use of LOs/competence approach Procedures for student evaluation of teaching Previous knowledge or experience about the term “Tuning” (if applicable)

40

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

LISTS OF QUESTIONS TO FOCUS GROUPS FOCUS GROUP FOR ACADEMICS 1. 2. 3. 4.

How long have you been teaching? How long have you been teaching at the university What level(s) of courses do you teach? To what extent are you informed about LOs/competences approach? How was this approach introduced to the faculty/academic staff? (for example, briefings, through memos, verbally, etc). 5. Were you provided with some kind of support (seminars, materials, peer support, memos, mentoring, etc) or feedback? 6. Have the learning outcomes/competences of your curriculum/a and its individual units been formulated by a team of academics? 7. How do you use LOS/competences approach in the course units you teach? 8. Have you heard of the term "Tuning"? If "Yes" what does it mean to you? 9. How has the use of learning outcomes/competences changed your teaching, learning and assessment strategies? Could you provide us with examples 10. Do you believe that the LO/competence approach improves student performance? Why? Do you believe that they are better prepared for finding a suitable job? 11. In your opinion which are the main challenges when using this approach in your classes? Is there any formal mechanism to share these challenges with colleagues? 12. Are the learning outcomes/competences of your course presented and explained in class? 13. Does your university/faculty apply a system of evaluation of your course design and delivery? FOCUS GROUP FOR STUDENTS 1. In which degree programme(s) are you enrolled? Which year? 2. Have you been informed about LOs/competences approach and to what extent? 3. Do you know for each of your courses the expected learning outcomes/competences? Did the lecturers share this information with you at the beginning and during the course? 4. Do your assessments test the learning outcomes? 5. Do you believe that the workload of each learning activity is appropriate (this may tie in with credits)? 6. Do you believe that the LO/competence approach has given you an understanding about what and why you are learning? Do you believe that this approach helps you to find a suitable job? 7. Do you evaluate your course units using standardised procedures (teaching, learning, assessment, using of learning outcomes)? 8. Have you heard of the term "Tuning"? If "Yes" what does it mean to you?

41

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

INTERVIEW FOR MANAGEMENT 1. How long have you been in your present position and in the university? 2. To what extent has the institution been introduced into the LOs/competences approach? What’s the level of implementation of this approach? 3. Beside the national regulations, does the institution have an internal procedure for incorporating the learning outcomes /competences approach into the (bachelor, master and doctorate) programmes? 4. As an institution, do you provide faculty members with some kind of staff development support (seminars, materials, peer support, memos, mentoring, etc)? 5. Have you heard of the term "Tuning"? If "Yes" what does it mean to you? Are you using the Tuning approach in your institution? 6. How has the academic community reacted to the application of learning outcomes/competences? Which are the main strengths and weaknesses they faced in using them? 7. Do you believe that the LO/competence approach improves student performance? Why? Do you believe that they are better prepared for finding a suitable job? 8. Does your university/faculty apply a system of evaluation of your course design and delivery?

42

A Study on the implementation of the learning-outcomes based approach in the EU and the USA

Appendix B - List of Countries, States and Subject Areas LIST OF COUNTRIES, STATES AND SUBJECT AREAS List of Countries Germany

Netherlands

Slovenia

Norway

Romania

Spain

Italy

Austria

Portugal

Ireland

Poland

Lithuania

Belgium

Sweden

List of States Texas

Maryland

California

Michigan

Utah

New York

Indiana

North Carolina

List of Subject areas Administration

International Business

Aeronautics

Mathematics

Architecture

Mechanical Engineering and Mechatronics

Arts Banking and Finance Biology Biotechnology Business Business Administration Chemistry Christianity Computer Science Economics Electrical Engineering Electronics Engineering Facility Management Foreign Languages

Media (TV & Radio) Medieval & Early Modern History Modern British History Pedagogy Philosophy Physics Physiotherapy Psychology Religious Studies Sociology Strategic Management US History Veterinary Medicine Veterinary Sciences

Gender Studies History Information Technology 43