Interactive online resources to teach Physics: quality criteria and ...

Aug 17, 2010 - with quality feedback? 3.1. Does the material operate in an understandable manner? 3.2. Is the general layout of the material consistent and intuitive? 3.3. Does the material provide effective feedback? 3.4. Is the material documented and does it have useful instructions? 1.1. Software compatibility. 1.2.
518KB Sizes 0 Downloads 89 Views
Interactive online resources to teach Physics: quality criteria and teachers’ expectations Jorge Andrade Silva(1), José Paulo Cravino(2)(3), Norberto Jorge Gonçalves(2)(4) (1) (2)

Henrique Medina Secondary School, Esposende, Portugal

Physics Department, School of Sciences and Technology, University of Trás-os-Montes e Alto Douro, Vila Real, Portugal (3)

Research Centre for Didactics and Technology in Teacher Education, Univ. of Aveiro, Aveiro, Portugal (4)

GCEP-Center of Physics, Univ. of Minho, Braga, Portugal

E-mail: [email protected], [email protected], [email protected]

Abstract Current reforms in Physics education call for the integration of digital technologies into teaching, advocating that students learn Physics content and processes through technology (Dani and Koenig, 2008) and the Internet offers a rich source of potential learning resources (Richards, 2005). However, for most Physics teachers, there is a huge difficulty to select valuable resources and to understand how their pedagogical potential in formative situations can be optimized. To address this gap, we propose the development of a feasible list of criteria to evaluate interactive online resources, based on four contributions dealing with interactive materials evaluation from the perspective of their use by teachers (CEIMSC1, MERLOT2, MPTL3 and SACAUSEF4). We also show the preliminary results of a survey of a sample of Portuguese Physics teachers, concerning the way they find, use, and evaluate the quality of interactive online resources. The proposed list of criteria for the evaluation of interactive online resources is compared with the results from this survey to find if they match the teachers’ needs and perceptions about this issue.

Interactive online resources to teach Physics In the universe of educational resources for Physics in digital format, we classify an interactive online resource for the teaching of Physics as a software application available on the Internet that promotes and facilitates interaction with the user, through software specifically designed and intended to be used in educational situations in Physics (Ramos 1998). A search on any search engine on the Internet reveals the existence of a large number of interactive resources (Clinch & Richards, 2002). However, this abundance of resources has not materialized in educational use in classrooms, despite continuous growth of computer facilities in schools for their use (Eteokleous, 2008) and the existence of a lot of resources to teach Physics in the Internet (Richards, 2005).

1

http://www.ceismc.gatech.edu http://www.merlot.org 3 http://www.mptl.eu 4 http://www.crie.min-edu.pt/index.php?section=92 2

Despite an often instinctive skepticism, many teachers consider the Internet a rich source of learning resources, specifically interactive ones, which present different levels of abstraction, help students gain a better understanding (Altherr et al., 2004) and allows them to become actively involved in learning (Renkl & Atkinson, 2007). However, some teachers simply are unaware of their existence. To others, there is a huge difficulty to select valuable resources and to understand how their pedagogical potential in formative situations can be optimized. In fact, one cannot expect most teachers to effectively find, evaluate, and use these resources, in addition to their already high workload (Mathelitsch, 2008). It is against this background that the criteria for the validation of resources appear as a decisive step to promote its use in order to equip teachers with tools reliable and easily accessible. The overcome of this barrier will allow a better judgment and critical evaluation of digital resources in the target ground for which they were created for in different contexts (Bratina et al., 2002) and learning environments. This step seems crucial to the development of the existent and new resources. Interactive online resource’s evaluation The evaluation of the quality of interactive online resources established itself as a central issue because there remains doubt about the educational value of many products available that can be used in school settings and for educational purposes (Dani & Koenig, 2008). In this sense, Costa (2005) argues that the assessment should highlight the need to guide teachers in knowledge and possible uses of resources that are available to them, providing effective integration into the curriculum, with pedagogical sense and set the specific educational processes. Le & Le (2007) report that there is a paucity of systematic studies on the educational use and validation of interactive online resources and those that exist are constructed from the perspective of evaluators, not the users. They advocate that the use of educational resources reflects the view of the teacher on how the student is supposed to learn and not on what impact the action may (or not) come to have on student learning. Mathelitsch (2008) argues that the evaluation of interactive online resources to teach Physics must be framed in a more ambitious process, including the selection of resources, the inclusion in a database and the evaluation using criteria of quality consolidated and standardized. Existing criteria lists The interactive online resources evaluation is usually supported by grids and scales previously developed and produced in accordance with the points that are critical to its viability teaching. For example, the United States in 1994, the CEISMC (Center for Education Integrating Science, Mathematics and Computing) has established an evaluation grid of virtual educational resources (available at http://www.ceismc.gatech.edu/MM_Tools/ERC.html), which incorporates the evaluator's opinion on issues related to instructional design, the aesthetic design, and functionality of the program, in a scale of 10 items. The Portuguese system of evaluation and certification - SACAUSEF - adopted a descriptive evaluation of mixed grid, which differentiate five areas: technical, content, pedagogy, language and values and attitudes. The evaluator, who is a specialized teacher to evaluate

educational resources of his curricular area, is also responsible for implementing an evaluation with students in the context of the classroom. The MERLOT project set for resource assessment a list of criteria divided into three areas: content quality, effectiveness as a Teaching-Learning Tool and ease of use. It also specifies with that grid, available in http://physics.merlot.org/PeerReviewCriteria.html, a set of criteria to be assessed by the reviewers in each field of analysis. In an article by Altherr et al. (2004) is made the state of art of multimedia resources for teaching physics, particularly in phases of research, selection and evaluation, arising from the work of the project team MPTL. This publication sets a list of criteria used to evaluate the quality of the resources embodied in the criteria applied to two practical examples. This article constitutes one of the explicitly cases of scientific publications found in the phase of research that relate specifically to how the materials to support physics classes available on the Internet are searched, selected and evaluated by teachers. The list proposed by the MPTL group (Altherr et al., 2004) establishes three criteria areas: motivation, content and method. The first area focuses on the use of empathy with the user and ease of navigation, the second in its scientific correctness and the third with the applicability of the resource and its suitability for teaching in pedagogical intended. The evaluation domains and the specific criteria of the aforementioned institutions are shown in Table 1.

CEIMSC (USA)

MERLOT (USA/CAN)

SACAUSEF (PT)

MPTL (EU)

Evaluation domains 1.Instructional design 2.Cosmetic design 3.Program functionality

1.Quality of content 2. Effectiveness as a TeachingLearning Tool 3.Ease of use

1.Technical 2.Content 3.Pedagogical 4.Linguistic 5.Values

1.Motivation 2.Content 3.Method

1.1. Does the material present valid (correct) concepts, models, and results? 1.2. Does the material present important physics concepts or models? 1.3. Does the material help develop conceptual understanding of physics? 1.4. Does the material make effective use of graphics and multimedia? 1.5. Is the material flexible?

1.1. Software compatibility 1.2. Design 1.3. Existence of useful instructions 1.4. Available functionalities 1.5. Users help

1.1. Userfriendliness 1.2.Attractiveness 1.3.Clear description of purpose and work assignment

2.1. Does the material relate to the learner's knowledge and needs? 2.2. Does the material promote knowledge development? 2.3. Does the material provide a quality learning experience? 2.4. Does the material provide learners with quality feedback?

3.1. Appropriation to curricular program goals 3.2. Possibility of curriculum integration 3.3. Respect for different learning rhythms

Specific criteria 1.1. This IMM provides learners with a clear knowledge of the program objectives. 1.2. The instructional interactions in this IMM are appropriate for the objectives. 1.3. The instructional design of this IMM is based on sound learning theory and principles. 1.4. The feedback in this IMM is clear. 1.5. The pace of this IMM is appropriate. 1.6. The difficulty level of this IMM is appropriate. 2.1. The screen design of this IMM follows sound principles. 2.2. Color is appropriately used in this IMM. 2.3. The screen displays are easy to understand. 3.1. This IMM operated flawlessly.

3.1. Does the material operate in an understandable manner? 3.2. Is the general layout of the material consistent and intuitive? 3.3. Does the material provide effective feedback? 3.4. Is the material documented and does it have useful instructions?

2.1. Scientific correctness 2.2. Matching to target group 2.3. Content appropriate to learner’s needs.

4.1. Grammatical correction 4.2. Language clearness 5.1. Absence of stereotypes and preconceptions 5.2. Promotion of male and female equality 5.3. Absence of violence’s incitation 5.4. Nature and environment friendliness

Table 1: Lists of quality criteria of educational resources.

2.1. Relevance 2.2. Scope 2.3. Correctness 3.1.Flexibility 3.2.Matching to target group 3.3.Realization 3.4.Documentation

Despite of using different terminology, we can see in a deeper analysis that there is a common trunk between the criteria adopted by each entity. The content area appears explicitly in the four lists, verifying that they share also the specific criteria: scientific correctness and appropriateness to the theme to which they relate. The technical field is also transverse to all lists, even under the guise of different terminology: "Ease of use" in MERLOT and "Motivation" in MPTL; CEIMSC on the criteria to be spread over the fields "Design" and "Functionality ". In all lists pedagogical criteria are set, though there are differences in both the name adopted as the relative weight. In this regard it is noted that the criteria listed in CEIMSC, the oldest of four, have its own domain and part of the content area, and that the list of MPTL disperse the fields "Motivation" and "Method". The list adopted by SACAUSEF incorporates the areas of "Language" and "Values and attitudes" to the three already mentioned. The criteria for these areas do not appear explicitly in the other lists, although that could fit the part of the content. Table 2 shows that the criteria of the four lists referenced can be framed in the areas of content, pedagogical and technical.

CEIMSC (USA)

MERLOT (USA/CAN)

SACAUSEF (PT)

MPTL (EU)

Content domain 1.1. This IMM provides learners with a clear knowledge of the program objectives. 1.6. The difficulty level of this IMM is appropriate. 2.3. The screen displays are easy to understand.

1.1. Does the material present valid (correct) concepts, models, and results? 1.2. Does the material present important physics concepts or models? 1.3. Does the material help develop conceptual understanding of physics? 1.4. Does the material make effective use of graphics and multimedia? 1.5. Is the material flexible?

2.1. Scientific correctness 2.2. Matching to target group 2.3. Content appropriate to learner’s needs. 4.1. Grammatical correction 4.2. Language clearness 5.1. Absence of stereotypes and preconceptions 5.2. Promotion of male and female equality 5.3. Absence of violence’s incitation

1.3.Clear description of purpose and work assignment 2.1. Relevance 2.2. Scope 2.3. Correctness

Pedagogical domain 1.2. The instructional interactions in this IMM are appropriate for the objectives. 1.3. The instructional design of this IMM is based on sound learning theory and principles. 1.4. The feedback in this IMM is clear.

2.1. Does the material relate to the learner's knowledge and needs? 2.2. Does the material promote knowledge development? 2.3. Does the material provide a quality learning experience? 2.4. Does the material provide learners with quality feedback?

3.1. Appropriation to curricular program goals 3.2. Possibility of curriculum integration 3.3. Respect for different learning rhythms 5.4. Nature and environment friendliness

3.1.Flexibility 3.2.Matching to target group 3.3.Realization 3.4.Documentation

Technical domain 1.5. The pace of this IMM is appropriate. 2.1. The screen design of this IMM follows sound principles. 2.2. Color is appropriately used in this IMM. 3.1. This IMM operated flawlessly.

3.1. Does the material operate in an understandable manner? 3.2. Is the general layout of the material consistent and intuitive? 3.3. Does the material provide effective feedback? 3.4. Is the material documented and does it have useful instructions?

1.1. Software compatibility 1.2. Design 1.3. Existence of useful instructions 1.4. Available functionalities 1.5. Users help

1.1. Userfriendliness 1.2.Attractiveness

Table 2: Lists of quality criteria separated by content, pedagogical and technical domains.

The compatibility between the lists is supported by the joint evaluation of resources by projects MERLOT and MPTL, each one using its own list. Mathelitsch (2008) states that this process involves two stages: a preliminary assessment of resources, made by one entity responsible for making triage of the resources to undergo a full evaluation, and a second phase, in which resources validated in the Preliminary undergo an independent evaluation by each entity, reaching at the end a final compromise between the results of each evaluator. Geissinger (1997) argues in this respect that organizations that deal with these resources should work towards the establishment of standardized criteria and terminology.

Proposal for quality criteria list In pursuit of uniformity of criteria and terminology used for the evaluation of educational resources available on the Internet and given the specificity of the materials adopted for study in this work (interactive resources in the field of physics), we propose the adoption of a list of criteria (see Table 3 ).

RESOURCE’S EVALUATION A. Content

NA

1

2

3

4

NA

1

2

3

4

NA

1

2

3

4

A1: Scientific correction A2. Clarity in the use of physics concepts A3: Appropriateness of language and linguistic correctness A4. Absence of prejudices and stereotypes

B. Pedagogical effectiveness B1: Integration into the curriculum B2: Suitability to target audience B3: Suitability for the existence of guidelines for didactic exploration B4. Links to other related resources B5. Inclusion of formative assessment tools B6. Sufficiency of the metadata

C. User friendliness C1: Ease of navigation C2: Features available

Global appreciation:

Legend: NA=Not applicable, 1=Strongly disagree, 2=Disagree, 3=Agree, 4=Strongly agree

Table 3: Proposed list of quality criteria.

The list of criteria proposed to assess the quality of online interactive resources for teaching Physics is distributed in three general categories: content, pedagogical effectiveness and user friendliness. Each of these areas contains indicators to evaluate a scale with five possibilities of assessment, and NA means not applicable and the numerals 1-4 correspond to the legend shown in table 3. The incorporation of the option NA (Not Applicable) results from the event that there are resources whose evaluation in any of the criteria does not apply or is not justified. The adoption of these three areas of evaluation stems from our alliance with the already agreed classification by the entities discussed above. In the field of Content each application should be scientifically correct, with clarity and accuracy in language use and with no prejudices of race or gender and incitement to violence of any kind. The Pedagogical Effectiveness field is designed to ensure the adequacy of the teaching activities, by assessing items such as its framework the curriculum guidelines of the level of

education that is intended, the adequacy of the degree of difficulty that presents and ease of integration with supporting materials to class (or classes) that is supposed to be used on. The ease of use domain appreciates the ease of navigation and available features such as audio/video and printing results or export them to other programs. The evaluation process culminates with a space where the evaluator can describe some other important aspects which have not been adequately protected in the list of criteria.

Survey to a Portuguese Physics teachers’ sample In order to validate (or not) the quality criteria suggested in the list, a sample of Portuguese Physics teachers was asked about the way they find, use, and evaluates the quality of interactive online resources. The preliminary results of this exploratory study are presented. Survey’s presentation The questionnaire was created on a spreadsheet of "Microsoft Excel" and sent for teachers to complete and return via email. There were 3 groups of questions, relating to: a) how Physics teachers find and select valuable resources; b) their opinion about the relative importance of evaluation criteria; c) their suggestions for promoting the use of online interactive resources for teaching Physics. A total of 182 Physics teachers email addresses received the questionnaire as an attached file. The e-mail contained a text presenting the project, requesting help in its implementation, and ensuring anonymity and assuring that the data collected are used only for research purposes. From this sample, 50 teachers answered the questionnaire, 31 female. On average, teachers’ age was about 43 years old and they were just half of their career. Most respondents have a higher education degree and mainly teach secondary school levels. Survey’s main conclusions About the main difficulty for the use of interactive online resources, respondents consider the lack of computer equipment in classrooms, followed in importance by the difficulty in finding reliable resources, lack of time to prepare and implement activities with the resources and lack of assessment of quality of materials. More specific operational aspects of the application in the classroom are considered of lesser degree of difficulty. In the ranking of quality criteria for evaluation of resources in the content area, respondents clearly established as first priority scientific correction, followed by clarity in the use of terms and physical concepts and the contribution to the conceptual understanding of physics. Aspects as the idiom, the goals presentation, the absence of prejudices and stereotypes and the inclusion of graphs and multimedia elements were minored by the teacher’s sample. In the pedagogical domain, the primary criterion is that of promoting the development of learning, followed by integration into the curriculum, the appropriateness to the target audience and the flexibility of teaching resource. Factors such the existence of appropriate guidelines for teaching, the establishment of links to other materials that reinforce or

complement the resources, the availability of tools that contribute to the formative evaluation and the adequacy of the descriptors of the characteristics the resource (metadata) are less important for teachers. In the technical field, the most important criteria are the accuracy and perceptibility of the operating mode of the application and the ease of navigation. The layout attractiveness is clearly considered as a subsidiary factor. Comparing the preliminary results of this study with the proposed criteria list we can conclude that in all domains there is a very strong correspondence in the most theoretical and generic criteria at the expense of those specific and operational. To promote the use of resources, respondents valorize the teacher training in planning and implementing classes with these kind of resources as well as accessing to the feedback of the resource’s use by other teachers. They also considered important ensuring that resources have been evaluated by their peers and being able to search quickly and efficiently a resource for particular program content. The need for suggestions on how to implement the interactive online resources in teaching and learning is strongly addressed too. Three quarters of teachers surveyed expressed their willingness to participate in a project to create a community of physics teachers who use, evaluate and propose the inclusion of new features.

Conclusion The ease of finding and accessing interactive online resources is a critical requirement to promote their use in the classroom. One of the most important conditions for Physics teachers to select and use these resources as didactical tools is their submission to an efficient and pragmatic evaluation process and having the possibility of accede to the feedback of other teachers’ implementation in class. We present a list of 12 evaluation criteria distributed in three general categories: content, pedagogical effectiveness and user friendliness. The results from a survey to a sample of Portuguese Physics teachers show that the criteria proposed in the list strongly match with the teachers’ needs and perceptions about this issue, particularly the most generic ones. The pedagogical potential of online interactive resources for the teaching of Physics can only be fully realized unless it satisfies the quality criteria and existence of guidelines matching the curricula and teaching practices.

References: Altherr, S., Wagner, A., Bodo, E., & Jodl, H. J. (2004). Multimedia material for teaching physics (search, evaluation and examples). European Journal of Physics, 25, 7–14. Bratina, TA, Hayes, D. & Blumsack, SL (2002). Preparing Teachers To Use Learning Objects. The Technology Source. Available online (17/08/2010) at http://ts.mivu.org/ default.asp?show=article&id=1034

Clinch, J., Richards, K. (2002). How can the Internet be used to enhance the teaching of Physics? Physics Education, 37 (2), pp. 109-114 Costa, F. A. (2005). Avaliação de software Educativo. Ensinem-me a pescar! Cadernos SACAUSEF I, 45-53 Dani, D. E., & Koenig, K. M. (2008). Technology and Reform-Based Science Education. Theory Into Practice, 47(3), 204 – 211. Eteokleous, N. (2008). Evaluating computer technology integration in a centralized school system. Computers & Education, 51(2), 669–686. Geissinger, H. (1997). Educational software: Criteria for evaluation. In R. O. R. P. R. Kevill (Ed.), Proceedings of ASCILITE'97 on what works and why? (Vol. 219-225). Perth, WA: Curtin University of Technology. Lê, Q., Lê, T. (2007). Evaluation of Educational Software: Theory into Practice. Available on-line (12/08/2010) at http://eprints.utas.edu.au/1328/1/11-Le-P.pdf Mathelitsch, L. (2008). Multimedia in Physics Teaching and Learning. In B.G. Sidharth, F. Honsell, O. Mansutti, K. Sreenivasan and A. De Angelis (Eds.), Proceedings of the International Conference “Frontiers on Fundamental and Computational Physics” (pp. 217-220). American Institute of Physics. Ramos, JL (1998). A criação e utilização de micromundos de aprendizagem como estratégia de integração do computador no currículo do ensino secundário. PhD dissertation. University of Évora. Renkl, A & Atkinson, RK (2007). Interactive Learning Environments: Contemporary Issues and Trends. An Introdution to the Special Issue. Educational Psychology Review 19:235–238 Richards, C. (2005). The design of effective ICT-supported learning activities: exemplary models, changing requirements, and new possibilities. Language Learning & Technology, 9(1), 60-79.