INQUIRY BASED SCIENCE EDUCATION BRINGING THEORY TO PRACTICE

Similar documents
2 nd grade Task 5 Half and Half

George Mason University College of Education and Human Development Secondary Education Program. EDCI 790 Secondary Education Internship

Maximizing Learning Through Course Alignment and Experience with Different Types of Knowledge

PEDAGOGICAL LEARNING WALKS: MAKING THE THEORY; PRACTICE

Guidelines for Mobilitas Pluss postdoctoral grant applications

Referencing the Danish Qualifications Framework for Lifelong Learning to the European Qualifications Framework

CONCEPT MAPS AS A DEVICE FOR LEARNING DATABASE CONCEPTS

Guidelines for Mobilitas Pluss top researcher grant applications

November 2012 MUET (800)

EDUC-E328 Science in the Elementary Schools

WHY SOLVE PROBLEMS? INTERVIEWING COLLEGE FACULTY ABOUT THE LEARNING AND TEACHING OF PROBLEM SOLVING

Assessment of Inquiry Skills in the SAILS Project

Inquiry and scientific explanations: Helping students use evidence and reasoning. Katherine L. McNeill Boston College

Developing Students Research Proposal Design through Group Investigation Method

Graduate Program in Education

The Impact of Honors Programs on Undergraduate Academic Performance, Retention, and Graduation

Last Editorial Change:

Information Sheet for Home Educators in Tasmania

Secondary English-Language Arts

Inquiry Based Science Education in Europe: Setting the Horizon 2020 Agenda for Educational Research?

Practices Worthy of Attention Step Up to High School Chicago Public Schools Chicago, Illinois

ACTION LEARNING: AN INTRODUCTION AND SOME METHODS INTRODUCTION TO ACTION LEARNING

MADERA SCIENCE FAIR 2013 Grades 4 th 6 th Project due date: Tuesday, April 9, 8:15 am Parent Night: Tuesday, April 16, 6:00 8:00 pm

Unit: Human Impact Differentiated (Tiered) Task How Does Human Activity Impact Soil Erosion?

Developing an Assessment Plan to Learn About Student Learning

Management of time resources for learning through individual study in higher education

Teaching in a Specialist Area Unit Level: Unit Credit Value: 15 GLH: 50 AIM Awards Unit Code: GB1/4/EA/019 Unique Reference Y/503/5372

Assessment and national report of Poland on the existing training provisions of professionals in the Healthcare Waste Management industry REPORT: III

Procedia - Social and Behavioral Sciences 209 ( 2015 )

Observing Teachers: The Mathematics Pedagogy of Quebec Francophone and Anglophone Teachers

DIDACTIC MODEL BRIDGING A CONCEPT WITH PHENOMENA

Inquiry Learning Methodologies and the Disposition to Energy Systems Problem Solving

Georgetown University School of Continuing Studies Master of Professional Studies in Human Resources Management Course Syllabus Summer 2014

MASTER S THESIS GUIDE MASTER S PROGRAMME IN COMMUNICATION SCIENCE

5. UPPER INTERMEDIATE

Teacher: Mlle PERCHE Maeva High School: Lycée Charles Poncet, Cluses (74) Level: Seconde i.e year old students

Abstractions and the Brain

Contract Language for Educators Evaluation. Table of Contents (1) Purpose of Educator Evaluation (2) Definitions (3) (4)

NSU Oceanographic Center Directions for the Thesis Track Student

STRETCHING AND CHALLENGING LEARNERS

Unit 7 Data analysis and design

Ministry of Education General Administration for Private Education ELT Supervision

Case study Norway case 1

understandings, and as transfer tasks that allow students to apply their knowledge to new situations.

Procedia - Social and Behavioral Sciences 141 ( 2014 ) WCLTA 2013

Algebra 1, Quarter 3, Unit 3.1. Line of Best Fit. Overview

ASSESSMENT REPORT FOR GENERAL EDUCATION CATEGORY 1C: WRITING INTENSIVE

Stacks Teacher notes. Activity description. Suitability. Time. AMP resources. Equipment. Key mathematical language. Key processes

2007 No. xxxx EDUCATION, ENGLAND. The Further Education Teachers Qualifications (England) Regulations 2007

ACADEMIC AFFAIRS GUIDELINES

USER ADAPTATION IN E-LEARNING ENVIRONMENTS

From practice to practice: What novice teachers and teacher educators can learn from one another Abstract

What Makes Professional Development Effective? Results From a National Sample of Teachers

The College Board Redesigned SAT Grade 12

MASTER S COURSES FASHION START-UP

Teachers Guide Chair Study

Blended E-learning in the Architectural Design Studio

ENG 111 Achievement Requirements Fall Semester 2007 MWF 10:30-11: OLSC

THE PENNSYLVANIA STATE UNIVERSITY SCHREYER HONORS COLLEGE DEPARTMENT OF MATHEMATICS ASSESSING THE EFFECTIVENESS OF MULTIPLE CHOICE MATH TESTS

Lab 1 - The Scientific Method

Rules and Regulations of Doctoral Studies

Document number: 2013/ Programs Committee 6/2014 (July) Agenda Item 42.0 Bachelor of Engineering with Honours in Software Engineering

Student Perceptions of Reflective Learning Activities

The Extend of Adaptation Bloom's Taxonomy of Cognitive Domain In English Questions Included in General Secondary Exams

How to Judge the Quality of an Objective Classroom Test

TASK 2: INSTRUCTION COMMENTARY

P. Belsis, C. Sgouropoulou, K. Sfikas, G. Pantziou, C. Skourlas, J. Varnas

Strategies for Solving Fraction Tasks and Their Link to Algebraic Thinking

Whole School Evaluation REPORT. Tigh Nan Dooley Special School Carraroe, County Galway Roll Number: 20329B

E-Learning Using Open Source Software in African Universities

INTERNATIONAL BACCALAUREATE AT IVANHOE GRAMMAR SCHOOL. An Introduction to the International Baccalaureate Diploma Programme For Students and Families

What can I learn from worms?

A Systems Approach to Principal and Teacher Effectiveness From Pivot Learning Partners

INQUIRY-BASED SCIENCE EDUCATION IN DIMENSIONAL MEASUREMENT TEACHING

Contact: For more information on Breakthrough visit or contact Carmel Crévola at Resources:

Disciplinary Literacy in Science

Full text of O L O W Science As Inquiry conference. Science as Inquiry

Integration of ICT in Teaching and Learning

Analysis: Evaluation: Knowledge: Comprehension: Synthesis: Application:

Lecturing Module

Third Misconceptions Seminar Proceedings (1993)

Indiana Collaborative for Project Based Learning. PBL Certification Process

Functional English 47251

Working with Rich Mathematical Tasks

Physics 270: Experimental Physics

Teachers response to unexplained answers

Guidelines for Writing an Internship Report

learning collegiate assessment]

PUBLIC CASE REPORT Use of the GeoGebra software at upper secondary school

What is PDE? Research Report. Paul Nichols

BENGKEL 21ST CENTURY LEARNING DESIGN PERINGKAT DAERAH KUNAK, 2016

teaching essay writing presentation presentation essay presentations. presentation, presentations writing teaching essay essay writing

Formative Assessment in Mathematics. Part 3: The Learner s Role

Pedagogical Content Knowledge for Teaching Primary Mathematics: A Case Study of Two Teachers

Study Group Handbook

SPATIAL SENSE : TRANSLATING CURRICULUM INNOVATION INTO CLASSROOM PRACTICE

ASSESSMENT OF STUDENT LEARNING OUTCOMES WITHIN ACADEMIC PROGRAMS AT WEST CHESTER UNIVERSITY

Classroom Assessment Techniques (CATs; Angelo & Cross, 1993)

Mathematics subject curriculum

Metadiscourse in Knowledge Building: A question about written or verbal metadiscourse

Transcription:

INQUIRY BASED SCIENCE EDUCATION BRINGING THEORY TO PRACTICE Kinga Orwat, Paweł Bernard, Karol Dudek Jagiellonian University in Krakow, Poland kinga.orwat@uj.edu.pl, bernard@chemia.uj.edu.pl, karol.dudek@uj.edu.pl ABSTRACT Teaching methods based on inquiry are more and more widely used in teaching natural sciences. In accordance with the new curriculum, introduced to Polish schools in 2008, students should gain new knowledge based on the open-inquiry method. This fact is related to a change of teachers approach in the didactic process and a change in the assessment system. This article undertakes an attempt to analyse how a teacher - who has theoretical knowledge in the field of IBSE teaching method and evaluating students working in this manner - does the said in practice at the level of upper secondary school, during chemistry classes. This research was a case study and it was based on materials and teachers trained in the framework of the SAILS project. KEYWORDS Formative assessment. IBSE. Open-inquiry. Case study. Galvanic cells INTRODUCTION Inquiry Based Science Education (IBSE) can be defined as an intentional process of problem diagnosing, carrying out a critical analysis of experiments and searching for alternative solutions, planning research, testing hypotheses, searching for information, constructing models, discussions with colleagues and formulating coherent arguments. (Linn, Davis and Bell, 2004). Unlike in the traditional teaching, in this approach the teacher loses his leading role, assuming instead the role of a trainer and lesson moderator (Anderson, 2002). The fact whether students are active, on the other hand, plays a greater role. According to NRC (Olson and Loucks-Horsley, 2000), the essential features of classroom inquiry include: Learners are engaged by scientifically oriented questions. Learners give priority to evidence, which allows them to develop and evaluate explanations that address scientifically oriented questions. Learners formulate explanations from evidence to address scientifically oriented questions. Learners evaluate their explanations in the light of alternative explanations, particularly those reflecting their scientific understanding Learners communicate and justify their proposed explanations. To implement these principles in practice, various models of instructions can be used (Atkin and Karplus, 1962); (Martin, Sexton and Gerlovich, 1999). For example, the 5Es learning circle covers stages: Engage; Explore; Explain; Elaborate or Extend; Evaluate (Bybee, 2002). According to it, at first an interest in the subject should be developed, then there should be a reference to the already possessed knowledge and then there is time for performing

experiments and research, after which the knowledge should be generalised and applied in different situations. The final stage is related to self-assessment of the student with reference to the acquired information. This cycle is an effective and popular tool for the teacher to organise lessons (Bernard, et. al., 2012). One can also adopt the model/description based on student s activities. Detailed description of skills that can be developed through the practice of inquiry, as divided into Stages of Scientific Inquiry were described by Wenning, (2007). A simplified version of this model could be a classification of student s skills into four major groups, composed of complex skills and arranged according to the time when they are developed: Planning; Experiment execution, Conclusions and evaluation, Skills developed at various stages. In every group of skills one can specify the basic ones: 1. Planning Defining the research problem, Formulating a research hypothesis together with its justification, Selection of variables, Drafting a variable control method (a description of mathematical dependencies), Drafting a raw data collection method (description of the experimental procedure). 2. Experiment execution Collecting raw data, Processing the collected data and calculating measurement errors, Result presentation. 3. Conclusions and evaluation Formulating conclusions, Plan and experiment execution evaluation, Proposals for experiment modification or improvement. 4. Skills developed at various stages. Looking for information, Team work, Safety and hygiene at work. Introducing teaching based on IBSE results also in changes in the form and method of student evaluation (INQUIRE project). In the course of classes taught pursuant to the IBSE method practically all inquiry skills should be subject to evaluation. The evaluation could take various forms and serve various purposes (Olson and Loucks-Horsley, 2000). These skills should be subject both to the summative and formative assessment. How a given student is to be assessed has an enormous impact on how teachers teach and what is their efficiency (Barron i Darling-Hammond, 2008). Currently, in the IBSE skills assessment a much greater role is played by the summative assessment. This is related to assessing students during school leaving/diploma examinations and to that students are evaluated during projects comparing student skills in various countries. Unfortunately, this assessment is usually limited solely to written answers to the presented problem. Even though such tasks are based on IBSE elements, they do not offer the possibility to test all skills developed in the course of scientific inquiry, i.e. team work skills or data collecting skills. It is clear that there is a need to increase the role of the formative assessment. According to Wynne Harlen (2000), the formative assessment occupies a special place in the evaluation of inquiry skills. This assessment enables an on-going evaluation of how the student mastered practically all skills trained during IBSE. Owing to it, the student obtains information concerning which skills he/she has already mastered

completely, and which are the ones he/she should work on (Black and Harrison, 2004). This assessment makes it possible to plan changes in the teaching and learning process in order to improve students achievements (Harrison, 2014). The formative assessment can be made at any stage of teaching unlike the summative one, which usually occurs towards its end (Olson and Loucks-Horsley, 2000) and gives information concerning the student s progress against a selected population (Harlen, 2000). The tools used to evaluate the student s inquiry skills, according to Llewellyn (2007) could be: multiple-choice and constructive response questions; rubrics, monitoring charts; concept maps; journal entries, lab reports; oral presentations; performance and self-assessment. Other tools include also tests of the Lowson-type or check lists. Baron and Darling-Hammond (2008) define rubrics as one of the most effective student evaluation tools. One can differentiate between two kinds of rubrics: analytical, enabling to assess one, selected skill, and holistic rubrics, which at the same time cover a number of skills in their assessment. The selected tools could be dedicated for various purposes. For example, multiple-choice questions could be used to specify the level of understanding of selected content while the use of rubrics or oral presentations will enable assessing the student s inquiry skills (Llewellyn, 2007). The abovespecified tools can be used for the summative and formative assessment due to the fact that the type of the evaluation depends primarily on its objective rather than on the tools that are used (Scriven, 1967), (Black i Harrison, 2014). One of the challenges facing teachers using IBSE is collecting information/evidence concerning the degree to which their students mastered their skills. This assessment could be based on observation of students work (observation sheets, video recordings), based on written work such as students notes taken in the course of their work, research reports or tests. Another source of information about student s evaluation could be self-assessment or mutual evaluation by students working in a team (Black, et. al.,2003). In 2009 in Poland a new curriculum for lower and upper secondary schools was approved. In accordance with it ( ) The student gains chemical knowledge in the research manner he or she observes, tests, verifies, draws conclusions and generalisations ( ) (Polish Core Curriculum, 2008). In accordance with the respective Regulation, an important role is to be played by developing students skills and research competencies; this is compliant with the IBSE teaching methodology. Guidelines provided for in the Act are executed also by the new examination requirements. The form and content of examination tasks verify to a growing extent the students research skills. Despite the introduction of legal changes, the implementation of teaching based on IBSE is very slow. This is related to limiting factors, such as: poor equipment of school laboratories, unadjusted form and content of handbooks but also poor preparation of teachers to work according to this work method (Paweł Bernard, et. al., 2012). Even though the IBSE method has been marked, as one offering the potential to increase student s interest in natural science subjects at various educational levels, there are few teachers who had an opportunity to get acquainted in practice with this method of work during lessons, and few of them has any knowledge about how to use inquiry in the course of their lessons (Olson and Loucks-Horsley, 2000). Moreover, a teacher working with the open-inquiry methods encounters the problem of difficulties with assessing his students. In the framework of the EU 7 th framework programme a number of projects were implemented to develop the methodology and didactic tools base for IBSE (Establish); (The Fibonacci Project); (Primas); (Profiles); (Inquiry) as well as projects aimed at developing the

IBSE assessment tools (SAILS); (Assest-me). In the majority of projects there have been foreseen trainings for teachers on the application of this method. The objective of the research was to check how a teacher who has theoretical knowledge referring to teaching with the IBSE method and assessing students working with this method actually implements these two in practice; what are the difficulties he encounters and which problems can he solve in his own; what motivates him to keep going. The research was a case study and it was based on materials and teachers taking part in the SAILS project. METHODOLOGY OF THE RESEARCH The object of the research was the implementation of teaching based on inquiry, together with an evaluation of selected IBSE aspects at the level of upper secondary school during chemistry classes. The implantation was made by a teacher, a SAILS project participant, who had theoretical knowledge on IBSE, completed a training concerning the methodology of teaching classes according to the IBSE method and assessing students working with this method. The research was made during the first attempt to introduce IBSE in practice. For this reason the teacher did not have any experience in teaching according to IBSE nor in assessing students working with this method. The teacher was made available didactic materials containing a description of the didactic unit on teaching about galvanic cells; the said unit was prepared in the course of the above-mentioned project implementation. The unit Galvanic cells proposes the teacher a strategy of educational path and specifies what IBSE skills could be subject to assessment in the course of the lesson. The materials offer also a proposal on information collection methods and the types of tools that could be applied during the assessment. However, there were no concrete tools, examples or evaluation standards. The teacher was given the choice as to in which class he is to teach the lesson, he was to prepare the education path scheme, chose skills subject to assessment, the tools to collect data for students assessment and to prepare criteria. The teacher was informed that the assessment form should be formative. The implementation results were discussed by the teacher in the form of a short report, subsequently the teacher was interviewed. The teacher chose for the implementation the first grade (17-18 years old) of upper secondary school, studying chemistry according to an extended curriculum. Students had some basic knowledge about galvanic cells. The class was composed of 6 students who worked in two teams of three persons each. The classes lasted 180 minutes, i.e. time equal to 4 lessons in the Polish educational system. The said classes were finished in the course of one day. Students worked with the open-inquiry method for the first time. The teacher prepared the tools and criteria to assess students himself. The thus drafted materials were consulted by the research authors. The teacher chose to assess the following skills: a) formulating hypothesis, b) designing an experimental procedure, c) drawing conclusions and evaluation. The assessment of the chosen skills was based on students reports. These reports were prepared in the course of classes. Student were not familiar with assessment criteria but they knew what skills would be evaluated. The teacher decided that the skill formulating hypothesis would be assessed in the form of a check-in box (Table 1.). Table 1. The tool to assess the formulating a hypothesis skill.

Aspects of the hypothesis evaluation: YES NO The hypothesis is formulated in a simple and clear way The hypothesis constitutes an adequate answer to the stated problem / research question. The hypothesis is at the appropriate scientific level. The hypothesis is verifiable with the use of available resources / materials. The hypothesis is not obvious. The hypothesis has justification that is adequate for a given level of education The teacher decided that assessing the skill Designing an experimental procedure would be made based on a rubric with 4 levels (Table 2): Table 2. The assessment tool for the Designing an experimental procedure skill. POOR CORRECT VERY GOOD EXCELLENT partially selects selects reagents correctly selects correctly selects reagents and laboratory equipment with the and laboratory equipment inadequately or with reagents and laboratory equipment reagents and laboratory equipment help of a teacher. the help of a teacher develops an develops an develops an develops an experimental method experimental method experimental method that does not take experimental method that takes into that takes into account dependent that takes into account all variables into account account dependent and independent presents dependent or independent variable and independent variable variable and some control variables complete sequence of causes and effects or does not develop presents presents logical that takes into any method presents incomplete or incomplete sequence of causes and effects but incomplete sequence of causes and effects account all experimental conditions. inadequate sequence takes into of causes and effects account health and safety regulations A rubric has also been designed to assess the skill Drawing conclusions and evaluation (Table 3.). Table 3. The tool to assess Drawing conclusions and evaluation. POOR CORRECT VERY GOOD EXCELLENT

draws inadequate conclusions or draws conclusions with teacher's help proposes unrealistic modifications of the experimental plan or modifications that do not have influence on the obtained results draws some draws conclusions draws conclusions conclusions that are that are adequate to the that are adequate to the adequate to the obtained results obtained results obtained results draws conclusions draws conclusions draws conclusions that clearly verify the that clearly verify the that allow the hypothesis, hypothesis, verification of the lists all the presents a full and hypothesis measurement errors, proper discussion of lists some proposes adequate measurement errors measurement errors modifications to the proposes adequate proposes plan of the experiment and real modifications modifications to the plan of the experiment to the plan of the experiment with the teacher's help RESULTS Classes began by formulating by the teacher a general problem/research question: How to obtain the highest voltage of a cell? The teacher made available to the students materials which they could use for their experimental work, among others metal plates (Al, Cu, Zn, Pb), solutions with concentration of 0.5 mol/dm 3 (CuSO4, Al(NO3)3, Pb(NO3)2, ZnCl2), voltometers and beakers. The first students task was to formulate detailed research questions and hypothesis. Students wrote their answers in the report as they continued their work. Below are students replies referring to the scope of skills assessed and the teacher s analysis thereof. Hypotheses noted down by students: Group A: The greatest voltage can be obtained from the aluminium - copper cells at the following concentrations: CuSO4 0.5 mol/dm 3 and Al (NO3)3 0.2 mol/dm 3 Group B: By constructing a zinc - copper cell, we obtain the greatest voltage. Teacher s comment: The hypothesis proposed by group A and B is at the appropriate level, it is possible to verify it with the use of available materials, it is not obvious, provides an adequate answer to the research question, however, this hypothesis has no justification. Therefore the proposed hypothesis is compliant with the guidelines proposed in the assessment tool, except for one property, i.e. the justification of the hypothesis. This is due to the fact that the hypothesis in itself does not have to have an explanation. This is not caused by the fact that students did not know how to justify the hypothesis but rather they were not aware of a need to do so. I said: The formulation of the hypothesis will be assessed students focused on the hypothesis only rather than on explaining it. The next stage of students work was specifying variables. Defining variables is not included in the Polish curriculum for any subject. For this reason the teacher decided not to

assess this skill and introduced a definition of variables during the lesson. Still, defining variables for the planned experiments turned out to be very difficult. The next stage was when students started designing the experimental procedure. Students answers: Group A: Laboratory equipment: multimeter, two beakers, electrolytic key, copper plate, aluminium plate, measuring cylinder. Reagents: CuSO4, Al(NO3)3, NaNO3 Plan: From the aluminium plate inserted into a beaker with a solution of Al(NO3)3 at a concentration of 0.2 mol/dm 3 a half-cell was constructed. Then, the second half-cell was constructed with a copper plate and CuSO4 at a concentration of 0.5 mol/dm 3 was poured into the beaker. The two half-cells were combined with the electrolyte key filled with a solution of NaNO3. Multimeter was connected to the cell. Teacher s comment: The plan covers appropriate reagents and laboratory equipment, as well as presents incomplete, but logical sequence of causes and effects. Unfortunately, this plan does not refer directly to the variables (both in the case of group A and B). This could result from the fact that in the Polish curriculum defining variables is not covered by any subject. Therefore, despite their appropriate introduction by me during the unit implementation, the students still had a big problem with defining variables. It seems that the skills assessed based on such plan of the experiment do not fully reflect the student's skills in this field. With the use of the proposed rubric I was not able to determine clearly the level of student s skills as they had both the features of excellent, very good and poor. Although the students do not define variables, they use them. During the experiment, the students have constructed also the second cell consisting of the same half-cells as above, but the concentration of CuSO4 was changed from 0.5 to 0.25 mol/dm 3. However, they did not include it in the plan of the experiment. These drafted plans were the basis for the execution of the experiment and subsequently used to formulate conclusions: Students answers: Group B: To obtain the highest voltage, metals with the highest possible difference in their potentials should be selected, for example by forming a zinc-copper cell: Zn Zn 2+ Cu 2+ Cu To obtain an even greater EMF cell, one can construct an aluminium-silver cell: Al Al 3+ Ag + Ag The concentrations of solutions can also be changed in order to obtain even greater electromotive force. Teacher s comment: Similarly as in the case of the planning skills, the work results presented by students were compared against the assessment tool (Table 3.). Students do not draw conclusions arising from

the experiment but on the basis on theoretical considerations (oxidation-reduction series), aside from the results of the experiment. Clearly this demonstrated the relationship between the concentration and the voltage value. The students write: ( ) you can change the concentration of solutions but do not write how to do it - whether to increase the concentration or dilute the solutions. During the experiment, students changed solution concentration and found some dependence between the concentration and the voltage. Nevertheless this fact has not been included or explained in their conclusions. The formulated conclusions are not adequate to the hypotheses and do not allow their final verification. None of the groups is considering any measurement errors or corrections. Students wrote: Multimeter s precision: 0.001V However, they do not refer to this information in their conclusions or evaluation. The lack of this piece of information may be due to the fact that students have never had contact with specifications of such data, or they are not aware of their presence and the need to discuss it. Similarly, as it was in the case of the above-mentioned tool, also this one is not very useful for the assessment of a student working with the inquiry method for the first time. The proposed tool is too advanced for this type of students. Teachers summary The assessment of students by rubrics is very difficult and requires a lot of teaching experience. The key issue seems to be the discussion of the assessment criteria with students before the beginning of a lesson. Thanks to this, students are aware of what is expected from them and, possibly, they may pay more attention to planning the experiment. Thus the work will be performed more carefully and in more detail. In addition, evaluation of students working by open-inquiry for the first time should rather be based on basic, individual skills. In the case of the proposed assessment of Designing an experimental procedure, 4 students skills are evaluated. For me and for other teachers, it would be easier to evaluate two separate basic skill rather than all of them at once. CONCLUSIONS Even though the teacher taught the class with the IBSE method for the first time, he did well. He assumed the role of the lesson s moderator, allowed students to draft detailed research questions on their own and to propose as well as test solutions, without imposing his own opinion. The teacher himself decided what would be the assessed skills and the form of their assessment: the check-list and rubrics. The teacher, while preparing rubrics did not relate their content to the level of students participating in the lesson. In this way the tool included skills that students have had any contact with before, such as for example the variables control or analysing measurement errors. There was also an error on the part of the teacher who did not discuss the assessment criteria with his students. Black and William (1998) claim that the awareness of the guidelines/assessment criteria increases the student s learning efficiency. In the case of work with students who are working with the open-inquiry method for the first time, it would a good solution to discuss the content of the rubrics with the students. One can see from the teacher s assessment that students have been negatively assessed for Designing an experimental procedure and Drawing Conclusions and evaluation. In the case of assessment

of formulating hypothesis, the evaluation was very mild and the tool itself made it possible to assess only some selected aspects of the hypothesis. Despite the poor results the teacher tried to give positive grades in order not to discourage students from working with the inquiry method: ( ) the drafted tools gave a negative assessment of students, nevertheless seeing their curiosity and involvement, I tried to raise it. Developing the next assessment tool I will try to focus on such a tool which will enable a positive student s assessment ( ). In this research we tested the efficiency of three tools to assess various skills. The presented tool for the assessment of formulating hypothesis enables the evaluation of a student s work; nevertheless, the research demonstrates that one of the hypothesis properties, i.e. justifying it is omitted in the students work. According to the teacher: ( ) I would propose a change to the very name of the evaluated skill/tool from Formulating hypothesis to Formulating and justifying the hypothesis( ) The remaining assessment tools in the form of rubrics do not allow to assess the work of a selected team of students. The proposed tools are potentially suitable for student assessment, but in the initial stage of teaching using the open inquiry approach, it would probably be more advisable to focus on the assessment of less complex skills such as selection and definition of variables. ( )A beginner student is not aware of the need to include various information resulting from the research, for this reason I propose that in the initial stage of teaching with the IBSE method he/one should allow students to work with a work sheet on which they would have consecutive research points marked. Moreover, if we want to assess such skills as, for example, conclusions and evaluation, it would be good to specify elements of that assessment such as: Applications; Verify the hypothesis; Empirical measurement errors; any amendments / changes ( ) What is more the teacher pays attention to the advancement level of the proposed rubrics: ( )With the use of the proposed tolls it is very difficult to assess students. These rubrics allow the assessment of a number of skills simultaneously. In the case of a poor student is impossible due to the fact that some of these skills can be mastered at a good level while others not at all. Therefore we have a problem: into what category should the student be included? ( ) Rubrics are one of the tools most frequently used to assess students working with the open inquiry methods. However, the research demonstrates that they are not always appropriate for that. Taking into consideration the varying degrees of their complexity, it would be better to propose the rubrics that are more complex. In such a case one skill at the same advancement stage could characterise many levels; this in turn would enable teachers to associate the students skills level to the rubric s level. On the basis of the conducted research one can say that it is possible to work with the openinquiry method with students who have never before experienced any simple methods based on inquiry. The teacher who is to evaluate students in such a situation should focus on the assessment of their basic skills. If rubrics are used as the tool to assess the students, their content and complexity should be correlated to the advancement level of the students, so that they describe in a clear way the students behaviour. It is also important that the assessment tools be discussed in detail by the teacher and understood by students.

REFRENCES Anderson, R. D., 2002. Reforming Science Teaching: What Research says about Inquiry. Journal of Science Teacher Education, 13(1), pp. 1-12. Assist - me Project. [online] Available at: < http://assistme.ku.dk/> [Accessed 17 July 2014]. Atkin, J. M. and Karplus, R., 1962. Discovery or invention? The Science Teacher, 29(5), pp. 45-51. Barron, B., & Darling-Hammond, L. (2008). Teaching for Meaningful Learning: A Review of Research on Inquiry-Based and Cooperative Learning. In L. Darling-Hammond, B. Barron, P. D. Pearson, A. H. Schoenfeld, E. K. Stage, T. D. Zimmerman, J. L. Tilson, Powerful Learning: What We Know About Teaching for Understanding. San Francisco: Jossey-Bass. Bernard, P., Białas, A., Broś, P., Ellermeijer, T., Kędzierska, E., Krzeczkowska, M., Szostak, E.,2012. Podstawy metodologii IBSE. In: I. Maciejowska, E. Odrowąż (Eds.), Nauczanie przedmiotów przyrodniczych kształtujące postawy i umiejetności badawcze uczniów. Kraków: Uniwersytet Jagielloński. Bernard, P., Dudek, K., Maciejowska, I., Odrowąż, E., Geoghegan, R.,2012. Introduction of inquiry based science education into polish science curriculum - general findings of teachers attitude. Chemistry Didactics Ecology Metrology, 17 (1-2), pp. 49 59. Beverley Bell, B. C., 2001. The characteristics of formative assessment in science education. Science Education, 85(5), pp. 536 553. Black, P. and Harrison, C.,2004. Science inside the black box: Assessment for learning in the classroom. London: NfER Nelson. Black, P. and Harrison, C., 2014. Assessment in the Pedagogy of Inquiry. Smec 2014. Dublin, unpublished. Black, P., and Wiliam, D., 1998. Assessment and Classroom Learning. Assessment in Education: Principles, Policy & Practice, 5 (1), pp. 7-74. Black, P., Harrison, C., Lee, C., Marshall, B., Wiliam, D.,2003. Assessment for learning: putting it into practice. Buckingham, United Kingdom: Open University Press. Bybee, R., 2002. Learning Science and the Science of Learning. Arglington: NSTA Press. Establish Project. [online] Available at: < http://www.establish-fp7.eu/> [Accessed 17 July 2014]. Harlen, W., 2000. Assessment in the Inquiry Classroom. In: Foundations, Volume 2: Inquiry: Thoughts, Views, and Strategies for the K-5 Classroom. National Science Foundation. Harrison, C., 2014. Assessment of Inquiry Skills in the SAILS Project. Science Education International, 25(1), pp. 112-122. INQUIRE Project. [online] Available at: < http://www.inquirebotany.org/en/news/taking-ibse-intosecondary-education-188.html > [Accessed 07 July 2014] Linn, M. C., Davis, E. A., Bell, P., 2004. Internet Environments for Science Education. London: Lawrence Erlbaum Associates. Llewellyn, D. J., 2007. Inquire Within: Implementing Inquiry-Based Science Standards in Grades 3-8 (2 ed.). Thousand Oask: Corin Press. Maciejowska, I., 2012. Kształtowanie postaw i umiejętności badawczych ucznia w nowej podstawie programowej. In: IBSE Inquiry Based Science Education. Nauczanie przedmiotów przyrodniczych kształtujące postawy badawcze ucznia. Kraków: Uniwersytet Jagielloński.

Martin, R., Sexton, C., Gerlovich, J., 1999. Science for All Children Lessons for Constructing Understanding. Needham Heights: Allan & Bacon. National Research Council. (1996). National science education standards. Washington: National Academy Press. Olson, S. and Loucks-Horsley, S., 2000. Inquiry and the National Science Education Standards: A Guide for Teaching and Learning. Washington, D.C.: National Academy Press. Polish Core Curriculum. (2008) Act of the Polish Parliament. Regulation of the Minister of Education. Dz.U. 2008 Nr 4, poz. 17. Podstawa programowa z komentarzami, tom 5. Edukacja przyrodnicza w szkole podstawowej, gimnazjum i liceum. PRIMAS. Project. [online] Available at: <http://www.promas.gr/en/servicessupport_en.htm> [Accessed 07 July 2014] PROFILES. Project. [online] Available at: < http://www.profiles-project.eu/> [Accessed 17 July 2014] SAILS. Project. [online] Available at: < http://www.sails-project.eu/portal > [Accessed 17 July 2014] The Fibonacci Project. [online] Available at: < http://fibonacci-project.eu/> [Accessed 17 July 2014] Scriven, M., 1967. The methodology of evaluation. In: R.W. Tyler, R.M. Gagné,M. Scriven, ed. 1967. Perspectives of curriculum evaluation. Chicago: Rand McNally. Wenning, C. J. (2007). Assessing inquiry skills as a component of scientific literacy. Journal of Physics Teacher Education Online, 4(2), pp. 21-24. [online] Available at: < http://www.phy.ilstu.edu/jpteo/issues/win2007.html> [Accessed 17 July 2014]