Learners performance in contextbased science tasks requiring higher order cognitive activity B. Putsoa, C. Dlamini, N. Dlamini, T. Dube, E. Khumalo, T. Masango, F. Ndlela, L. Nhlengethwa, S. Tsabedze. Abstract A team of teachers has revived a set of assessment tasks they developed and administered to a sample of junior secondary school learners some two years ago. The poor performance of learners in that study urged the team to find out about the cognitive demands of the contextbased tasks and the details of how learners tackle such tasks. An item analysis technique recently developed by members of the original team that formulated Bloom s taxonomy was used. Learners performance was analyzed both quantitatively and qualitatively. Contextualisation and cognition in school science Bennett et al. (200) stated recently that one of the most significant changes taking place in science education during the last twenty years has been the development of a wide range of materials that use contexts and applications as a starting point for developing an understanding of scientific ideas (p6). These developments that emphasize the importance of knowledge application extend into the assessment dimension. For instance, the Organization for Economic Development (OECD, 999) has stated that its new international assessment framework focuses on the abilities of learners to apply their understanding of scientific ideas to situations relevant to personal, social and economic life. Similarly, Baker, et al. (99), in reference to alternative assessment approaches, such as those referred to in the literature as `authentic or `performance, stated that these seek to access learners ability to apply knowledge and to solve real life problems. Bennett et al. (200) reported mainly on developments and effects of the contextbased approach in Western countries, however, the literature does show that a number of Southern African countries are keen designers and implementers of homegrown contextualised school science materials for teaching, learning, and for assessing performance in scientific abilities. (Lubben et al.,995; Putsoa and Maphalala, 996; Kasanda et al., 2002; Koosimile, 200). The main reason for an interest in this approach is its capacity to improve the relevance of science for learners, making it more real to their lives, therefore interesting and possibly less difficult. The latter is a motivational feature of special interest because of it enhances the chances of including reasoning tasks in classrooms and during assessments. Reasoning here refers to those cognitive activities that go beyond knowledge acquisition and basic understanding, to those associated with higher cognitive activity as used in Bloom s taxonomy of educational objectives (Krathwohl, 200), or in reference to critical thinking and problem solving. Lewin and Caillods (200) have advised developing countries on the need to enhance such analytic and abstract thinking abilities and attitudes among learners because they are highly desirable lifelong attributes that are critical for national development. The problem The literature on assessment in science is replete with instances of low performance among learners. This phenomenon is particularly marked in tasks that require reasoning (Lewin, 992, Putsoa, 992; Ogunniyi, 996). A prominent reason proffered for the difficulty experienced by learners, is the low relevance of the science taught to their everyday circumstances. The continued development of curriculum material that relate to local contexts remains a challenge for many African countries. On the other hand, various pitfalls associated with contextualization have
been pointed out, such as on the validity of assessment items for reflecting the intended science concepts adequately (Ahmed and Pollitt, 200). They have also warned against the danger of contexts distracting performers from the intended scientific ideas by being regarded by learners as ends in themselves. These hazards are recognized and efforts have been made in the development of tasks used in this study to be wary of this problem. Putsoa, et al (200) reported on a comparison of learners performance in eight contextualized questions (Ctasks), against a corresponding set of eight noncontextualised (NCtasks). That study showed a general poor performance on the given reasoning aspects of the tasks, in that the percentage of correct responses in all C and NC tasks ranged from % to 56%. Nonetheless, the results showed that learners had performed better in most (75%) of the Ctasks and had scored better in only two of the eight NCtasks. Although the difference in the respective means of.7% and 25.6% for C and NC tasks was statistically insignificant, the research group surmised that the reason for learners higher performance in contextbased tasks had to do with a higher motivation that was due to the challenge of tackling familiar scenarios of the local environment and the urge to succeed. The nature of the tasks is aligned to some of the characteristics of `authentic or `performance assessments. Baker et al. (99) stated that these tend to be openended tasks; focus on higher order skills; employ context sensitive strategies and often use complex problems requiring several types of performance and significant student time. In addition, Custer (99) is of the view that authentic assessments incorporate a wide variety of techniques designed to correspond as closely as possible to `real world student experiences (p.660). Thus, the Ctasks were regarded by the research team as more difficult than the NC ones because they demanded a preliminary cognitive act of interpreting the issues embedded in the local context before the learner determined an appropriate scientific idea to apply in tackling the given task. Conversely for the NCtasks, the need for an initial interpretation of a real situation did not arise. In the current study, the group analyzed the attributes of each of the eight Ctasks to identify elements of higher order cognition demanded of learners as a result of contextualization. This enabled the team to determine learners competencies and difficulties in tackling sciencebased reasoning tasks. The research questions addressed three concerns stated below.. What higher order cognitive activity characteristics are contained in the set of contextbased assessment tasks? 2. How do learners perform in the higher order cognitive activities?. What alternative responses do learners give to the scientific reasoning tasks? Methodology The eight contextbased assessment tasks used in the 200 study were reanalyzed, with a view to determining factors that contribute to the observed low performance of learners. The questions focused mainly on the scientific topic of Force. Scripts of 6 learners were selected randomly from the pile of the sample schools used in the earlier study. The data was collected as follows:. Using the revised Bloom s Taxonomy of Educational Objectives (Krathwowl, 200), the research group identified the cognitive category to which each item of each task belonged. The new technique used for analysing the questions is illustrated with two of the eight tasks on Table. Learners scripts on these two tasks, `Maize cob and ` (litter) are displayed below for ease of reference. The number of question items contained in each task varied. The names or titles of the tasks appear in the results Tables 2 and on the quantitative and qualitative analysis of learners performance, respectively. 2
2. Items with the same cognitive demand were grouped together and an overall average performance of the 6 candidates in each category was calculated.. Thereafter, performance in task items of Bloom s first and second cognitive category was separated from those task items containing higher order cognition. The latter were found to require learners to apply, analyze and evaluate information. Thus, two techniques were used in the analysis, that is: quantitatively, to determine the percentage of correct responses in the cognitive categories; then qualitatively, to determine the alternative responses given by learners, as opposed to the expected ones from the questions asked. The outcome of this analysis is shown in Table. Results
(c) The handles of Sibusiso s wheelbarrow were broken and replaced with longer ones. Do you think this wheelbarrow can be still be used as before? What can you say about the effort needed to raise the same load, to help support your idea? Table : Illustrative classification of tasks using Bloom s revised framework Cognitive process dimension Knowledge dimension Task Remember Understand Apply Analyze Evaluate Create A. Factual Maize cob Item(c) B. Conceptual Maize cob Item(b) Item(a) Item (c) C. Procedural Maize cob Item (a) Item (b) D. Metacognitive Maize cob Table 2: Performance in cognitive categories Cognitive category Task and No. of items Remember 5 items Boiled egg () Gogo () Marks 6 Total marks expected 96 6 Performance Correct responses 20 (N=6) Overall % correct responses 2 9 20 Average % Understand items Boiled egg () 6 96 2 2
FJB (2) Gogo () Good food () () 5 2 0 6 70 2 60 256 2 56 25 7 9 5 9 22. Apply 6 items Boiled egg () Cranes (2) Galaza () Gogo (6) FJB () Maize cob (2) () 7 2 6 6 208 2 6 92 5.5 5 55 8 27 25 2 25 9 2 66.8 Analyze items Cranes () Maize cob () Galaza () 6 6 6 0 8 0 50 9 2 Evaluate item () 2 2 8 25 25 Create None Legend: the 5 columns represent the following data: Column Role. Cognitive category 2. Task name and number of items asked in the category. Marks for all items of a task in the category. Total marks expected from 6 candidates for all category items 5. Total marks obtained by the 6 candidates per item 6. Marks converted to percent of the expected total per item 7. Average performance in all items within category. Table : Qualitative data analysis of higher order tasks/items Task Question asked Expected response Examples of alternative responses Boiled egg Evaporate product of Solid form of the salt Reverting to reactants eggshell and vinegar remains. e.g. eggshell remains. reaction. What happens? Crane(s) Fruit juice bottle Galaza Gogo a) Show position of fulcrum from various parts in different types of crane. b) Identify crane part for lifting load.. Reason why crowned bottles break when opened. a) Show where force is applied on a legged pot lifted by boy and girl using a rod. b) Identify who exerts greater force and why Fermenting maize meal: a) Keep the fermenting a) Arrow pointing at crane part allowing beam movement. b) One with fulcrum permitting vertical movement. Crown force used (at factory) is greater than force required to open bottles. a) Draw crown on red for lifting pot facing downwards. b) Sipho, the boy is closer to centre of rod than the girl Ncamsile. a) Keep fermenting mixture out of fridge, it a) Arrows pointing at parts e.g. the load, pulley. b) pointers to various unrelated parts. Referred to forces being large but not specifying which force; or ref. to increased heat, energy. a) Arrow facing various directions on different parts of diagram. b) Focus on gender difference, i.e. Sipho is stronger. 5
Maize cob mixture in fridge? Why? b) Can gogo get back the original maize meal after it fermented? a) Show direction of force applied by fingers on maize cob. b) Relate different tools for shelling maize to types of force applied. a) Show positions of fulcrum load, effort on tools doing work. b) Decide if change of structure: longer handles affects working of tool and why in terms of effort. needs warmth. b) Fermented maize meal cannot be changed to original state. a) Arrow on right thumb bending northwest to show twisting force. b) Push with rough stone, twist with knife, Pulls with teeth. a) Label according to position of work done by tool or hand. b) Yes, less effort is applied if wheelbarrow handles are longer. Did not recognize or use characteristics of chemical reaction. a)arrows on right or left hand, after pointing straight up. b) Stated everyday use of tools, e.g. press with rough stone, cut with knife, grind with teeth a) Most positions correctly interpreted. Error often in identifying fulcrum. Most stated no difference occurs. Others that more effort needed. Discussion and conclusions The team s findings with respect to the first research question on the extent to which the set of eight contextualized tasks assessed reasoning abilities are displayed in Table 2. Data related to the second research question on learners performance in such tasks are also reflected in Table 2. Quantitatively, the table shows that seven out of eight tasks contained at least one item requiring higher order cognition. `Good food was the only task whose items were found to all belong to the category of understanding. Further, the question items were clearly skewed towards the categories of higher cognition in applying, analysing and evaluating information, with a total of 20 such items. This is in contrast to the total of 6 items found to have been requiring learners to recall and show understanding. Therefore, it can be said that the contextualising of tasks promoted the inclusion of reasoning activities in relation to real situations in the learners environment. Despite that the analysis technique used in this study was different from that used two years ago, the performance average of.8% obtained for application tasks compares well with the mean of.7% obtained for all Ctasks by the larger sample of two years ago. Thus, it can be inferred from Table 2 data, again, that learners experience difficulty in coping with the higher cognitive demands of analysing and evaluating situations. With respect to the third research question, which purports to reveal the nature of the difficulties experienced by learners in tackling reasoning tasks, the team s findings are annotated in Table. In summary, the following limitations that contributed to a general low performance were the prevalence of: insufficient attention and concentration on the question asked, or on the elements contained in a given situation or context; Inadequate or an absent use of the imagination about a given situation, for instance on dimensional diagrams, as was required for tasks on `Cranes, `Galaza (biglegged pot), and `Maize cob. The above are examples of what Lewin and Caillods regard as critical lifelong attributes that must be given attention alongside the acquisition of scientific knowledge in classrooms. References Ahmed, A. and Pollit, A. (200). Improving the validity of contextualised questions. Paper presented at the British educational Research Association. Leeds. UK. 6
Baker, E. L., O Neil, H.F., & Linn, R. L. (99). Policy and validity prospects for performance based assessment. American Psychologist, 8 (2), 2028. Bennett, J., Lubben, F. (200). A systematic review of the effects of contextbased and Science TechnologySociety (STS) approaches in the teaching of secondary science. Evidence for Policy and Practice Information and Cocoordinating Centre (EPPICentre) publication. Custer, R. L. (99). Performance Based Education Implementation Handbook. Columbia: Instructional Materials Lab, University of Missouri. Kasanda, C.D.;!Gaoseb, N. Lubben, F. (2002). A comparison of the use of everyday contexts by Mathematics and Science teachers in Namibian secondary schools. Proceedings of the 0 th Annual Conference of the Southern African Association for Research in Mathematics since and Technology Education (SAARMSTE). Durban. Koosimile, A.T. (200).Outofschool experiences in science classes: Problems, issues and challenges in Botswana. International Journal of Science Education, 26 (), 895. Krathwohl, D.R (2002). A revision of Bloom s taxonomy: an overview. In Theory into Practice volume, number, Autumn, 2228. College of Education. The Ohio State University. Lewin, K. (992). Science education in developing countries: issues and perspectives for planners. Institute for Educational Planning. UNESCO. Paris. Lewin, K.M and Caillods, F. (200). Financing secondary education in developing countries; strategies for sustainable growth. Institute for Educational Planning. UNESCO. Paris. Lubben, F; Campbell, B. and Dlamini B. (995). In service support for a technological approach to science education. A report to Overseas Development Administration (ODA). Ogunniyi M.B. (996). Science, technology and Mathematics: the problem of developing critical human capital in Africa. International journal science education, 8 (): pp26728. Organisation for Economic Cooperation and Development (999). Measuring students knowledge and skills: a new framework for assessment. OECD Publications service Putsoa, B. (992). Investigating the ability to apply scientific knowledge, through process skills, among high school leavers in Swaziland. Unpublished D. Phil. Thesis. University of York. Putsoa, B., Dlamini, C., Dlamini, E., Dlamini, N., Dube, T., Khumalo, E., Masango, T., Ndlela, F., Nhlengetfwa, L., and Tsabedze S. (200). Comparing learners performance in contextualised and noncontextualised science tasks. Proceedings of the th Annual conference of the Southern African Association for Research in Mathematics and Science and Technology Education (SAARMSTE). Mbabane. Putsoa, B., and Maphalala T.P. (996). Linking School Science with local Industry and Indigenous Technology (LISSIT) in Swaziland. Proceedings of the th Annual Conference of the Southern African Association for Research in Mathematics and Science Education (SAARMSTE). 7