Recommended Sponsor Painted-Moon.com - Buy Original Artwork Directly from the Artist

Source: The Conversation (Au and NZ) – By Stewart Riddle, Associate Professor, School of Education, University of Southern Queensland

Shutterstock

Most Australian students who took part in the last OECD Programme for International Student Assessment (PISA) said they would have invested more effort if the test had counted towards their school marks.

This is a finding from a recent report issued by the Australian Council for Education Research. The data came from a questionnaire students filled out at the end of the two-hour PISA test in 2018. They were asked to rate how much effort they would have invested if they knew their results would count towards their school marks.

Some 73% of students indicated they would have put in more effort had that been the case.

While 56% of Australian students claimed to put in “high effort” in the PISA tests, this would have increased to 91% if the results were included in their school results.

We spend a lot of time focusing on debates about curriculum (what is being taught to students) and pedagogy (how it is being taught). Data from standardised tests such as PISA and NAPLAN are often used as evidence of declining standards, falling outcomes and failing teachers.

But the above results show yet again that schooling is more complex than politicians like to advocate. Methods to lift standards such as going “back to the basics” – as the then education minister, Dan Tehan, vowed to do after the last PISA results came out – or encouraging the “best and brightest” to become teachers – a goal of the current education minister, Alan Tudge – are too simplistic for the real world.

What is PISA?

Every three years, PISA tests how 15-year-old students in dozens of countries apply reading, science, maths and other skills to real-life problems.

PISA generates much attention from policymakers and the media. It is often used as a proxy for making judgements about the quality of teaching and learning in Australian schools.




Read more:
PISA doesn’t define education quality, and knee-jerk policy proposals won’t fix whatever is broken


But there are important questions regarding what exactly the PISA tests measure and how useful the results are for informing policymaking and education debates.

Is it knowledge or effort?

The ACER report showed levels of effort in PISA were higher for female students, those attending metropolitan schools, non-Indigenous students and students from backgrounds of relatively high socioeconomic advantage.

But, when averaged out, nearly half of Australian students who sat the 2018 PISA test admitted they did not try their best.

These results are comparable with the OECD average of 68% of students claiming they tried less on the PISA tests than they would if it counted towards their school grades. In contrast, students in the highest-performing education systems of Beijing, Shanghai, Jiangsu and Zhejiang (China) reported very high levels of effort. There could be several reasons why the same theory may be less applicable to these Chinese systems, such as them having a more strongly competitive academic culture.




Read more:
The PISA world education test results are about to drop. Is Australia getting worse?


Educational psychologists in Australia have long studied the links between motivation, self-efficacy (students’ beliefs they can perform at the level they need to) and academic achievement.

For example, expectancy–value theory, to put it simply, suggests the lower the perceived value or usefulness of a task, the less motivated one potentially is to put in much effort.

Student running up some stairs that has a door at the top opening up to the sky.
Motivation to do the task is determined by its perceived value.
Shutterstock

Perhaps one of the unintended side effects of assuring participating students that PISA is a low-stakes task — it does not count towards their school grades — is the potential for downward pressure on performance.

The year 9 slump

Another potential reason for the lack of motivation in students taking the PISA test is the well-documented slump in engagement and motivation during the middle years of schooling.

NAPLAN data have consistently shown a pronounced drop in performance from year 7 to year 9, when students are 14–15 years old. For example, 9.1% of year 7 students didn’t meet the national minimum standard in the 2013 NAPLAN writing task. Two years later in the NAPLAN 2015 writing task, nearly twice as many (17.7%) year 9 students didn’t meet the minimum standard.

At the higher end of performance, the proportion of students above the national minimal standard dropped from 72.2% in 2013 to 59% in 2015.

The pattern is persistent. The results from the year 9 NAPLAN writing task in 2019 clearly demonstrate a dramatic drop in performance. The percentage of students in year 9 meeting or exceeding the national minimum standard was 82.9%, compared to 95% of the same student cohort in the 2013 year 3 writing task.

Research has shown the middle years of schooling is a challenging time for many students. Their bodies and minds are changing rapidly, the demands of high school and their social lives become more complex, and the level of disengagement and disaffection with school rapidly escalates.




Read more:
The missing middle: puberty is a critical time at school, so why aren’t we investing in it more?


What does this mean for school policy?

Instead of policies such as going back to basics, student motivation and engagement must be part of the education policy landscape.

This means paying closer attention to the lives, knowledges, experiences, hopes, fears, challenges and opportunities facing young people.

Educators and policymakers must consider complex factors of social, economic and educational disadvantage and advantage to meet the Mparntwe Declaration goals of educational excellence and equity. This includes the interplay of socioeconomics, location, culture and community, school resourcing and access for all young people to housing, health, economic and social stability, and quality schooling.

The Conversation

Stewart Riddle does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

ref. Yes, Australia’s PISA test results may be slipping, but new findings show most students didn’t try very hard – https://theconversation.com/yes-australias-pisa-test-results-may-be-slipping-but-new-findings-show-most-students-didnt-try-very-hard-172050

NO COMMENTS