Student reporting: Where to from here? 

Hilary Hollingsworth and Jonathan Heard


Across Australia, as in many other locations, there is a long tradition of schools engaging in activities intended to communicate information about student learning each year. Given the tremendous investment of effort in these activities by teachers and principals, questions of great interest include: are these activities providing quality information about student learning, and are there alternative designs for these activities that might provide ‘better’ information about student learning?

Over the last three years, we have investigated these questions through the Australian Council for Educational Research (ACER) Communicating Student Learning Progress project (Hollingsworth, Heard & Weldon, in press). Our investigation has focused on the national research, policy and practice landscape related to how information about student learning is communicated, and in particular, what student reporting looks like.

This article presents some of the insights into student reporting revealed through our investigation. A brief summary of some of the prevailing issues related to student reporting is presented first to provide a context for current policies and practice. This is followed by a discussion of the growing use by schools of electronic systems and tools to communicate student learning, and the opportunities and possibilities that these systems offer for reporting. 


Students’ school reports have long been the cornerstone of communication to parents: a much anticipated document that offers a final reckoning of a child’s achievement each semester. However in Australia, the shift from syllabus- to outcomes- to standards-based curricula, as well as other educational trends over the years, have left their mark on student reporting, with the ‘traditional’ semester report frequently being a target of criticism, and its perceived inadequacies a topic of much debate.

Three issues related to student reporting practice that receive consistent airtime in this debate are: the language used in reports, the grading and ranking of students, and the timing of reporting. 

Various approaches to describing student performance have been applied by schools and systems in student reports over the years. During the transition towards outcomes-based education in the 1990s for example, the objective, descriptive language of the curriculum increasingly came to replace the subjective and evaluative voice of the teacher, as assessment started to focus less on grading a student’s performance on tasks and more on measuring their individual attainment of expected learning outcomes. National and state reviews from the mid-90s to the mid-2000s (Cuttance, Stokes, & Department of Education, Training and Youth Affairs, 2000; Eltis & New South Wales Department of Training and Education Co-ordination, 1995; Reporting to Parents Taskforce & Tasmania Education Department, 2006) revealed that parents who were unfamiliar with and unaccustomed to this new style of reporting found it to be opaque and inaccessible. Such community perception prompted the federal government in 2004 to announce that education funding to states and territories would be tied to a requirement that schools write “plain language” reports at least twice yearly, and that a child’s performance in her subjects must be graded using an A to E (or similar five-point) scale.

While few argued with the need to improve the language used in reports, the reintroduction of the A to E scale proved – and remains – a sticking point. Educators have questioned the impact on student motivation because of the potential for reports to define students by their performance on the A to E scale. There is also doubt cast on whether the A to E scale can be, and is, applied with any level of consistency. Recently, the inadequacy of A to E grades for the reporting of learning progress has been reflected in the Gonski 2.0 review, which recommends the introduction of new reporting arrangements which focus on both attainment and gain (Australian Government Department of Education and Training, 2018).

Another concern with the tradition of semester reporting is that it does not provide timely-enough information to be useful. School reporting processes often require such extensive lead time for preparation and then finalisation of reports that by the time they are made available they lack currency. In our investigation, calls for greater flexibility in student reporting processes to increase the currency of information were made by students, parents and teachers. 

These and other issues highlight the limitations of inherited models and legacy practices related to student reporting, and provoke the timely reimagining of the purpose and form of student reporting (for a more extensive review, see Hollingsworth et. al, in press). What follows is a discussion of the ways that some schools are utilising electronic systems to support, extend and reconceive their student reporting practices. 

The end of the semester report?

In the last 10 years, schools have increasingly adopted sophisticated electronic management systems with multi-user functionality, which, if thoughtfully managed, may help to address some of the limitations of traditional reporting. Variously referred to as School Management Systems (SMS), Student or School Information Systems (SIS), Learning Management Systems (LMS) or Virtual Learning Environments (VLE), what unites these commercially available products is that they provide the capacity for schools to report on student learning, both to students and to parents. Many of these products allow schools to generate semester reports automatically, simply by collating and aggregating learning data and teacher feedback comments stored in a teacher’s online ‘mark book’.

According to interviews with several product providers, the vast majority of their client schools still produce semester reports, satisfying what many interpret as the mandated government requirement that all schools produce two summative written reports per year. However, all acknowledge that the semester report is quickly changing. In place of detailed comments and information about a student’s performance, many schools are publishing more succinct, auto-generated academic transcripts, which are sometimes little more than graphs and grades.

James Leckie, co-founder and director of Schoolbox, observes that “there is certainly a trend towards schools removing the requirement for teachers to enter any additional information at the end of the semester”. The primacy of the traditional semester report as the main vehicle for communicating to parents about their child’s learning, seems to be “a paradigm that is changing”. So says Daniel Hill, director of sales at Edumate, who – from his dealings with client schools – suspects many would drop the practice of semester reporting entirely, were it not for the mandated requirements.

Continuous reporting

The waning effort put into the production of semester reports in some schools is explained by an increasing preference for the new reporting functionality these electronic tools offer – continuous online reporting. Continuous reporting refers to the practice of reporting in regular instalments. Typically, at key moments throughout the semester, teachers provide updated assessment information to the system online, which is then made visible to students and parents.

The main benefit schools perceive in continuous reporting (sometimes referred to as progressive reporting) is the timely manner in which parents are informed of their child’s achievement. It is often seen as ‘too late’ at the end of semester for a parent to be formally notified of how their child has been performing. In addition, the added capacity to upload annotated copies of the student’s work, to include a copy of the assessment rubric, and to type limitless feedback comments to the student (visible also to parents), is seen as vastly more informative than the restrictive summary comments usually offered in a semester report.

Despite the potential for continuous reporting to be seen as burdensome for teachers, many schools are instead seeing it as a trade-off of teacher time, particularly in secondary settings. Assessing several tasks and providing feedback to students throughout the semester is already established practice in secondary schools. Travis Gandy, general manager of operations at Compass, suggests that one of the more popular aspects of continuous reporting for teachers is the lack of an “end of semester rush to get the reports out”. If detailed feedback offered to the student mid-term can also, by a click of a button, be made visible to parents as well, then avoiding a re-hash of this feedback for the parents at the end of semester can be seen as a win-win. 

Progressive reporting versus reporting progress

Beyond the detail, frequency, and the timeliness of information that continuous online reporting allows, product providers go to significant lengths to add new reporting features and functions. In line with much of the academic research into high-impact teaching, these features enable such things as the creation of electronic rubrics, differentiated assessment tasks, narrative teacher feedback, digital annotation of student work and student self-evaluation and reflection. 

However, research suggesting that, in any given class, the most advanced learners can be as much as five or six years ahead of the least advanced learners, has put mounting pressure on schools to be able to assess – and communicate – not simply a child’s performance against age-based curriculum standards, but the progress they make in their learning from the point at which they start. This was reflected in the recent Gonski 2.0 review, which includes the following recommendation: “Introduce new reporting arrangements with a focus on both learning attainment and learning gain, to provide meaningful information to students and their parents and carers about individual achievement and learning growth.” (Australian Government Department of Education and Training, 2018, p.31).

Such a recommendation presents significant challenges for schools. In an ACER Research Conference paper (Hollingsworth & Heard, 2018), we identify two key findings from our early analysis of samples of student reports collected from schools and sectors around the country. One of these findings is that while schools often use the word ‘progress’ within reports, most of what they report tends to focus on performance or attainment rather than learning gain. As we outline in that paper, one possible explanation for this is that in some schools, a child’s performance over time is considered synonymous with their progress over time. In the absence of assessment and reporting measures that can monitor and communicate a child’s increasing proficiency within an area of learning, a sense of progress can only be (erroneously) inferred from whether a child’s performance is improving, maintaining or declining. This is a concern both for perennially low-performing students who might still make significant learning progress each year, and for high-performing students who are not being extended.

Consultation with providers of LMS and SMS-like products would appear to support this view. Demand still exists for electronic reporting features such as grade point averages, student-to-cohort comparison charts, and ‘learning alerts’ that track the performance of students in their assessments longitudinally and notify teachers of any scores a student obtains that fall significantly outside their usual performance. It is conceivable that schools might use functions such as these to track or compare student performance, but construe them as indicating how a student is ‘progressing’. 

By implication, it may even be the case that by simply engaging in continuous – or progressive – instalments of reporting, schools misconceive this also as reporting ‘progress’. While providers of continuous reporting technologies are seeking to find solutions that would help schools to represent learning gain as well as performance in their reports, many schools are, according to Daniel Hill, “just at the beginning of that journey”. 

James Leckie agrees. While he has worked with client schools to assist them in using electronic rubrics for assessment that could theoretically measure – and track – progress, he notes that “it does rely on schools implementing standards-aligned rubrics” in the first place. Other providers, such as Sentral, enable schools to collate work sample portfolios for continuous assessment, which could theoretically be used to demonstrate gains in student learning and skill development over time. Both Sentral and Compass provide continuum tracker applications which, Travis Gandy at Compass explains, “allows schools to tick off particular [achievement standard] outcomes” over time. Gandy also points to Compass’ data analytics tool which stores external standardised assessment data (such as NAPLAN, PAT and OnDemand testing), useful for making assessments of progress along a learning or curriculum continuum. However, Gandy notes: “We often find it under-utilised by our schools”. Our own analysis of school reports in the Communicating Student Learning Progress project revealed only one school that reported standardised testing data to parents. 

Nevertheless, the functionality that electronic systems and tools already offer provides some exciting opportunities that – if re-purposed – may lead schools towards improvements in reporting, along the lines of the recommendations made in the Gonski report. For example, the capacity for teachers to report in regular instalments in place of (or perhaps in addition to) semester reports, to collate digital samples of student work in electronic portfolios as evidence of growth over time, to digitally annotate this work to describe the features that show increased proficiency, to track these gains on a curriculum continuum or digital rubric that reflects a progression of learning within a subject, and to correlate such evidence against stored data obtained by external standardised testing, means that schools are already well-equipped with electronic tools to enable them to communicate learning progress, in the true sense of the growth in understanding, skills and knowledge a student makes over time, irrespective of their starting point. 

What limits schools in this endeavour, therefore, is not the reporting technology, but the curriculum design, delivery, and assessment practices inherited from an industrial model of schooling, with its narrow focus on students’ performance and achievement against year level expectations, rather than the gains that learners make over time. A broader focus, coupled with a reimagining of how existing technologies could be used, provides a clear first step towards the improvements in reporting that would enable us to better value and communicate the growth our students make.


Australian Government Department of Education and Training (2018). Through Growth to Achievement: Report of the Review to Achieve Educational Excellence in Australian Schools. Commonwealth of Australia: Canberra. Retrieved May, 2018, from

Cuttance, P., Stokes, S. A. & Department of Education, Training and Youth Affairs (DETYA) (2000). Reporting on student and school achievement. Canberra: DETYA.

Eltis, K. J. & New South Wales Department of Training and Education Co-ordination (1995). Focusing on learning: Report of the review of outcomes and profiles in New South Wales Schooling. Sydney: NSW Department of Training and Education Co-ordination.

Hollingsworth, H., Heard, J. & Weldon, P. (in press). Communicating student learning progress. Melbourne: Australian Council for Educational Research.

Heard, J. & Hollingsworth, H. (2018). Continuous student reporting – the next step? Teacher Magazine. Retrieved from 

Hollingsworth, H., & Heard, J. (2018). Communicating Student Learning Progress: What does that mean, and can it make a difference? Paper presented at ACER Research Conference 2018, Sydney, August 2018.

Reporting to Parents Taskforce & Tasmania Education Department (2006). Report to the Minister for Education Hon David Bartlett MHA.


This article is an adaptation of a paper first published in Teacher magazine in 2018 as part of a series related to ACER’s Communicating Student Learning Progress project (Heard & Hollingsworth, 2018). 


Jonathan Heard is a Research Fellow at the Australian Council for Educational Research in the Educational Monitoring and Research Division. He has 15 years’ experience as a high school teacher, holding leading teacher positions in instructional coaching and pedagogy. His current work at ACER involves the assessment of reading, verbal reasoning and critical thinking. Jonathan is also a member of ACER’s Centre for Assessment Reform and Innovation (CARI) Communicating Student Learning Progress project.

Dr Hilary Hollingsworth is a Principal Research Fellow in the Educational Monitoring and Research Division at ACER. She has 30 years’ experience working in a wide range of national and international educational contexts including schools, universities, research organisations, government education departments, and private education service organisations. Her expertise is in teaching and learning, teacher education and professional development, classroom observation frameworks and the use of video, teacher feedback, teaching quality, assessing student learning, and communicating student progress. Hilary is a member of ACER’s Centre for Assessment Reform and Innovation (CARI) and led the Communicating Student Learning Progress project.

This article appears in Professional Voice 13.1 Mental health, reporting and education futures.

Have you recently transitioned to @education email and would you like to change email address for membership?