Skip to main content

e-asTTle – Reports

e-asTTle produces reports that support teachers, students, whānau and family, and leaders. Find information on generating and interpreting individual and group reports.

e-asTTle –  Reports

Tags

  • AudienceKaiakoSchool leadersProfessional development providers
  • Resource LanguageEnglish

About this resource

e-asTTle is an online assessment tool, developed to assess students’ achievement and progress in reading, mathematics, writing, and in pānui, pāngarau, and tuhituhi.

This section of the e-asTTle collection explains how to use e-asTTle to generate and interpret individual and group reports for students.

To login to e-asTTle, click here: e-asTTle - Welcome (education.govt.nz)

Reviews
0

e-asTTle – Reports

e-asTTle can produce reports that support teachers, students, whānau and family, and leaders.

See the tabs below to find out more about the following reports:

  • Individual Learning Pathway (ILP) – gives a clear one page summary of a student’s overall score and includes their strengths and gaps, showing you what their next learning steps could be.
  • Individual Question Analysis (IQA) – a breakdown for each question showing correct/incorrect, the objective, and the ILP quadrant.
  • Tabular report – a summary of e-asTTle scores and levels. It is a .csv file that can be opened by Excel or imported into your student management system (SMS). You can report on up to 2000 students at one time.
  • Console and Console – Comparisons – dashboard (with dials, and box and whisker plots) showing group/class/cohort performance against e-asTTle norms.
  • Progress – Progress, Comparisons and Progress by Term – box and whisker plots showing longitudinal data against e-asTTle norms.
  • Curriculum Levels – bar graphs showing the distribution of students by curriculum level, for each curriculum strand, and their name listed in groupings according to curriculum and sublevel.
  • Group Learning Pathway – group-level formative information on the curriculum objectives.

There are several group reports available in e-asTTle. These help to answer the question “how are groups (of learners) going with their learning?” 

  • The Group Learning Pathway provides formative information on each of the curriculum objectives tested and groups the students according to their curriculum and sublevel. This enables you to see the spread in your group/class/cohort.
  • The Curriculum Levels report groups your students for you according to curriculum and sublevel for each curriculum strand.
 | 

Generating a multi-test report

To generate a multi-test report, go through the following steps.

  1. Select “View Reports” from the left menu.
  2. Use filters to search for tests.
  3. Select "Search".
  4. Select the tests you wish to view reports for.
  5. Select "View Reports".
  6. On the next screen you will need to select the options you wanting to create a report, for example “Year,” “Group” etc., to suit the multi-test report you are wanting to create.
  7. Choose the report type from the thumbnail.
  8. If prompted, select the options you wish to have displayed for example, “Comparisons,” “Compare Results on the X-axis".
  9. If prompted, select the “Scaling Options".
  10. Click “View Report”.
  11. Click “OK” when the popup button appears saying “PDF ready – Click OK to download.

 

Guidelines for reporting across multiple tests

Aside from the Tabular report, there are six other report types that allow you to look at multiple tests simultaneously. They fall into two categories – reports designed to combine results for tests sat closely together, and reports designed to assess progress over time.

Tests administered within three months

 

The Console, Console–Comparisons and Group Learning Pathway are intended for tests that were administered around the same time. For example, when you have created three Reading tests and targeted them to different abilities, and students sat the test at rougly the same time, you can display the aggregated results on these reports.

Generating a Console, Console–Comparisons or Group Learning Pathway for these tests taken closely together will combine the results as if they all came from one test. On the report, you cannot separate out which results came from which test.

Although e-asTTle does not restrict you to only using Console, Console – Comparisons, and Group Learning Pathways for tests administered within three months, use caution when aggregating tests over a longer time perion. In particlar, Group Learning Pathways collected over long periods may misrepresent students’ current strengths and weaknesses.

Tests administered more than three months apart

 

Progress, Progress–Comparisons, and Progress by Term reports are intended for tests administered at least three months apart. Progress by Term automatically combines all results from multiple tests in the same school term. Each term on the report will have one box-and-whisker representing the combination of results.

Progress and Progress–Comparisons display results from different tests as separate box-and-whisker plots. This means that even if two tests have been sat within three months, two separate box-and-whisker plots display. Note that if you assign the same test within three months, these results are aggregated and reported as one box-and-whisker plot.

The "Tabular Report" lists your students’ ID numbers, year and demographic information, and all scores and levels. When multiple tests are selected to report on, the report is divided into sections, each representing one test.

Tabular report, excel spreadsheet with data populating for multiple fields including Test ID, Test name, Test type, Date modified, Student Ma, NSN ID, Names, Funding year, Date tested, Gender, Language, Ethnicity and Overall score.

Image description: Tabular report, excel spreadsheet with data populating these fields: Test ID, Test name, Test type, Date modified, Student Master ID, NSN ID, Last name, First name, Funding year, Date tested, Gender, Language, Ethnicity, Overall score, Overall level surface, Surface score, Deep score, Deep level.

 

Tabular reports are available across the different subjects. You can include up to 2000 student results at a time. Tabular reports with over 300 results are generated overnight.

 

Dashes on reports

A dash indicates that e-asTTle could not generate a score. If a student gets fewer than three questions correct in a strand or the test, e-asTTle will not be able to generate the appropriate response. Reading and maths tests have an additional guessing function. In some cases, a student may have answered three questions correctly but still have a dash because some of their correct answers were removed as likely guesses.
See the e-asTTle – Reference material page for more information on the guessing function.

 

Types of Tabular reports

A Tabular Report comes in two types: the standard Tabular Report (SMS compatible) and the Tabular Report by Group.

e-asTTle Report Version screen with Tabular Report (SMS-Compatible) radio button selected, and Tabular Report by Group radio button (not selected). Below this are Cancel button and View Report button.

Image description: e-asTTle Report Version screen with Tabular Report (SMS-Compatible) radio button selected, and Tabular Report by Group radio button (not selected). Below this are Cancel button and View Report button.

 

The “Tabular Report by Group” differs from the standard “Tabular Report” in that it contains either one or two additional group columns:

  • Assignment group (will always display).
  • Assignment subgroup (this column applies if the test has been assigned to a “group of groups".

The Tabular Report by Group may not be compatible with your SMS system. 

Group information


Assignment Group column 

This column will always display. It contains the name of the group that your test was assigned to. 

Assignment Subgroup column 

This column will show up if you assign your test to a "group of groups". In this situation, the group is broken down into subgroups.

Importing a Tabular report into your SMS

 

The standard Tabular Report (SMS compatible) can be imported into your SMS. The Tabular Report by Group into may not be compatible with your SMS. 

Visit the KAMAR website for importing an e-asTTle file (Note: you will need to login to the KAMAR website).

If you are using a different SMS, contact the SMS vendor for steps to import a Tabular Report from e-asTTle.

This report addresses the question, “How are my students progressing?” Using this report you can track your students’ progress over time, including over schooling years. To get a Progress Report, you will need to select more than one test (that were taken at least three months apart). 

The Progress Report displays box-and-whisker plots showing the centre and range of students’ overall results. The box-and-whiskers plots are joined together with lines to the median to indicate the rate of progress.

See the e-asTTle – Reference material page, under "interpreting data tab" for more information.

 

Progress – Comparisons report

This is a variation of the Progress report that allows you to compare the progress of certain cohorts of students with those of equivalent gender, ethnicity, or language (“like with like”). This report addresses the questions: 

  • “How are students of a specific gender, ethnicity and/or language progressing, compared to similar students in New Zealand?”
  • “How are my students progressing compared to those in similar schools?” 

In this report, only the groups you selected are included in the report and e-asTTle norms only refer to students of your chosen group. For example, you could compare your Year 6 male students with the e-asTTle norms for Year 6 males. 

This report indicates that if your groups’ results are below average, that performance is not due to group membership, as similar students in other schools are performing better than your sample. It may be that the difference is due to instructional experience. 

Remember to consider the number of students in your group with these attributes. If your group is small, interpret your results with caution. Smaller groups are more likely to be significantly different from the e-asTTle norm by chance.

Your school is assigned one of 20 e-asTTle clusters, based on its location, decile, type, and size. Using the "Schools Like Mine" option will show your students’ performance compared to students in your e-asTTle cluster. The report header shows which cluster your school is assigned to.

Please note: The profile and demographics in your school may have changed considerably since the school was allocated to a cluster so there should be some caution taken when interpreting the 'Schools Like Mine" comparative data. 

 

Progress by Term reports 

The Progress by Term is like the Progress report, but it aggregates all results from all tests that were sat within one school term. Each term on the report will have one box-and-whisker representing the combination of results. The right-hand y-axis displays the e-asTTle levels, which correspond to the scores on the left-hand y-axis. This report often provides the cleanest view of group results and progress.

Screenshot showing “Progress by Term: Reading” for Year 7-8, period tested: 2023-2024.

Image description: Screenshot showing “Progress by Term: Reading” for Year 7-8, period tested: 2023-2024. On the lefthand side are four line graphs sitting vertically. The first is yellow and shows “Processes and Strategies [13]”, the second is green and shows “Purposes and Audiences [13]”, the third is blue and shows “Ideas [13]”, the fourth is grey and named “Language features [0]” (there is no data on the graph). In the centre is a box and whisker graph with “NZ Performance” displayed in blue and “Your Group Performance” displayed in red. The graph title is, Overall Scores. The y axis scores range from 1250-1650 in increments of 50. The x axis shows data from Term 1 2023 2 results, Term 2 2023, Term 3 2023, Term 4 2023 5 results, and Term 1 2024 7 results. On the righthand side y axis is L6 going down to L3. The graph data shows: Term 1, 2023 blue box (NZ performance) from 1375-1490 and the whisker extends the length of the graph. The Term 1 2023 red (Year Group Performance) whisker extends from 1325-1405. The Term 4 2023 blue box is 1400-1510 and the whisker extends the length of the graph. The Term 4 2023 red box is 1375-1500, the bottom whisker extends from 1350-1375 and the top whisker extends from 1500-1550. The Term 1 2024 blue box is 1410 to 1520 with the whisker extending the length of the graph. The Term 1 2024 red box is 1405-1525, the bottom whisker extends from 1375-1405 and the top whisker extends from 1525-1530. The graph shows the overall group performance is below the NZ performance from Term 1 2023 to Term 4 2023. The group performance in Term 1 2024 is just above NZ performance.On the far right are 4 grey boxes displayed vertically. The first has “Structure [0]” with no data. The three grey boxes below it contain no data.

 

This report is only available if the tests you selected were sat within a 15-month period. 

Use caution when interpreting reports with few results. In the graph above, the first box-and-whisker only has two results, the second five, and the final box-and-whisker has seven results.

Please note: By selecting the “Progress by Term” report you will be able to see your class progress over time divided into term increments. Each term will have one red box-and-whisker which represents the aggregated scores from all the tests that your class sat in that term.

This report addresses the feedback question, "How are my students doing compared to others of the same year level who are sitting e-asTTle tests?" 

It shows your students’ e-asTTle scores – overall, depth and strand – relative to the e-asTTle norms.

 

Curriculum function (strand) dials

    Two curriculum functions dials. On the lift if the "number knowledge" dial. One the right is the "number sense and operations" dial.

    Image description: Two curriculum functions dials. On the left is the “Number Knowledge [17]” dial which ranges from 1400-1800. The pointer is in the blue section of the dial and points to just below 1550. On the right is the “Number Sense & Operations [17]” dial which ranges from 1400-1800. The pointer is in the blue section of the dial and points to just above 1550. 

     

     Explanation of the parts of the dials 

    • The red pointer shows your students’ achievement.
    • The blue shaded area shows the average strand score (or ‘norm’) for the students’ year level. When multiple year levels are included on the report, weighted averages are used to allow fair comparisons. 
    • The number between square brackets is the number of results displayed. 
    • The width of the red pointer indicates an average e-asTTle measurement error. If there is white or coloured space between the pointer and the edge of the blue area, you can assume there is a statistically meaningful difference between your students’ results and the e-asTTle norm.

    Important points to note

    • Strand scores must be interpreted carefully. They represent scores for students who got three questions right in that strand after any likely guesses were removed. When the test contains relatively few questions in a strand, the strand results will be biased towards the higher achieving students. 
    • Be cautious about interpreting dials when there are not many results (that is, a small number in the square brackets). 
    • Students without strand scores are not included. For e-asTTle to have enough evidence to generate a valid score, at least three questions must be answered correctly. Unfortunately, this means if there are only four algebra questions, only students with 3 out of 4 or 4 out of 4 correct are reported. Typically, when some strands don’t have many questions, these dials are biased towards your higher achieving students.
    • Overall results are organised by year level. Up to eight year levels with matching norms can be displayed. In instances where more than eight year levels have sat the test, the years with the most students are selected and represented on the report.
    • The matching blue boxes show the "surface" and "deep" achievement of other e-asTTle students in the same year level. When multiple year levels are included on the report, the data points on the box-and-whisker are weighted to allow fair comparisons.  

    What else do I need to know? 

    • Surface and deep plots are not available for writing tests. 
    • Be cautious about interpreting surface/deep box-and-whisker plots when there are not many results (that is, a small number in the square brackets).

    Attitude score bar

    • The red circle shows your students’ average attitude score. 
    • The blue bar shows scores for other e-asTTle students in the same year level. 

    For multi-test Console reports, when different tests have been created with different attitude domains, the attitude bar does not display.

     

    Console report options

    Choosing a scaling option 

    The scaling option you choose determines which numbers are used as the y-axis on the report. 

    • Auto-scale (Default) – this option "zooms in" to fit the lower and upper quartiles of your group and of the norms. It is the best option when you are looking at just one report at a time. 
    • While it will always have either 50-point or 100-point increments, the minimum and maximum of scale range vary depending on the scores. 
    • Comparative scale – this option "zooms out" to include the full e-asTTle score scale. This is useful when you have two reports from different time periods (or different groups) and you want to compare them side-by-side. 

    Choosing an x-axis 

    Your choice of x-axis determines how scores are grouped together.

    Year is the default option. This gives separate box-and-whisker plots for each year level. For all other options, you will have to select one year at a time.

    How to choose an x-axis

    1. Select the radio button for the option you wish to use. 
    2. If you have selected Year/Gender, Year/Ethnicity, Year/Language Groups or Year/Student Groups, a drop-down will appear showing the year levels of students who have completed the test.  
    3. Select one year level (note that if you select Year/Student Groups and students belong to fewer than seven groups, the report will automatically generate with these groups. If students belong to eight or more groups, a separate panel will display. You must select up to seven groups and add these groups to the Selected Groups box.) 
    4. Select "View Report" to view a PDF of the report. 

    Important points

    • Pānui, Pāngarau, and Tuhituhi tests have options of Gender or Student Groups comparisons only, as there are limited norms for these subjects. 
    • Be mindful of which groups you select to report on. If a student belongs to more than one of the groups that you selected, their result will display in more than one box-and-whisker plot e.g. Johnny is in both Room 2 and Tigers Reading groups. Selecting both groups will result in Johnny’s score displaying on both box-and-whisker plots.

    x-axis options: Year – Gender/Ethnicity/Language at Home 


    These report options help you work out which effects that are due to group membership and which are due to their instructional experience. For example, you might generate a Console Year x Ethnicity for Year 9. 

    If your Pacific students are below the norm for Pacific, this result may not be related to ethnicity because Pacific students in other schools are achieving higher on e-asTTle tests. It may be that the difference is due to instructional experience.

      Screenshot of e-asTTle “View reports: Reading 4P English A”.

      Image description: Screenshot of e-asTTle “View reports: Reading 4P English A”. The information section at the top shows: Level 3, 4, 5, Strand PS, LF, Total Test Time: 38 minutes, Delivery method: onscreen, Date Created: 05 Apr 2018, Owner: Simone Bayley. Below this is the heading “Compare Results on the X-axis" with the following series of radio buttons listed vertically: Year, Year (Gender), Year (Ethnicity), Year (Language at Home), Year (Student Groups). The Year radio button is selected. Below the radio buttons is the heading “Scaling options” with these radio buttons: Auto scale – best for single reports, Comparative scale – best for comparing multiple reports. Auto-scale is selected. At the bottom are “Cancel” and “View Report” buttons.

       

      Adding an extra layer of comparisons with Console-Comparisons

      Console and Console-Comparisons reports look very similar. Both compare your students based on year level and demographics. However, Console-Comparisons allows you to add an extra layer of comparisons. 
      Console-Comparisons addresses the following questions: 

      • How are students of a specific gender, ethnicity and/or language doing compared to similar students? 
      • How are my students doing compared to students in similar schools? 

      Only the groups you selected are included in the report, and e-asTTle norms refer to only students of your chosen group. For example, selecting "female" will only include your female students. They will be reported against the female e-asTTle norms.

      Using x-axis options with Console-Comparisons options 

       

      Your choice of the comparisons and x-axis options determines how scores are grouped together. 

      Year is the default option. This gives separate box-and-whisker plots for each year level. For all other options, you will have to select one year at a time. 

      Steps for choosing comparison options:

      1. Select the demographic of the option you wish to use. You can select one or two. 
      2. Choose "Select x-axis". 
      3. Select the radio button for the option you wish to use. 
      4. If you have selected Year/Gender, Year/Ethnicity, Year/Language Groups or Year/Student Groups, a drop-down box will appear showing the year levels of students who have completed the test. Select one year level.  
        •  Note that if you select Year/Student Groups and students belong to less than seven groups, the report will automatically generate with these groups. If students belong to eight or more groups, a separate panel will display. You must select up to seven groups and add these groups to the Selected Groups box.
      5. Select "View Report" to view a PDF of the report.

      This report addresses the question, “How are my students performing relative to curriculum level targets?” 

      It shows you the distribution of students across curriculum levels, for each strand you selected during test creation. This report lets you monitor how teaching and learning activities affect your students’ progress between curriculum levels. 

      The "Curriculum Levels" report only displays a maximum of three levels. The three levels selected are those with the majority of students. Students who fall outside these levels will be displayed as > or <. For example, if the lowest displayed level is 3, students with scores in level 2 will be presented as <3B. 

      To see lists of students that fall in the various sub-levels for the different curriculum areas either select the curriculum strand link below the graphs or scroll down. 

      Only writing reports contain sub-levels 1B, 1P, and 1A. For other subjects, students achieving below Level 2 are represented as <2B.

      Screenshot of Curriculum Levels Report for Test: 5 Literacy A. Group: All test candidates. Date tested.

      Image description: Screenshot of Curriculum Levels Report for Test: 5 Literacy A. Group: All test candidates. Date tested. Three bar graphs show scores for: Curriculum functions – Progress and strategies, Ideas, Language features.

      This report addresses the question, “What are the strengths and weaknesses of my group?” 

      The "Group Learning Pathway" is a group summary of the "Individual Learning Pathway" reports. 

      The question numbers from the test are noted in the brackets. If the test was adaptive, the question number is prefixed with the stage number. For example (1:4) indicates question number 4 of stage 1.

      The report is formatted as a horizontal cumulative bar graph. The graph shows the percentage of occasions that your students have Gaps, To Be Achieved, Achieved or Strength in questions within each curriculum objective.

      Pay most attention to objectives where there are at least four questions. If an objective with less than four questions shows an interesting result, you may wish to create a new test focusing on that strand. This will provide you with a more robust estimate of student ability in that strand.

      Annotated Group Learning Pathways Report for Test.

      Image description: Annotated Group Learning Pathways Report for Test. Group is “All test candidates". Group size is 11.

      The key shows stacked bars: Red “Gaps %,” Blue “To be achieved %,” Green “Achieved %,” Yellow “Strengths %".

      There are five Processes and Strategies bars: 1. Find, select, & retrieve information, 2. Skim/scan for information, 3. Knowledge of strategies to solve unknown words and gain meaning, 4. Respond using understandings and information, 5. Make use of prior knowledge.

      There are three Purposes and Audience bars: 1. Empathise with characters and situations, 2. Explore author’s purpose & question intent, 3. Evaluate author’s purpose & intent.

      There are six Ideas bars: 1. Consistently read for meaning, 2. Identification and understanding of main ideas, 3. Make inferences, 4. Select and retrieve accurate and coherent information, 5. Understand and interpret information accurately, 6. Understand meanings or ideas.

      Annotations of the lefthand side of the graph.

      Annotation pointing to the Find, select, & retrieve information bar, “Each bar is drawn in proportion to the percentage of the responses".

      Annotation pointing to the Make use of prior knowledge bar, “Curriculum objective".

      Annotation pointing to the Evaluate author’s purpose & intent bar which shows 45% of students with gaps, “Mainly red bar: Area where group needs remedial instruction".

      Annotation pointing to the Understand and interpret information accurately blue section of the bar which shows 45% of students in the “To be achieved” section of the bar, “Mainly blue bar: Area where instruction should be planned".

      Annotations on the right-hand side of the graph.

      Annotation pointing to the number 59, on the Find, select, & retrieve information green section of the bar, “Number indicates percentage of group that had questions for that objective in each quadrant. Note: if the test was adaptive, the report will be displayed like an Aggregate GLP (see below)".

      Annotation pointing to the Make use of prior knowledge bar which shows 64% of students Achieved, “Mainly green bar: Area where further instruction is not required at this level".

      Annotation pointing to Empathise with characters and situations (6, 7, 10, 24, 27, 29) bar title question numbers, “Question number in the test related to curriculum objective".

      Annotation pointing to the Understand and interpret information accurately bar showing the yellow 45% strengths section, “Mainly yellow bar: Area where group has significant strength".

      Key:

      • Red = Gaps percentage
      • Blue = To be achieved percentage
      • Green = Achieved percentage
      • Yellow = Strengths percentage

       

      Aggregate (Multi-test) Group Learning Pathway Report 

      The most noticeable difference between the multi-test and the single-test version is that there are no question numbers. Instead, the aggregate report displays the total number of questions and total number of students related to each achievement objective bar. The bars, percentages and coloured sections can be interpreted in the same way as a single-test GLP report.