top of page

Pre-Use Validation

  • Writer: Specialised VET Services
    Specialised VET Services
  • Jun 27
  • 4 min read

Updated: 5 days ago

"Where in the Standards does it say we have to do pre-use validation? It just says we have to review the assessment tools prior to use..."


What is "pre-use" validation?
What is "pre-use" validation?

The answer is this:

The National Vocational Education and Training Regulator (Outcome Standards for NVR Registered Training Organisations) Instrument 2025 (the 'Outcome Standards') does not use the term "pre-use validation", but Outcome Standard 1.3.1 states:


The assessment system is fit-for-purpose and consistent with the training product


And the performance indicators at 1.3.2 is where it is outlined that an RTO must demonstrate:


(a) the assessment is consistent with the requirements of the training product;

(b) assessment tools are reviewed prior to use to ensure assessment can be conducted in a way that is consistent with the principles of assessment and rules of evidence set out under Standard 1.4; and

(c) the outcomes of any such reviews inform any necessary changes to assessment tools.



Before we break down each of these components, let's pause for a moment to consider the requirements in a broad sense.


Aside from the definition of validation in the Outcome Standards which is below, we can understand that in general, 'to validate' something means to check, prove or confirm that thing is true and/or correct; to verify it.


In the Outcome Standards:


validation means the review of the assessment system to ensure that:

(a) assessment tools are consistent with the training product and the requirements set out in this instrument; and

(b) assessments and assessment judgements are producing consistent outcomes.



Definitions aside for a moment, let's think about the activities that might underpin the Outcome Standard requirement.


Thinking about Outcome Standard 1.3.2a how would you demonstrate that assessment is consistent with the requirements of the training product?


Given the assessment tool drives the assessment, what would you do to demonstrate the assessment tool is meeting the requirements of the unit/s of competency?


Our suggestion is to validate the mapping document. Analyse:

  • The mapping document (as the record of assessment questions and tasks against the unit)

  • The assessment questions and tasks, and

  • The unit of competency.


Does the mapping document present an accurate account of how the assessment covers the unit requirements? In validating that mapping document, is it confirmed that the questions and tasks collect enough, valid evidence according to what the unit requires - across all of the knowledge and skills requirements? Can you verify that the assessment conditions will be met?


Next, thinking about Outcome Standard 1.3.2b, how would you demonstrate that assessment tools are reviewed prior to use to ensure assessment can be conducted in a way that is consistent with the principles of assessment and rules of evidence set out under Standard 1.4?


To meet this, it seems that the following is required:

An analysis of the assessment tool before it is put into use to confirm - to validate - that it contains the necessary elements to:

  • support assessment to be conducted according to the principles of assessment (fairness, flexibility, validity, reliability) and

  • support assessment decisions to be made based on the rules of evidence (validity, sufficiency, authenticity, currency)


Here, people with the expertise to understand the application of the 'principles' and 'rules' should be involved - as this is what the benchmark is; "...that assessment can be conducted in a way that is consistent with the principles of assessment and rules of evidence set out under Standard 1.4"


Lastly, thinking about Outcome Standard 1.3.2c, how would you demonstrate that the outcomes of any such reviews inform any necessary changes to assessment tools?


We would suggest this is clear cut - document the analyses to come out of the validation and/or use the RTO's continuous improvement register to record what the identified issue was, and how it was fixed. Documentation might be a overall summary report of the activity and/or the mapping sheet and/or tool marked up with comments.



What is interesting is one of the specific suggestions from ASQA in their Practice Guide linked to Outcome Standard 1.3. The suggestion made by them is to:


"...demonstrate how you have reviewed your assessment tools prior to use, for example by: consulting with industry to confirm that the content of the tool is correct and relevant to the workplace..."


We don't disagree with this action as one that goes toward quality of the tools, however not quite sure where this action fits in to meet any of 1.3.2a, 1.3.2b or 1.3.2c. It seems to be more aligned with Outcome Standard 1.2.1, which is:


"Engagement with industry, employer and community representatives effectively informs the industry relevance of training offered by the NVR registered training organisation."


At any rate, the Outcome Standards are not prescriptive, which means RTOs may find different ways to meet the requirements - as long as they do. Further, ASQA have confirmed that they do not intend for the Practice Guides to impose any legal or compliance obligations for RTOs.


In terms of the requirements in the performance indicators under Outcome Standard 1.3, RTOs must validate that the tools they are about to use will work within an overall system that meets compliance and training product requirements. Therefore the act of validating/confirming/verifying pre-use, is required.



Need a template to help document meeting the requirements? Try the Training Tools Pre-use Validation of Assessment Tools template


-This article is AITA Scale Rating 1

What is the AITA Scale?

bottom of page