Quality assurance processes
Quality assurance is integrated into our processes and computer systems and applied throughout the data-matching cycle.
These assurance processes include:
- registering the intention to undertake a data-matching program on an internal register
- risk assessment and approval from the data steward and relevant senior executive service (SES) officers prior to any data-matching program being undertaken
- conducting program pilots or obtaining sample data to ensure the data-matching program will achieve its objectives prior to full datasets being obtained
- notifying the OAIC of our intention to undertake the data-matching program and seek permission to vary from the data-matching guidelines (where applicable)
- restricting access to the data to approved users and access management logs record details of who has accessed the data
- quality assurance processes embedded into compliance activities, including
- review of risk assessments, taxpayer profiles and case plans by senior officers prior to client contact
- ongoing reviews of cases by subject matter technical experts at key points during the life cycle of a case
- regular independent panel reviews of samples of case work to ensure our case work is accurate and consistent.
These processes ensure data is collected and used in accordance with our data-management policies and principles and complies with the OAIC's Guidelines on data matching in Australian Government administrationExternal Link.
How we ensure data quality
The data is sourced from providers' systems and may not be available in a format that can be readily processed by our systems. We apply extra levels of scrutiny and analytics to verify the quality of the data.
This includes but is not limited to:
- meeting with data providers to understand their data holdings, including their data use, data currency, formats, compatibility and natural systems
- sampling data to ensure it is fit for purpose before fully engaging providers on task
- verification practices at receipt of data to check against confirming documentation; we then use algorithms and other analytical methods to refine the data.
Data is transformed into a standardised format and validated to ensure that it contains the required data elements prior to loading to our computer systems. We undertake program evaluations to measure effectiveness before determining whether to continue to collect future years of the data or to discontinue the program.
The ATO’s enterprise data quality (DQ) approach champions 6 core DQ dimensions:
- Accuracy – the degree to which the data correctly represents the actual value.
- Completeness – if all expected data in a data set is present.
- Consistency – whether data values in a data set are consistent with values elsewhere within the data set or in another data set.
- Validity – data values are presented in the correct format and fall within a predefined set of values.
- Uniqueness – if duplicated files or records are in the data set.
- Timeliness – how quickly the data is available for use from the time of collection.
To assure data is fit for consumption and maintains integrity throughout the data-matching program, the following data quality elements are also applied:
- Currency – how recent the time period is that the data set covers.
- Precision – the level of detail of a data element.
- Privacy – access control and usage monitoring.
- Reasonableness – reasonable data is within the bounds of common sense or specific operational context.
- Referential integrity – when all intended references within a data set or with other data sets, are valid.