Data compilation

DEFINITION: Operations performed on data to derive new information according to a given set of rules.

CONTEX: In quality assurance frameworks, “Data compilation” refers to the description of statistical procedures used for producing intermediate data and final statistical outputs. Data compilation covers, among other things, the use of weighting schemes, methods for imputing missing values or source data, statistical adjustment, balancing/cross-checking techniques and relevant characteristics of the specific methods applied.
 

SOURCE: SDMX Glossary (Version 1.0, February 2016), SDMX.
 

HYPERLINK: http://sdmx.org/

 

© European Quality Assurance in Vocational Education and Training

This project has been funded with support from the European Commission. This Website reflects only the views of EQAVET and the Commission cannot be help responsible for any use which may be made of the information contained therein

Designed and developed by Arekibo