Although large-scale assessments (LSA) of school achievement claim to measure domain-specific achievement, they have been criticized for primarily measuring domain-general abilities. Numerous studies provide evidence that LSA of mathematical achievement as well as verbal achievement cover both general cognitive abilities (GCA) and domain-specific achievement dimensions. We extend previous research by analyzing a standards-oriented and literacy-oriented LSA in the domain of science to determine the relation of these two assessment types with domain-general abilities. While literacy-oriented assessments focus on the knowledge and skills students need to meet the demands of modern societies, standards-oriented assessments focus on national educational standards and curricula. A sample of 1722 students worked on three assessments: (a) the PISA scientific literacy assessment; (b) a standards-oriented assessment based on the German National Educational Standards in biology, chemistry, and physics developed by the Institute for Educational Quality Improvement (IQB); and (c) a GCA test. Comparisons of competing structural models showed that models differentiating between domain-specific achievement and GCA best represented the structure of the assessments. Furthermore, standards-oriented and literacy-oriented LSAs in science shared common variance with GCA but also comprised specific variance. In addition to a factor representing students' GCA, we identified a science literacy-oriented and two standards-oriented factors. Relations with school grades in various STEM and non-STEM subjects were mixed and only partly provided evidence for the specificity of science LSAs. Our findings are important for understanding and interpreting results of LSAs in the contexts of GCA and science. We discuss our outcomes with respect to educational monitoring practices.
This item's license is: Attribution 4.0 International