The implementation of mathematical competencies in school curricula requires assessment instruments to be aligned with this new view on mathematical mastery. However, there are concerns over whether existing assessments capture the wide variety of cognitive skills and abilities that constitute mathematical competence. The current study applied an explanatory item response modelling approach to investigate how teacher-rated mathematical competency demands could account for the variation in item difficulty for mathematics items from the Programme for International Student Assessment (PISA) 2012 survey and a Norwegian national grade 10 exam. The results show that the rated competency demands can explain slightly more than and less than half of the variance in item difficulty for the PISA and exam items, respectively. This provides some empirical evidence for the relevance of the mathematical competencies for solving the assessment items. The results also show that for the Norwegian exam, only two of the competencies, Reasoning and argument and Symbols and formalism, appear to influence the difficulty of the items, which questions to what extent the exam items capture the variety of cognitive skills and abilities that constitute mathematical competence. We argue that this type of empirical data from psychometric modelling should be used to improve assessments and assessment items, as well as to inform and possibly further develop theoretical concepts of mathematical competence.