This study introduces a framework to produce very short versions of the MacArthur–Bates Communicative Development Inventories (CDIs) by combining the Bayesian-inspired approach introduced by Mayor and Mani (2019) with an item response theory–based computerized adaptive testing that adapts to the ability of each child, in line with Makransky et al. (2016).
We evaluated the performance of our approach—dynamically selecting maximally informative words from the CDI and combining parental response with prior vocabulary data—by conducting real-data simulations using four CDI versions having varying sample sizes on Wordbank—the online repository of digitalized CDIs: American English (a very large data set), Danish (a large data set), Beijing Mandarin (a medium-sized data set), and Italian (a small data set).
Real-data simulations revealed that correlations exceeding .95 with full CDI administrations were reached with as few as 15 test items, with high levels of reliability, even when languages (e.g., Italian) possessed few digitalized administrations on Wordbank.
The current approach establishes a generic framework that produces very short (less than 20 items) adaptive early vocabulary assessments—hence considerably reducing their administration time. This approach appears to be robust even when CDIs have smaller samples in online repositories, for example, with around 50 samples per month-age.