Whereas previous research revealed the interactional variability occurring in oral assessments, demonstrating how it could undermine test validity (e.g., A. Brown, 2003), little has been published regarding how language programs can mine and analyze video recordings of in-house oral placement tests for validation purposes. Addressing this need, this article demonstrates how action research supplemented by an interventionist conversation analysis (CA) approach may be implemented as a tool for localized and cyclical test validation. This article reports on the process of how a team of oral placement test interviewers and CA researchers sought to iteratively improve interviewer training materials and testing protocols based on observed talk during placement test interactions. The study revolves around sample analyses of data excerpts to exemplify two data comparison procedures, involving different interviewers within the same iteration (i.e., horizontal comparison) and the same interviewer across different iterations after participation in a training session (i.e., vertical comparison). It showcases the ongoing process of collaboration with the goal of offering transferable insights to those who design, implement, and revise in-house oral placement tests. The authors contend that a holistic and unified approach to test validation, involving continued dialogue between researchers and practitioners, is crucial to in-house oral placement tests (Kunnan, 1998; Messick, 1989).
ASJC Scopus subject areas