Developing a Language Placement Test for Italian

In 2012, David Skica of Loyola University approached BYU to join forces and develop an Italian webCAPE. For the past two years they have been working diligently to complete the exams.

When Jerry Larson, Ph.D. developed webCAPE, he did it for the most commonly taught languages in high school–Spanish, French and German. Today those are still the most common languages, but language education in elementary, middle and high schools are shifting. As more languages are taught in schools, more placement testing is needed. It seems a week doesn’t go by when we get a request to develop another language placement exam. Trending today are: Chinese, Arabic and Portuguese. But, overwhelmingly for the past 10 years, the most common request has been for an Italian webCAPE.

As cultural boundaries erode and inter-national marriages grow, we are also seeing many students who have grown up mastering two languages since early childhood. The demand for accurate placement has never been higher. We would love to someday offer placement in many more languages, but as I summarize below, it’s a tedious process. For now, we are happy to say we are working on the Italian version of our placement test.

When Will the Placement Exam Be Ready?

While on the surface the delivery of a placement exam seems really easy, there is a lot that goes into it. First you have to create the questions, answers and sensible wrong answers–oh, anywhere between 900 and 1000 questions. And make sure the questions range in proficiency from easy to hard for all aspects: from grammar to reading, spelling and syntax. Then, you have to have all questions peer-reviewed. Second, you have to randomize the selection of questions and test many, many students. Third, you review the results and analyze the questions; removing redundant questions, adjusting others. Test again. Finally, when you believe you have the right bank of questions, assign a value and place them into the adaptive engine. If you don’t have an adaptive engine that can manage scoring and question selection you have to build that as well.

For Italian, Loyola has already created the bank of questions. Below is a high overview of the project and anticipated time line:

Early 2013 – BYU & Loyola entered into a Joint-effort agreement to work on Italian webCAPE
Mid 2013 – Loyola performed a preliminary analysis of question types, ranges, scores and purpose
Late 2013 – Entered the bank of questions into a CAPE-like engine that randomizes the questions
Early 2014 – Performed an Alpha test with the bank of questions
Mid 2014 – Analyzed the data and made the appropriate adjustments
Late 2014 – (Current) Performing a Beta1 & Beta2 test of the exams
Early 2015 – Analyze the results make question bank adjustments
Mid 2015 – (Path A) Determine the need for a Beta 3 test period
Mid 2015 – (Path B) Approve Italian webCAPE exams
Late 2015 – Release to the public.

We would like to thank the many volunteers (and students) who have helped during our beta testing, including:

Ohio University
Fairfield University
Lake Erie College
Montclair State University
Southeastern Louisiana University
University of Massachusetts – Dartmouth
Gonzaga University

Please stay tuned for additional updates!