Design of Corpus-Generated EFL Placement and Progress Tests for University Students
Marina Kustova, National Research University – Higher School of Economics (Russian Federation)
Abstract
The main aim of the paper is to introduce a tool which makes it possible to generate questions for an EFL placement test and for progress tests required in the course Academic Writing in English. A learner corpus made up of Higher School of Economics student essays and other pieces of academic writing with mistakes outlined and annotated by their English instructors (REALEC) has been used as a source of questions for tests. The system (whose working title is RETM – REALEC English Test
Maker) automatically generates a pool of tasks from the corrected sentences in the corpus, and these tasks are then analysed by an expert and those that are considered to be pedagogically reasonable
are divided into three sets depending on the level of the correction difficulty. During the test, the correction offered by a test-taker is compared with the correction that had been given by an annotator (an EFL instructor). The suggestion is either automatically accepted or rejected, and, correspondingly,
the next question in the test will be either at a higher or lower level of difficulty, or will stay at the same level if the highest or the lowest level has been reached. Even though the process of setting up a test is not fully automated and requires human effort at some stages, it is still much more timesaving than
the traditional ways of composing tests. In conclusion, I outline the points demonstrating advantages of this tool both for instructors and students.