The Methodology: Quantitative study
Data: Written IELTS test score and performance in English 101/ entry level English class based on percentage grade in the course, and performance in entry level class in the student’s degree area, based on percentage grade in the course.
Why : I want to see if the performance band that a student scores in on the IELTS consistently correlates with their classroom performance, in order to determine whether or not it is an accurate predictor of student outcomes, or ability to meet the standards of performance in the classroom setting.
Data Collection: Questioners with closed answer questions regarding scores and level of performance.
The data that we are collecting is protected personal information, and so it will be easiest to get directly from students via a simple questioner. We will only be collecting quantitative data regarding scores, and will be collecting it anonymously in hopes of receiving more honest answers.
Research will primarily be used to create background information and to inform the hypothesis through the literature review:
References:
Cooper, Trish. "Can IELTS Writing Scores Predict University Performance? Comparing the Use of Lexical Bundles in IELTS Writing Tests and First-year Academic Writing." Stellenbosch Papers in Linguistics Plus 42.0 (2014): 63. Web.
This study asks whether or not there is a strong connection between IELT writing scores and classroom performance. It takes aim at creating meaningful data around the way that IELT score can be used in the University setting to determine what a student is actually capable of, and to what degree educators can expect them to engage in academic level writing. This is similar to our own hypothesis, except that we are looking at performance in specific course performance, and this looked more generally at first year academic writing. It is still useful, however, in defending our own research question.
Feast, Vicki. “The Impact of IELTS scores on performance at University. International Education Journal Vol 3, No 4, 2002. Print.
Rather than simply establishing correlation, this study works to create a cause and effect relationship between IELT testing and University performance. This means that misscoring could actually lead to student failure. This is interesting with relation to our work because of the cause and effect that they work to build I want to specifically use this to defend the importance of the testing, and the importance of the scores being accurate. If a student scores well on IELTS but is not actually prepared for university level work, it can significantly damage their ability to perform at the expected level in classes. This is a trend I would expect to see in at least some of the data that we collect through our research process.
Moore, Tim, and Janne Morton. "Dimensions of Difference: A Comparison of University Writing and IELTS Writing." Journal of English for Academic Purposes 4.1 (2005): 43-66. Web.
This study is in some ways similar to Muller’s study of the gap between IELTS and performance, but rather, this considers the gap between IELTS testing strategy and classroom tasks. In other words, it considers the way in which IELTS measures a different set of skills than academic work. This is very significant to our study. If the tasks are comparable then we should see a strong correlation in the data that we collect in our research process, but if they are not comparable then we will see variance, which is indicative of a major flaw in the testing system.
Müller, Amanda. "The Differences In Error Rate And Type Between IELTS Writing Bands And Their Impact On Academic Workload." Higher Education Research & Development 34.6 (2015): 1207-1219. Academic Search Complete. Web. 9 Mar. 2016.
This study meaningfully demonstrates that IELTS score does not always accurately reflect academic performance level of students. It quantitatively considers the gap between what is demonstrated in standardized testing, and what students are actually capable of doing within the academic classroom.
Uysal, Hacer Hande. "A Critical Review Of The IELTS Writing Test." ELT Journal: English Language Teaching Journal 64.3 (2010): 314-320. Academic Search Complete. Web. 9 Mar. 2016.
Uysal critically reviews the reliability of the IELTS test, giving special attention to various flaws in the testing procedure which could lead to a lack of reliability and trying to determine whether or not they have created unreliability in practice. He also provides suggestions for changes that should be made in order to increase the reliability and usability of the test. This is similar to our own purpose, but from a slightly different perspective, and so is useful in terms of literature review, and honing in on exactly what we are hoping to determine in the course of the study.