Ir al menú de navegación principal Ir al contenido principal Ir al pie de página del sitio

Articles

Vol. 27 Núm. 1 (2024)

Developing and validating a post-admission screening-diagnostic assessment procedure to offer language support in diploma programs

DOI
https://doi.org/10.37213/cjal.2024.33259
Enviado
diciembre 8, 2022
Publicado
2024-03-15

Resumen

As post-secondary institutions assume more responsibility for the language abilities of their graduates, more attention is being paid to post-admission language support to enhance student success. Previous research has indicated that a post-admission language diagnostic assessment procedure, when coupled with language support services, can be an effective model in helping students meet language expectations in post-secondary settings. This paper outlines the development and validation of a screening-diagnostic assessment procedure to recommend students to language support services in diploma programs. Our key findings suggest that testing vocabulary can be an effective measure for screening language abilities and that students who receive a recommendation through the procedure and subsequently attend language support classes have higher communication grades than those who do not attend. These results offer validity evidence for the use of this procedure while ongoing research is being conducted to continue to validate its testing measures.

Citas

  1. Alderson, J. C. (2005). Diagnosing foreign language proficiency: The interface between learning and assessment. Bloomsbury Academic. http://dx.doi.org/10.5040/9781474212151
  2. Alderson, J. C. (2007). The challenges of diagnostic testing: Do we know what we are measuring? In J. Fox, M. Wesche, D. Bayliss, L. Cheng, & C. E. Turner (Eds.), Language testing reconsidered (pp. 21-30). University of Ottawa Press.
  3. Alderson, J. C., Brunfaut, T., & Harding, L. (2015). Towards a theory of diagnosis in second and foreign language assessment: Insights from professional practice across diverse fields. Applied Linguistics, 36(2), 236–260. https://doi.org/10.1093/applin/amt046
  4. Alderson, J. C., & Huhta, A. (2005). The development of a suite of computer-based diagnostic tests based on the Common European Framework. Language Testing, 22(3), 301–320. https://doi.org/10.1191/0265532205lt310oa
  5. Arkoudis, S., Baik, C., & Richardson, S. (2012). English language standards in higher education: From entry to exit. ACER Press. https://doi.org/10.1080/07294360.2013.783955
  6. Bachman, L. F., & Palmer, A. S. (1996). Language testing in practice: Designing and developing useful language tests. Oxford University Press.
  7. Bachman, L. F., & Palmer, A. S. (2010). Language assessment in practice: Developing language assessment and justifying their use in the real world. Oxford University Press.
  8. Beynen, T. (2020) Metaphor comprehension and engineering texts: Implications for English for Academic Purposes (EAP) and first-year university student success, TESL Journal Canada/Revue TESL Du Canada, 37(1), 22-50. https://doi.org/10.18806/tesl.v37i1.1332
  9. Biber, D., Gray, B., & Poonpon, K. (2011). Should we use characteristics of conversation to measure grammatical complexity in L2 writing development? TESOL Quarterly, 45(1), 5–35. http://www.jstor.org/stable/41307614
  10. Bonanno, H., & Jones, J. (2007). The MASUS procedure: Measuring the academic skills of university students: A diagnostic assessment. University of Sydney, Learning Centre.
  11. Bond, T. G., & Fox, C. M. (2015). Applying the Rasch model: Fundamental measurement in the human sciences. (3rd ed.). Routledge. https://doi.org/10.4324/9781315814698
  12. Brown, J. D. (2013). My twenty-five years of cloze testing research: So what? International Journal of Language Studies, 7(1), 1-32. http://www.ijls.net/sample/71-1.pdf
  13. CLB. (2012, October). Canadian language benchmarks: English as a Second Language for Adults. Centre for Canadian Language Benchmarks. https://www.canada.ca/content/dam/ircc/migration/ircc/english/pdf/pub/language-benchmarks.pdf
  14. Celce-Murcia, M. (2007). Rethinking the role of communicative competence in language teaching intercultural language use and language learning. In E.A. Soler, & M.P. Jordà (Eds.), Intercultural language use and language learning (pp. 41–57). Springer.
  15. Chapelle, C.A., Voss, E. (2017). Utilizing technology in language assessment. In E. Shohamy, I. Or, & S. May (Eds.), Language testing and assessment. Encyclopedia of language and education (pp. 149-161). Springer.
  16. Council of Europe. (2020). Common European Framework of References for Languages: Learning, teaching, assessment- Companion volume. Council of Europe Publishing. www.coe.int/lang-cefr.
  17. Cox, T. L., & Dewey, D. P. (2021). Measuring language development through self-assessment. In P. Winke, & T. Brunfaut (Eds.), The Routledge handbook of second language acquisition and language testing (pp. 382-390). Routledge.
  18. Coxhead, A. (2000). A new academic word list. TESOL Quarterly, 34(2), 213-238. https://doi.org/10.2307/3587951
  19. Coxhead, A., & Nation, P. (2001). The specialised vocabulary of English for academic purposes. In J. Flowerdew, & M. Peacock (Eds.), Research perspectives on English for academic purposes, 252–267. https://doi.org/10.1017/CBO9781139524766.020
  20. Crossman, E., Choi, Y., Lu, Y, & Hou, F. (2022, March 23). International students as a source of labour supply: A summary of recent trends. Statistics Canada. https://www150.statcan.gc.ca/n1/pub/36-28-0001/2022003/article/00001-eng.htm
  21. Darics, E., & Koller, V. (2018). Language in business, language at work. Bloomsbury.
  22. Devos, N. J. (2019). Comparing first-term students’ English language proficiency at a Canadian polytechnic institute. BC TEAL Journal, 4(1), 53-83. https://doi.org/10.14288/bctj.v4i1.335
  23. Devos, N. J. (2022, November 7). BCIT provides students with free English language support services. BCIT News. https://commons.bcit.ca/news/2022/11/english-language-support-services/
  24. Di Biase, B., & Kawaguchi, S. (2013). Processability theory. In P. Robinson (Ed.), The Routledge encyclopedia of second language acquisition (pp. 512-518). Routledge.
  25. di Gennaro, K. (2016). Searching for differences and discovering similarities: Why international and resident second-language learners’ grammatical errors cannot serve as a proxy for placement into writing courses. Assessing Writing, 29, 1-14. https://doi.org/10.1016/j.asw.2016.05.001
  26. Dimova, S., Yan, X., & Ginther, A. (2020). Local language testing: Design, implementation, and development. Routledge.
  27. Donohue, J. P., & Erling, E. J. (2012). Investigating the relationship between the use of English for academic purposes and academic attainment. Journal of English for Academic Purposes, 11(3), 210-219. https://doi.org/10.1016/j.jeap.2012.04.003
  28. Douglas, D. (2000). Assessing languages for specific purposes. Cambridge University Press.
  29. Dunworth, K. (2009). An investigation into post-entry English language assessment in Australian universities. Journal of Academic Language and Learning, 3(1), A1-A13. https://journal.aall.org.au/index.php/jall/article/view/67
  30. Dyson, B. P. (2009). Understanding trajectories of academic literacy: How could this improve diagnostic assessment? Journal of Academic Language and Learning, 3(1), A52-A69. https://journal.aall.org.au/index.php/jall/article/view/99
  31. Edwards, E., Goldsmith, R., Havery, C., & James, N. (2021). An institution-wide strategy for ongoing, embedded academic language development: Design, implementation and analysis. Journal of Academic Language and Learning, 15(1), 53-71.
  32. Elder, C., & Erlam, R. (2001). Development and validation of the Diagnostic English Language Needs Assessment (DELNA): Final report. The University of Auckland, Department of Applied Language Studies and Linguistics. https://cdn.auckland.ac.nz/assets/delna/delna/documents/elder-erlam-2001-report.pdf
  33. Elder, C, & Read, J. (2015). Post-entry language assessments in Australia. In J. Read (Ed.), Assessing English proficiency for university study (pp. 25-46). Palgrave Macmillan.
  34. Elder, C., & von Randow, J. (2008). Exploring the utility of a web-based English language screening tool. Language Assessment Quarterly, 5(3), 173–194. https://doi.org/10.1080/15434300802229334
  35. Ellis, R. (2005). Measuring implicit and explicit knowledge of second language: A psychometric study. Studies in Second Language Acquisition, 27, 141-172. https://doi.org/10.10170S0272263105050096
  36. Erlam, R., & Botelho de Magalhães, M. (2021). Post-entry English language assessment at the University of Auckland: Ongoing validation of DELNA. New Zealand Studies in Applied Linguistics, 27(1), 32–49.
  37. Erling, E. J., & Richardson, J. T. E. (2010). Measuring the academic skills of university students: Evaluation of a diagnostic procedure. Assessing Writing, 15(3), 177–193. https://doi.org/10.1016/j.asw.2010.08.002
  38. Ewald, T. (2020). Writing in the technical fields: A practical guide. (3rd ed). Oxford University Press.
  39. Fox, J. (2005). Test decisions over time: Tracking validity. Language Testing, 21(4), 437–465. https://doi.org/10.1191/0265532204lt292oa
  40. Fox, J. (2015). Trends and issues in language assessment in Canada: A consideration of context. Language Assessment Quarterly, 12(1), 1–9. https://doi.org/10.1080/15434303.2014.999921
  41. Fox, J., Haggerty, J., & Artemeva, N. (2016). Mitigating risk: The impact of a diagnostic assessment procedure on the first-year experience in engineering. In J. Read (Ed.), Post-admission language assessment of university students (pp. 43–65). Springer International Publishing. https://doi.org/10.1007/978-3-319-39192-2_3
  42. Fox, J., von Randow, J., & Volkov, A. (2016). Identifying students at-risk through post-entry diagnostic assessment: An Australasian approach takes root in a Canadian university. In V. Aryadoust & J. Fox (Eds.), Trends in language assessment research & practice: The view from the Middle East and Pacific Rim (pp. 265-285). Cambridge Scholars Publishing.
  43. Fox, J., & Artemeva, N. (2017). From diagnosis toward academic support. ESP Today, 5(2), 148-171. https://doi.org/10.18485/esptoday.2017.5.2.2
  44. Fulcher, G. (1997). An English language placement test: Issues in reliability and validity. Language Testing, 14, 113–139. https://doi.org/10.1177/026553229701400201
  45. Green, R. (2013). Statistical analysis for language testers. Palgrave Macmillan.
  46. Halliday, M. A. K., & Hassan, R. (2013). Cohesion in English. Routledge. (Original work published in 1976)
  47. Heeren, J., Speelman, D., & De Wachter, L. (2021). A practical academic reading and vocabulary screening test as a predictor of achievement in first-year university students: Implications for test purpose and use. International Journal of Bilingual Education and Bilingualism, 24(10), 1458-1473. https://doi.org/10.1080/13670050.2019.1709411
  48. Hirch, R. (2020). An interview with Dr. John Read. Language Assessment Quarterly, 17(2), 204–215. https://doi.org/10.1080/15434303.2020.1730842
  49. Holder, G. M., Jones, J., Robinson, R. A., & Krass, I. (1999). Academic literacy skills and progression rates amongst pharmacy students. Higher Education Research and Development, 18(1), 19–30. https://doi.org/10.1080/0729436990180103
  50. Jang, E. E. (2009). Cognitive diagnostic assessment of L2 reading comprehension ability: Validity arguments for fusion model application to language assessment. Language Testing, 26(1), 31-73. https://doi.org/10.1177/0265532208097336
  51. Knoch, U. (2011). Rating scales for diagnostic assessment of writing: What should they look like and where should the criteria come from? Assessing Writing, 16, 81– 96. https://doi.org/10.1016/j.asw.2011.02.003
  52. Knoch, U., & Macqueen, S. (2020). Assessing English for professional purposes. Routledge. https://doi.org/10.4324/9780429340383
  53. Kuiken, F., & Vedder, I. (2016). Functional adequacy in L2 writing: Towards a new rating scale. Language Testing, 34(3), 321-336. https://doi.org/10.1177/0265532216663991
  54. Kyle, K. (2018). Measuring syntactic complexity in L2 writing using fine‐grained clausal and phrasal indices. Modern Language Journal, 102(2), 333–349. https://doi.org/10.1111/modl.12468
  55. Lahuerta, A. C. (2018). Study of accuracy and grammatical complexity in EFL writing. International Journal of English Studies, 18(1), 71-89. https://doi.org/10.6018/ijes/2018/1/258971
  56. Lastres-López, C., & Manalastas, G. (2017). Errors in L1 and L2 university students’ writing in English: Grammar, spelling and punctuation. Revista Electrónica De Lingüística Aplicada, 16(1), 118-134. http://www.aesla.org.es/ojs/index.php/RAEL/article/view/325
  57. Laufer, B., & Levitzky-Aviad, T. (2018). Loanword proportion in vocabulary size tests: Does it make a difference? International Journal of Applied Linguistics, 169(1), 95-114. https://doi.org/10.1075/itl.00008.lau
  58. Li, M., & Zhang, X. (2021). A meta-analysis of self-assessment and language performance in language testing and assessment. Language Testing, 38(2), 189-218. https://doi.org/10.1177/0265532220932481
  59. MacDonald, P. (2016). “We all make mistakes!”. Analysing an error-coded corpus of Spanish university students’ written English. Complutense Journal of English Studies, 24, 13-129. http://dx.doi.org/10.5209/CJES.53273
  60. Mehrens, W. A., & Cizek, G. J. (2012). Standard setting for decision making: Classifications, consequences, and the common good. In G. J. Cizek (Ed.), Setting performance standards: Foundations, methods, and innovations. (2nd ed.). (pp. 33-46). Routledge. https://doi.org/10.4324/9780203848203
  61. Meara, P. (1996). The dimensions of lexical competence. In G. Brown, K. Malmkjaer, & J. Williams (Eds.), Competence and performance in language learning (pp. 35–53). Cambridge University Press.
  62. Messick, S. (1996). Validity and washback in language testing. Language Testing, 13(3), 241–156. https://doi.org/10.1177/026553229601300302
  63. Miralpeix, I., & Muñoz, C. (2018). Receptive vocabulary size and its relationship to EFL language skills. International Review of Applied Linguistics in Language Teaching, 56(1), 1–24. https://doi.org/10.1515/iral-2017-0016
  64. Nation, I. S. P. (2006). How large a vocabulary is needed for reading and listening? The Canadian Modern Language Review/La Revue canadienne des langues vivantes, 63(1), 59-82. https://doi.org/10.3138/cmlr.63.1.59
  65. Nation, I. S. P., & Beglar, D. (2007). A vocabulary size test. The Language Teacher, 31(7), 9-13. http://www.jalt-publications.org/archive/tlt/2007/07_2007TLT.pdf
  66. Nation, I. S. P. (2012, October 23). The vocabulary size test. Victoria University of Wellington, School of Linguistics and Applied Linguistics. https://www.victoria.ac.nz/lals/about/staff/publications/paul-nation/Vocabulary-Size-Test-information-and-specifications.pdf
  67. Nation, I. S. P. (2013). Learning vocabulary in another language (2nd ed.). Cambridge University Press. https://doi.org/10.1017/9781009093873
  68. Nizonkiza, D. (2011). The relationship between lexical competence, collocational competence, and second language proficiency. English Text Construction, 4(1), 113-146. https://doi.org/10.1075/etc.4.1.06niz
  69. Norris, J. M. (2012). Purposeful language assessment: Selecting the right alternative test. English Teaching Forum, 3, 41-45. https://files.eric.ed.gov/fulltext/EJ997530.pdf
  70. Palmer, L., Levett-Jones, T., & Smith, R. (2018). First year students’ perceptions of academic literacies preparedness and embedded diagnostic assessment. Student Success, 9(2), 49-61. https://doi.org/10.5204/ssj.v9i2.417
  71. Palmer, L., Levett-Jones, T., Smith, R., & McMillan, M. (2014). Academic literacy diagnostic assessment in the first semester of first year at university. The International Journal of the First Year in Higher Education, 5(1), 67-78. https://doi.org/10.5204/intjfyhe.v5i1.201
  72. Paton, M. J. (2007). Why international students are at greater risk of failure: An inconvenient truth. International Journal of Diversity, 6(6), 101-111. https://doi.org/10.18848/1447-9532/CGP/v06i06/39297
  73. Pienemann, M. (1998). Language processing and second language development: Processability theory. John Benjamins.
  74. Purpura, J. (2004). Assessing grammar. Cambridge University Press.
  75. Ransom, L. (2009). Implementing the post-entry English language assessment policy at the University of Melbourne: Rationale, processes, and outcomes. Journal of Academic Language and Learning, 3(2), A13-A25. https://journal.aall.org.au/index.php/jall/article/view/71
  76. Read, J. (2008). Identifying academic language needs through diagnostic assessment. Journal of English for Academic Purposes, 7, 180–190. https://doi.org/10.1016/j.jeap.2008.02.001
  77. Read, J. (2015a). Issues in post-entry language assessment in English-medium universities. Language Teaching, 48(2), 217–234. https://doi.org/10.1017/S0261444813000190
  78. Read, J. (2015b). Assessing English proficiency for university study. Palgrave Macmillan.
  79. Read, J. (2016). Some key issues in post-admission language assessment. In J. Read (Ed.), Post-admission language assessment of university students (pp. 3–20). Springer International Publishing. https://doi.org/10.1007/978-3-319-39192-2_1
  80. Read, J., & Chapelle, C.A. (2001). A framework for second language vocabulary assessment. Language Testing, 18(1), 1–32. https://doi.org/10.1177/026553220101800101
  81. Schmitt, N., Cobb, T., Horst, M., & Schmitt, D. (2017). How much vocabulary is needed to use English? Replication of Van Zeeland & Schmitt (2012), Nation, (2006), and Cobb (2007). Language Teaching, 50(2), 212–226. https://doi.org/10.1017/S0261444815000075
  82. Scouller, K., Bonanno, H., Smith, L., & Krass, I. (2008). Student experience and tertiary expectations: Factors predicting academic literacy amongst first-year pharmacy students. Studies in Higher Education, 33,167–178. https://doi.org/10.1080/03075070801916047
  83. Shapiro, S., Farrelly, R., & Tomaš, Z. (2014). Fostering international student success in higher education. TESOL Press, NAFSA.
  84. Skinner, I., & Mort, P. (2009). Embedding academic literacy support within the electrical engineering curriculum: A case study. IEEE Transactions on Education, 52(4), 547–554. http://dx.doi.org/10.1109/TE.2008.930795
  85. The University of Auckland. (n.d.). Diagnostic English Language Needs Assessment: Handbook for candidates at the University of Auckland. Retrieved March 26, 2023, from https://cdn.auckland.ac.nz/assets/delna/delna/documents/delna-handbook.pdf
  86. The University of British Columbia. (n.d.). English language competency. Retrieved October 13, 2022, from Undergraduate Programs and Admissions: https://you.ubc.ca/applying-ubc/requirements/english-language-competency/
  87. Urmston, A., Raquel, M., & Tsang, C. (2012). Diagnostic testing of Hong Kong tertiary students' English language proficiency: The development and validation of DELTA. Hong Kong Journal of Applied Linguistics, 14(2), 60–82.
  88. Urmston, A., Raquel, M., & Aryadoust, V. (2016). Can diagnosing university students’ English proficiency facilitate language development? In J. Read (Ed.), Post-admission language assessment of university students (pp. 87–112). Springer International Publishing. https://doi.org/10.1007/978-3-319-39192-2_5
  89. Zareva, A., Schwanenflugel, P., & Nikolova, Y. (2005). Relationship between lexical competence and language proficiency: Variable sensitivity. Studies in Second Language Acquisition, 27(4), 567–595. https://doi.org/10.1017/S0272263105050254
  90. Zell, E., & Krizan, Z. (2014). Do people have insight into their abilities? A metasynthesis. Perspectives on Psychological Science, 9(2), 111-125. https://doi.org/10.1177/1745691613518075
  91. Zumbo, B. (2007). Validity: Foundational issues and statistical methodology. In C. R. Rao & S. Sinharay (Eds.), Handbook of statistics: Psychometrics, (vol. 26, pp. 45-79). Elsevier Science B.V. https://doi.org/10.1016/S0169-7161(06)26003-6
  92. Zumbo, B. (2021). A novel multimethod approach to investigate whether tests delivered at a test centre are concordant with those delivered remotely online. [Research Monograph]. Vancouver, BC: Paragon Testing Enterprises/UBC Paragon Research Initiative, University of British Columbia. http://dx.doi.org/10.14288/1.0400581