SciELO - Scientific Electronic Library Online

 
vol.61 número1ESTUDIO DE LAS HABILIDADES DE LECTOESCRITURA DE LOS ESTUDIANTES UNIVERSITARIOS MEDIANTE TAREAS DE SÍNTESISNARRACIONES BIOGRÁFICAS DE ESCOLARES INMIGRANTES EN LA EDUCACIÓN CHILENA: UNA APROXIMACIÓN DESDE EL SISTEMA DE VALORACIÓN índice de autoresíndice de materiabúsqueda de artículos
Home Pagelista alfabética de revistas  

Servicios Personalizados

Revista

Articulo

Indicadores

Links relacionados

  • En proceso de indezaciónCitado por Google
  • No hay articulos similaresSimilares en SciELO
  • En proceso de indezaciónSimilares en Google

Compartir


RLA. Revista de lingüística teórica y aplicada

versión On-line ISSN 0718-4883

RLA vol.61 no.1 Concepción jul. 2023

http://dx.doi.org/10.29393/rla61-7eprn20007 

ARTÍCULO

ENGLISH PEDAGOGY STUDENTS' MOTIVATIONS TOWARDS THE C1 ADVANCED TEST AS A TOOL TO MEASURE ENGLISH LANGUAGE PROFICIENCY

LA MOTIVACIÓN DE ESTUDIANTES DE PEDAGOGÍA EN INGLÉS SOBRE EL EXAMEN C1 ADVANCED PARA MEDIR COMPETENCIAS LINGÜÍSTICAS DE LA LENGUA INGLESA

ROGER RAMIREZ DRAUGHN1 
http://orcid.org/0000-0003-3806-8564

NICOLÁS CARDENAS TAMBURINI2 
http://orcid.org/0000-0003-2684-4431

1Universidad San Sebastián, Santiago, Chile. roger.ramirez@uss.cl

2World Learning, USA nico.cardenas@worldlearning.org

ABSTRACT

The following study aims to explore and describe the opinions and motivations of senior English pedagogy undergraduate students from a private university in Chile towards the C1 Advanced test, a standardized English language proficiency test by Cambridge. In order to answer the research questions, a study following both a quantitative and qualitative approach with a descriptive design was carried out. This included a questionnaire that was completed by 82 students from four different regions where the English pedagogy program has current students enrolled. The main results show that students nationwide generally agree with the importance of taking international standardized English language proficiency test, although there are some discrepancies among the degree of agreement among students from different regions. Students also indicate high levels of anxiety, stress, and nerves as driving feelings they experience leading up to taking the exam.

Keywords: Standardized testing; language proficiency; student perceptions; English pedagogy

RESUMEN

El presente estudio tiene como objetivo explorar y describir las percepciones y motivación de estudiantes de último año de la carrera de Pedagogía en Inglés en una universidad privada en Chile con respecto al examen C1 Advanced, un examen estandarizado de dominio del inglés ofrecido por Cambridge. Para responder las preguntas de investigación, se implementó un estudio descriptivo de índole cuantitativo y cualitativo. Se incluyó un cuestionario, que fue completado por 82 estudiantes de cuatro regiones del país diferentes donde el programa en Pedagogía en Inglés tiene estudiantes matriculados. Los principales resultados muestran que a nivel nacional los estudiantes encuestados generalmente están de acuerdo con la importancia de rendir un examen internacional estandarizado de dominio de la lengua inglesa, aunque existen algunas discrepancias entre los niveles de acuerdo, basado en las regiones de procedencia de los estudiantes. Los participantes de la investigación también indican altos niveles de ansiedad, estrés y nervios como emociones predominantes mientras se preparan para realizar el examen.

Palabras clave: Pruebas estandarizadas; dominio de una lengua; percepción de estudiantes; pedagogía en inglés

1. INTRODUCTION

Students, educators and professional workers from around the world, including Chile, oftentimes find themselves in the need to certify their English language proficiency for both academic and professional purposes (Brown, 2019). The specific reasons range from participating in international exchange programs and having a gap year to working abroad to applying for a job that requires a certain level of English.

This study focuses on senior undergraduate students from four different regions of Chile enrolled in an English pedagogy program at a private university. For these students, certifying their English proficiency is part of the last year of the academic program as well as a common requirement among educational institutions that employ English language teachers in Chile. The expectation of the university program and the general requirement from educational institutions hiring English language teachers is to have a certified C1 level of English. This proficiency level corresponds to an advanced use of the language based on the Common European Framework of Reference for Languages (Council of Europe, 2022). In order to obtain such a certification, there are a number of international standardized tests that can be implemented. At the time of this study, the English pedagogy university program that was considered for this research administered the C1 Advanced exam (formerly known as CAE) by Cambridge English Language Assessment.

Considering the high-stakes nature of this international standardized language exam, students from the English pedagogy program are faced with a considerable challenge that can have an important impact on their academic and professional outcomes. For this reason, it is of great significance to learn more about students' perspectives, thoughts, and experiences around this exam.

The objective of this study was to identify the perceptions and motivations of senior undergraduate students of English Pedagogy from four different regions of a private university in Chile towards the C1 Advanced exam by Cambridge. Based on this objective, this study addressed the following research questions:

  • • What are the motivations of students towards the C1 Advanced exam?

  • • Are there differences between students' motivations towards the C1 Advanced exam based on the region they are from?

  • • Is there a relationship between the variables examined in regard to students' perception of the different language skills evaluated through the C1 Advanced exam?

  • • Are there any differences between the motivation of students towards the C1 Advanced exam based on the level of English they believe to have?

  • • Are there differences between the feelings and thoughts students experienced leading up to take the C1 Advanced exam based on the region they are from?

2. LITERATURE REVIEW

2.1. What is standardized testing?

2.1.1. General introduction

Testing has been used throughout history by institutions and organizations to generate a scalar notion of individuals' skills and knowledge, typically to decide who has the qualifications to become a part of them. Advocates of testing have claimed its usefulness lies in the potential to inform teachers, students, parents, administrative staff and even policymakers on how a determined population of test takers are meeting certain standards, and as such make decisions related to programs, practices, and opportunities (Brown & Abeywickrama, 2021). An example that has become massively commonplace in the past half-decade is standardized testing. In the area of English language aptitude, international standardized English language proficiency tests (henceforth ISELPTs) have emerged as the most widely acknowledged proficiency assessment instruments by universities, workplaces, and other score users. As the definition by Katz (2014) indicates, proficiency tests differ from other forms of assessment in that they are meant to gauge "students' ability in a language independent of a curriculum or specific course content" (p. 321). Following this worldwide trend, a steadily increasing number of Chilean universities are requiring their students to take an ISELPT such as TOEFL, TOEIC, IELTS and the Cambridge suite of tests to certify their level of proficiency before they graduate.

2.1.2. The case of C1 Advanced (formerly known as CAE)

The C1 Advanced test is one of the exams offered by Cambridge. Relatively popular in Chile as a way to measure language proficiency, the C1 Advanced test is administered by the end of the last year in the English Pedagogy degree across a variety of universities. Albeit it is not a requirement for graduation, and it does not affect students' grades directly or indirectly, the results of this test can be an important milestone for pre-service teachers. For many the first instance of an ISELPT, the C1 Advanced test represents an outside measurement of the language skills students have developed throughout the duration of their university program.

Intrinsically tied to the CEFR proficiency level in its name, a "C" grade (a score of 180-192 on the Cambridge English Scale) is the cut to be considered as having a C1 level (Green, 2018). The C1 Advanced test is made up of four "papers" (one for each language skill), lasts around 4 hours, and can be taken in either physical or computer-based formats. Table I shows further details and the specific duration of each paper, taken from the Cambridge English website:

Table I Papers, features, and duration of C1 Advanced exam by Cambridge. 

2.2. Why testing at the end of a teacher education degree?

2.2.1. National guidelines & education market

Taking a language proficiency test by the end of a teaching degree has become a more frequent practice in the past decade, influenced by factors coming from different parts of the education system. Universities seek to examine both how pre-service teachers meet the proficiency benchmarks set by their programs, and whether these students align with the exit profile expected by national and international standards (Cisterna-Zenteno et al. 2016). In Chile, the national Ministry of Education releases and updates teaching standards that schools and educators are meant to observe. This initiative was born in 2004, as part of a series of strategic reforms to improve international business connections across language barriers (Matear, 2008). After a national diagnostic proficiency test the same year, results positively correlated the students with the best scores to teachers with high levels of proficiency (Ministerio de Educación/SIMCE/Cambridge ESOL Examinations, 2004), which became a flagship for the development of teacher quality expectations.

The current standards for teachers of English place the ideal educator at a C1 level in their section on communicative competence, both at the production and comprehension levels (Ministerio de Educación, 2021). In practice, these nationwide standards issued by the Ministry of Education are strong recommendations. They do not act as a sole mandate that establishments are required to follow. Nevertheless, the strong demands of the education market make institutions embrace these guidelines. Public or private school administrations pose their own requirements when hiring educators, and it is not uncommon for them to include a C1 level of proficiency in the CEFR scale among the prerequisites to enlist English language teachers.

2.2.2. Suggestions by researching institutions

As part of their study results, institutions researching the teaching and learning of English around Latin America have suggested Ministries of Education and teacher-training programs to strengthen the quality of teacher education by implementing stricter screening and recruiting processes, such as the implementation of ISELPTs (see for example, Cronquist & Fiszbein, 2017; Bruns & Luque, 2015).

All of the reasons above point towards the implementation of an ISELPT in the last semester of a teaching degree, in the months leading to graduation.

2.3. Does standardized testing reflect proficiency levels?

The concept of standardized testing might equivocally seem synonymous to a fair, objective measurement of a subject's level of proficiency. Test results should effectively reflect what the person is capable of doing with their language skills and be a predictor of relative success using the language. In reality, the relationship between how test takers perform and their level of proficiency is a complicated one (Benavides, 2015; Jenkins, 2016). Although ISELPTs are implemented in a controlled environment and are founded on the basis of continual improvement by the institutions that generate them, it is not entirely correct to assume a unidimensional score scale can faithfully reflect a multidimensional construct as language aptitude. The field of standardized language testing is still in continual evolution. Although the quality of the processes and products associated with it has allegedly improved in the last decades, research is still necessary and significant. Listening to test takers is one step in this direction.

2.4. How does standardized testing affect higher education students?

2.4.1. State of research

Even though testing acts as a gatekeeper for job opportunities and ultimately the socioeconomic trajectory of many students and pre-service teachers' lives, the field of test takers' perception has been severely under-researched (Suryaningsih, 2014; Brown, 2019). Most of the literature in this regard comes from the last decade, indicating a growing interest in the human element of ISELPTs. Test-takers' perceptions should be critical for an evaluation instrument that decidedly influences their careers, as it is them who prepare for it, undergo the process of taking it, and live with the results. Consequently, their voices should not be left out of the equation (Kirkland, 1971; Shohamy, 2015).

The authors found that some research has been conducted to understand the influence of standardized testing on students and their perceptions, but only a small fraction considers higher education students. From these, a much smaller portion focuses on pre-service educators. Although specific, this area of investigation should be especially relevant since teachers draw on their own experiences as learners to form their beliefs around education (Borg, 2003; Zheng, 2009). Students' perception of their learning experiences can mold how they learn, and for future teachers this holds true for aspects such as assessment and test preparation (Struyven et al., 2005). Teachers' values and belief systems tend to inform their actions in the classroom consciously or unconsciously, making their own interpretations of reality a cascade effect that might affect their students (Pérez Andrade, 2019).

2.4.2. The effect on test-takers' motivation

The effect of standardized testing on student motivation is mixed and inconclusive, and seems to vary with factors such as personality, culture, beliefs, and experience with test preparation and implementation. Studies have found that higher education students can feel motivated to learn when faced with the challenge of taking an ISELPT as part of their program (for example, see Ockey & Gokturk, 2019; Li et al., 2012; Rohman et al. 2019; Vongpumivitch, 2012). Conversely, the motivation that many test takers feel tends to coexist with feelings of stress, anxiety, and frustration (for example, see Suryaningsih, 2014; Castro Acosta, 2009; Pan, 2009).

2.4.3. The effect on test-takers' learning

The literature consulted rarely focused on what higher education students learned about their own language ability by taking an ISELPT, or how they process the feedback offered through the results. Nonetheless, it has been reported that students tend to struggle most with extensive listening and extensive reading (Cister-na-Zenteno et al. 2016; Castro Acosta, 2009).

2.4.4. Test-takers' perception of ISELPTs

Research on younger students in the school system emphasizes that high stakes testing can be understood as necessary, but also perceived as identity-defining and shame-inducing (Taylor, 2013). The perception of higher education students tends to lean towards an acknowledgement of why institutions implement IS-ELPTs, combined with unconformity by not being familiar enough with the test items, target culture and perceived level of challenge (Jenkins, 2016; Ockey & Gokturk, 2019).

Across ages, students' perceptions of ISELPTs and standardized testing at large seem to be influenced by their performance in them. Students who obtain higher scores are more likely to be satisfied with the notion of testing and their overall test-taking experience (Tsai & Tsou, 2009; Kirkland, 1971).

3. METHODOLOGY

The purpose of this exploratory research is to identify the motivations of senior undergraduate students from a private university in Chile towards the C1 Advanced exam by Cambridge. Based on this intention, non-experimental research has been carried out following a quantitative and qualitative approach with a descriptive design (Cohen et al., 2018).

3.1. Context and participants

This study took place at a private university in Chile, which has campuses in Puerto Montt, Valdivia, Concepción, and Santiago. At this university, English pedagogy students in the last semester of their undergraduate studies are required to take an ISELPT to certify their English language proficiency. Nonetheless, students are not required to achieve a certain score in order to graduate. The English pedagogy undergraduate program has a current duration of eight semesters. It is important to point out that the recently updated version of the program has a duration of ten semesters. However, the participants of this study were all students from the eight-semester long version of the program. Throughout this English pedagogy program, students are required to complete two semester-long courses aimed at preparing them for taking an international standardized exam: one focused on test-taking strategies and one focused on better understanding and practicing taking a variety of mock ISELPTs.

At the time of this research, the C1 Advanced exam by Cambridge, formerly known as Cambridge English: Advanced (CAE), was the test being implemented. This standardized test is administered by an external institution in Santiago that is licensed by Cambridge. The test is offered to English pedagogy students at no additional cost besides what is already paid through their university tuition. The directors and academic staff of the university are the ones who coordinate all the administrative logistics, including registration and scheduling, with the external test provider.

For this study, a total of 98 students were invited to participate in a questionnaire focused on their perceptions and motivations towards the C1 Advanced test as a tool to measure their English language proficiency. When these senior undergraduate students were asked to complete the questionnaire, they had already taken the C1 Advanced test. The questionnaire was completed anonymously using Google Forms. No personal details like name, sex, age, phone number or emails were collected through the questionnaire. Since one of the researchers of this project is the director of the undergraduate program of the university considered, participants were contacted through local teachers of the program.

3.2. Instrument and data collection

An online questionnaire was designed and administered to gather senior English pedagogy undergraduate students' motivations towards the C1 Advanced exam by Cambridge. In order to obtain this information, the questionnaire included 12 Likert scale questions (Muthén & Kaplan, 1985). The Likert type responses included: strongly agree, agree, neutral, disagree, and strongly disagree. In addition to these items, participants were asked to answer 10 multiple choice type questions in which they could select more than one answer. 8 of the 10 questions were included to ensure validity of the questionnaire since these items addressed information already encountered in the Likert scale questions. The remaining 2 questions aimed at collecting data regarding concrete motivations for taking the C1 Advanced exam and feelings and thoughts participants experienced leading up to taking said exam. Finally, the questionnaire included 1 open-ended question for responders to expand and offer more details on their perception of the overall value of taking an international standardized English language test at the end of an undergraduate English pedagogy program.

To confirm the relevance and clarity of the questions, the questionnaire was reviewed by two external experts on the topic of ISELPTs. Afterwards, it was piloted with a group of 18 students from the same university in Santiago, Chile. It was carried out in a physical classroom at the university with volunteer students over a period of 45 minutes. The pilot consisted of participants completing the virtual questionnaire in fifteen minutes followed by a thirty-minute focus group on their reactions and feedback towards the research instrument. These participants made specific observations on the number of open-ended questions included in the questionnaire. All 18 students agreed that three of the four open-ended questions felt redundant since they found themselves writing the same answers they had already selected in the previous Likert and multiple-choice questions. Participants from the pilot also suggested to use the acronym CAE throughout the questionnaire instead of C1 Advanced since this new title of the test is not as familiar to them. Students also mentioned that the questionnaire felt concise and straightforward. They confirmed that it could be completed within fifteen minutes, especially if two or three of the open-ended questions were to be removed.

Based on the results of the pilot and the focus group, a new version of the questionnaire was constructed in which only one open-ended question was included at the end for participants to share further opinions and thoughts on the C1 Advanced test. The questionnaire was also modified so that the use of the test title CAE was consistent throughout most of the instrument. Finally, when applicable, a line was added to some questions of the survey clarifying that more than one answer could be selected.

3.3. Population

The population for this research was composed of senior English pedagogy undergraduate students from four different regions of a private university in Chile. The total population of these students reached 98 people. For the purposes of this study, a convenience sampling technique was used (Griffee, 2012; Rea & Parker, 2014; Warner, 2013). The sample consisted of 82 students from all four regions. Specifically, the sample included 20 students from Santiago, 21 from Valdivia, 21 from Puerto Montt, and 20 from Concepción.

3.4. Procedure

The questionnaire was sent to all 98 students enrolled in the last year of the English pedagogy undergraduate program from the four regions where it is offered. Even though participation in answering the questionnaire was highly encouraged, it was completely voluntary. Responders were assured of the confidential and anonymous nature of their responses. Participants who agreed to participate in the study signed an informed consent that had already been approved by the Scientific Ethics Committee of the university.

3.5. Data analysis

The data and results received through the questionnaire, were analyzed based on descriptive statistics (Gordon, 2012). Cross-tabulation and Pearson's Chi-square were conducted to measure whether and to what extent students from different regions had significantly different perceptions and motivations towards the C1 Advanced test. These analyses were performed using the SPSS Statistics 26.0 software.

4. RESULTS AND DISCUSSION

4.1. Descriptive analysis of students' opinions towards the C1 Advanced test by Cambridge

As shown in Table II, the descriptive analyses convey that students pertaining to all four regions where the university runs the program generally agree with the importance of taking the C1 Advanced test as senior English pedagogy undergraduate students. The items that generated the highest positive ratings (agree and strongly agree) were based on the following statements: as a senior undergraduate student of English pedagogy, I believe it is important to take the CAE CI test (88.2%) and the CAE CI test is an effective tool for measuring my English language proficiency as a senior undergraduate student of English pedagogy (84.4%). On the other hand, the item where most students least agree with the C1 Advanced test relates to the alignment of their English language preparation at the university with the exam. A total of 75% of the responders believe their university preparation does not align well with the C1 Advanced exam.

Table II Descriptive analysis of students' opinions towards the C1 Advanced test. 

According to the analysis of the information collected during the research and regarding the questions established for this study, we can conclude that the students generally agree that it is important to take an ISELPT like the C1 Advanced test. In addition, over 60% of respondents indicated that they became more aware of their speaking and listening language skills after taking the C1 Advanced test. This shows that participants are interested in and inclined towards having a better understanding of their own language skills. In contrast, reading and writing were the two personal language skills that students felt they became the least aware of by taking and receiving the results of the C1 Advanced test. This result provides a clear sign of what areas participants might need more support with throughout their English language development at the university they attend and in preparation for taking the C1 Advanced test - or any other standardized language test.

A total of 75% of the responders also suggested that the results of the C1 Advanced test motivated them to continue improving their English language skills and proficiency levels. Students demonstrate through this question a keen interest in working on their English language skills based on their own results from the standardized language test. It also became evident through the survey that students feel that the C1 Advanced test is reputable among employers in Chile, and that it is an effective way of certifying students' English language skills (70% overall agreement). This result is a clear indication of the importance of this standardized test in the Chilean teaching work environment and the high regard that students hold the test at.

4.2. Descriptive analysis of students' motivations and feelings towards the C1 Advanced test by Cambridge

The descriptive analyses of students' motivations for taking the C1 Advanced test show that the students, from all regions of the university, had a variety of motivational sources ranging from work to personal purposes (Table III).

A high 70% of participants revealed that what motivated them to take the C1 Advanced test was to receive an international certification of their language proficiency level. 74% expressed that they were motivated to apply for a job in the Chilean teaching context. These two results may be indicative of the weight of language certifications for educators at a national level and how important it is for employers that language teaching candidates hold a valid English language proficiency certificate.

A more intrinsic and personal motivational source comes from the results of two particular items. 70% of respondents indicated that what drove their motivation to take the C1 Advanced test was to have a better understanding of their English language skills and to validate their own personal language learning process. These results are highly encouraging since they reflect the importance of self-awareness and pro-active learning from students. In addition, the results demonstrated that students do not only take the C1 Advanced test because it is required, but also because they are personally interested in seeing how their English language skills fare based on the CEFR (Common European Framework of Reference).

Participants also seem to have an important interest in taking the C1 Advanced test to further their education and pursue post-graduate studies with 56% of respondents revealing that they were motivated to take the C1 Advanced test to apply for a master's degree. Another 52% mentioned that they were motivated to take the C1 Advanced test to potentially participate in an exchange program abroad. These results show that students have motivations that expand beyond just immediate work purposes. Students also hope to travel abroad, have intercultural experiences, and continue to learn through a master's program. The great majority of exchange programs and master's programs do require a certified C1 level of English. For this reason, the C1 Advanced test seems to be an effective instrument for most students.

Finally, it is important to mention that the research survey included a statement where students could select as an option that they were not motivated to take the C1 Advanced test or that they took the test just because it was mandatory. A mere 2% selected both items, revealing that 98% of students do have sources of motivation to take the C1 Advanced test which include work, study, travel, and personal reasons.

Table III Descriptive analysis of students' motivations towards the C1 Advanced test. 

In terms of how students felt in the weeks and days leading up to taking the C1 Advanced test, the results indicate an overwhelming inclination towards negative and stressful feelings (Table IV). Respondents selected stress, anxiety, and nerves as the predominant feelings they experienced in the weeks before taking the test (80%, 78%, and 74% correspondingly). This result is highlighted by 52% of students pointing out that they felt concern and lack of confidence. Conversely, only 26% of respondents indicated that they felt curiosity and positively challenged. In addition, 18% of the responders revealed that they felt excited as they prepared for the test. These results highlight a general concern from students who feel anxious, stressed, not prepared, and unconfident in the timeframe prior to taking the C1 Advanced test. Even though it may be difficult to avoid stressful and anxious feelings before taking such a high-stakes exam, there is significant work to be done and emotional support to be provided to students who prepare for taking the C1 Advanced test. This is a challenge that lies ahead for both the leadership of the English pedagogy undergraduate program and the students who will be taking the C1 Advanced test or a similar one in the future.

Table IV Descriptive analysis of students' feelings towards the C1 Advanced test. 

4.3. Differences between students' opinions and motivations towards the C1 Advanced test based on regional background

With respect to the regional differences, respondents from all four regions included in this research (Puerto Montt, Valdivia, Concepción, and Santiago) generally selected similar answers, with different degrees of agreement, for most items of the survey regarding their opinions and motivations towards the C1 Advanced test (Table V). Items shared in Table V were selected based on results that indicated most predominant differences, similarities and opportunities for further discussion. For example, regarding statements connected to the importance of taking the C1 Advanced exam as senior undergraduate students of English pedagogy, students from all four regions agreed on the importance of this exam (Puerto Montt: 94%; Valdivia; 96%; Concepción: 92%; Santiago: 68%). A worthy difference to note with these results is that even though most students from all regions agree, students from Santiago have at least a 24% lower indication rate in this agreement compared to all three other regions.

Regarding students' motivation indicators for taking the C1 Advanced test there is a similar difference. Students from Puerto Montt, Valdivia, and Concepción all agree more strongly that the results from the C1 Advanced exam motivate them to improve their personal English language skills (80%, 78%, and 76% correspondingly). However, respondents from Santiago have a lower agreement to being motivated to improve their English language skills based on results from the C1 Advanced exam with a 60% agreement rate.

When asking participants about their opinion around the reputation of the C1 Advanced test, there are once again similar differences among students from the same regions. Respondents from Santiago agree to this statement with a 52% rate while participants from Puerto Montt, Valdivia, and Concepción have a higher agreement rate for the same statement (86%, 84%, and 80% correspondingly).

In terms of factors that motivate students to take the C1 Advanced test, students from all four regions agreed on work-related purposes, post-graduate studies, and to better understand their own English language skills as their three most important sources of motivation. These results are reflected in the following items: to have an international certificate (Puerto Montt: 70%; Valdivia: 66%; Concepción: 72%; Santiago: 76%); to apply for a teaching job (Puerto Montt: 80%; Valdivia: 68%; Concepción: 72%; Santiago: 74%); to apply to a master's program (Puerto Montt: 52%; Valdivia: 50%; Concepción: 60%; Santiago: 62%); to have a better understanding of my English language skills (Puerto Montt: 72%; Valdivia: 70%; Concepción: 72%; Santiago: 64%). Students all seem to have similar motivational sources for taking the C1 Advanced test.

When it comes to how students felt as they got ready to take the C1 Advanced test, there are generally similar results across all four regions. The results indicate that most students emphasized feelings of stress, anxiety, and nerves in the weeks leading up to taking the standardized C1 Advanced test with general results all above 74% for all four regions. These results are highlighted in the following items: stress (Puerto Montt: 82%; Valdivia: 78%; Concepción: 80%; Santiago: 80%); anxiety (Puerto Montt: 80%; Valdivia: 76%; Concepción: 80%; Santiago: 78%); nerves (Puerto Montt: 84%; Valdivia: 76%; Concepción: 74%; Santiago: 62%). The regional comparison in the emotional dimension of students' experience regarding the C1 Advanced test once again highlights the need to work on this area and strategize around supporting participants as they prepare for taking this standardized language test. Participants' C1 Advanced results may be directly linked to the feelings and emotions they experience days and weeks prior to taking the test (Suryaningsih, 2014; Castro Acosta, 2009; Pan, 2009). Improving students' emotional state of mind and overall confidence as they prepare for taking the C1 Advanced test could have a meaningful impact on their results.

In a more controversial statement, participants were asked to indicate whether they would make the C1 Advanced test -or any other ISELPT- a requirement for graduating if it were up to them to decide. The results to this question reflected the most significant differences among Santiago and all three other regions. Participants from Santiago responded with an 87.5% against making an international standardized language test mandatory while most students from all three other regions agreed on making a test of this nature mandatory (Puerto Montt: 72%; Valdivia; 66%; Concepción: 70%). These results highlight that students from Santiago do not agree with an international standardized language test being a graduation requirement while students from other regions agree for the most part. These are highly interesting results that may be due to students' inconformity with the test, their results, the nature of standardized testing, their ability to apply to jobs with other requirements, and a disagreement towards the ability of standardized tests to truly reflect language proficiency among other possibilities. This particular result calls for further research and exploration with students from Santiago to better understand their differences as compared with students from other regions of Chile.

Table V Differences in students' opinions and motivations towards the C1 Advanced test based on regional backgrounds. 

Regarding students' overall perception on the value of taking an international standardized English language test at the end of an undergraduate English pedagogy program, they seemed to agree on the test being a great opportunity to demonstrate proficiency levels and linguistic progress by the end of the program. However, they also seem to consistently express that they do not feel emotionally prepared for such an exam. In the final open-ended question of the survey, one participant responded: "The test should and must be an obligation for English teachers and for their results to be at least C1". Another responder mentioned: "The test functions as evidence of the progress made during the program but because the pressure and nerves some don't show their real potential and language proficiency".

4.4. Research limitations

The authors acknowledge the limitations of the research instrument and the results obtained in this study. First, the participants of this research attend classes in one university and in the central and central-south regions of the country, including Santiago, Concepción, Valdivia, and Puerto Montt. This geographical indicator leaves out students from the north and extreme south areas. Further research with students from other regions of Chile and from other universities in the country would contribute to a deeper and more inclusive understanding of students' opinions and motivations towards standardized English language proficiency tests.

Also, the research instrument was reworked to consist of mostly Likert scale items, with little room for participants to expand on their choices. The lack of more open-ended prompts could limit the complexity and depth of answers, opinions, and perceptions of participants of this research.

Finally, due to the nature of the survey instrument implemented in this research, the study would benefit from having individual interviews with volunteer participants who wish to elaborate and expand on their experience with the IS-ELPT in focus.

5. CONCLUSION

This study aimed at exploring the perceptions, attitudes and motivations that senior students from an English pedagogy major in a Chilean private university hold as test-takers toward an ISELPT, in this case Cambridge's C1 Advanced test, as part of their last year. Since these tests can be gatekeepers in the career of future educators, the authors consider critical to observe how pre-service teachers are equipped to have a successful experience that reflects their language proficiency as accurately as possible. Program authorities are encouraged to evaluate the current preparation students receive as they prepare of an ISELPT. This evaluation should include a deep look into how current course programs integrate preparation focused on question types, thinking skills, and test-taking skills needed for successfully taking an ISELPT.

The findings reflect that students consider such a high-stakes test important to measure and understand their level of proficiency, but also unequivocally associate the time leading to the experience with negative feelings such as nervousness, stress and anxiety. This is consistent with the literature on the topic of standardized testing at large, and might seem natural considering the influence an ISELPT can have in these students' career paths. The results suggest that program authorities would greatly benefit from examining how learners are being supported throughout the degree, and particularly in the months leading to the ISELPT selected. This may include the introduction and practice of meta-affective strategies, perspective-taking and a separation between test results and their perceived self-worth.

Especially in regions other than the capital (Santiago), most students consider that an ISELPT should be mandatory as part of the last year in an English pedagogy program, and perhaps controversially, a requirement for graduation. This further solidifies the complex perception of ISELPTs as a necessary benchmark, even when it may lead to undesirable negative feelings. It also unveils a difference in attitude towards this assessment tool between non-metropolitan areas and Santiago. Further research is needed to outline the extent of this discrepancy, and whether it encompasses other areas regarding evaluation or its connection to an external perceived power.

Furthermore, a significant portion of the students surveyed also identified this test-taking instance as a chance for self-assessment and validation of what they have learned throughout the program. This also aligns with previous studies, which have established how standardized testing can influence students' self-perception and, potentially, their self-worth. The results indicate that these future educators would greatly benefit from an overt dialogue to make meaning of their experience and to better comprehend the reality of standardized testing, as they prepare for their own test-taking instances and also prepare to support young students in taking a number of these during their time in the school system.

REFERENCES

Benavides, J. E. (2015). Las pruebas estandarizadas como forma de medición del nivel de inglés en la educación colombiana. In J. A. Bastidas y G. Muñoz (Eds.), Fundamentos para el desarrollo profesional de los profesores de inglés (2nd ed., pp. 19-38). Editorial Universitaria. [ Links ]

Borg, S. (2003). Teacher cognition in language teaching: A review of research on what language teachers think, know, believe, and do. Language Teaching, 36(2), 81-109. https://doi.org/10.1017/S0261444803001903Links ]

Brown, J. D. (2019). World Englishes and international standardized English proficiency tests. In C. L. Nelson, Z. G. Proshina, & D. R. Davis (Eds.), The Handbook of World Englishes (2nd ed., pp. 703-724). Wiley-Blackwell. [ Links ]

Brown, H. D., & Abeywickrama, P. (2021). Language assessment: Principles and classroom practices. New York: Pearson. [ Links ]

Bruns, B. & Luque, J. (2015). Great teachers: How to raise student learning in Latin America and the Caribbean. Washington, D.C.: World Bank Group. [ Links ]

Cambridge English Language Assessment. (2017). CI Advanced. Cambridge English. https://www.cambridgeenglish.org/exams-and-tests/advanced/Links ]

Castro Acosta, F. (2009). Percepciones de la comunidad de la Licenciatura en lenguas modernas de la Pontificia Universidad Javeriana respecto a los exámenes TOEFL, IELTS y CAE durante el periodo 2005-2008 [Bachelor's thesis, Pontificia Universidad Javeriana]. Repositorio PUJ. https://repository.javeriana.edu.co/bitstream/handle/10554/5901/tesis705.pdf?sequence=1&isAllowed=yLinks ]

Cisterna-Zenteno, C., Soto-Hernández, V., & Díaz-Larenas, C. (2016). Medición de habilidades de comprensión lectora y auditiva en estudiantes de pedagogía en inglés de una universidad chilena. Revista Electrónica Educare, 20(1), 1-21. https://doi.org/10.15359/ree.20-L8Links ]

Cohen, L., Manion, L., & Morrison, K. (2018). Research methods in education (8th ed.). London: Routledge. [ Links ]

Council of Europe (2014). The CEFR Levels. COE. https://www.coe.int/en/web/common-european-framework-reference-languages/level-descriptionsLinks ]

Cronquist, K., & Fiszbein, A. (2017, September). English language learning in Latin America. https://www.thedialogue.org/wp-content/uploads/2017/09/English-Language-Learning-in-Latin-America-Final.pdfLinks ]

Gordon, R. (2012). Applied Statistics for the Social and Health Sciences. New York, NY: Routledge. [ Links ]

Green, A. (2018). Linking tests of English for academic purposes to the CEFR: The score user's perspective. Language Assessment Quarterly, 15(1), 59-74. https://doi.org/10.1080/15434303.2017.1350685Links ]

Griffee, D. T. (2012). An introduction to second language research methods: Design and data. Berkeley: TESL-EJ Publications. [ Links ]

Jenkins, J. (2016). International tests of English: Are they fit for purpose? In X. Liao (Ed.), Critical reflections on foreign language education: Globalization and local interventions. The Language Training and Testing Center. [ Links ]

Katz, A. (2014). Assessment in second language classrooms. In M. Celce-Murcia, D. M. Brinton, & M. A. Snow (Eds.), Teaching English as a second or foreign language (pp. 320-337). Australia: National Geographic Learning. [ Links ]

Kirkland, M. C. (1971). The effect of tests on students and schools. Review of Educational Research, 41(4), 303-350. https://doi.org/10.2307/116944.1 Links ]

Li, H., Zhong, Q., & Suen, H. (2012). Students' perceptions of the impact of the College English Test. Language Testing in Asia 2(3), 77-94. https://doi.org/10.1186/2229-0443-2-3-77 Links ]

Matear, A. (2008). English language learning and education policy in Chile: Can English really open doors for all? Asia Pacific Journal of Education, 28(2), 131-147. https://doi.org/10.1080/02188790802036679 Links ]

Ministerio de Educación/SIMCE/Cambridge ESOL Examinations (2004, September). Resultados nacionales del diagnótico de inglés aplicado en el año 2004. https://bibliotecadigital.mineduc.cl/bitstream/handle/20.500.12365/17754/PPT040014.pdf?sequence=1&isAllowed=yLinks ]

Ministerio de Educación (2021, Agosto). Estándares de la profesión docente. Carreras de pedagogía en inglés. Educación básica y media. https://biblio-tecadigital.mineduc.cl/bitstream/handle/20.500.12365/17599/ingles.pdf?sequence=1&isAllowed=yLinks ]

Muthén, B., & Kaplan, D. (1985). A comparison of some methodologies for the factor analysis of non-normal Likert variables. British Journal of Mathematical and Statistical Psychology, 38, 171-189. https://doi.org/10.1111/j.20448317.1992.tb00975.xLinks ]

Ockey, G. J., & Gokturk, N. (2019). Standardized language proficiency tests in higher education. In X. Gao (Ed.), Second handbook of English language teaching (pp. 1-17). Springer. https://doi.org/10.1007/978-3-319-58542-0_25-1 Links ]

Pan, Y. C. (2009). Test impact: English certification exit requirements in Taiwan. TEFLIN Journal, 20(2), 119-139. http://dx.doi.org/10.15639/teflinjournal.v20i2/119-139Links ]

Pérez Andrade, G. (2019). Language ideologies in English language teaching: A multiple case study of teacher education programmes in Chile [Doctoral dissertation, University of Southampton]. University of Southampton, Institutional Repository. [ Links ]

Rea, L. & Parker, A. (2014). Designing and conducting survey research: A comprehensive guide (4th ed.). San Francisco, CA: Jossey-Bass. [ Links ]

Rohman, K. A., Budiana, H. & Hartini, N. (2019) The students' perceptions on a test of English proficiency as graduation requirement. LLT Journal: A Journal on Language and Language Teaching, 22(2), 171-181. https://doi.org/10.24071/llt.2019.220204Links ]

Shohamy, E.G. (2015). The power of tests: A critical perspective on the uses of language tests. London: Routledge. [ Links ]

Struyven, K., Dochy, F. & Janssens, S. (2005). Students' perceptions about evaluation and assessment in higher education: a review. Assessment and Evaluation in Higher Education, 30(4), 325-341. https://doi.org/10.1080/02602930500099102Links ]

Suryaningsih, H. (2014). Students' perceptions of international English language testing system (IELTS) and test of English as a foreign language (TOEFL) tests [Master's thesis, Indiana University of Pennsylvania]. Knowledge Repository. [ Links ]

Taylor, M. (2013). Student perceptions of standardised testing: Survey exploring the attitudes of university undergraduate students towards the traditional methods of student evaluation in formal education [Master's thesis, Lancaster University]. e-space Manchester Metropolitan University's Research Repository. https://espace.mmu.ac.uk/576631/1/Taylor%20%28Matthew%29%202013%20%28Lancaster%29%20Quantitative%20doc.pdfLinks ]

Tsai, Y., & Tsou, C. (2009). A standardized English Language Proficiency test as the graduation benchmark: student perspectives on its application in higher education. Assessment in Education: Principles, Policy & Practice, 16(3), 319-330. https://doi.org/10.1080/09695940903319711Links ]

Vongpumivitch, V. (2012). Motivating lifelong learning of English? Test takers' perceptions of the success of the general English proficiency test. Language Assessment Quarterly, 9(1), 26-59. https://doi.org/10.1080/15434303.201L627521Links ]

Warner, R. M. (2013). Applied statistics: From bivariate through multivariate techniques. Thousand Oaks, CA: Sage Publications. [ Links ]

Zheng, H. (2009) A review of research on EFL pre-service teachers' beliefs and practices. Journal of Cambridge Studies, 4(1), 73-81. https://doi.org/10.17863/CAM.1579Links ]

1This article is the result of an internal research project titled "Student Perceptions and Motivations towards the Advanced C1 Test", approved by the Scientific Ethics Committee (126-22) at Universidad San Sebastián.

Received: December 10, 2022; Accepted: June 25, 2023

Creative Commons License This is an open-access article distributed under the terms of the Creative Commons Attribution License