أبحاثاللغة الأجنبية

ESL Undergraduate Learners’ View of Virtual Assessments During the Corona Virus Pandemic

ESL Undergraduate Learners’ View of Virtual Assessments During the Corona Virus Pandemic

Dr. Mohamed Hasan Al Kassem, English Language Education

Email: elkassim1972@yahoo.com  , kassimmuhammad72@gmail.com

Abstract

The last quarter of 2019 marked the outbreak of the Corona Virus Pandemic, the disease that entirely transformed the face of the world. That transformation necessitated the shift from onsite or face-to-face learning to virtual learning. Because learning and assessment are profoundly interconnected and dependent, this mandated instructors to conduct assessments virtually too. As a case study, this research aimed to investigate ESL undergraduate learners’ views of virtual assessment during the Corona Virus Pandemic. The research investigated thirty learners who were studying ESL education courses during their first undergraduate semester of the academic year 2021-2022 at a franchise private university in Tyre district, South Lebanon. The researcher used a Google Form questionnaire with virtual interviews as research tools. Data collected from the questionnaire and the interviews were organized and analyzed qualitatively. Findings showed that virtual assessments were partially effective due to their contribution in evaluating learners’ achievements and distribute the ESL course materials properly, despite reporting the presence of some encounters as learners’ demotivation, connectivity scramble, cheating, and plagiarism. The research ended with some recommendations for further future investigations.

Keywords:

Virtual Learning: Cojocariu, V.-M., Lazar, I., Nedeff, V., Lazar, G. (2014) defined it as the type of learning “having in common the ability to use a computer connected to a network, that offers the possibility to learn from anywhere, anytime, in any rhythm, with any means”.

Virtual Assessment (hereafter referred to as VA): Dixon & Worrell (2016) defined it as any means of evaluating learners’ achievements, providing feedback, or improving learners’ proficiency level in “fully online credit courses”.

Article Type: Case Study

  1. Introduction

Virtual learning has mandated virtual assessments and assessment tools. Recently, online exams, quizzes, presentations, writings, etc. have been turned digital. For this purpose, many educational institutions around the world have relied on them for conducting diagnostic, formative, and summative assessments. Various research studies have shown the significant role technological devices play in delivering learning instructions and conducting assessments. In studies like those conducted by Johnston (2004), Baleni (2015), and Ebrahimzadeh & Alavi (2017) there was concern about the impact of technology on facilitating assessments while others like Siviyanti (2014) and Indrayana & Sadikin (2020) stressed on online methods to deliver instruction including assessments, primarily in higher education. Currently, VA is referred to as the type of evaluation utilized via any technological device or international network. Research studies have pointed to it as an alternative approach to the conventional method of a pen and a paper. In their discussion to support VA, Khairil & Mokshein (2018) claimed that it can provide “direct feedback and scoring, practices and effective time”. They also recommend it as a method to consume less logistic expenses as photocopying and answer sheets. There was also a discussion about the variety of assessment tools that instructors can rely on as Google Forms, quizzes, essays, etc. which have left remarkable impacts on the educational process primarily because they enabled instructors to evaluate learners though they are not in the classroom. In this respect, most higher education institutions employed special platforms to deliver virtual learning and VA such as Moodle, Google Meet, Microsoft Teams, Blackboard, etc. with both synchronous and asynchronous features. Lebanese private universities had to deliver instructions and assessments to undergraduate learners during the Corona Virus pandemic. Both instructors and learners had to rely on them though with limited experience in practices, tools, delivery, monitoring, and implementations, which have encountered various challenges. In the university where this research was conducted, instructors utilized Moodle to conduct assessments in various modes such as multiple choice, fill in the blacks, matching, and open-ended questions. Throughout the assessment periods, ESL instructors of the undergraduate program worked on the assessment of learners via the variety of these modes. Many reported these as a success for enabling them to recognize learners’ improvement and ability or inability to achieve the delivered online learning objectives. Others complained about facing several impediments like connectivity, proctoring, and validity.  As for learners, the whole issue of virtual learning was novel. All of them came from the same socio-economic background and almost from the same public and private schools that had limited experience in virtual learning and assessment as well.  Thus, the issue of VA was novel too. Their experience was so limited. This case study was piloted in the same private university to investigate the views of ESL graduate learners of the implemented VA approach during the frequent lockdown periods caused by the Corona Virus Pandemic. The researcher hoped that the findings and results could assist explore the views of this sample towards VA, so that recommendations could be proposed to instructors, assessors, and programs designers.

Review of the Related Literature

This review is introduced to offer ESL instructors pedagogical assistance through exploring some literature and study’s findings.

  1. VA Overview

Dixon & Worrell (2016) defined VA as an online procedure to evaluate learners’ accomplishment, provide them with feedback, and improvement in any assigned course. They clarified showing that these assessments could be totally online as online exams or online submissions. Similarly, Robles & Braathen (2004) defined VA as a method to measure learners’ progress online through preserving the assessment basics. For this purpose, they confirmed that assessors should work to familiarize assessment tasks to enable them offer learners’ effective feedback, achieve accountability, and validate quality. In addition, they viewed VA as a method because it involves evaluating many constituents as well as measuring learners’ academic accomplishment.

  1. VA Types

As in conventional assessments, formative and summative assessments are the two fundamental types of VA. Colman (2021) argued that a formative assessment can be adopted for the purpose of deciding “how well a student is learning the material”. Colman (2021) showed that a formative assessment is more effective if it is “ongoing, consistent, and provides critical feedback to learners”. She spoke about performing summative assessments in online learning showing that such assessments are in the form of final exams conducted to evaluate learners’ achievement at the end of a course. However, she illustrated various methods through which leaners can be evaluated online such as online quizzes, essay questions, drag-and-drop tasks, interviews, etc. In illustrating the best way to utilize these methods, she showed that this is primarily refers to the learning needs and objectives. To her, conducting an online quiz is a more suitable assessment tool for any instructor if the purpose is to provide a fast check of the learners’ understanding. However, if the purpose of the assessor is to evaluate learners’ interviewing abilities, dialogue simulation will be more applicable.  Whatever type assessors adopt, VA should have the same academic consistency as face-to-face assessment; it should “align with course and program learning outcomes, provide valuable learning opportunities for students, and have a level of excellence for students to work toward” (Vlachopoulos, 2016).

  1. VA Tools

Prior to Covid-19 Pandemic, various assessment tools were provided by educationalists for formative and summative assessments. Hunt et al. (2007) spoke about using online devices as mobiles, tablets, and computers.  One familiar tool is the Google Form which is a free online tool that permits assessors to generate forms, surveys, quizzes, etc. Its aim is to evaluate learners’ prior knowledge, set goals, and collect data. According to Keeler (2015), it helps assessors get instant and simultaneous answers and offers diverse features such as multiple versions, question banks, setting of time limit, etc. Another tool is the Proprofs Quiz which is an online wide-ranging quiz that enables any user to generate, share, and score VA. Capterra (2019), conveyed that the Proprofs Quiz is an interactive quiz favored by “educators, trainers, educational institutions, and business”. Socrative is also a free VA tool. It enables users to generate tests and download others from the web. Via Socrative, assessors can merge formative and summative assessments in snapshot polls and/or polls, in addition to grade learners’ work automatically consuming less time and effort. This tool also enables assessors to evaluate students’ learning in real time via surveys, polls, quizzes, etc.  Moodle is another VA tool which is an open-source Course Management System (CMS) also known as LMS or virtual learning environment (Teachtaught.com, 2015). It is one of VA approaches that affords ongoing feedback on learners’ achievement. According to Padayachee et al. (2018), Moodle comprises the capability to manage teaching, students learning, and all sides of assessment.

  1. VA Practices

The pandemic frequent lockdowns mandated VA. To most ESL assessors, it was introduced as a new trend in the educational domain. They practiced it differently based on various academic, social, and technical factors.  Oncu & Cakir (2011) clarified that many VA practices were hard to be implemented due to the absence of face-to-face contact with learners. Literature introduced scores of these practices to virtual pedagogy. One of these tools is polls. As indicated by Ozcan-Deniz (2017), polls are online tools used to collect data on learners’ perceptions of online content delivery whether at the beginning of the semester, at the middle, or at its end with the possibility to use the same questions for comparison purposes which permits assessors to identify the areas where learners have achievements and areas that demand improvement and attention. Another form of VA practice is discussion boards. Discussion boards are conversations’ initiators with easy accessibility, control, and recording. Assessors can easily score with a grading rubric and follow up learners who require ongoing attention and awareness of what and when to post. A third practice tool for VA is quizzes. Most assessors use quizzes for formative assessment purposes. As ongoing assessment tools, quizzes provide instructors with clear data on learners’ progress. Online quizzes can be conducted and given in video lectures form. When recorded, instructors can use the PowerPoint record option and insert the quiz questions between the slides. Many educational institutions around the world utilize projects in both face-to-face and online classrooms. In online classroom assessment, assessors use projects to provide learners the opportunity to collaborate. In onsite classrooms, learners do not know each other well, so interaction and collaboration are not enhanced. Online projects allow learners to communicate, however, in a different way. They can use other tools while working on projects. For instance, some educators advised students to have their finalized work in Google Drive and share files with others. In addition, they can take part in conferences.  Because projects typically use real case studies, research indicates that these can enhance learners’ critical thinking as well as enabling learners’ response to contribute to the virtual classroom teaching strategies. Assessors are advised to adopt specific rubrics to produce an effective evaluation of these projects.

  1. VA Benefits

The implementation of VA incorporates various benefits on all sides. Gaytan & Mcewen (2007) believe in the multiple values that VA has. Gaytan & Mcewen (2007) note that VA enables assessors to provide accurate test results where learners can instantly access into an electronic gradebook. They also claim that VA enhances learner-centered environment, decreases economic costs, and contributes to providing immediate feedback. As shown by Seifert & Feliks (2018), conducting assessments online is of economic worth. It saves papers, printing, photocopying, and energy consumption.  Similarly, Khairil and Mokshein (2018) indicate that VA has both economic and ecological benefits. They argued that “paperless” tests are environmentally friendly and demand less costs. This is because the time and materials utilized are reduced to knowledge acquisition and analysis that is performed automatically which reduces administrative work.  Seifert & Feliks (2018) explain that VA saves time and reduces the exclusiveness of the classroom as a setting for assessment implementations. VA is accessible anywhere, anytime, and most significantly via any device. Seifert & Feliks (2018) reveal that through VA assessors can provide valuable and immediate feedback as well as computerized grading and reporting systems. This allows assessors or assessment administrators to enhance results, feedback, process grades, recognize learners’ improvement, select questions, etc. just by pressing a button. Similarly, Khairil and Mokshein (2018) introduce VA as a good approach that affords auto correction and grading. This is supported by the fact that assessors can generate any assessment using online platforms which could correct learners’ answers automatically and provide scores instantly. All these opts have high accuracy, and most importantly free of any charge. Also, Khairil and Mokshein (2018) argue that VA enables assessors to provide oral and written feedback. Assessors have the chance to provide quality and detailed feedback for learners who can take advantage of it instantly. They show that VA questioning, grading systems, measurement, and measurement tools are reliable and valid. There is no place or space for errors or unfair results among any test or quiz candidates. Khairil and Mokshein (2018) applaud VA for its practicality. They argue that it can be performed at any time, any place based on the assessors’ choice and the surrounding circumstances, with a considerable flexibility in the learning pace. Part of the VA practicality goes for the capability of the utilized technological device to handle much more work than those available via any printed or hardware materials. Khairil and Mokshein (2018) believe that VA can generate distinctive situations that leave an impact on learners’ enthusiasm and attitude. They agree that VA is “more unique, fun and absolutely meets the demands with what needed in 21st century where “successful students are influenced by individual differences in motivation and achievement”.

  1. VA Challenges

Like all types and forms of assessments, VAs encounter diverse challenges. Hricko & Howell (2006) stated several challenges that learners and assessors might have while being exposed to any VA. One challenge, according to Hricko & Howell (2006), is that relying on VA necessitates awareness of certain technological skills, applications, programs, etc., such as typing and creating multiple screens. They add that some assessors and learners cannot endure reading passages on a computer or tablet screen, mainly long ones which may lead to fatigue. They also report that some learners are unable to see the entire passage, question, or exam through the screen “because some items require[s] scrolling horizontally and vertically to get an entire graphic on the page”. Laitusis (2020) highlights the variety of educational environment as a VA challenge, and as instruction adapts to the new virtual context, assessment should do so too. Laitusis (2020) shows that parents and their views are excluded while considering VA where they should be “called upon to provide appropriate support and supervision for their children”. Laitusis (2020) believes that parents require guidance and resources to offer settings “conductive to learning”. Another challenge introduced by Laitusis  (2020) is that VA is unfair for learners with certain disabilities. It is obvious that VA does not take into consideration learners with intellectual, learning challenges, or visual problems.  These, in addition to test time extension, are excluded and those types of learners were not aided with any individualized support.  Furthermore, Laitusis (2020) confirms that VAs were unfair since many learners did not have reliable internet or technology access. It has been revealed that around 30% of online learning children do not have adequate access to the internet.

  1. VA Recent Research Review

During the last two decades, researchers conducted numerous research to investigate the VA efficacy in education in general and in the ESL context in particular. Such research has covered countless issues related to VA as effectiveness, challenges, perspectives, advantages, principles, practices, etc. This research reviewed the most recent ones to attain a comprehensive perception of the fundamental inferences in the field of virtual assessment. To start with, Ozden et al. (2004) conducted a descriptive study to investigate learners’ perceptions of utilizing computer-assisted assessments and to examine the possibility of relying on learners’ reviews to validate assessments. For this purpose, the researchers generated a website for implementing assessments at Kocaeli University, Turkey. As a qualitative study, the researchers used surveys and interviews to collect data from the participants who were third –year students. Research results showed that “instant feedback, randomized question order, item analysis of questions, and obtaining scores immediately” were the most significant features of VA. Analysis also indicated that despite the presence of many challenges and the sore need for improvements, participants perceived VA as effective and accepted it, mainly computer friendly learners. In a different context, Birch & Volkov (2007) surveyed University of Southern Queensland learners’ perceptions of utilizing online discussion boards as assessment tools. This qualitative study was conducted on 70 ESL and EFL students via an electronic survey. Respondents of the survey reported their perceptions regarding “a compulsory assessment item” including contributions to online assessment. Findings indicated that most participants “enjoyed the assessment item” and consented that VA helped them achieve some social and cognitive learning outcomes. Other findings showed that participants viewed VA as beneficial where they were able to share experience with peers and “reduce the feelings of isolation”. In 2014, Johnson & Palmer in their study at an undergraduate state university examined whether linguistics is more suitable for face-to-face than online context. Where they surveyed assessment scores and participants’ perceptions of the efficacy of introductory linguistics course. The researchers used a survey to investigate learners’ perceptions. In the analysis of the survey’s outputs, they concluded that linguistics and maybe other courses are appropriate for both online and face-to-face contexts. They also showed that participants viewed linguistic content and assessment effective. Later, Brown & Lally (2018) in a common international project between two higher education institutions in Finland and Ireland investigated issues regarding online assessments. The research relied on surveys and interviews as data collection tools, where participants from the two educational institutions were set in various discussion interviews. Research findings validated that many participants showed low confidence and low awareness of ongoing assessment. They encountered challenges when they conducted online assessments.  Other results revealed that participants’ “perceptions of efforts and reward” differed from those of their instructors. On the other hand, Hussain et al. (2020) piloted a mixed approach study to examine learners’ perceptions of online assessment during the first wave of the Corona Virus Pandemic. The study participants were 302 learners in UAE during their second semester. The research adopted both qualitative and qualitative tools to collect data. Through analysis it was evident that learners with High GPA were less contented with online assessment. The researchers noted  a reversed correlation between the learners’ GPAs and the satisfaction degree, and learners’ preference to implement both online oral exams and on campus testing.  In the same year, though in a different context, Adanir et al. (2020) conducted a mixed approach study where 370 undergraduate participants from Turkey and Kyrgyzstan were included. The participants were having online courses for the first year. The aim of the research was to study learners’ perceptions of online assessment. The study utilized a survey to collect qualitative data on learners’ perceptions. Simultaneously, quantitative tools were used for analysis. Findings revealed that learners’ perceptions vary according to many factors like gender, major, and experience in online learning.  Analysis also indicated that Turkish learners’ perceptions differ from those of Kyrgyz. Turkish learners who viewed online assessments less stressful, more reliable, and fairer than the conventional assessments. Recently, Meccawi et al. (2021) piloted a cross-sectional descriptive mixed method study for the aim of investigating both learners and instructors’ perception of online assessment. The participants were 547 undergraduate students and 213 instructors at King Abdul-Aziz University- KSA.  To collect qualitative data, the researchers prepared two different questionnaires for learners and instructors. Major findings highlighted participants’ complaint about various problems such as cheating, plagiarism, in addition to learners’ low awareness of exams’ ethics. The last study in this review was conducted in 2021 by Yuliano & Mujtahin. who implemented a case study to explore instructors’ perceptions and practices of online assessment during the Corona Virus Pandemic circumstances. The researchers utilized an open-ended questionnaire and online interviews with participants who were instructors in the ELT context. Unfortunately, this case study’s finding reflected instructors’ negative attitude towards online assessment because they believed in the availability of various impediments as connectivity, validity, and learners’ lack of motivation. Yet, results showed that online assessment contributed to materials distribution and testing learners’ accomplishments.  To sum up, the conducted research studies in the last two decades have investigated the implementation of online assessment with special emphasis on the employed strategies and learners’ attitude. Some of these disclosed promising results as presented in the studies of Ozden et al. (2004), Birch & Volkov (2007), Johnson & Palmer (2014), Adanir et al. (2020) who highlighted the effectiveness and usefulness of conducting online assessment as an approach which could replace the other conventional methods. However, others as those conducted by Brown & Lally (2018), Meccawi et al. (2021), Yuliano & Mujtahin (2021) reported negative attitudes and the presence of challenges with limited significance of online assessment, which necessitates the need for further investigation and research. This current qualitative case study research intends to provide further investigation to ascertain whether online assessment is viewed positively or negatively by ESL undergraduate learners. The researcher hopefully anticipates that this research could fill in the gap primarily in the undergraduate language instruction investigation and provide ESL academics and assessors with beneficial recommendations.

  • Methodology
  1. Research Method

This study adopted the case study methodology. The researcher found the qualitative case study approach appropriate for this research because it served its purposes well. As indicated by Bonney (2015), a case study is an approach founded on the description of “a real or hypothetical situation that requires a solution or action”.  On the other hand, Duff, P. (2015) argues that the case study methodology enables researchers to study a case profoundly, so that they can attain “understanding of individuals’ experiences”. Duff, P. (2014) also shows that this approach paves the way for a comprehensive understanding of the case’s perception, improvement, or achievement within a certain “linguistic, social, or educational context”. These motives contributed to the researcher’s quest for adopting this methodology in this research.

  1. Research Context

This case study was conducted at a franchise private university in Tyre district, South Lebanon where English is communicated as a first foreign language of instruction in most majors. This private university adopts an international English language curriculum designed primarily for learners studying English as a second language based on a validated placement test that determines learners’ proficiency levels.  The study took place during the academic year 2021-2022 and lasted for a period of 15 weeks.

 

  1. Research Instruments

As a qualitative case study, this research utilized two tools in its quest for collecting data. The first tool was a Google Form Questionnaire to investigate participants’ insights of the implemented online assessments. On the other hand, and for the purpose of verifying participants’ responses and dig out their insights, the interview approach was adopted where the interviewer followed a formalized list of questions. The researcher took advantage of the kinds and types of the English course assessments conducted throughout the semester to generate questions and conversation starters to enhance mutual communication. Each interview took place for around 15 minutes, where participants shared their experience and reflected their attitudes of virtual assessment.

 

  1. Research Participants

The research participants were thirty students studying ESL as a foundation course during their second undergraduate semester at a franchise private university in Tyre district, South Lebanon. The participants ages ranged from 18 to 21. Nineteen participants were females and eleven were males. Yet, gender was not regarded as a variable in this case study. All participants studied English as a second language since the primary educational stage in public and private schools. Participants were from the same socio-economic background.

  1. Research Analysis and Discussion

During the investigation period, the researcher incorporated participants in an online Google Form questionnaire completion to collect data to explore their insights of the implementation of online assessment in the undergraduate foundation ESL classrooms. However, the interviews with participants were conducted to collect additional data and involve them in more discussions to reach more accurate responses on their perceptions.

  1. Data Collected from the Questionnaire

In reference to the Google Form questionnaire that participants completed online, the responses were collected and grouped. See the attached table.

The statistics conveyed by the questionnaire clearly states participants’ partial contentment of online assessment. It is evident that many participants found online assessment practical and beneficial on various levels. Regarding accessibility, around half of participants (46%) agreed that online assessments were easily accessible and 14.25% strongly agreed with this statement while only 24% disagreed and 15.25% strongly disagreed with it. Most participants reported that the implemented online assessments served the same purposes of the conventional ones as paper-based assessments. In this respect, 40.25% agreed with this assertion, 45% strongly disagreed while only 11.25% disagreed, and 3.5% strongly disagreed. In addition, a considerable number of participants regarded online assessment as able of providing the same results as the conventional methods. So, 39% of them agreed and 47% strongly agreed with this point while 10.5% disagreed and 3.5% strongly disagreed.   Furthermore, 21.5% agreed and 76.25% strongly agreed with the statement that online assessments provide free-error score while only 1.25% disagreed and 1% strongly disagreed with this assertion. There was almost a consent on the idea that online assessments always provide questions with clear and specific instructions. So, 66.25% strongly agreed and 26.25% agreed with this point while only 6.25% disagreed and 1.25% strongly disagreed. Moreover, most participants admitted that correction and results were fast. This was clear through the percentages above: 11% agreed and 87% strongly agreed with only 2% who disagreed.  On the other hand, many participants criticized online assessments due to the various encounters that faced learners. For instance, many participants rejected the idea that this new form of assessment was fair to all learners (37.25% disagree and 27% strongly disagree while only 21% agreed and 14.25% strongly disagreed). Many participants accused online assessments for time consuming. Around 17.25% agreed and 4.25% strongly agreed with the statement that online assessments consumed less time than the conventional assessments. Yet, 63.5% disagreed and 15% strongly agreed. For the statement that online assessments required more preparation than the conventional assessments, only 7% agreed and 3.75% strongly agreed while 8.25% disagreed and 81% strongly disagreed. In response to the statement that instructors’ online feedback was clear and comprehensive, only 10% of respondents agreed and 18% strongly agreed while about 53.75% disagreed and 18.25% strongly disagreed.  Also, only 30.5% agreed and 26.25% strongly disagreed with the idea that online assessment feedback was helpful whereas 39.5% disagreed and 4.25% strongly disagreed. As for improving learners’ abilities, only 26% agreed and 9% strongly agreed with 52.5% who disagreed and 12.5% strongly disagreed. About 13.75% of learners agreed and 83.25% strongly agreed that online assessments paved the way to cheating and avoid proctors monitoring while only 2% disagreed and 1% strongly disagreed. Regarding network connectivity and power supplies, 23.25% agreed and 71.25% strongly agreed whereas 4.25% disagreed and 1.25% strongly disagreed that these were always a problem in online assessments. Through answering the question about motivation, only 13% agreed and 4% strongly agreed while as 61.5% disagreed and 21.25% strongly disagreed that online assessments increased their motivation to learn. The most important part in the questionnaire was related to having the same experience with online assessment. Only 9.1% agreed and 15% strongly agreed 32% with 43.5% disagreed and 0.4% strongly disagreed with the idea that they felt interested about being assessed online next semester.

  1. Data Collected from the interviews.

 

Data extracted from the interviews showed various attitudes towards the implementation of online assessment. These can be categorized into four points.

  1. Participants’ Insights

All participants applauded the idea that online assessments enabled instructors to evaluate them and graded the conducted formal and summative assessments. They also considered them as “a new trend” that involved learners in “new learning experiences”. In addition, participants revealed that the online assessments helped them do their exams even under the Corona Virus Pandemic strict lockdown periods, which was better than postponing exams for months or to the next semester or year. They reflected a positive attitude towards the idea that conducting assessments online contributed to keeping social distancing and avoiding the risk of infection. However, various responses reflected many negative attitudes and complaints. It was so clear that both assessors and learners encountered electricity failure while making the assessments. The Corona Virus Pandemic lockdowns concurred with an economic crisis in Lebanon where Lebanese faced long periods of electricity failure. Many of the interviewed assured having “no electricity at the time of the exam”. This problem coincided with connection disruption and instability. Most learners reported the inability to access the online assessments on time, or even being disconnected while the assessment was in progress. These “put us down” as one interviewer replied. Another interviewee replied with an angry tone “I prefer exposing myself to Corona Virus than making an exam online”. Another showed that throughout the whole semester she did not complete one exam without electricity or connectivity problems. The participants’ negative attitude had also roots related to experience. Almost all the interviewees confirmed that it was their first experience with online assessment. A participant showed “everything was new to me even typing my answers using the keyboard was new”. Participants showed a negative attitude towards online assessments for being deprived of the chance to have the appropriate training on using the platforms, such as Moodle. They reported their illiteracy in these platforms’ skills. Many argued that they are against being tested via the platforms, and the majority assured that they only had brief information one day before the exam about the required skills such as logging in and out, inserting answers, moving from one page to another, making modifications, etc. Some others clarified their negative insights through the stressful situations and worry online assessments put them in. There was worry about “losing all answers”, “fail to submit answers”, or “the instructor will not find my answers”. These kept learners “anxious all the time”, as many assured. These were not the only issues behind the participants’ negative attitude.  Many reported the instructors’ lack of experience in setting online tests or quizzes regarding timing, types of questions, and number of question sets. The majority of participants indicated that their instructors should have regarded the time issue while preparing the exams, and they should have even made a balance between the multiple-choice questions, fill in the blanks, and the open-ended questions. Each type of these had its conditions, grades, and allotted time. Many instructors discarded these facts, students reported.  As a result, participants encountered many issues that affected their attitude towards the acceptance of online assessments though there was a confirmation of their worth primarily in evaluating learners’ achievements and grading their work.

  1. Learners’ Motivation

During the pandemic, online assessment was proposed as the only available solution to evaluate learners’ achievements and work. Many academics proposed varying instructional and assessment methods and tools to enhance learners’ motivation. As a new trend in the private university where this study was conducted, online assessment failed to serve this purpose. The many problems that learners encountered such as electricity failure, connectivity, lack of experience, etc. made them less enthusiastic. For instance, being unable to access the assigned test or quiz on time demotivated many learners. One of the interviewed participants reported that her peers were “stressful all the time” and “got panicked the moment they heard about any upcoming evaluation”. They referred to this state to anticipating an “endless list of technical problems”. Many reported their worry about submitting assignments and tasks on the due time because some of them had “only one laptop at home where there were many children waiting to have their work done online”. This issue, as many participants confirmed, was also “applicable on online learning not only on online assessment”.  To enhance learners’ motivation, some instructors were flexible regarding the timing issue. For instance, many participants admitted that instructors extended tasks or tests submission time to overcome the stressful states and worry students encountered, so that they became more enthusiastic. Yet, many of the interviewees showed that even with the extension of the assessments time limit, learners were overwhelmed and burdened with the same worries. As a measure taken by most educational institutions around the world, the status of pass/fail was applied instead of the grading system for each course, as normally adopted before the pandemic. Even with this measure learner were not motivated. A considerable number of the interviewees showed a lack of enthusiasm. Many said that they were still “afraid of the same result” showing that “even with the pass or fail result, we fear of not passing the course in the end”. Moreover, after contacting some of the participants’ instructors, they confirmed adopting some strategies to motivate learners as involving them in preparing certain tasks or projects to be presented as parts of ongoing assessment. To a certain limited extent, this worked with some learners who practiced well before performing online. Some others, as instructors reported, recorded their presentations, and sent them as unsynchronized material. The rest of learners remained with the same state of “enthusiasm deficiency”, as one interviewed participant called it. In reference to the above results which reflect the participants’ motivation, it could be obviously stated that learners were less motivated about having their assessments online due to the ongoing states of worry they encountered, despite the various measures taken in this respect by the university administration and instructors.

  1. Online Assessment Efficacy

Assessment is the procedure through which instructors or assessors consider the information, understanding and skills of learners.  Assessment also has a significant part in the learning process, and therefore in the learning opportunities. As discoursed by Davis (2010), an assessment should meet certain values or outcomes to attain its efficacy. Davis (2010) stressed that an assessment should provide learners with high quality feedback information that contributes to self-correction.  Many of the interviewed participants assured that the feedback they got from instructors was limited. Some reported that many of their answers, mainly those related to open-ended questions, were only graded. Instructors left no direct or indirect feedback. Hence, participants lost the opportunity to reach the stage of practicing auto-correction. In addition, some of the participants noted that the conducted assessments, formative and summative, affected their learning proficiency positively. They showed that these online assessments were supportive to their skills and understanding. However, many disagreed with this view showing that the exams and assignments they had for evaluation were merely recitations of the information delivered during classroom instruction. Participant NF showed that she “got so little from the assessments” and the only thing she gained was “only the grade”. Interviews also showed that the online assessments failed to encourage all learners to interact and involve in dialogue around learning. They lost the opportunity of feedback dialogue in the form of peer feedback or instructor feedback. Many claimed that the assessments were not proactive. They lacked the needed interaction among learners. Many interviewees admitted that they “got instructions, studied the learnt materials, sat for an exam, and got the results. Nothing else was done for us”. This shows that assessment went off its course. It should have involved learners in interaction and dialogue. Ten of the interviewees stated that online assessments involved them in decision making about the assessment strategies and practices.  They showed that in some of the quizzes, instructors asked them about their preference of multiple-choice questions, filling in the blanks, or essay answers. However, more than twenty participants had no role in the instructors’ choice of the assessment strategy and four ones had no idea about the intended instructor’s plan. They showed that being informed about the assessment policy would have contributed to getting better results. Many spoke about this point with confirmation that they had lost the opportunity whereas “during the conventional paper-based assessments everything was so clear”. As indicated by Davis (2010), it is expected that an assessment should encourage positive motivational beliefs and self-esteem. According to data collected from the interviews, no participant had a positive response when inquired about this point. All participants complained about their inability to feel enthusiastic about the performed assessments showing that the impact of these assessments was so limited, and it failed to enhance their motivation to learn. Furthermore, there were many responses from participants which reported that the online assessments and assignments did not provide the appropriate information for instructors and assessors which could be utilized in contributing to their teaching strategies and practices. On this point, one participant commented saying that “nothing used to be changed after each assessment, just some oral feedback and comments on our mistakes. In the next session, everything is done in the same way”. There was even a negative response on the question about the impact of online assessment to support the development of learning groups and learning communities with the inability to clarify what good performance was in relation to objectives, criteria, and standards, as shown in Davis (2010).   Based on all the above, it was obvious that the efficacy of the conducted online assessments was so limited.

  1. Online assessment Evaluation

The worthiness of online assessments during the Corona Virus pandemic was unquestionable.  There were many positive attitudes in the interviews that applauded them. Many of the interviewees stated that the assessments they had had an impact on the learning process in general. The conducted assessments, as many participants reported, enabled instructors to follow up learners’ progress especially through the formative assessments represented by the short quizzes and assignments that were done during the learning sessions. They contributed to the instructors’ ability to evaluate at any time and place, and instructors can track learners’ performance at another time (Seifert, 2018). Participants also confirmed that the assessments contributed to providing an actual but not an accurate image about their achievement of the course predetermined objectives. Still, there was a confirmation about keeping learners aware regarding the seriousness of assessments and evaluations with frequent announcements that students will get a pass or fail result when the semester terminates. However, the interviews revealed learners’ dissatisfaction about assessments’ efficacy.  Many participants disclosed the ineffectiveness of the measurement instructors did. Learners complained about the instructors’ inability to monitor all students during assessments which led to inaccurate results and evaluations. Also, most participants complained about their inability to submit their finalized work on time due to the frequent internet connectivity problems. They even complained about their frequent failure to edit or revise their answers after submission due to the lack of experience and practice. During the interviews, most respondents reported problems in restoring their answers whenever they moved from one place to another in the assessment. Participant LM confirmed that he lost all his answers in the first two assessments he had, and in both times the instructor interfered to aid tackle the problem. At least fifteen participants, including LM, encountered similar problems in various assessments “primarily in the first part of the semester” and “due to the instructor’s cooperation and understanding that we passed them”.

Conclusions

To wrap up, this case study survey was piloted for the aim of exploring undergraduate ESL learners’ view of online assessment during the Corona Virus pandemic. After thorough investigation of the participants’ views, it was clearly revealed that learners viewed online assessments as partially effective where they disclosed that such assessments provided solutions as alternatives to the conventional paper-based assessments. However, most respondents focused on the challenges they encountered throughout the entire semester which contributed to holding negative attitudes towards online assessments in general. Specifically, findings revealed that instructors succeeded in utilizing various assessment approaches to evaluate and measure learners’ achievements and progress as quizzes, essay questions, and forum posts as discussed by Colman (2021). Other tools were also used as Google Form (Keeler,2015), Proprofs Quiz (Capterra ,2019), Socrative, and Moodle with several practices such as polls, discussion boards, quizzes, projects, and conferences. To a certain extent, the frequent implementation of these enabled both assessors and instructors to achieve learning objectives. Findings of this research reflected this through the participants’ confirmation that the instructors evaluated their work through various ways and provided scores with feedback to improve future work. There was confirmation from learners on the value of VA such as doing assessments from home at any time or place without being stuck to the work schedule and settings. There were also many voices that applauded online assessments for enabling assessors to provide automatic scoring and feedback with reliable and valid measurements. These findings agreed with Ozden et al. (2004), Birch & Volkov (2007), Johnson & Palmer (2014) whose research findings reflected positive perceptions from instructors and learners towards VA. Yet, the same findings revealed many opponents of the idea of online efficacy due to the frequent challenges they encountered.  For many participants, online assessments were unfair to all learners due to the numerous logistic and technical problems they frequently had. They also complained about consuming more time than the conventional assessments because most learners, instructors, and assessors lacked the experience in utilizing “this new trend”, as some participants in the interviews liked to call it.  Furthermore, a considerable percentage of participants believed that doing assessments online required more preparation, where many showed that the feedback provided by instructors was unclear, incomprehensible, and not helpful. They also showed that the assessments went off their course and did not serve their purpose since they did not improve the learners’ proficiency levels, as most participants revealed in the questionnaire. The most shocking finding was that most participants did not favor that experience with online assessment; they did not get motivated and rejected to be involved again in a similar experience. These findings agree with Adanir et al. (2020), Meccawi et al. (2021) and Yuliano & Mujtahin (2021) who showed the limited efficiency of online assessments and the negative attitudes both instructors and learners held.  As a result, this study showed that conducting online assessments during the Corona Virus Pandemic was not completely efficient for both learners and instructors who encountered numerous challenges that should be considered by academics and assessors in conducting any future online assessments. Therefore, it is worth confessing that these survey results could have been different if the sample had been more representative. The research was conducted in a limited area and encompassed only 35 ESL undergraduate learners from one private university in Tyre district- South Lebanon. Thus, the research findings were primarily based merely on the perceptions of this limited group. For further research, therefore, the researcher recommends increasing the number of participants, involving instructors, and covering more universities to make the sample more representative. It is also recommended to utilize other research instruments as observations to achieve more validated results.

References

Adanir G.A. et al. (2020). Learners’ Perceptions of Online Exams: A Comparative Study in Turkey and Kyrgyzstan. International Review of Research in Open and Distributed Learning. Volume 21, Number 3. http://www.irrodl.org/index.php/irrodl/article/view/4679/5331.

Bahar, M., & Asil, M. (2018). Attitude towards e-assessment: influence of gender, computer usage and level of education. Open Learning: The Journal of Open, Distance and e-Learning, 00(00), 1–17. Retrieved from https://doi.org/10.1080/02680513.2018.150352.

Baleni, Z. G. (2015). Online formative assessment in higher education: Its pros and cons. Electronic Journal of e-Learning, 13, 228-236.

Birch, Dawn & Volkov, Michael (2007) Assessment of online reflections: Engaging English second language (ESL) students.  Australasian Journal of Educational Technology 2007, 23(3), 291-306

 Bonney, K. M. 2015. Case study teaching method improves student performance and perceptions of learning gains. Journal of Microbiology & Biology Education 16 (1): 21–28. https://www.ncbi.nlm.nih. gov/pmc/articles/PMC4416499/

 Brown K, Lally V. (2018). Rhetorical relationships with students: A higher education case study of perceptions of online assessment in mathematics. Research in Comparative and International Education. 2018;13(1):7-26. doi:10.1177/1745499918761938

Cojocariu, V.-M., Lazar, I., Nedeff, V., Lazar, G. (2014). SWOT analysis of e-learning educational services from the perspective of their beneficiaries. Procedia-Social and Behavioral Sciences, 116, 1999–2003.

Colman, H. (2021) 9 Assessment Methods for Using Online Learning  (ispringsolutions.com). https://www.ispringsolutions.com/blog/8-ways-to-assess-online-student-learning

Davis, S. (2010). Effective Assessment in a Digital Age A guide to technology-enhanced assessment and feedback. http://survey.jisc.ac.uk/digiassess

Duff, P. (2014). Case Study Research on Language Learning and Use. Annual Review of Applied Linguistics, 34, 233-255. doi:10.1017/S0267190514000051

Dixon, D.D., & Worrell, F.C. (2016). Formative and summative assessment in the classroom. Theory Into Practice, 55(2). doi-org.ezproxy.lib.ucalgary.ca/10.1080/00405841.2016.1148989

Duncan & Cohen (2011) Exploring Learner-Centered Assessment: A Cross-Disciplinary Approach. International Journal of Teaching and Learning in Higher Education. Vol. 23 No. 2, 2011, 247.

Ebrahimzadeh, M., & Alavi, S. (2017). The effect of digital video games on EFL students’ language learning motivation. Teaching English with Technology, 17(2), 87-112.

Gaytan, J., & McEwen, B. C. (2007). Effective Online Instructional and Assessment Strategies. American Journal of Distance Education, 21(3), 117–132. https://doi.org/10.1080/08923640701341653

Hunt, M, et al. (2007). The use of ICT in the assessment of modern languages: the English context and European viewpoints. November 2014), 37–41.

  1. T. Hussain, et al. (2020). Students’ Perception of Online Assessment During the COVID-19 Pandemic: The Case of Undergraduate Students in the UAE. 21st International Arab Conference on Information Technology (ACIT), 2020, pp. 1-6, doi: 10.1109/ACIT50332.2020.9300099.

Hricko, M., & Howell, S. L. (2006). Online Assessment and Measurement: United States of America. Information Science Publishing.

Indrayana, B., & Sadikin, A. (2020). Penerapan E-Learning Di Era Revolusi Industri 4. 0 Untuk Menekan Penyebaran Covid-19. 2(1), 46–55.

Inoue, M., & Pengnate, W. (2018). Belief in foreign language learning and satisfaction with using Google classroom to submit online homework of undergraduate students. 5th International Conference on Business and Industrial Research (ICBIR) (pp. 618-621). IEEE.

Johnson, David & Palmer, Chris C. (2014). Comparing Student Assessments and Perceptions of Online and Face-to-Face Versions of an Introductory Linguistics Course. https://citeseerx.ist.psu.edu/viewdoc/download?doi=10.1.1.892.8014&rep=rep1&type=pdf.

Johnston, T. C. (2004). Online homework assessments: Benefits and drawbacks to students. Academy of Educational Leadership Journal, 8(3), 29-40.

Khairil and Mokshein. (2018). 21st Century Assessment: Online Assessment. 8(1), 659–672. https://doi.org/10.6007/IJARBSS/v8-i1/3838

Koç, Selma et.al. (2015). Assessment in Online and Blended Learning Environments. USA: Information Age Publishing Inc

Kim, N., et al. (2008). Assessment in Online Distance Education: A Comparison of Three Online Programs at a University. 16.

Laitusis, Vytas (2020) Assessment Challenges in a Remote Learning Environment https://www.hmhco.com/people/vytas-laitusis

Meccawi, Z. et al. (2021) Assessment in ‘survival mode’: student and faculty perceptions of online assessment practices in HE during Covid-19 pandemic. International Journal for Educational Integrity (2021) 17:16

Oncu, S., & Cakir, H. (2011). Research in online learning environments: Priorities and methodologies. Computers & Education, 57(1), 1098–1108. https://doi.org/10.1016/j.compedu.2010.12.009

Ozcan-Deniz, Gulbin. (2017). Best practices in assessment: a story of online course design and evaluation.

Ozden, M., et al. (2004). Students’ Perceptions of Online Assessment: A Case Study. 19.

Padayachee, P., et al. (2018). Online Assessment in Moodle: A Framework for Supporting Our Students. 32(5), 211–235.

Rena M. & Pratt K. (2009). Assessing the Online Learner, United States of America: Jossey-Bass.

Robles & Braathen (2004). Online Assessment Techniques. Vol. XLIV. No. 1., 39. 47 29

Seifert, T., & Feliks, O. (2018). Assessment & Evaluation in Higher Education Online self-assessment and peer-assessment. Assessment & Evaluation in Higher Education, 0(0), 1–17.

Silviyanti, T. M. (2014). Looking into EFL students’ perceptions in listening by using English movie videos on YouTube. Studies in English Language and Education, 1(1), 42-58.

Vlachopoulos, D. (2016). Assuring quality in e-learning course design: The roadmap. International Review of Research in Open and Distributed Learning, 17(6). doi.org/10.19173/irrodl. v17i6.2784

Yulianto, D. & Mujtahin, N.M. (2021). Online Assessment during Covid-19 Pandemic: EFL Teachers’ Perspectives and Their Practices. Journal of English teaching, 7(2), 229-242. DOI: https://doi.org/10.33541/jet.v7i2.2770

عدد الزوار:135

مقالات ذات صلة

اترك تعليقاً

لن يتم نشر عنوان بريدك الإلكتروني. الحقول الإلزامية مشار إليها بـ *

زر الذهاب إلى الأعلى