Part 4: What Tools or Resources Can We Use?

The Cost of Respect? Surprisingly Little

Joseph Kennedy and Albert Kagan

Online and distance learning classes have grown precipitously in the last decade. Flexibility, access, cost, class variety, and instructor engagement as well as technological advances have driven this expansion of online offerings, even while traditional class delivery across higher education is decreasing (Bailey et al., 2018; Snyder et al., 2019). During the fall 2020 semester, students enrolled in distance education courses in United States post-secondary institutions were 72.8% of all attendees; a year earlier this statistic was 37.2% (U.S. Department of Education [DoE], 2022 and U.S. DoE, 2021 respectively). Much of the increase in online offerings is related to COVID-19; however, recent U.S.-wide survey data indicates that students’ desire for flexibility in course offerings is likely to cause many institutions to offer more online courses than they had pre-pandemic (Venable, 2022, p. 36).

As online and distance learning have increased, concerns regarding academic integrity have similarly grown.  Some concerns revolved around the lack of in-person contact and the apparent ease of cheating.  Numerous studies have demonstrated an appearance of cheating in online classes in disciplines as diverse as business, engineering, and nursing (Daffin & Jones, 2018; Dyer et al., 2020; Harton et al., 2019; Seife & Maxwell, 2020). Such results continue to demonstrate the pattern in students’ attitudes regarding the nature of cheating in an Internet-connected world that King et al. identified in 2009 when they found that almost three-quarters of business students at their institution believed cheating is easier in online courses.  Together, these findings indicated that students do not feel “cheating” is “cheating” when the instructor does not specifically define what constitutes cheating.  This attitude may be due to the prevalence of websites that provide quick answers to questions often asked on tests, the ubiquity of resources that eliminate the need for students to learn and memorize basic facts, or a shift in learner conceptualization regarding what is truly worth remembering.  Regardless of the cause, prior to the COVID-19 pandemic, there was a growing disconnect between instructors’ expectations of student behavior on assessments, and students’ understanding of which cheating behaviors were unethical (Braff, 2011; Seife & Maxwell, 2020).

When the COVID-19 pandemic started to spread in the winter and spring terms of 2020, many higher education intuitions adopted online tools rapidly.  Oftentimes the hasty implementation was fraught with faculty/staff unpreparedness, technology deficiencies, limited course material availability, and student confusion.  Coupled with the immediacy of the online transition were ongoing concerns regarding appropriate implementation of Universal Design for Learning (UDL) principles in online courses (Evmenova, 2018), integrity within the online model (Palmer et al., 2019), and a lack of clarity regarding the definition of academic integrity. Suryani and Sugen (2019) reported that it was difficult to locate the academic integrity policies of many institutions even pre-pandemic.  Some schools that offered only a few courses online had a poorly developed technology infrastructure, and many of those schools were concerned about the cost of purchasing new technology tools in a short time span.

Meanwhile, students challenged new teaching methodologies; there was frustration regarding poor communication from faculty and institutions, confusion over differing proctoring methods, and considerable concern about student privacy. While some of the challenges could be addressed through clearer communication (Bozkurt et al., 2020), student concerns regarding privacy are pervasive, enduring, and multi-faceted.  The anger of students revealed in one editorial (The Editorial Board, 2021) goes beyond resentment regarding the invasion of student privacy to argue that test proctoring software uniquely creates inequity. Other student voices articulate that such technologies are not only insulting and anxiety-producing, but fail to enhance academic integrity (Poster, 2021).

Traditional residential liberal arts institutions that did not already support a robust online class presence were particularly challenged as the pandemic became pervasive in the spring of 2020. The required training of faculty and students, course material development, grading alterations, workload modifications, technology upgrades, and enforcement of academic integrity standards challenged not just institutional capacity and budgets but also instructional models. The integrity standards had to address class applications, written assignments, group activities, and exam/assessment methodologies in an online setting. In this context, the paradigm of respect for learning, the students, the class, the instructor, and the institution had to be balanced with the concern for assessment security and the potential intrusiveness of any toolkit adopted by an institution.

This paper discusses the implementation of online assessment mechanisms using pre-existing tools at a small liberal arts college in the upper Midwest region of the United States, precipitated by the shift to distance learning driven by the COVID-19 pandemic. A set of online processes has been in place since the spring term of 2019, with continual refinement of online delivery methods, learning management system (LMS) practices, faculty training, and student instruction within a model of academic integrity preservation. The attainment of mutual respect across the system regarding faculty goals and student needs is a guiding principle supporting the institutional mission during the transition to online class delivery.

The Pandemic’s Impact on Online Tool Usage: A Case Study

The authors’ institution is a residential liberal arts college located in the upper Midwest with an approximate enrollment of 2,000 undergraduate students. The institution emphasizes in-person instruction and had only begun incorporating some hybrid and online courses into the curriculum in the prior four years. During the spring semester of 2020, 19 of 598 courses had been planned to be online or primarily online courses.  The institutional use of educational technology tools to deliver performance assessments was relatively under-developed as well; less than one-third of courses taught in the fall of 2019 used quiz and test tools through the school’s LMS.  At this time, approximately 75% of the faculty members had no experience teaching courses including online experiences, and approximately one-half of the faculty members had limited experience using online assessment tools. These figures are based on course offerings at the institution from 2017-2020 supplied by the registrar’s office, content available on the college LMS, and the semester-by-semester notes of one of the authors, who is the manager of LMS at the institution.  This lack of familiarity with online course development is not isolated to this institution; multiple authors note limited online experience at many colleges in the spring of 2020 (Al-Freih, 2021; Cutri & Mena, 2020; Haslam et al., 2020; Johnson et al., 2020).

In late January 2020, this institution’s classroom technologies coordinator, a member of Information Technology Services (ITS), argued that ITS needed to begin preparing the institution’s infrastructure for the possible impact of COVID-19.  His prescient warning allowed the department to procure some hardware ahead of the international rush and to begin preparing ITS staff to deploy and explain new technologies and learning approaches to the campus community.

During February of 2020, instruction continued to be in-person.  Some faculty members began to inquire individually about tools and approaches for distance learning and sought advice from those faculty members who had prior experience with online courses.  Meanwhile, the rapid advancement of COVID-19 led to a turbulent and ever-shifting set of policies; as of March 11, 2020, the institution had announced they had no plans to switch to online / distance learning.  Two days later, the institution’s president announced a new plan, which included a six-day “pause” in instruction, followed by a shift to fully remote learning.  Given the institution’s emphasis on face-to-face instruction, the institution’s instructional designer felt this was an inadequate amount of time to prepare for online course continuation.  This opinion was not unique to this college; other studies allude to the abruptness of the online conversion faced by many institutions (Dyer et al., 2020; Seife & Maxwell, 2020).

Development of Procedures to Safeguard Academic Integrity

Although faculty members had some concerns regarding academic integrity, much of the faculty focused on learning new techniques of teaching and the tools necessary to deliver instruction in a fully online environment. Therefore, relatively few faculty members engaged in discussions regarding academic integrity approaches and safeguards.  Those who did engage in such discussions with the instructional designer identified the following issues, per his contemporaneous notes:

  • Given the chaotic transition to online instruction, students would both likely feel increased pressure to violate academic integrity and would have greater opportunities to do so.
  • The institution needed a system that provided confidence when a potential academic violation was identified so a Type I error would not occur; in other words, no students should be punished for having violated academic integrity when no such violation had occurred.
  • It was important to avoid systems that were overly intrusive, which could lead to students feeling it was assumed they would cheat.
  • The faculty wished to implement systems that were transparent and effective at deterring academic integrity violations.
  • Systems that required students to use expensive equipment or tools would be inequitable, especially as the institution lacked sufficient funds to purchase hardware for each student.

Therefore, faculty members and administrators sought to balance effective deterrence and detection systems with a respect for student motivation and privacy.  Given the residential face-to-face focus of the institution, faculty members needed to be acutely aware of the dangers of appearing overly intrusive.  Thus, an early decision was made to avoid eye-tracking and non-college human proctoring systems.  The desired balance also had to preserve the academic rigor of the courses, which meant that some systems of control and accountability were considered.

Unfortunately, with only one week to implement a radically different instructional model, faculty had little time to consider a holistic and institutional culture-based approach to academic integrity.  While the institution did generally follow principles such as those articulated by Kitahara et al. (2011), namely that the problem of academic integrity violations must be addressed and solved at the societal level, the timeframe was clearly insufficient to determine the impact of new tool adoption on the shared cultural understanding of academic integrity.  Instructors’ focus on rapidly learning new tools and approaches meant they were often unable to clearly communicate procedures and objectives to students, as the faculty members themselves were unclear of the mechanisms being employed. As a result, University of Florida Instructional Assistant Professor D. Mani posits, the combination of the stress of the “unknown” and the breakdown in communication was likely one cause of resultant academic misconduct (personal communication, April 5, 2022).

Ultimately, several approaches were adopted.  The authors implemented a secure-browser online assessment approach in the courses they taught and managed; during subsequent semesters, the authors managed ten courses with an approximate enrollment of 200 students. Eventually, this process would become a record-proctor model, including assessments delivered through the LMS using a secure online browser as well as video recording software originally designed as a performance assessment tool.

Additionally, the authors segmented the capstone paper for each class into three parts; expanded the number of questions in each quiz by five while allowing students to choose to skip any five questions; altered the participation mechanisms; and replaced letter/number indicators in multiple choice answer sets with mere bubbles.  These changes were designed to provide greater support to students in line with UDL principles and demonstrated respect for their individual learning autonomy while also mitigating the risk of academic integrity violations.  Breaking the capstone project into three segments allowed the instructor to provide rapid feedback, including redirection to students struggling with citation concepts as well as with content.  It also minimized the value of any student’s purchase of a “paper mill” submission.  These refinements and the development of the recorded-proctoring assessment mechanism are described further in the following section.

Development and Maturation of this Design

The authors have been collaborating on course design and electronic tool incorporation in multiple courses offered in the School of Business since the fall of 2015; Course 1 was taught each fall and spring semester during the years discussed in this chapter.  The original assessment design of the courses reflected Universal Design for Learning (UDL) considerations in multiple ways:

  • Only two of the three mid semester exams were included in the semester grade calculation to ensure a student struggling on a particular day could still earn full marks.
  • Most courses included a group presentation to allow students to demonstrate their knowledge and ability to operate in a group setting.
  • Students added to their classmates’ knowledge through both in-class participation and individual presentations covering current issues in the subject, allowing different modes of participation.
  • The capstone (term) paper assignment was heavily scaffolded.

Fortuitously, the instructor had planned to offer courses online beginning in the spring of 2020, which were consequently modified mid-semester.  In subsequent semesters, the courses were modified further as described below to reflect student feedback and to ensure that methods of instruction and assessment better demonstrated respect for students as learners as well as supporting non-intrusive monitoring practices.

Course 1

In the fall of 2019, Course 1 was offered in two sections as an in-person class.  While the instructor used the institution’s LMS to keep materials organized for students, the primary mechanism for demonstrating respect for students was face-to-face interaction, where the instructor could respond to stated and unstated student concerns in the moment.  Relevant elements of this course included:

  • 3 multiple-choice / essay mid-semester assessments, which were taken in person; students’ lowest grade of these three assessments was dropped
  • Study guides for each assessment
  • A term paper with three defined parts, submitted as a single assignment near the end of the semester
  • A group presentation of a case study, presented in person to the entire class
  • An individual short presentation on a topic currently relevant to the course, during a week chosen by each student
  • A participation grade, composed of interaction with student presentations and in-person attendance
  • A final essay-based exam

In the spring of 2020, the instructor reimagined Course 1 as an online course, collaborating with the instructional designer to modify elements in a manner that would provide students greater autonomy without sacrificing the level of rigor nor impacting academic integrity.  For each chapter, students were provided a study guide as well as a voluntary online quiz, using a format identical to the unit assessment. The unit assessments were now online tests with both multiple-choice and essay components; each student’s lowest grade of the three was still discarded.   In accordance with UDL principles regarding flexible assignments, students had a four-hour window to complete the assessment but were required to complete it within two hours.  This flexibility again demonstrated respect for students but was not so extended in duration as to cause exam security concerns, consistent with the findings of Cluskey et al. (2011) and Munoz and Mackay (2019).  Students requiring any timing exception were accommodated on a case-by-case basis.  Based on LMS logs reporting the students’ actual times spent completing the assessment and student feedback in synchronous online sessions, the authors found this approach met students’ needs.

Other class activities were also modified.  The term paper was broken into three separate components, and the instructor provided timely feedback on each part.  This process provided constructive feedback quickly, both increasing student understanding and decreasing the chances a student could easily violate academic integrity using a paper mill (Rodchua, 2017).  The group presentation was discarded, as the logistics of a group presentation in an online course were challenging with respect to time availability and cost/benefit outcomes.  More emphasis was placed on the individual presentation, which could be a recorded presentation; students then asynchronously engaged in discussions moderated by the presenters.  The final exam was configured identically to the first three midterm assessments.  None of the assessments deployed any browser security.

Debriefing during the summer of 2020, the authors felt that the strength of the extant academic integrity safeguards was insufficient, an assumption shared by many educators across the country (The Wiley Network, 2020).  This institution observed hundreds of students falling under suspicion of cheating in online exams when fewer than a dozen students had faced such suspicions in prior semesters.  At the same time, it was clear that students were facing numerous stressors, which influenced the authors to build more class supports and academic safeguards.

In the fall of 2020, further changes were incorporated to support students and maintain academic integrity.  Class progress checklists were added at the top of the LMS course to prompt students to install and familiarize themselves with the set of free apps and programs that they would need.  All four assessments, including the final exam, incorporated randomization of multiple-choice questions.  All assessments used a secure browser for administration; this locked student computers into kiosk mode, preventing the computer from accessing any resources other than the exam.  A review of log files from the prior semester’s assessments supported the adjustment of the time limit to 100 minutes.  To reduce confusion and respect the remote nature of attendance, students were assigned specific presentation weeks and time blocks during which they were responsible for engaging in an online critique as part of the participation requirement.

Midway through the semester, as students struggled to stay current in their required activities, animated GIFs were programmed into the class LMS page to remind students of imminent due dates for upcoming activities.  A graphic indicator of completion progress was also added to the class page; this completion taskbar was so impactful that 100% of students who were falling behind either caught up or reached out to the instructor within 24 hours of its appearance.

Course 2

In the spring of 2020, Course 2 began in person.  Assessments included three open-book, take-home exams; a term paper; short individual presentations with ensuing discussions; a participation component based upon attendance and contribution to discussions; and a final exam which was also an open-book take-home assessment with three days allocated for completion.  When the institution pivoted to distance learning, the discussions and class meetings incorporated synchronous online tools. No additional exam security was implemented, as the exams were already open resource in nature.  This process was chosen as the best option for course continuity due to the technical nature of the class and the limited time available to develop a more synchronous approach.

Course 3

In the fall of 2020, Course 3, which had initially been designed as an asynchronous course, was retooled in accordance with College guidelines to include biweekly synchronous optional meetings.  The term paper was changed from a single-submission document to a three-assignment project, animated GIFs were implemented as in Course 1, and students chose their presentation and critique weeks.  This course involved more computational activities than the other courses, and the multiple-choice questions on all assessments were randomly selected from question banks, so no browser security measures were implemented.

Development of Secure Assessment Mechanism

In the spring of 2021, in response to on-campus concerns regarding academic integrity and cognizant of concerns raised by other research (Holden et al., 2021) and the need for some measure of accountability, the authors modified the assessment procedure in both Course 1 and Course 2 to include in-person virtual proctoring using Zoom.  After the first assessment, students in both classes made it abundantly clear that the in-person proctoring via Zoom felt both invasive and logistically complex and increased their anxiety levels.  The instructor asked them to try it one more time, thinking that perhaps the newness of the procedure was at fault.  It was not; students even more vociferously protested after the second assessment, again contacting the instructor via email, text, the class LMS chat, and during class discussion time to voice their concerns.  In response, the authors changed the proctoring mechanism to recordings captured by the institution’s existing performance assessment tool; assessments still were administered within a secure browser.

Students were provided three weeks’ notice regarding the new procedure, and the College provided spaces where students could take the exams if they did not wish their personal location to be recorded. Only the instructor and the LMS administrator had access to the recordings. The recordings were viewed only if separate indicators of potential academic integrity violations were observed. The recordings were maintained in accordance with institutional privacy policies.

By the end of this term, many faculty members had moved away from using Zoom as a proctoring tool.  To safeguard academic integrity, these faculty members instead were relying on mechanisms such as more stringent question randomization selection, alternate types of questions, or alternative types of assessments completely.  Twenty-two faculty members teaching courses incorporating various methods of online assessment, as well as three faculty members whose courses had only in-person assessments, agreed to engage students with a voluntary post-course survey of technology use in exams.  Across the 25 courses, 77 students responded to an email request from their professor to take a Qualtrics survey following the college’s Institutional Review Board procedures; the low return rate is attributed to the necessary timing of the survey, which was solicited after the final exam was completed.  In this survey, students self-selected the assessment mechanism of their course and then responded to two Likert-scale prompts.  Twenty-four students indicated their assessment had been a take-home assignment, 16 students’ assessments were delivered via the LMS without security, and 15 students were unsure what mechanism had been used.  Seven students were administered in-person assessments, while two students described their assessment as “secure”; the remaining 14 students were able to categorize their assessment more fully as either Zoom-proctored or recording-proctored.  The responses are summarized in Table 1 (regarding perceived intrusiveness) and Table 2 (regarding perceived academic integrity guarantees).

 

Table 1

Student Perception of Intrusiveness of Assessment Mechanism (Spring 2021)

Assessment Mechanism I felt the process was intrusive and/or invasive
Strongly disagree Somewhat disagree Uncertain / Neutral Somewhat agree Strongly agree Total
Assignment 16 3 3 1 1 24
Moodle 9 2 2 3 16
Unsure 3 4 6 1 1 15
Record-proctor 2 3 4 2 1 12
In-person 4 1 2 7
Zoom-proctor 1 1 2
Secure 1 1
Total 35 13 17 9 3 77

Table 2

Student Faith in Academic Integrity Assurance of Assessment Mechanism (Spring 2021)

Assessment Mechanism I believe this approach helped ensure . . . Academic Integrity
Strongly disagree Somewhat disagree Uncertain / Neutral Somewhat agree Strongly agree Total
Assignment 15 8 1 24
Moodle 10 5 1 16
Unsure 1 2 2 4 6 15
Record-proctor 4 6 2 12
In-person 5 2 7
Zoom-proctor 1 1 2
Secure 1 1
Total 35 25 6 5 6 77

The survey was targeted to courses that employed the full range of final exam administrations, such as in-person exams, take-home assignments, and unsecured and secured online exams.  Data from the survey was combined with conversations from the authors’ classrooms and asynchronous forums. This aspect was especially important given the limited number of responses from students who indicated their assessment had been Zoom-proctored; anecdotal comments from colleagues led to the conclusion that many of them, consistent with the authors, had switched away from Zoom-proctoring midway through the semester due to student complaints.

Survey results, combined with the student and colleague comments, led the authors to conclude the recording-proctored model used was relatively unintrusive when compared to the other methods of online-secure delivery (Zoom-proctoring, secure, and Moodle/LMS).  This conclusion relied particularly upon the comments from students in their courses who had experienced both Zoom-proctoring and recording proctoring. Students for the most part preferred the second proctoring option. The authors also concluded that recording proctoring is viewed as likely to preserve academic integrity.  Most importantly, this method appears to be relatively well balanced between the desired characteristics of non-intrusiveness and maintenance of exam security.

Therefore, in the fall of 2021, both Course 1 and Course 3 retained the recording-proctored online assessment method, although prior student comments were heeded to clarify directions and streamline sample assessments.  Separate student comments regarding the unusual stressors of remote learning were addressed as well; on each exam, students were allowed to select any five questions to not answer (skip).  Few students expressed frustration with the assessment procedures this semester.

In the spring of 2022, faculty members at the college implemented some of the assessment techniques developed throughout the prior three semesters.  Many faculty members added progress bars to their own online courses, and randomized multiple-choice questions in online exams; additionally, at least two departments implemented the recorded-proctoring approach to online assessments.

Discussion

Towards the end of the spring 2021 semester, a student remarked to one of the authors that the recorded-proctoring method felt even less intrusive than an in-class exam, because “even if my professor watches me taking the test, it’s not while I’m actually taking the test.”  Similar student comments, plus anecdotal observations by other faculty members, indicate that students feel reassured they are not presumed to be cheaters and appear to perceive the intrusiveness of this method to be no different from other methods.  In both the fall 2021 and spring 2022 semesters, students stated that the assessment procedures were well-explained, and the survey results and follow-up discussions with students in the fall of 2021 revealed that students are more confident in the ability of this mechanism to safeguard academic integrity than other methods.  An online assessment mechanism which centers both student comfort and academic integrity protection can achieve both and must be implemented to achieve equity and reliable assessment outcomes.

This is reassuring since this institution looks toward full participation in intercollegiate course-sharing consortiums in the coming years.  To do so, the college will need to have confidence in its online assessment mechanisms.  The past three years have demonstrated that faculty and administrators will accept long-term practice and procedural changes, even when such changes affect fundamental aspects of the institution. In the 2023 academic year, one of the institution’s three Schools is planning to implement the recording-proctoring model as the only acceptable online assessment model.  The institution’s registrar has created a room prioritization system that privileges instructors who teach in-person classes that encourage a physical/virtual attendance mix policy.

The development of this model also makes it clear that implementing a secure method of delivering assessments is not, by itself, sufficient to secure academic integrity; technology tools can only support a culture of academic integrity.  Transparency with students regarding the rationale and implementation of such tools demonstrates the centrality of students and enhances existing, mutually respectful academic cultures.

Future Considerations

Higher education institutions and residential liberal arts institutions will grapple with concerns regarding academic integrity while demonstrating respect for student perspectives continuingly as online course offerings expand. While these topics are broad, pervasive, and far-reaching, it is possible to address them in a cost-efficient manner.  This requires careful consideration of technological controls for academic integrity, the unique environment of small liberal arts colleges, and tool-specific applications in a continuing academic atmosphere of change.

While technology alone cannot ensure academic integrity, a low-cost structure can sustain the necessary respect for students to promote an ongoing culture of academic integrity.  It is important that institutions explore methods where technology used to enforce integrity rules also prioritizes respect for student autonomy and considers the impact on student anxiety (Conijn et al., 2022).  This is especially recommended in relation to concepts of Universal Design for Learning, as technology tools can be particularly helpful in facilitating multiple ways of presenting material and multiple means of demonstrating mastery of concepts (Rogers-Shaw et al., 2018).  Additionally, while technology tools can protect academic integrity in specific assessment situations, such as online testing, institutions must take care that the deployment of such tools does not damage the academic integrity relationship by creating a perception that no student is to be trusted.

Any academic integrity discussions must both contextualize and recognize the institution’s culture.  Campus culture at residential liberal arts colleges is different from larger institutions.  The lived experiences of faculty members and administrators at smaller institutions may lead to an incomplete understanding of resource intensity and thus the cost of support resources can quickly overwhelm budgets.  Liberal arts colleges may find that the time-on-technology required of faculty using online resources is at odds with extant practices which prioritize the face-to-face experience (Rust, 2019).  Scenarios such as these must be considered whenever an institution finds it necessary to adopt online tools.

The culture of institutions, which emphasize in-person learning, also can inadvertently clash with the accommodations necessary to effectively use online learning tools.  Institutions may not be prepared for the flexibility of scheduling that students expect in online and hybrid courses, and many students may still view online courses as “easy” courses (Baker et al., 2021), which can lead to increased academic integrity violations when it becomes clear such courses are, in fact, not “easy.” Colleges and universities must explore student perceptions as well as prevailing institutional culture in concert with the administrative and faculty ethos, or an imbalance of respect and academic rigor may surface.

Finally, as institutions implement online assessment tools, some degree of standardization and familiarity, with respect to tool-specific considerations, must be part of the decision process.  The performance assessment tool used at this institution does not provide a mechanism to pass the status of a recording in progress to the LMS; because the instructor needed to verify each student’s recording status, students could not seamlessly begin their examination.  Instead, they had to engage in a multi-step process that demanded their concentration, causing additional stress during the evaluation process.  However, the use of this tool, which is common in more than two-thirds of the departments on campus, appears to have fostered student trust in the method overall.  Institutions should take note: specific security-enhancing tools must be adopted across the institution, or the lack of standardization will frustrate students and potentially damage the culture of academic integrity. In essence, a seamless design for monitoring online class performance should integrate ease of use characteristics, academic integrity preservation, and data retrieval to support integrity concerns in a non-intrusive implementation.

Summary

The movement to online courses and delivery methods necessitated by the pandemic demonstrated that many universities and colleges were not adequately prepared for this transition, especially within such a short time frame.  Still, it appears possible for any institution, regardless of its size or pandemic-prior focus, to leverage existing educational technology tools and find a balance between respecting students’ autonomy and preserving academic integrity.

At the authors’ small liberal-arts, residential college, faculty began to use existing tools to provide rigorous student assessment within the context of non-intrusiveness and integrity preservation, while also pursuing cost effectiveness, student acceptance, and faculty comfort. The process discussed in this example was molded to fit the institutional culture. The current implementation appears to present a successful path forward. An overarching assumption is that the process in operation will be subject to continual modification predicated upon technology changes, student tolerance, academic rigor, integrity maintenance, cost parameters, and administrative support.  This process did, and can continue to, demonstrate that the cost of respecting students is surprisingly little.

References

Al-Freih, M. (2021). The impact of faculty experience with emergency remote teaching: An interpretive phenomenological study. IAFOR Journal of Education, 9(2), 7–23. https://doi.org/10.22492/ije.9.2.01

Bailey, A., Vaduganathan, N., Henry, T., & Laverdiere, R. (2018).  Making digital learning work. Boston Consulting Group. https://edplus.asu.edu/sites/default/files/BCG-Making-Digital-Learning-Work-Apr-2018%20.pdf

Baker, D. A., Unni, R., Kerr-Sims, S., & Marquis, G. (2021, April).  An examination of the factors leading to students’ preferences and satisfaction with online courses.  International Journal for Business Education, 161, 112-129.

Bozkurt, A., Jung, I., Xiao, J., Vladimirschi, V., Schuwer, R., Egorov, G.,. . . Paskevicius, M. (2020). A global outlook to the interruption of education due to COVID-19 pandemic:  Navigating in a time of uncertainty and crisis.  Asian Journal of Distance Education, 15(1), 1-126. https://doi.org/10.5281/zenodo.3878572

Bruff, D. (2011, February 28). Why do students cheat? Vanderbilt University. Retrieved May 10, 2022, from https://cft.vanderbilt.edu/2011/02/why-do-students-cheat/

Cluskey, G. R., Ehlen, C., & Raiborn, M.  (2011, July).  Thwarting online exam cheating without proctor supervision.  Journal of Academic and Business Ethics, 4(1), 1-7.

Conijn, R., Kleingeld, A., Matzat, U., & Snijders, C. (2022).  The fear of big brother: The potential negative side-effects of proctored exams.  Journal of Computer Assisted Learning. https://doi.org/10.1111/jcal.12651

Cutri, R. M., & Mena, J. (2020). A critical reconceptualization of faculty readiness for online teaching. Distance Education, 41(3), 361–380.  https://doi.org/10.1080/01587919.2020.1763167

Daffin Jr., L. W., & Jones, A. A. (2018). Comparing student performance on proctored and non-proctored exams in online psychology courses. Online Learning, 22(1). https://doi.org/10.24059/olj.v22i1.1079

Dyer, J. M., PettyJohn, H. C., & Saladin, S. (2020). Academic dishonesty and testing: How student beliefs and test settings impact decisions to cheat. Journal of the National College Testing Association, 4(1).

Evmenova, A. (2018).  Preparing teachers to use Universal Design for Learning to support diverse learners.  Journal of Online Learning Research, 4(2), 142-171.

Harton, H. C., Aladia, S., & Gordon, A. (2019). Faculty and student perceptions of cheating in online vs. traditional classes. Online Journal of Distance Learning Administration, 22(4).  Retrieved July 19, 2022, from https://ojdla.com/archive/winter224/hartonaladiagordon224.pdf

Haslam, C. R., Madsen, S., & Nielsen, J. A. (2020). The ISPIM Innovation Conference – Innovating in Times of Crisis. In Event Proceedings: LUT Scientific and Expertise Publications.

Holden, O. L., Norris, M. E., & Kuhlmeier, V. A. (2021). Academic integrity in online assessment: A research review. Frontiers in Education, 6. https://doi.org/10.3389/feduc.2021.639814

Johnson, N., Veletsianos, G., & Seaman, J. (2020). US faculty and administrators’  experiences and approaches in the early weeks of the COVID-19 pandemic. Online Learning, 24(2), 6–21. https://doi.org/10.24059/olj.v24i2.2285

King, C. G., Guyette Jr, R. W., & Piotrowski, C. (2009). Online exams and cheating: An empirical analysis of business students’ views. Journal of Educators Online, 6(1), 1-11.  https://files.eric.ed.gov/fulltext/EJ904058.pdf

Kitahara, R., Westfall, F., & Mankelwicz, J. (2011, May). New, multi-faceted hybrid approaches to ensuring academic integrity. Journal of Academic and Business Ethics, 3(1).

Munoz, A., & Mackay, J. (2109).  An online testing design choice typology towards cheating threat minimisation.  Journal of University Teaching and Learning Practice, 16(3). https://doi.org/10.53761/1.16.3.5

Palmer, A., Pegrum, M., & Oakley, G. (2019).  A wake-up call? Issues with plagiarism in transnational higher education.” Ethics & Behavior 29(1), 23-50. https://doi.org/10.1080/10508422.2018.1466301

Poster, G. (2021, April 30). Lockdown browsers fail to create a culture of academic integrity. The Retriever. Retrieved April 8, 2022, from https://retriever.umbc.edu/2021/04/lockdown-browsers-fail-to-create-a-culture-of-academic-integrity-they-invade-student-privacy-and-harm-student-health/

Rodchua, S. (2017). Effective tools and strategies to promote academic integrity in e-learning. International Journal of e-Education, e-Business, e-Management and e-Learning, 7(3), 168–179. doi: 10.17706/ijeeee.2017.7.3.168-179

Rogers-Shaw, C., Carr-Chellman, D.J., & Choi, J. (2018).  Universal Design for Learning: Guidelines for accessible online instruction.  Adult Learning, 29(1). https://doi.org/10.1177/1045159517735530

Rust, J. (2019).  Toward hybridity: The interplay of technology, pedagogy, and content across disciplines at a small liberal-arts college.  Journal of the Scholarship of Teaching and Learning, 19(2). https://doi.org/10.14434/josotl.v19i1.23585

Seife, D. A. & Maxwell, R. S. (2020). Cheating in online courses: Evidence from online proctoring. Computers in Human Behavior Reports, 2https://doi.org/10.1016/j.chbr.2020.100033

Snyder, T. D., de Brey, C., & Dillow, S. A. (2019). Digest of Education Statistics 2018 (NCES 2020-009).  National Center for Education Statistics, Institute of Education Sciences, U.S. Department of Education, Washington, D.C.

Suryani, A. W., & Sugeng, B. (2019).  Can you find it on the Web?  Assessing university websites on academic integrity policy.  2019 International Conference on Electrical, Electronics and Information Engineering (ICEEIE) 6, 309-313. https://doi.org/10.1109/ICEEIE47180.2019.8981405

The Editorial Board. (2021, March 18). Test proctoring software that films students is an invasion of privacy and presents problems with equity. El Camino College The Union. https://eccunion.com/opinion/editorials/2021/03/17/test-proctoring-software-that-films-students-is-an-invasion-of-privacy-and-presents-problems-with-equity/

The Wiley Network. (2020, July 22).  Is student cheating on the rise? How you can discourage it in your classroom. The Wiley Network. https://www.wiley.com/en-us/network/education/instructors/teaching-strategies/is-student-cheating-on-the-rise-how-you-can-discourage-it-in-your-classroom

U.S. Department of Education, National Center for Education Statistics. (2021, January). Digest of Education Statistics.  Retrieved April 12, 2022, from https://nces.ed.gov/programs/digest/d20/tables/dt20_311.15.asp

U.S. Department of Education, National Center for Education Statistics, Integrated Postsecondary Education Data System (IPEDS). (2022).  Percent of students enrolled in distance education courses, by state and distance education status of student: 2020 [Data Visualization Tool].  Retrieved April 12, 2022, from https://nces.ed.gov/ipeds/TrendGenerator/app/build-table/2/42?rid=6&cid=85

Venable, M. A. (2022). 2022 Online Education Trends Report. BestColleges.com.  https://www.bestcolleges.com/research/annual-trends-in-online-education/

License

Icon for the Creative Commons Attribution-NonCommercial 4.0 International License

The Cost of Respect? Surprisingly Little Copyright © 2023 by Joseph Kennedy and Albert Kagan is licensed under a Creative Commons Attribution-NonCommercial 4.0 International License, except where otherwise noted.