Introduction
As those who teach in the online Distance Learning
(DL) environment refine their instruction materials,
procedures and policies, an alarming number of
students are arming themselves with a plethora of
weapons, employing both new and old strategies and
technologies, to obtain an unfair advantage over the
rest of their classmates. In this work the authors
highlight the current state of these affairs and
review one university’s approach to regaining
control of academic integrity in its DL offerings.
The paper introduces the relevant issues with an
example of Troy University’s published statement of
its Standards of Conduct, illustrates disturbing
trends in dishonesty among the current student
population with a recent case study, presents a
brief survey of the literature to explore the extent
of “the problem”, highlights Troy University’s
approach to resolving many of the issues, identifies
several pressing questions arising from this
research, and concludes with a plan for continued
research in this increasingly important area.
Standards of Conduct
A university communicates its attitudes and policies
regarding the standards of behavior expected from
its student population through a section of its
catalog or student handbook typically entitled
“Standards of Conduct.” This section usually
includes definitions of misconduct, identifies
corresponding administrative responsibilities,
outlines procedures for disciplinary actions, lists
potential penalties for misconduct, and defines the
rights of accused students. Excerpts from Troy
University’s Undergraduate Catalog (Troy University,
2006) are presented below as these sections apply to
proper student behavior and as they relate to this
case study.
. . . “A student is subject to disciplinary action
if:
. . . In connection with the taking of, or in
contemplation of the taking of any examination by
any person:
a.
A student knowingly discovers or attempts to
discover the contents of an
examination before the contents are revealed by the
instructor;
b.
A student obtains, uses, attempts to obtain or use,
or supplies or attempts
to supply to any person, any unauthorized material
or device;
c.
A student uses, attempts to use, or supplies or
attempts to supply to any
person unapproved materials or devices dishonestly.
. . . Penalties for Misconduct:
. . . Any student who has committed an act of
misconduct……may be subject to one or more of the
following penalties:
a.
A student’s grade in the course or on the
examination affected by the misconduct may be
reduced to any extent, including a reduction to
failure.
b.
A student may be suspended from the University for a
specific or an indefinite period, the suspension to
begin at any time.”
(Troy University Undergraduate
Catalog, 2005-2006)
Such standards appear to be clear, reasonable and
“common sense” statements of the type of behavior
all institutions of higher learning expect from
their students and the potential penalties for
improper behavior. Additionally Troy University
relies on a student Honor Code system to instill
academic honor, trust, and integrity that it views
as fundamental to its academic policy. This case
study illustrates the new and often-blatant assaults
upon their ability to preserve academic integrity
institutions of higher learning must now confront,
particularly in the DL environment. Perhaps the
most disturbing of these assaults are shifts in
society’s attitudes toward academic integrity and
corresponding views of what is acceptable and
ethical behavior. The ongoing struggle between new
implementation and security technologies embedded in
the DL delivery systems and the counter-technologies
that defeat them complicates the enforcement of
academic standards. Additionally, our litigious
society may add even more impediments to maintaining
a university’s academic integrity.
Case Study – Two Distance Learning Courses
This case summarizes experiences from recent
offerings of two quantitative courses in
Troy
University’s core, business program sequence; QM3341
Business Statistics II and MGT3373 Operations
Management. The emphasis in MGT3373 was a balanced
presentation of general principles and several
quantitative techniques most often encountered in
the business world. For each of these courses
students were required to take seven online quizzes
consisting of 20 multiple-choice questions randomly
drawn from a large test bank. They were also
required to take an online final examination
consisting of 50 multiple-choice questions randomly
drawn from the test bank. Finally, both courses
included a proctored examination (PE) for which the
students were subject to specific rules for personal
identification, control of the exam environment, and
security. The PE consisted of two parts; part one
consisted of 25 multiple-choice questions, part two
consisted of five quantitative problems (QM3341) or
five essay questions (MGT3373). Officials from the
University’s DL office pre-approved students’
choices of proctors and defined rules for
establishing a secure environment before
examinations were distributed to them.
Examinations, instructor materials and author test
banks are not intended for general consumption,
particularly by students. Troy University, its
instructors and textbook publishers cooperatively
implement multiple strategies to mitigate cheating
and/or unethical behavior. These strategies
typically include:
a)
The BlackboardTM delivery system provides
controls that force students to complete assessments
once they are entered
b)
BlackboardTM provides instructor controls
that are designed to prevent students from printing
copies of all exams
c)
Instructors provide instructions with each
examination that typicallyoman">
Students must take assessments separately to prevent
copying or collusion
-
Students may not make copies of exams
-
Proctors must return all test question sheets in
addition to all answer sheets
d)
Publishers screen applicants for instructor
materials and author test banks to prevent students
from obtaining copies
The authors were first alerted to potential
violations of the University’s Standards of Conduct
by the unusual and unreasonable quiz timings of six
QM3341 students for the first three quizzes. Each
online quiz had a 1-hour time limit and these six
students were completing them in 2-3 minutes with
near perfect scores. Historically students averaged
30-40 minutes on these quizzes. All six students
were registered at the same university site. The
authors sent each of these students an e-mail
inquiring about the unusual timings; few responded.
One student claimed all his timings were
reasonable. No student admitted to possessing
unapproved sources of information. The authors
began collecting utilization and performance data on
these students and changed test bank utilization
procedures. While preparing the assessment test
banks, a “code” had been inserted into all test bank
questions, which would facilitate correlation of the
(randomly selected) quiz questions to the test
banks. For quiz four the authors shuffled those
codes. As shown in Table 1 the timings on quiz
four for all six students immediately jumped to
historical levels. For all remaining quizzes the
codes were completely removed and student timings
remained roughly at historical averages.
To minimize student anxiety over memorizing formulas
and to facilitate the use mathematical tables in
many of the examination problems, the Proctored Exam
(PE) for this course was given open-book. To allow
students to leverage the efforts expended in their
homework assignments, the exam was also given
open-notes. Because of the irregularities noted
above the authors ensured that all the questions for
both the parts of the exam differed from all exams
given in all previous years. Table 2 summarizes the
peculiarities experienced on the PE for these same
six students compared to the rest of the class and
to the historical performance of students in all
past online offerings of the same course.
Table 1. Summary of Quiz Timing Irregularities
QM3341 Business Statistics II – Term 2/05
Timings (Minutes:Seconds) and Scores
(*Administratively changed to 0) |
QM3341 |
Quiz 1 |
Quiz 2 |
Quiz 3 |
Quiz 4 |
Quiz 5 |
Quiz 6 |
PE |
Quiz 8 |
Final |
Student 1 |
2:19
100 |
2:03
100 |
3:53
100 |
33:24
95 |
36:11
75 |
8:46
95 |
56/0* |
17:07
100 |
35:36
96 |
Student 2 |
1:57
100 |
3:30
90 |
3:02
100 |
26:14
95 |
24:49
95 |
12:59
90 |
56/0* |
21:10
95 |
37:37
92 |
Student 3 |
2:57
100 |
2:25
95 |
1:47
100 |
25:50
20 |
22:55
65 |
23:25
85 |
56/0* |
11:15
95 |
30:39
94 |
Student 4 |
2:01
100 |
3:25
100 |
5:37
100 |
39:23
90 |
42:57
85 |
rder-right: 1.0pt solid black; border-top: medium none; border-bottom: 1.0pt solid black; padding-left: 5.4pt; padding-right: 5.4pt; padding-top: 0in; padding-bottom: 0in">
56/0* |
38:48
40 |
46:48
94 |
Student 5 |
1:47
100 |
2:05
100 |
1:53
100 |
53:53
90 |
56:05
85 |
35:56
90 |
41/0* |
9:05
95 |
73:38
60 |
Student 6 |
1:38
100 |
1:51
100 |
1:47
100 |
48:04
15 |
71:11
95 |
31:32
90 |
41/0* |
3:56
100 |
46:39
94 |
Table 2. Proctored Exam Irregularities – QM3341
QM 3341 Business Statistics II – Term 2/05
Proctored Exam Answer Sheet Analyses |
QM3341
Answer
Sheets |
Group A
(Students 1,2,3,4) |
Group B
(Students 5,6) |
Rest of Class |
Historically:
“similar test” -(same format, authors’ test
bank) |
Part 1
25 Multiple Choice (MC) |
- Same 2 Errors
- Same 2
erroneous
choices |
- Same 2 Errors
(Same as Group A) |
- No student missed both
questions as A,B
- No other pair had
identical MC sheets |
No pair had identical sheets |
Part 2
5 Quantitative Problems |
Virtually identical:
- Answers
- Layout
- Detail
- Errors
- Omissions
- Inclusions
- Decimal place
rounding
- Wording (90%) |
Virtually identical:
- Answers
- Layout
- Detail
- Errors
- Omissions
- Inclusions
- Decimal place
rounding
- Wording (90%) |
- None matched A,B
- No other pair had
matching answer sheets
- No other student presented an answer to ANY of
the problems which matched these students’
responses in style or format |
No pair had matching answer sheets |
When confronted with the PE irregularities observed
in QM3341, only three of the students responded.
Two students, identified here as Students 1 and 2,
vehemently denied “illegal” or wrongful activity and
two immediately threatened lawsuits. Those two
students were also taking MGT3373 Operations
Management in the DL format in Term 2/05 and their
behavior in that course (Table 3) was similar to
their behavior in QM3341. Both submitted identical
answer sheets for the multiple-choice portion of the
proctored exam. Both submitted virtually identical
answer sheets for the five essay questions. When
confronted with these additional irregularities,
they voiced the same denials and threats of lawsuits
as they did for QM3341.
Table 3. Proctored Exam Irregularities - MGT3373
MGT3373 Operations Management - Term T2/05
Proctored Exam Answer Sheet Analyses |
MGT3373
Answer Sheets |
Students 1,2 |
Rest o
Historically:
“similar test” – (same format, authors’ test
bank) |
Part 1
25 Multiple Choice |
- Same 2 Errors
- Same 2 Erroneous
Choices |
- No student sheet matched
A,B
- No other pair had identical
MC sheets |
No pair had identical sheets |
Part 2
5 Short Essay Questions
|
Virtually identical
- Answers
- Detail
- Omissions
- Inclusions
- Examples
- Wording – 90% |
- None matched A,B
- No other pair had matching
answer sheets |
No pair had matching answer sheets |
Records from QM2241 Business Statistics I from Term
1/05 revealed that the same two students exhibited
the same irregularities in quiz and examination
performance in that course as well. They completed
1-hour quizzes in 2-3 minutes, completed the 2-hour
final exam in five minutes and submitted identical
answer sheets for both parts of their proctored
exams. That term those irregularities
escaped detection.
For QM3341 Students 1 and 2 had arranged to take
their proctored examinations under the supervision
of a specific university professor in the Business
department at the Troy campus. They asked that
proctor to allow them to take the examination
together since they had developed “common notes”
they wished to share. When that professor denied
their request and offered instead to allow them to
duplicate the notes so that each would have a copy,
the students never returned to take the examination
and went to another university official to serve as
their proctor.
All six offending students were initially assigned
failing grades for the courses in question.
However, because the early attention over these
violations of academic integrity focused upon the
unreasonable assessment timings and possession of
unapproved sources, there was concern that the
open-notes policy for the exam might provide a legal
loophole in a court of law. Consequently the grades
were changed to ones determined strictly from “items
submitted.” However upon further review and
investigation none of the students could explain the
degree of similarity among their PE answer sheets,
particularly on the quantitative and essay portions
of the exam(s). These similarities could not be
explained by simply having common notes,
irrespective of their sources. Therefore, their
final grades were administratively reassigned as
failing and the six students were apprised of their
rights to appeal their grades.
During the early stages of their appeals two of the
students admitted to possessing unapproved
materials. One claimed they all had copies of all
the examination questions the authors had given in
the past. Another admitted they had the textbook
test bank that they found on an open website.
Because the exam was given open-notes they claimed
that they included these materials in their notes
and therefore had done nothing wrong or illegal.
From an academic perspective common sense would
dictate that examinations are meant to provide
assessments of the student’s understanding of the
material being examined, not the extent or accuracy
of their data bases, irrespective of the manner in
which they were obtained. Additionally the
assessments are meant to reflect the individual
student’s knowledge and original work without help
from or collusive activities with others.
Furthermore Troy University’s statement of its
Standards of Conduct clearly identifies these
activities as specific violations. As a matter of
policy the authors included in all of the course
syllabi appropriate excerpts from these published
Standards as well as recommendations for students to
read the full set of Standards. Astonishingly,
despite the fact that the behavior of the students
cited was in clear violation of these published
standards, they did not view their actions as
infractions of academic integrity.
This case raises other troublesome concerns over the
preservation of academic integrity particularly for
courses offered in the DL environment. Students
such as those observed in this case do not believe
having an author test bank is “cheating.”
Disappointingly, despite clear copyright
restrictions from the publisher, the textbook test
bank for QM3341 was posted in its entirety on
another university’s website by an instructor in a
manner accessible by the entire Web public. The
offending students were brazenly defiant about their
actions and use of materials available on the public
Internet. They were willing to take their cases to
the highest levels within the University, including
the Chancellor of the University. In the end all
the failing grades were upheld by the University and
all appeals were denied.
Extent of the Problem – Examination of the
Literature
For several decades the popular and academic press
has published startling reports on the scope and
extent of the cheating problem. Many reports
suggest that the propensity to gain an unfair
advantage in the academic environment begins at the
elementary school level and grows increasingly more
prevalent as students progress through secondary and
higher levels of education (Slobogin, 2002; McCabe,
2005; ETS Research Center, 2006; Overholser, 1999;
Vos Savant, 2006). The problem is pervasive and has
increased dramatically over the past 30 years
(Harding, et. al., 2001; McCabe, et. al., 2001).
Some of the shocking findings (Fellgurth, 2003;
Smith, 2006) indicate:
a)
In 1996 the American Psychological Association
survey showed that 50% of undergraduates admitted to
having cheated more than once.
b)
In 1999 survey by Donald McCabe of Rutgers
University indicated that on most campuses over 75%
of students admit to some form of cheating.
c)
A 2002 survey by McCabe fond that 74% of high school
students admitted to cheating on a test or paper at
least once.
d)
A 2003 national survey found 41% of students sampled
said plagiarism happened “often” or “very often”
e)
Other national surveys show that cheating at
colleges is on the rise and is occurring at both the
undergraduate and graduate levels.
f)
Other research reported an incidence rate of
cheating of 40% among graduate students
As data storage, access, distribution and
communication technologies have advanced so too have
the sophistication of the methods by which offending
students practice their deceptions (Conradson &
Hernandez-Ramos 2004, Argetsinger, 2003).
Many investigators have found interesting
correlations between the propensity to cheat and a
multiplicity of factors that may constitute
predictive variables in certain cases. The
observed trends include; underclassmen cheat more
than upperclassmen, students with lower grade point
averages (GPAs) cheat more than those with higher
GPAs, cheating is more prominent among fraternity
and sorority members and athletes, students who
perceive that peers cheat without getting caught are
more likely to cheat themselves, younger students
tend to cheat more than older students, and
substantially less cheating occurs at institutions
employing strong academic honor codes (Butterfield,
et. al., 1999; McCabe & Klebe Treviono, 1997).
Levels of mastery and extrinsic factors strongly
influence cheating as do perceived social norms
regarding cheating, knowledge of institution policy
regarding cheating, and student attitudes toward
cheating (Jordan, 2001). The research on gender as
a discriminator for cheating has yielded mixed
results and may necessitate secondary gender-related
factors (McCabe, et. al., 2006; Ruegger & King,
1992).
Whatever the influencing variables, most research
indicates that cheaters are generally less mature,
less reactive to observed cheating, less deterred by
social stigma and guilt, less personally invested in
their education; and more likely to be receiving
scholarships but performing more poorly (Diekhoff,
1996). Not surprisingly cheaters tend to shun
accountability for their actions and blame their
parents and teachers for widespread cheating, citing
increased pressure on them to perform well (Greene &
Saxe, 1992). Worse yet, society as a whole has
become increasingly more tolerant and even accepting
of the practice of cheating, often citing the need
to survive in today’s competitive environment as
justification for that shift in attitude (Slobogin,
2002; Vos Savant, 2006; Callahan, 2004).
The new technology tools and distorted societal
attitudes towards cheating make the job of
maintaining academic integrity within the
educational environment much more challenging.
While interesting technological solutions, such as
Troy University’s Securexam Remote ProctorTM
described below, are now being implemented,
additional non-technology based strategies may be
required to make the Distance Learning environment
less vulnerable to today’s sophisticated cheaters.
For instance, some research has found that
Universities that have implemented a Student Honor
Code have experienced decreased levels of cheating
among their student bodies (McCabe, 1995; McCabe,
et. al. 2001; McCabe, et. al., 1993; Gray, 1998).
The research suggests that the long-term solution to
curtailing academic cheating must include
well-defined standards, a strong sense of
accountability and properly focused “community”
attitudes, above and beyond complex high-technology
attempts to establish a secure testing environment
(McCabe & Klebe Trevino, 1993; Gray, 1998;
Henderschott, et. al., 1999; McCabe & Pavela 2000).
In order to minimize the unethical students’
inclination and ability to cheat, faculty,
administration and the responsible student
population must work together. Establishing a
proper climate to achieve this goal must include
unwavering support by the administration of the
faculty efforts to maintain ethical standards for
academic integrity (Heberling, 2002).
One Approach – Troy University
Troy University has historically pursued a
multi-faceted approach to curbing academic
dishonesty among its student body. The approach
included traditional methods for controlling the
examination environment, “policing” the work and
behavior of its students for both in-class and
take-home assignments, and instilling a sense of
honesty and ethics through a well-published Academic
Code and Student Honor Code. These practices were
also incorporated into its
Distance Learning course offerings and modified as
the delivery medium required. The issues that these
techniques commonly addressed include:
a)
Verifying proper student is taking the
exam
b)
Copying others
work
Receiving assistance from
others
d)
Using unapproved crib notes, electronic devices,
storage media
e)
Using unapproved materials such as copies of
instructors past examinations
f)
Helping others commit illicit acts
g)
Collusion
Quite
often this broad-based approach included the use of
proctored examinations with the requirement for the
physical presence of a pre-approved human proctor.
Many of Troy University’s DL courses currently
require at least one proctored examination be used
for student assessment. Each of these proctored
examinations imposes predictable logistical,
scheduling and security restrictions, particularly
in the DL environment where students may be
distributed in remote locations and vastly differing
time zones all over the world.
With the incorporation of new technologies into its
strategies to prevent dishonest behavior, Troy
University has adopted a blended approach to DL
courses within its eCampus. In addition to its
efforts to establish a University-wide culture of
honesty and ethics and in collaboration with the
Cambridge, Massachusetts based company, Software
Secure, the University has developed a set of
hardware and software tools to replace the need for
human proctors and the associated logistical and
implementation issues. This approach includes a
hardware/software solution called the “Securexam
Remote ProctorTM (Johnson, 2006).” The
system will allow Troy faculty members to monitor
online test takers and provide students the
flexibility to take exams anywhere and at any time.
The hardware module connects to a computer’s USB
port and does not contain the student’s personal
information, thereby allowing sharing of the
hardware. The target cost for the remote proctor
system is in the order of $100. A fingerprint
sensor is built into the base of the unit, and
instructors may specify the time and frequency at
which students must identify themselves before and
during the examination. The system incorporates a
small video camera with a 360-degree field-of-view
and an omni-directional microphone to detect unusual
or unapproved activity. When such detections occur,
alerts are generated and suitable prompts may be
sent to the instructor and appropriate data
recorded. For these detections real time audio and
video will be remotely recorded for viewing and
processing at any time. Securexam Remote Proctor
TM will include software tools that control
student activity so that students taking exams
cannot access any unauthorized material online or
use any other software while taking the exam. This
state-of-the-art system is completing its final test
phases and is being implemented in several of Troy’s
eCampus courses in the fall of 2007.
By creating computer-based accountability, Troy
University is taking a proactive approach to
ensuring online academic quality while making their
online programs and courses available to students,
worldwide, meeting associated government mandates
and creating the proper framework for maintaining
the highest standards of academic integrity and
fairness.
While many academics are excited about potential for
these technologies to suppress the cheating problem,
others are concerned over possible ethical issues,
like unwarranted intrusion into the lives of
individuals. There is currently little published
work on the effectiveness of such automated
proctoring systems and the overall impact of the
additional requirements they impose upon DL students
and instructors. Troy University is one of the
first to adopt and integrate this technology-rich
environment into its distance learning offerings.
Although interesting technological solutions, such
as Troy University’s Securexam Remote Proctor
TM, are now being implemented, they will
likely require non-technology based, companion
strategies to make the Distance Learning environment
less vulnerable to today’s sophisticated cheaters.
In addition to this state-of-the-art set of hardware
and software tools for controlling the testing
environment, Troy University has launched an
eCampus-wide project to redesign all of its eCampus
courses in a phased sequence over the next two
years. Course Redesign teams, consisting of
subject-matter and instructional-design experts,
have been formed to conduct the course redesign
efforts. The ultimate goal is to have all eCampus
courses designed to common delivery and student
learning objective (SLO) standards. Each Redesign
team will determine a common textbook to be used for
all sections of each eCampus course as well as
common “course templates.” The structure and detail
of each course template will be determined by the
corresponding course Redesign team, depending upon
the nature of the course material. Part of the
redesign effort will include restructuring and
redesign of course quizzes, examinations, projects
and assignments to fully exploit the Securexam
Remote Proctor TM, eliminating all human
proctors and conducting all assessments online.
Summary and Conclusions
These experiences have led to the following
observations and comments. Surprisingly, even
without the open-notes policy formerly used in the
proctored examinations for the DL courses of the
case study, there are real concerns that the
Standards of Conduct may not be defendable in court
of law. It is clear that:
1)
Students who wish to obtain an unfair advantage over
other students are now armed with new and
interesting opportunities, tools and resources with
which to obtain that unethical edge.
2)
These new technologies and tools make the collective
job of the University, instructors, course delivery
system designers and publishers much more difficult.
3)
In today’s society students are more apt to wield a
weapon they believe is omnipotent – the threat of
lawyers and lawsuits. They have been conditioned to
believe that with such threats the university will
ultimately back down.
4)
In the face of these difficulties much more thought,
time and energy must be spent in designing DL
courses to maintain academic integrity.
While the challenge to protect Academic Integrity is
common to course offerings in both the online and
traditional (in-class) environments, courses
presented in a purely DL environment present special
concerns for implementation of protective measures.
Anecdotally this case raises several pressing
questions over the preservation of academic
integrity particularly for courses offered in the DL
environment:
1) If
a student has resources that give him/her an unfair
advantage over other students, does this constitute
unethical behavior, violations of the University’s
Standards of Conduct, or cheating? Is a student
obligated to reveal the possession of such sources
when queried by the instructor? How do these facts
relate to the student’s Honor Code?
2) If
a student obtains instructor materials, such as the
authors’ test banks for the course textbook, which
give him/her prior knowledge of examination
questions and therefore an unfair advantage over
other students, does this constitute unethical
behavior or cheating? If this material can be
obtained from an open website does this change the
fact that their possession is a clear violation of
the University’s Standards of Conduct?
3) What
degree of collusive activity in examinations, if
any, is acceptable? To what level of certainty must
an instructor prove that such collusion did in fact
occur? Upon whom does the “burden of proof” fall?
Are the standards for burden of proof the same in
academic cases as those in a civil court of law?
4) What
is the appropriate statement of the University’s
Standards of Conduct? How can such dishonest
activity be controlled in the DL format? Will
purely technology-based solutions be sufficient?
5) In
today’s ever expanding hi-tech environment is it
possible to write a statement of the Standards of
Conduct which is comprehensive and which will
withstand the scrutiny of attorneys in a court of
law? What are the bounds of “academic freedom?”
6) How
must these standards be communicated in a course
syllabus? Obviously it is not possible or practical
to include in a reasonable course syllabus an
exhaustive list of possible means and mechanisms an
unethical student may employ to circumvent the
Academic Code. Do these omissions from a course
syllabus constitute a legal loophole that allows
students to behave in an unchecked manner?
7) What
are the legal implications? To what extent does
enforcement of these standards put the University at
risk for lawsuits? To what extent are instructors
personally liable if his/her actions to enforce the
Academic Code are taken are without malice,
prejudice or bias and not conducted in an arbitrary
or capricious manner?
8) Have
student and society attitudes toward academic
integrity changed? If so, at what point in a
student’s development and education does this change
occur? What factors influence these changes in
attitude?
9) Are
there differing views of academic integrity among
the student, academia and working professional
populations?
10) What
are the special implications of these issues on
courses taught in the DL format? Are DL students
more likely to commit actions in violation of the
Academic Code? If so, what factors influence that
disturbing trend, if any?
Troy University’s multi-faceted approach to
controlling the Distance Learning testing
environment using Securexam Remote Proctor TM
to replace the human proctor at the testing
site combined with its well-published Academic Code
and strong Student Honor Code and implementation of
its redesigned course structures and templates is a
fully integrated attempt to enhance the academic
integrity of its online programs. The overall
effectiveness of this approach is yet to be
determined and will be closely scrutinized as
students with high-tech tools make new assaults upon
eCampus academic integrity. More research is
required to determine if this approach will be
successful, particularly for quantitatively intense
material. In the example cited in this paper the
evidence used to conclusively prosecute the
offending students was, in large part, based upon
the detailed analysis of hand-written answer sheets,
comparing the detailed quantitative and qualitative
content. The analysis included factors such as the
extent and nature of the quantitative detail
presented, specific material included or omitted in
each student’s responses, the nature and precision
of the numbers displayed, and the totality of the
content compared to other students’ answer sheets.
Designing online examinations which provide this
level of detail and diagnostic capability will be
yet another challenge for this delivery medium.
Future Research
In order to answer the many pressing questions cited
above, the authors propose the following plan for
future research:
1) The
authors will conduct a review of Troy University’s
history with respect to violations of its Standards
of Conduct, sorted by appropriate demographic
factors. These experiences will be compared to
those of other universities by searching the
appropriate literature and surveying institutions
that wish to contribute. These efforts will attempt
to identify significant trends in academia’s views
of ethical behavior, if any.
3) Through
multi-university collaborations with other
researchers in the field, the authors will compare
various approaches to establishing effective
(technology-based and non-technology based)
solutions.
4) As
the Remote Proctor system is fully implemented for
all its eCampus offerings, statistical data will be
gathered to determine the impact, if any, of the
introduction of the technology into the DL
environment.
5) A
set of surveys will be used to determine differences
in views of academic integrity between the current
student and instructor populations. The survey will
include students and instructors in both the
traditional (in-class) and Distance Learning
environments. The survey will identify those acts
that both populations consider to be unethical,
violations of the Academic Code, or cheating.
6) Statistical
analyses will be used to identify meaningful
correlations and trends in student and instructor
views sorted by appropriate demographic factors
(e.g. age, sex, university major, membership in
fraternities or sororities, participation in
athletics, employment).
7) Local
businesses will be surveyed to determine employer
attitudes towards academic integrity. The survey
will include sufficient demographic information to
determine which factors influence employer attitudes
towards ethical behavior, cheating and academic
integrity. These results will be contrasted with
those obtained from surveys of academia.
8) Statistical
analyses will be used to identify significant trends
and changes in an individual’s attitudes as he/she
transitions from pupil to university student and
ultimately to working professional.
9) Based
upon this research, potential revisions to
university policies toward Standards of Conduct,
Academic Codes and Student Honor Codes will be
formulated, as well as corresponding adjustments to
course syllabi and student handbooks, particularly
for DL courses.
References
Argetsinger, A. (2003). U-Maryland Says Students Use
Phone to Cheat – Text Messaging Delivers Test
Answers. Retrieved on June 15, 2006 from the
Washington Post website.
Butterfield, K.; McCabe, D.; Klebe Trevino, L.
(1999). Academic Integrity in Honor Code and
Non-Honor Code Environments: A Qualitative
Investigation. Journal of Higher Education, Vol.
70.
Callahan, D. (2004). The Cheating Culture: Why
More Americans Are Doing Wrong to Get Ahead.
Harcourt Publishers.
Conradson, S., Hernandez-Ramos, P. (2004).
Computers, the Internet, and Cheating Among
Secondary School Students; Some Implications for
Educators. Retrieved on June 15, 2006 from Santa
Clara University website.
Diekhoff, G., LaBeff, E. Clark, R., Williams, L.,
Francis, B., Haines V. (1996). College Cheating: Ten
Years Later. Research in Higher Education, Vol.
37, No. 4, pp 487-502.
Educational Testing Service Research Center (2006).
Academic Cheating Fact Sheet. ETS website.
Retrieved on June 15, 2006 from ETS website.
Fellgurth, Jennifer; “Cyber Cheating”, Cheatweb,
www.hartnell.edu, accessed Nov 10, 2003.
Gray, S., (1998). Maintaing Academic Integrity in
Web-Based Instruction. Education Resources
Information Center, Vol. 35, No. 3, pp 186-188.
Greene, A.S., Saxe, L. (1992). Everybody (Else) Does
It: Academic Cheating Education. Retrieved on June
15, 2006 from Resources Information Center website.
Harding, T.S.; Carpenter, D.D.; Montgomery, S.M.;
Steneck, N.H. (2001). The Current State of Research
on Academic Dishonesty Among Engineering Students.
31st Annual Frontiers in Education
Conference.
Heberling, M., (2002). Maintaining Academic
Integrity in Online Education. Retrieved on June 15,
2006 from the Online Journal of Distance Learning
Administration, Vol. 5, No. 1.
Hendershott, A., Drinan, P., Cross, M. (1999).
Toward Enhancing a Culture of Academic Integrity.
NASPA Journal, Vol. 37, No. 4, pp 587-598.
Johnson, Sallie U., PhD (2006). Striving for
Excellence – Ensuring Online Testing Integrity.
Distance Learning Administration Annual.
University of the State of West Georgia.
Jordan, A. (2001). College Student Cheating: The
Role of Motivation, Perceived Norms, Attitudes, and
Knowledge of Institutional Policy. Ethics and
Behavior, Vol. 11, No. 3, pp 233-247.
McCabe, D. Klebe Trevino, L. (1993). Academic
Dishonesty: Honor Codes and Other Contextual
Influences. Journal of Higher Education, Vol. 64.
McCabe, Donald, (1995). Honor Codes and Student
Cheating. Retrieved June 15, 2006 from Rutgers
University website.
McCabe, Klebe Trevino, L. (1997). Individual and
Contextual Influences on Academic Dishonesty: A
Multicampus Investigation. Research in Higher
Education, Vol. 38, no. 3, pp379-396.
McCabe, D.; Klebe Trevino, L.; Butterfield, K.
(1999). Cheating in Academic Institutions: A Decade
of Research. Ethics and Behavior, Vol. 11,
No. 3, pp 219-232.
McCabe, D., Pavela, G., (2000). Some Good News About
Academic Integrity. Change, Vol. 32, No. 5, pp
32-38.
McCabe, D. (2005); Levels of Cheating and Plagiarism
Remain High, Honor Codes and Modified Codes are
Shown To Be Effective in Reducing Academic
Misconduct. Retrieved on June15, 2006 from Center
for Academic Integrity, Duke University website.
McCabe, C., Ingram, R., Conway Dato-on, M. (2006).
The Business of Ethics and Gender. Journal of
Business Ethics; Vol. 64, No. 2, pp 101-116.
Overholser, G. (1999). Case Study: Minnesota’s
Basketball Cheating Scandal. Retrieved on June 15,
2006 from the Des Moines Register website.
Ruegger, D., King, E. (1992). A Study on the Effect
of Age and Gender Upon Student Business Ethics.
Journal of Business Ethics, Vol. 11, No. 3, pp
179-186.
Slobogin, K (2002). Many Students Say Cheating’s OK
– Confessed Cheater: “What’s important is getting
ahead”. Retrieved on June 15, 2006 from CNN
website.
Smith, Sheryll; “At What Age Do Children Start
Cheating?”, Clearinghouse, Missouri Western
University, accessed November 6, 2006
Troy University, Troy University Undergraduate
Catalog, 2005-2006
Vos Savant, M. (2006). Ask Marilyn. Parade
Magazine, April 9, 2006.
|