Become a Writer Today

Essays About Cheating: Top 5 Examples and 9 Writing Prompts

Essays about cheating show the value of honesty, see our top picks for examples and prompts you can use in writing.

In the US, 95% of high school students admitted to participating in some form of academic cheating . This includes exams and plagiarism. However, cheating doesn’t only occur in schools. It’s also prevalent in couples. Psychologists say that 50% of divorce cases in the country are because of infidelity . Other forms of cheating exist, such as cheating on a diet, a business deal, etc.

Because cheating is an intriguing subject, many want to read about it. However, to write essays about cheating appropriately, you must first pick a subtopic you’re comfortable discussing. Therefore, we have selected five simple but exemplary pieces you can read to get inspiration for writing your paper.

See below our round-up of top example essays about cheating.

1. Long Essay On Cheating In School By Prasanna

2. the reality of cheating in college essay by writer kip, 3. why cheating is wrong by bernadette mcbride, 4. what counts as cheating in a relationship by anonymous on gradesfixer, 5. emotional cheating by anonymous on papersowl, 1. types of cheating, 2. i was cheated on, 3. is cheating a mistake or choice, 4. tax evasion and cheating , 5. when i cheated, 6. cheating in american schools and universities, 7. review a famous book or film about cheating, 8. a famous cheating quote, 9. cause and effects of cheating.

“Cheating is a false representation of the child’s ability which he may not be able to give without cheating. It is unfair to everyone involved as it deprives the true one of the chance to come on the top.”

Prasanna begins the essay by defining cheating in schools and then incorporates how this unethical behavior occurs in reality. She further delves into the argument that cheating is not learning but an addiction that can result in students losing self-confidence, sanity, and integrity. 

Apart from showing the common causes and harmful effects of cheating on students, Prasanna also adds parents’ and teachers’ critical roles in helping students in their studies to keep them from cheating.

“It’s human nature to want to win, and some of us will go against the rules to do so. It can be harmless, but in many cases, it is annoying, or even hurtful.”

Kip defines cheating as human nature and focuses his essay on individuals who are hell-bent on wanting to win in online games. Unfortunately, these players’ desire to be on top is all-consuming, and they’re willing to go against the rules and disregard their integrity.

He talks about his experiences of being cheated in a game called AoE. He also incorporates the effects of these instances on newbies. These cheaters will humiliate, dishearten, and traumatize beginners who only want to have fun.

Check out these essays about cooperation .

“A cheater is more than likely lying to themselves more than to the people around them. A person can only go so far before their lies catch up to them, begin to accumulate, and start to penalize you.”

Mcbride dedicates her essay to answering why cheating is wrong, no matter the circumstance. She points out that there will always be a definite punishment for cheaters, whether they get caught. Mcbride believes that students who cheat, copy, and have someone else do their work are lazy and irresponsible. These students will never gain knowledge.

However, she also acknowledges that some cheaters are desperate, while some don’t realize the repercussions of their behaviors. At the end of the essay, she admits to cheating but says she’s no longer part of that vicious cycle, promising she has already realized her mistakes and doesn’t want to cheat again.

“Keep in mind that relationships are not based on logic, but are influenced by our emotions.”

The author explains how it’s challenging to define cheating in a relationship. It’s because every person has varying views on the topic. What others consider an affair may be acceptable to some. This includes the partners’ interaction with others while also analyzing the individual’s personality, such as flirting, sleeping in the same bed, and spending time with folks.

The essay further explains experts’ opinions on why men and women cheat and how partners heal and rebuild their trust. Finally, examples of different forms of cheating are discussed in the piece to give the readers more information on the subject. 

“…emotional cheating can be described as a desire to engage in another relationship without physically leaving his or her primary relationship.”

There’s an ongoing debate about whether emotional cheating should be labeled as such. The essay digs into the causes of emotional cheating to answer this issue. These reasons include lack of attention to each other, shortage of affectionate gestures, and misunderstandings or absence of proper communication. 

All of these may lead to the partner comparing their relationship to others. Soon, they fall out of love and fail to maintain boundaries, leading to insensitivity and selfishness. When a person in a relationship feels any of these, it can be a reason to look for someone else who can value them and their feelings.

9 Helpful Prompts in Writing Essays About Cheating

Here are some cheating subtopics you can focus your essay on:

Essays About Cheating: Types of cheating

Some types of cheating include deception, fabrication, bribery, impersonation, sabotage, and professional misconduct. Explain their definitions and have examples to make it easier for readers to understand.

You can use this prompt even if you don’t have any personal experience of being cheated on. You can instead relay events from a close friend or relative. First, narrate what happened and why. Then add what the person did to move on from the situation and how it affected them. Finally, incorporate lessons they’ve learned.

While this topic is still discussed by many, for you, is cheating a redeemable mistake? Or is it a choice with consequences? Express your opinion on this matter. Gather reliable evidence to support your claims, such as studies and research findings, to increase your essay’s credibility.

Tax evasion is a crime with severe penalties. Explain what it is and its punishments through a famous tax evasion case your readers can immediately recognize. For example, you can use Al Capone and his 11-year imprisonment and $215,000 back taxes . Talk through why he was charged with such and add your opinion. Ensure you have adequate and reliable sources to back up your claims.

Start with a  5 paragraph essay  to better organize your points.

Some say everyone will cheat at some point in their life. Talk about the time you cheated – it can be at a school exam, during work, or while on a diet. Put the perspective that made you think cheating was reasonable. Did you feel guilt? What did you do after, and did you cheat again? Answer these questions in your essay for an engaging and thrilling piece of writing.

Since academic cheating is notorious in America, use this topic for your essay. Find out which areas have high rates of academic cheating. What are their penalties? Why is cheating widespread? Include any measures the academe put in place.

Cheating is a frequent cause of conflict on small and big screens. Watch a film or read a story and write a review. Briefly summarize the plot, critique the characters, and add your realizations after finishing the piece. 

Goodreads has a list of books related to cheating. Currently, Thoughtless by S.C. Stephens has the highest rating.

Use this as an opportunity to write a unique essay by explaining the quote based on your understanding. It can be quotes from famous personalities or something that resonates with you and your experiences.

Since cheating’s cause and effect is a standard prompt, center your essay on an area unrelated to academics or relationships. For instance, write about cheating on your diet or cheating yourself of the opportunities life presents you.

Create a top-notch essay with excellent grammar. See our list of the best grammar checkers.

cheating in essays

Maria Caballero is a freelance writer who has been writing since high school. She believes that to be a writer doesn't only refer to excellent syntax and semantics but also knowing how to weave words together to communicate to any reader effectively.

View all posts

  • Skip to main content
  • Keyboard shortcuts for audio player

Buying College Essays Is Now Easier Than Ever. But Buyer Beware

Tovia Smith

cheating in essays

Concern is growing about a burgeoning online market for essays that students can buy and turn in as their own work. And schools are trying new tools to catch it. Angela Hsieh/NPR hide caption

Concern is growing about a burgeoning online market for essays that students can buy and turn in as their own work. And schools are trying new tools to catch it.

As the recent college admissions scandal is shedding light on how parents are cheating and bribing their children's way into college, schools are also focusing on how some students may be cheating their way through college. Concern is growing about a burgeoning online market that makes it easier than ever for students to buy essays written by others to turn in as their own work. And schools are trying new tools to catch it.

It's not hard to understand the temptation for students. The pressure is enormous, the stakes are high and, for some, writing at a college level is a huge leap.

"We didn't really have a format to follow, so I was kind of lost on what to do," says one college freshman, who struggled recently with an English assignment. One night, when she was feeling particularly overwhelmed, she tweeted her frustration.

"It was like, 'Someone, please help me write my essay!' " she recalls. She ended her tweet with a crying emoji. Within a few minutes, she had a half-dozen offers of help.

"I can write it for you," they tweeted back. "Send us the prompt!"

The student, who asked that her name not be used for fear of repercussions at school, chose one that asked for $10 per page, and she breathed a sigh of relief.

"For me, it was just that the work was piling up," she explains. "As soon as I finish some big assignment, I get assigned more things, more homework for math, more homework for English. Some papers have to be six or 10 pages long. ... And even though I do my best to manage, the deadlines come closer and closer, and it's just ... the pressure."

In the cat-and-mouse game of academic cheating, students these days know that if they plagiarize, they're likely to get caught by computer programs that automatically compare essays against a massive database of other writings. So now, buying an original essay can seem like a good workaround.

"Technically, I don't think it's cheating," the student says. "Because you're paying someone to write an essay, which they don't plagiarize, and they write everything on their own."

Her logic, of course, ignores the question of whether she's plagiarizing. When pressed, she begins to stammer.

"That's just a difficult question to answer," she says. "I don't know how to feel about that. It's kind of like a gray area. It's maybe on the edge, kind of?"

Besides she adds, she probably won't use all of it.

Other students justify essay buying as the only way to keep up. They figure that everyone is doing it one way or another — whether they're purchasing help online or getting it from family or friends.

"Oh yeah, collaboration at its finest," cracks Boston University freshman Grace Saathoff. While she says she would never do it herself, she's not really fazed by others doing it. She agrees with her friends that it has pretty much become socially acceptable.

"I have a friend who writes essays and sells them," says Danielle Delafuente, another Boston University freshman. "And my other friend buys them. He's just like, 'I can't handle it. I have five papers at once. I need her to do two of them, and I'll do the other three.' It's a time management thing."

The war on contract cheating

"It breaks my heart that this is where we're at," sighs Ashley Finley, senior adviser to the president for the Association of American Colleges and Universities. She says campuses are abuzz about how to curb the rise in what they call contract cheating. Obviously, students buying essays is not new, but Finley says that what used to be mostly limited to small-scale side hustles has mushroomed on the internet to become a global industry of so-called essay mills. Hard numbers are difficult to come by, but research suggests that up to 16 percent of students have paid someone to do their work and that the number is rising.

"Definitely, this is really getting more and more serious," Finley says. "It's part of the brave new world for sure."

The essay mills market aggressively online, with slickly produced videos inviting students to "Get instant help with your assignment" and imploring them: "Don't lag behind," "Join the majority" and "Don't worry, be happy."

"They're very crafty," says Tricia Bertram Gallant, director of the Academic Integrity Office at the University of California in San Diego and a board member of the International Center for Academic Integrity.

The companies are equally brazen offline — leafleting on campuses, posting flyers in toilet stalls and flying banners over Florida beaches during spring break. Companies have also been known to bait students with emails that look like they're from official college help centers. And they pay social media influencers to sing the praises of their services, and they post testimonials from people they say are happy customers.

"I hired a service to write my paper and I got a 90 on it!" gloats one. "Save your time, and have extra time to party!" advises another.

"It's very much a seduction," says Bertram Gallant. "So you can maybe see why students could get drawn into the contract cheating world."

YouTube has been cracking down on essay mills; it says it has pulled thousands of videos that violate its policies against promoting dishonest behavior.

But new videos constantly pop up, and their hard sell flies in the face of their small-print warnings that their essays should be used only as a guide, not a final product.

Several essay mills declined or didn't respond to requests to be interviewed by NPR. But one answered questions by email and offered up one of its writers to explain her role in the company, called EduBirdie.

"Yes, just like the little birdie that's there to help you in your education," explains April Short, a former grade school teacher from Australia who's now based in Philadelphia. She has been writing for a year and a half for the company, which bills itself as a "professional essay writing service for students who can't even."

Some students just want some "foundational research" to get started or a little "polish" to finish up, Short says. But the idea that many others may be taking a paper written completely by her and turning it in as their own doesn't keep her up at night.

"These kids are so time poor," she says, and they're "missing out on opportunities of travel and internships because they're studying and writing papers." Relieving students of some of that burden, she figures, allows them to become more "well-rounded."

"I don't necessarily think that being able to create an essay is going to be a defining factor in a very long career, so it's not something that bothers me," says Short. Indeed, she thinks students who hire writers are demonstrating resourcefulness and creativity. "I actually applaud students that look for options to get the job done and get it done well," she says.

"This just shows you the extent of our ability to rationalize all kinds of bad things we do," sighs Dan Ariely, professor of psychology and behavioral economics at Duke University. The rise in contract cheating is especially worrisome, he says, because when it comes to dishonest behavior, more begets more. As he puts it, it's not just about "a few bad apples."

Felicity Huffman And 12 Other Parents To Plead Guilty In College Cheating Scandal

Felicity Huffman And 12 Other Parents To Plead Guilty In College Cheating Scandal

"Instead, what we have is a lot ... of blemished apples, and we take our cues for our behavior from the social world around us," he says. "We know officially what is right and what's wrong. But really what's driving our behavior is what we see others around us doing" or, Ariely adds, what we perceive them to be doing. So even the proliferation of advertising for essays mills can have a pernicious effect, he says, by fueling the perception that "everyone's doing it."

A few nations have recently proposed or passed laws outlawing essay mills, and more than a dozen U.S. states have laws on the books against them. But prosecuting essay mills, which are often based overseas in Pakistan, Kenya and Ukraine, for example, is complicated. And most educators are loath to criminalize students' behavior.

"Yes, they're serious mistakes. They're egregious mistakes," says Cath Ellis, an associate dean and integrity officer at the University of New South Wales, where students were among the hundreds alleged to have bought essays in a massive scandal in Australia in 2014.

"But we're educational institutions," she adds. "We've got to give students the opportunity to learn from these mistakes. That's our responsibility. And that's better in our hands than in the hands of the police and the courts."

Staying one step ahead

In the war on contract cheating, some schools see new technology as their best weapon and their best shot to stay one step ahead of unscrupulous students. The company that makes the Turnitin plagiarism detection software has just upped its game with a new program called Authorship Investigate.

The software first inspects a document's metadata, like when it was created, by whom it was created and how many times it was reopened and re-edited. Turnitin's vice president for product management, Bill Loller, says sometimes it's as simple as looking at the document's name. Essay mills typically name their documents something like "Order Number 123," and students have been known to actually submit it that way. "You would be amazed at how frequently that happens," says Loller.

Using cutting-edge linguistic forensics, the software also evaluates the level of writing and its style.

"Think of it as a writing fingerprint," Loller says. The software looks at hundreds of telltale characteristics of an essay, like whether the author double spaces after a period or writes with Oxford commas or semicolons. It all gets instantly compared against a student's other work, and, Loller says, suspicions can be confirmed — or alleviated — in minutes.

"At the end of the day, you get to a really good determination on whether the student wrote what they submitted or not," he says, "and you get it really quickly."

Coventry University in the U.K. has been testing out a beta version of the software, and Irene Glendinning, the school's academic manager for student experience, agrees that the software has the potential to give schools a leg up on cheating students. After the software is officially adopted, "we'll see a spike in the number of cases we find, and we'll have a very hard few years," she says. "But then the message will get through to students that we've got the tools now to find these things out." Then, Glendinning hopes, students might consider contract cheating to be as risky as plagiarizing.

In the meantime, schools are trying to spread the word that buying essays is risky in other ways as well.

Professor Ariely says that when he posed as a student and ordered papers from several companies, much of it was "gibberish" and about a third of it was actually plagiarized.

Even worse, when he complained to the company and demanded his money back, they resorted to blackmail. Still believing him to be a student, the company threatened to tell his school he was cheating. Others say companies have also attempted to shake down students for more money, threatening to rat them out if they didn't pay up.

The lesson, Ariely says, is "buyer beware."

But ultimately, experts say, many desperate students may not be deterred by the risks — whether from shady businesses or from new technology.

Bertram Gallant, of UC San Diego, says the right way to dissuade students from buying essays is to remind them why it's wrong.

"If we engage in a technological arms race with the students, we won't win," she says. "What are we going to do when Google glasses start to look like regular glasses and a student wears them into an exam? Are we going to tell them they can't wear their glasses because we're afraid they might be sending the exam out to someone else who is sending them back the answers?"

The solution, Bertram Gallant says, has to be about "creating a culture where integrity and ethics matter" and where education is valued more than grades. Only then will students believe that cheating on essays is only cheating themselves.

  • Our Mission

Alex Green Illustration, Cheating

Why Students Cheat—and What to Do About It

A teacher seeks answers from researchers and psychologists. 

“Why did you cheat in high school?” I posed the question to a dozen former students.

“I wanted good grades and I didn’t want to work,” said Sonya, who graduates from college in June. [The students’ names in this article have been changed to protect their privacy.]

My current students were less candid than Sonya. To excuse her plagiarized Cannery Row essay, Erin, a ninth-grader with straight As, complained vaguely and unconvincingly of overwhelming stress. When he was caught copying a review of the documentary Hypernormalism , Jeremy, a senior, stood by his “hard work” and said my accusation hurt his feelings.

Cases like the much-publicized ( and enduring ) 2012 cheating scandal at high-achieving Stuyvesant High School in New York City confirm that academic dishonesty is rampant and touches even the most prestigious of schools. The data confirms this as well. A 2012 Josephson Institute’s Center for Youth Ethics report revealed that more than half of high school students admitted to cheating on a test, while 74 percent reported copying their friends’ homework. And a survey of 70,000 high school students across the United States between 2002 and 2015 found that 58 percent had plagiarized papers, while 95 percent admitted to cheating in some capacity.

So why do students cheat—and how do we stop them?

According to researchers and psychologists, the real reasons vary just as much as my students’ explanations. But educators can still learn to identify motivations for student cheating and think critically about solutions to keep even the most audacious cheaters in their classrooms from doing it again.

Rationalizing It


First, know that students realize cheating is wrong—they simply see themselves as moral in spite of it.

“They cheat just enough to maintain a self-concept as honest people. They make their behavior an exception to a general rule,” said Dr. David Rettinger , professor at the University of Mary Washington and executive director of the Center for Honor, Leadership, and Service, a campus organization dedicated to integrity.

According to Rettinger and other researchers, students who cheat can still see themselves as principled people by rationalizing cheating for reasons they see as legitimate.

Some do it when they don’t see the value of work they’re assigned, such as drill-and-kill homework assignments, or when they perceive an overemphasis on teaching content linked to high-stakes tests.

“There was no critical thinking, and teachers seemed pressured to squish it into their curriculum,” said Javier, a former student and recent liberal arts college graduate. “They questioned you on material that was never covered in class, and if you failed the test, it was progressively harder to pass the next time around.”

But students also rationalize cheating on assignments they see as having value.

High-achieving students who feel pressured to attain perfection (and Ivy League acceptances) may turn to cheating as a way to find an edge on the competition or to keep a single bad test score from sabotaging months of hard work. At Stuyvesant, for example, students and teachers identified the cutthroat environment as a factor in the rampant dishonesty that plagued the school.

And research has found that students who receive praise for being smart—as opposed to praise for effort and progress—are more inclined to exaggerate their performance and to cheat on assignments , likely because they are carrying the burden of lofty expectations.

A Developmental Stage

When it comes to risk management, adolescent students are bullish. Research has found that teenagers are biologically predisposed to be more tolerant of unknown outcomes and less bothered by stated risks than their older peers.

“In high school, they’re risk takers developmentally, and can’t see the consequences of immediate actions,” Rettinger says. “Even delayed consequences are remote to them.”

While cheating may not be a thrill ride, students already inclined to rebel against curfews and dabble in illicit substances have a certain comfort level with being reckless. They’re willing to gamble when they think they can keep up the ruse—and more inclined to believe they can get away with it.

Cheating also appears to be almost contagious among young people—and may even serve as a kind of social adhesive, at least in environments where it is widely accepted.  A study of military academy students from 1959 to 2002 revealed that students in communities where cheating is tolerated easily cave in to peer pressure, finding it harder not to cheat out of fear of losing social status if they don’t.

Michael, a former student, explained that while he didn’t need to help classmates cheat, he felt “unable to say no.” Once he started, he couldn’t stop.

A student cheats using answers on his hand.

Technology Facilitates and Normalizes It

With smartphones and Alexa at their fingertips, today’s students have easy access to quick answers and content they can reproduce for exams and papers.  Studies show that technology has made cheating in school easier, more convenient, and harder to catch than ever before.

To Liz Ruff, an English teacher at Garfield High School in Los Angeles, students’ use of social media can erode their understanding of authenticity and intellectual property. Because students are used to reposting images, repurposing memes, and watching parody videos, they “see ownership as nebulous,” she said.

As a result, while they may want to avoid penalties for plagiarism, they may not see it as wrong or even know that they’re doing it.

This confirms what Donald McCabe, a Rutgers University Business School professor,  reported in his 2012 book ; he found that more than 60 percent of surveyed students who had cheated considered digital plagiarism to be “trivial”—effectively, students believed it was not actually cheating at all.

Strategies for Reducing Cheating

Even moral students need help acting morally, said  Dr. Jason M. Stephens , who researches academic motivation and moral development in adolescents at the University of Auckland’s School of Learning, Development, and Professional Practice. According to Stephens, teachers are uniquely positioned to infuse students with a sense of responsibility and help them overcome the rationalizations that enable them to think cheating is OK.

1. Turn down the pressure cooker. Students are less likely to cheat on work in which they feel invested. A multiple-choice assessment tempts would-be cheaters, while a unique, multiphase writing project measuring competencies can make cheating much harder and less enticing. Repetitive homework assignments are also a culprit, according to research , so teachers should look at creating take-home assignments that encourage students to think critically and expand on class discussions. Teachers could also give students one free pass on a homework assignment each quarter, for example, or let them drop their lowest score on an assignment.

2. Be thoughtful about your language.   Research indicates that using the language of fixed mindsets , like praising children for being smart as opposed to praising them for effort and progress , is both demotivating and increases cheating. When delivering feedback, researchers suggest using phrases focused on effort like, “You made really great progress on this paper” or “This is excellent work, but there are still a few areas where you can grow.”

3. Create student honor councils. Give students the opportunity to enforce honor codes or write their own classroom/school bylaws through honor councils so they can develop a full understanding of how cheating affects themselves and others. At Fredericksburg Academy, high school students elect two Honor Council members per grade. These students teach the Honor Code to fifth graders, who, in turn, explain it to younger elementary school students to help establish a student-driven culture of integrity. Students also write a pledge of authenticity on every assignment. And if there is an honor code transgression, the council gathers to discuss possible consequences. 

4. Use metacognition. Research shows that metacognition, a process sometimes described as “ thinking about thinking ,” can help students process their motivations, goals, and actions. With my ninth graders, I use a centuries-old resource to discuss moral quandaries: the play Macbeth . Before they meet the infamous Thane of Glamis, they role-play as medical school applicants, soccer players, and politicians, deciding if they’d cheat, injure, or lie to achieve goals. I push students to consider the steps they take to get the outcomes they desire. Why do we tend to act in the ways we do? What will we do to get what we want? And how will doing those things change who we are? Every tragedy is about us, I say, not just, as in Macbeth’s case, about a man who succumbs to “vaulting ambition.”

5. Bring honesty right into the curriculum. Teachers can weave a discussion of ethical behavior into curriculum. Ruff and many other teachers have been inspired to teach media literacy to help students understand digital plagiarism and navigate the widespread availability of secondary sources online, using guidance from organizations like Common Sense Media .

There are complicated psychological dynamics at play when students cheat, according to experts and researchers. While enforcing rules and consequences is important, knowing what’s really motivating students to cheat can help you foster integrity in the classroom instead of just penalizing the cheating.

Academic Dishonesty: 5 Methods of Identifying Cheating and Plagiarism

cheating in essays

One aspect of teaching that can make an instructor feel pessimistic and disheartening is when a student attempts to gain an unfair advantage.  Most of the time, this is labeled simply as cheating , defined as intentionally using or attempting to use unauthorized materials on any academic exercise , or plagiarism , the appropriation or use of another person's ideas, results, or words without giving appropriate credit , but we see instances of fabrication and other acts of dishonesty.  What can you do to combat acts of academic dishonesty?  This article is meant to help faculty members at any level, even teaching assistants, identify possible occurrences of academic dishonesty.

Know Your School’s Policies & Be Transparent with Your Students

When you become a faculty member at a new institution, take a more extensive teaching role at your current institution, or even a long-time teacher implementing new curriculum changes, you must identify and know the school’s policy and rules regarding academic honesty and creating a fair classroom environment.  Each faculty member may enforce the rules differently, but it’s critical that the students know your classroom rules and expectations upfront. A few key items to consider:

  • Do you want them to work with other students on their homework?
  • What rules and procedures do you have for assignments, reports, and exams?
  • Put this information in your syllabus and discuss this with them on Day 1 of your course with transparency. 

If one of your students performs an act of academic dishonesty in your course, this will allow you to enforce the sanctions professionally.  If you don’t know where to find this information, ask your faculty mentor or your university’s appropriate administrative office.  These offices are usually the academic honor office, the department or college office, or the Dean of Faculties office, depending on the institution.

2. Watch for the Methods Students Use to Cheat and Plagiarize

The reasons why students cheat have not changed, but how students cheat has changed dramatically.  Typically, there is an assumption that most cheaters are bad or failing students, but students cheat for a multitude of reasons: poor time management skills, a tough class schedule, stress, and anxiety, or poor communication of the rules by their faculty members.  The use of social media and other electronic resources has changed academia over the last 20 years. A few examples of some cheating methods to watch out for include:

  • Social Media Communication: Students discuss test questions and individual assignments via social media and other chat apps to give their friends and colleagues academic advantages. 
  • Smartphones: Many students take pictures of their answers with their smartphones and send them to others using text messages.
  • Smartwatches: Recently, smartwatches have become more prevalent and allow communication and internet browsing without the use of a cell phone.  They allow students to access study files and answers that were not authorized by the faculty member. 
  • Groups that Share Tests: Many student organizations have tests and assignments from previous semesters that allow students to look up questions from a faculty member or specific class. 
  • Unauthorized Help: Tutoring services will discuss how to “beat a test” or “write the perfect paper” by giving students unauthorized aid. This can also include groups or individuals who may offer to write a paper or take a test for a fee on behalf of the student.

Being smart as a faculty member is knowing that these outside resources are available and to identify when they are being used improperly.

3. Be Proactive, Not Just Reactive

For some instances of academic dishonesty, the origin of the problem comes back to the faculty member not taking a proactive role in combating the acts.

  • Full Established Boundaries: The first place for immediate improvement is the discussion of unacceptable acts on the first day of class and syllabus.  Many faculty members will only include the minimum required statement in their syllabus.  This does not properly set student academic honesty boundaries.  Establishing such boundaries might be informing students of the use of plagiarism detection software, describing acceptable behavior and communication about assignments on social media, or acceptable help on homework, essays, and reports.
  • Variety in Assessment: Another place where faculty can improve is writing different assignments or multiple forms for exams.  Changing up how you ask questions, what essay question prompts you to use, and creating different forms for exams can be time-consuming. However, this effort will reward students with a fair and objective assessment.  If you are concerned with academic dishonesty in your course, putting in some work early will benefit your course in the long run.

4. Grade Assignments, Reports, and Essays Attentively

Most of the time, trust your own feelings when looking for possible occurrences of academic dishonesty.  When grading assignments, if the work seems more advanced than the student’s level or that they do not seem to follow the question prompt, this can be a strong indication of plagiarism. A few ways to validate these concerns and provide either “proof” or deterrents of this behavior include:

  • Show Your Work: Require multiple drafts of a paper and give feedback regarding citation standards throughout the writing process. 
  • Side-by-Side Grading: If you have research papers or lab reports in which students worked with a partner or in a group, grade the assignments side-by-side.  While the data or general content may be the same, direct copying will be more apparent. 
  • Online Plagiarism Checkers: Technology has been developed to help identify plagiarism.  Websites such as Turnitin.com , Unicheck , PlagarismSearch , and others have students upload their essays/reports then compare all submissions to other online resources and papers turned in for other courses or at other institutions.  Many schools have licenses for this technology and you should utilize it on any type of critical thinking or writing assignment.

5. Manage Exam Administration and Proctoring

Most attention is focused on deterring cheating is during exams.  A few methods that can specifically help discourage academic dishonesty during these high-stake assessments include:

Assigned Seats: A good first step is to assign seats for each exam. While this might be challenging for a large lecture hall, it minimizes the chance of friends and study partners sitting next to each other; thereby limiting the student interaction.  It also allows faculty or proctors to know who is present to take the exam.

  • Variety & Alterations by Section: As mentioned before, having multiple forms of an exam can be a great preventive for cheating.  Having different exam forms with the same questions mixed in a different order, or similar questions about the same are all small, minor changes that can promote an honest testing environment.

One topic of test administration that does not get enough attention is proctoring.  In a small classroom, there may be only one adult in a 20-40 student class.  For larger lectures containing 200-400 students, teaching assistants help faculty make sure students are taking their exams honestly.  How can proctors create an honest environment? 

  • They must proctor actively:  Many proctors distribute exams and then ignore the students to grade other assignments, work on their computers, look at their cell phone or possibly leave the room.  After you pass out the exams, you should walk around, checking for anything suspicious, and watching for students looking at other exams.  If you spot any of these behaviors, make an immediate change. 
  • Reminders About the Rules: Announcements about looking at their own paper can only help so much, so moving students to correct behavior might be necessary.  Having another set of eyes and having another presence in the room, even for a brief time, can correct behavior. 
  • Instructor Collaboration: Faculty members that do have test proctors should meet with them before the exam, explain to them the correct protocols, and describe past experiences or issues that occur during exams.  This five-minute discussion will help a test proctor during a situation they have never faced and keep them actively involved during the exam session.

While cheating and plagiarism can cause many faculty members to become frustrated, being able to give your students a fair testing environments and objective assignment is the goal of all successful educators. 

Attending a conference?

Checkout if mcgraw hill will be in attendance:.

  • Share full article

Cheating, Inc.: How Writing Papers for American College Students Has Become a Lucrative Profession Overseas

Credit... Illustration by The New York Times

Supported by

By Farah Stockman and Carlos Mureithi

Tuition was due. The rent was, too . So Mary Mbugua, a university student in Nyeri, Kenya, went out in search of a job. At first, she tried selling insurance policies, but that only paid on commission and she never sold one. Then she sat behind the reception desk at a hotel, but it ran into financial trouble.

Finally, a friend offered to help her break into “academic writing,” a lucrative industry in Kenya that involves doing school assignments online for college students in the United States, Britain and Australia. Ms. Mbugua felt conflicted.

“This is cheating,” she said. “But do you have a choice? We have to make money. We have to make a living.”

Since federal prosecutors charged a group of rich parents and coaches this year in a sprawling fraud and bribery scheme , the advantages that wealthy American students enjoy in college admissions have been scrutinized. Less attention has been paid to the tricks some well-off students use to skate by once they are enrolled.

Cheating in college is nothing new, but the internet now makes it possible on a global, industrial scale. Sleek websites — with names like Ace-MyHomework and EssayShark — have sprung up that allow people in developing countries to bid on and complete American homework assignments.

Although such businesses have existed for more than a decade, experts say demand has grown in recent years as the sites have become more sophisticated, with customer service hotlines and money-back guarantees. The result? Millions of essays ordered annually in a vast, worldwide industry that provides enough income for some writers to make it a full-time job.

The essay-for-hire industry has expanded significantly in developing countries with many English speakers , fast internet connections and more college graduates than jobs, especially Kenya, India and Ukraine. A Facebook group for academic writers in Kenya has over 50,000 members .

After a month of training, Ms. Mbugua began producing essays about everything from whether humans should colonize space (“it is not worth the struggle,” she wrote) to euthanasia (it amounts to taking “the place of God,” she wrote). During her best month, she earned $320, more money than she had ever made in her life. The New York Times is identifying Ms. Mbugua by only part of her name because she feared that the attention would prevent her from getting future work.

It is not clear how widely sites for paid-to-order essays, known as “contract cheating” in higher education circles, are used. A 2005 study of students in North America found that 7 percent of undergraduates admitted to turning in papers written by someone else, while 3 percent admitted to obtaining essays from essay mills. Cath Ellis, a leading researcher on the topic, said millions of essays are ordered online every year worldwide.

cheating in essays

“It’s a huge problem,” said Tricia Bertram Gallant , director of the academic integrity office at the University of California, San Diego. “If we don’t do anything about it, we will turn every accredited university into a diploma mill.”

When such websites first emerged over a decade ago, they featured veiled references to tutoring and editing services, said Dr. Bertram Gallant, who also is a board member of the International Center for Academic Integrity, which has worked to highlight the danger of contract cheating . Now the sites are blatant.

“You can relax knowing that our reliable, expert writers will produce you a top quality and 100% plagiarism free essay that is written just for you, while you take care of the more interesting aspects of student life ,” reads the pitch from Academized, which charges about $15 a page for a college freshman’s essay due in two weeks and $42 a page for an essay due in three hours.

“No matter what kind of academic paper you need, it is simple and secure to hire an essay writer for a price you can afford,” promises EssayShark.com. “Save more time for yourself.”

In an email, EssayShark’s public relations department said the company did not consider its services to be cheating, and that it warned students the essays are for “research and reference purposes only” and are not to be passed off as a student’s own work.

“We do not condone, encourage or knowingly take part in plagiarism or any other acts of academic fraud,” it said.

A representative for UvoCorp, another of the companies, said its services were not meant to encourage cheating. “The idea behind our product design is to help people understand and conform to specific requirements they deal with, and our writers assist in approaching this task in a proper way,” the representative said in an email. “According to our policies, customers cannot further use any consultative materials they receive from us as their own.”

Representatives for Academized and Ace-MyHomework did not return emails and phone calls seeking comment.

A major scandal involving contract cheating in Australia caused university officials there to try to crack down on the practice. A similar effort to confront the industry has emerged in Britain, but not in the United States.

Contract cheating is illegal in 17 states, but punishment tends to be light and enforcement rare . Experts said that no federal law in the United States, or in Kenya, forbids the purchase or sale of academic papers, although questions remain about whether the industry complies with tax laws .

“Because American institutions haven’t been whacked over the head like Australian schools were, it’s easier to pretend that it’s not happening,” said Bill Loller, vice president of product management for Turnitin, a company that develops software to detect plagiarism . “But it’s absolutely happening .”

Mr. Loller said he had worked with some colleges that have students who have never shown up for class or completed a single assignment. “They’ve contracted it all out,” he said.

Contract cheating is harder to detect than plagiarism because ghostwritten essays will not be flagged when compared with a database of previously submitted essays; they are generally original works — simply written by the wrong person . But this year, Turnitin rolled out a new product called Authorship Investigate, which uses a host of clues — including sentence patterns and a document’s metadata — to attempt to determine if it was written by the student who turned it in.

Some of the websites operate like eBay, with buyers and sellers bidding on specific assignments. Others operate like Uber, pairing desperate students with available writers. Either way, the identities and locations of both the writers and the students are masked from view, as are the colleges the assignments are for .

Still, in some of the assignments that Ms. Mbugua provided to The Times, names of colleges that the essays were meant for became clear. One assignment asked students to write about a solution to a community problem, and the essay Ms. Mbugua provided described difficulties with parking around Arizona State University. “Students could always just buck up and take the walk,” the paper said.

Bret Hovell, a spokesman for Arizona State University, said the school was not able to determine whether the essay had been turned in.

In Kenya, a country with a per capita annual income of about $1,700, successful writers can earn as much as $2,000 a month, according to Roynorris Ndiritu, who said he has thrived while writing academic essays for others.

Roynorris Ndiritu , 28, who asked that only part of his name be used because he feared retribution from others in the industry in Kenya, graduated with a degree in civil engineering and still calls that his “passion.” But after years of applying unsuccessfully for jobs, he said, he began writing for others full time. He has earned enough to buy a car and a piece of land, he said, but it has left him jaded about the promises he heard when he was young about the opportunities that would come from studying hard in college.

“You can even get the highest level of education, and still, you might not get that job,” he said.

In interviews with people in Kenya who said they had worked in contract cheating, many said they did not view the practice as unethical.

As more foreign writers have joined the industry, some sites have begun to advertise their American ties, in a strange twist on globalization and outsourcing. One site lists “bringing jobs back to America ” as a key goal. American writers, who sometimes charge as much as $30 per page, say that they offer higher-quality service, without British spellings or idioms that might raise suspicion about an essay’s authorship.

Ms. Mbugua, the Kenyan university student, worked for as little as $4 a page. She said she began carrying a notebook, jotting down vocabulary words she encountered in movies and novels to make her essays more valuable.

Ms. Mbugua, 25, lost her mother to diabetes in 200 1, when she was in the second grade. She vowed to excel in school so that she would one day be able to support her younger brother and sister.

A government loan and aunts and uncles helped her pay for college. But she also worked, landing in an office of 10 writers completing other people’s assignments , including those of American students. The boss stayed up all night, bidding for work on several sites, and then farmed it out in the morning .

“Any job that is difficult, they’re like, ‘Give it to Mary,’” she said.

There were low points. During summer break , work slowed to a trickle. Once, she agonized so much over an American history paper about how the Great Depression ended that she rejected the job at the last minute, and had to pay an $18 fine.

But Ms. Mbugua said she loved learning, and sometimes wished that she were the one enrolled in the American universities she was writing papers for. Once, when she was asked to write an admissions essay for a student in China who was applying to the Eli Broad College of Business at Michigan State University, she said she dreamed of what it would be like to go there herself.

Eventually, Ms. Mbugua said, she decided to strike out on her own, and bought an account from an established writer with UvoCorp. But UvoCorp forbids such transfers, and Ms. Mbugua said the account she had purchased was shut down.

Now Ms. Mbugua finds herself at a crossroads, unsure of what to do next. She graduated from her university in 2018 and has sent her résumé to dozens of employers. Lately she has been selling kitchen utensils.

Ms. Mbugua said she never felt right about the writing she did in the names of American students and others.

“I’ve always had somehow a guilty conscience,” she said.

“People say the education system in the U.S., U.K. and other countries is on a top notch,” she said. “I wouldn’t say those students are better than us,” she said, later adding, “We have studied. We have done the assignments.”

Advertisement

Essay cheating: How common is it?

  • Published 3 May 2018

Female writing in a library

A BBC investigation has found that prominent YouTube stars are encouraging students to buy essays.

Passing off a custom-made essay as your own is a form of plagiarism known as contract cheating.

It involves a student ordering an essay, usually through a website, for a fee.

But it could also be friends or family members writing an essay on a student's behalf.

Companies offering these sort of services are known as essay mills.

The fee will usually depend on the essay subject, length and deadline.

Edubirdie.com screenshot

Some essay mills - including EduBirdie - claim that the essays they provide are "100% plagiarism free". But even if the essay you buy doesn't necessarily contain copied material, the act of submitting it as your own is itself a form of plagiarism - according to the Quality Assurance Agency (QAA), which monitors standards in UK higher education.

A student caught doing this could face serious penalties - including expulsion.

EduBirdie says that there is disclaimer on its site which suggested that the work it provided should only be used as a sample or a reference.

What's the scale of contract cheating?

The QAA told Reality Check that it believes contract cheating is on the rise.

In its 2016 report, the agency said leaflets advertising essay services had been handed out on campuses. There were also reports of adverts appearing on university notice boards.

One UK essay writing company boasts that it has helped more than 25,000 students over the past 15 years.

But we don't know how many of those students who used the service went on to submit the essays as their own.

The QAA also referred Reality Check to a 2016 Times investigation. Based on Freedom of Information requests, the newspaper unearthed 50,000 cases of cheating in UK universities over the previous three years.

This works out at 17,000 per year, or 0.7% of students.

YouTube stars paid to sell cheating

Essay cheat companies face university ban

  • The man who helps students cheat

The problem with this number, which the QAA acknowledges, is that it includes all forms of cheating - not just contract cheating.

But even if we did know how many students were caught contract cheating, we still wouldn't know how many cases went undetected.

For that we have to rely on survey data, where students are asked if they have ever cheated by submitting an essay written by someone else.

The most recent UK study was carried out in 2012 and found that 29.5% of participants agreed that they had "submitted work taken wholly from an internet source (free or paid) as your own".

Elsewhere, the QAA cites a 2014 study from Saudi Arabia, which found that 22% of students reported having paid someone to write an essay.

Prof Phil Newton, from Swansea University, is an expert on contract cheating. He says that with surveys of this nature, there's always a likelihood that respondents may not give accurate answers - especially if they are owning up to deviant behaviour.

So, we have to treat survey data with a degree of scepticism.

Are these services legal?

At the moment there's nothing, legally speaking, to stop websites selling essays.

In fact many websites contain disclaimers that say students shouldn't pass off the essays as their own and that they should only be used as study aids.

That said, in March 2018, the Advertising Standards Agency (ASA) upheld a complaint about claims appearing on a UK essay mill website.

The ASA said the website gave the misleading impression that "consumers would be able to submit purchased essays as their own without repercussion".

But even if legislation was brought in, the QAA says it is unlikely to solve the problem.

It says many of these websites are offshore and even if they were closed down they can easily re-emerge.

So what can be done about it?

Contract cheating can be very difficult to spot. As the essays are bespoke they're unlikely to be picked up by software which some universities use to detect plagiarism.

Last October, guidance was issued to institutions on how to deal with contract cheating.

  • Have fewer assessments by essays
  • Block essay writing websites from IT systems
  • Get familiar with student writing styles and try to spot any changes
  • Have clear procedures to report suspected cheating
  • Support struggling students with their writing skills

Why do students do it?

There are many reasons - it could be as simple as laziness or a lack of confidence in writing ability.

In April, Prof Newton, along with colleagues, published a study into contract cheating in Australia.

The study focused on students who had asked either friends or family members to write essays on their behalf.

The researchers found that were three factors which increased the likelihood of contract cheating.

  • Students who spoke a language other than English
  • A dissatisfaction with their learning environment
  • Where students perceived there were opportunities to cheat

Presentational grey line

Read more from Reality Check

Send us your questions

Follow us on Twitter

More on this story

  • Published 1 May 2018

Adam Saleh

  • Published 9 October 2017

Hands on keyboard

The man who helps students to cheat

  • Published 12 May 2016

Generic young person at computer

cheating in essays

Doing away with essays won’t necessarily stop students cheating

cheating in essays

Honorary Fellow, The University of Melbourne

Disclosure statement

Julie Hare does not work for, consult, own shares in or receive funding from any company or organisation that would benefit from this article, and has disclosed no relevant affiliations beyond their academic appointment.

University of Melbourne provides funding as a founding partner of The Conversation AU.

View all partners

It’s never been easier for university students to cheat. We just need look to the scandal in 2015 that revealed up to 1,000 students from 16 Australian universities had hired the Sydney-based MyMaster company to ghost-write their assignments and sit online tests.

It’s known as contract cheating – when a student pays a third party to undertake their assignments which they then pass off as their own. Contract cheating isn’t new – the term was coined in 2006 . But it’s becoming more commonplace because new technologies, such as the smart phone, are enablers.

Read more: 15% of students admit to buying essays. What can universities do about it?

Cheating is taken seriously by universities and the national regulator, the Tertiary Education Quality and Standards Agency . Much of the focus has been on changing assessment tasks to ones deemed to be harder for a third party to undertake. This is called “ authentic assessment ”.

This type of assessment has been widely adopted at universities . They are comprised of tasks that evaluate knowledge and skills by presenting students with real-world scenarios or problems relevant to the kinds of challenges they would face following graduation. But new research found authentic assessment may be as vulnerable to cheating as other more obvious examples, such as essays.

What the research shows

This new study was conducted by academics from six universities, led by Tracey Bretag and Rowena Harper from the University of South Australia. The research – part of the federal government’s Contract Cheating and Assessment Design project – surveyed 14,086 students and 1,147 staff.

The goal of this research was to collect and understand student’s perceptions of the likelihood of cheating on 13 different assessment tasks. The research then asked teaching staff which of the 13 tasks they used.

cheating in essays

The researchers have previously reported from this data set that 6% of students admitted to cheating. The purpose of the current round of analysis was not to understand the extent of cheating, but perceptions of how easily it might be done, and if that correlated with the tasks educators set.

They found, for both students and teachers, assessments with a short turnaround time and heavily weighted in the final mark were perceived as the tasks which were the most likely to attract contract cheating.

Assessments perceived as the least likely to attract contract cheating were in-class tasks, personalised and unique tasks, vivas (oral explanations of a written task) and reflections on practical placements. But these tasks were the least likely to be set by educators, presumably because they’re resource and time intensive.

Contract cheating and assessment design

The research confirms the relationship between contract cheating and assessment design is a complex one. There was no assessment tasks for which students reported a 0% likelihood of contract cheating. Students who engage in contract cheating both see and look for opportunities to cheat regardless of the assessment task.

For universities, that means they must assume cheating is always possible and simply changing what assessments they use will not combat the problem.

cheating in essays

Many experts have advocated the use of supervised exams to combat cheating. But this new research adds to a growing body of evidence that exams provide universities and accrediting bodies with a false sense of security. In fact, previous data has shown students reported engaging in undetected cheating on supervised exams at higher rates than other types of cheating.

Another common approach is to use a series of small, graded tasks, such as spontaneous in-class tests, sometimes called continuous assessment . Even here, students indicated these were the third most likely form of assessment to be outsourced.

Who’s most likely to cheat?

There has been much attention , particularly during the MyMaster scandal , on international students’ use of contract cheating. The new research suggests both international students and domestic students from non-English speaking backgrounds are more likely to engage in contract cheating than other students.

Read more: Don't assume online students are more likely to cheat. The evidence is murky

The research also found business and commerce degrees were more likely be perceived as attracting contract cheating. Engineering was also particularly vulnerable to cheating.

Students from non-English speaking backgrounds hypothesised cheating would be most likely to occur in assessments that required research, analysis and thinking skills (essays), heavily weighted assignments and assessments with short turnaround times.

cheating in essays

Perhaps unsurprisingly, students who indicated they were satisfied with the quality of teaching were less likely to think breaches of academic integrity were likely. In other words, this confirms previous research which showed students dissatisfied with their educational experience are more likely to cheat.

So what do we do about it?

This research provides yet more compelling evidence that curriculum and changes to teaching strategies and early intervention must be employed to support students’ academic endeavours.

The researchers also point out high levels of cheating risks undermining the reputation and quality of Australia’s A$34 billion export sector in international education.

The data demonstrates assessment tasks designed to develop relevant professional skills, which teachers are highly likely to set, were perceived by students as tasks that can easily be cheated on. These might include asking accounting students to memorandums, reports or other communication groups to stakeholders, such as shareholders. In fact, among students from a non-English speaking background, the risks of cheating might actually increase for these tasks. This means authentic assessment might run the increasing risk of being outsourced.

Read more: Assessment design won’t stop cheating, but our relationships with students might

This research shows the relationship between contract cheating and assessment design is not a simple product of cause and effect. In fact, the nature of the task itself may be less relevant to the prevalence of cheating than other factors such as a student’s from non-English speaking background’s status, perceived opportunities to cheat or satisfaction with the teaching and learning environment.

All educators must remain vigilant about cheating. Teachers must be properly resourced by their universities to ensure they can create rich learning environments which uphold the integrity of the higher education system.

Burdened with large debts and facing a precarious job market after graduation, it’s perhaps unsurprising some students, particularly those who are struggling academically, take a transactional approach to their education. This new research provides more clear evidence contract cheating is a systemic problem that requires a sector-wide response.

  • University assessment
  • MyMaster cheating scandal
  • Essay mills
  • Exam cheating
  • Essay writing
  • Contract cheating

cheating in essays

Research Assistant in Immunology and Virology

cheating in essays

General Manager | La Trobe University, Sydney Campus

cheating in essays

Lecturer / Senior Lecturer - Business Law & Taxation

cheating in essays

Newsletters and Social Media Manager

cheating in essays

Industrial Officer (Senior)

  • Original article
  • Open access
  • Published: 14 November 2017

Detecting contract cheating in essay and report submissions: process, patterns, clues and conversations

  • Ann M. Rogerson 1  

International Journal for Educational Integrity volume  13 , Article number:  10 ( 2017 ) Cite this article

20k Accesses

64 Citations

15 Altmetric

Metrics details

Detecting contract cheating in written submissions can be difficult beyond direct plagiarism detectable via technology. Successfully identifying potential cases of contract cheating in written work such as essays and reports is largely dependent on the experience of assessors and knowledge of student. It is further dependent on their familiarity with the patterns and clues evident in sections of body text and reference materials to identify irregularities. Consequently, some knowledge of what the patterns and clues look like is required. This paper documents how to identify some of the patterns and clues observed in essay and report submissions. Effective assessment design with specific contextual requirements make irregularities easier to detect and interpret. The irregularities identified were confirmed as instances of contract cheating through conversations held with postgraduate students. An essential element of the conversations was the evidence presented for discussion. Irregularities were noted on a pro-forma specifically developed for this purpose. Patterns identified include misrepresented bibliographic data, inappropriate references, irrelevant material and generalised text that did not address the assessment question or grading criteria. The validated patterns formed the basis of identifying potential instances of contract cheating in later submissions. Timely conversations with students before the end of semester are essential to determining whether the patterns and clues link to poor knowledge of academic writing conventions or classified as contract cheating necessitating the application of appropriate penalties under institutional policies and procedures.

Introduction

Detecting situations where students have not fully authored their own written submissions is an ongoing challenge for educators and institutions. This includes detecting work that is the result of various forms of contract cheating. Contract cheating has extended beyond earlier definitions used to describe students outsourcing assessable work to external parties (Clarke & Lancaster, 2006 ) to include other behaviours such as sharing, trading, ghosting and impersonation (Bretag et al. 2017 ). While the range of identified contract cheating behaviours continues to expand, and our understanding of the cause and prevalence of the issue improves, the methods to detect their occurrence is still largely reliant on the person charged with the responsibility for grading the work (Bretag & Mahmud, 2009 ; Dawson & Sutherland-Smith, 2017 ; Lancaster & Clarke, 2007 ; Rogerson, 2014 ; Rogerson & McCarthy, 2017 ).

Educators grade assessable student work against rubrics, discipline criteria and task specifications. They are also required to determine if the students’ work is their own. Due to the continuing and evolving practices of contract cheating, there is a need for an evolutionary approach to enhance assessor evaluation skills beyond discipline related practices and academic writing conventions. What is also necessary is an approach that can streamline the methods of determining irregularities and documenting evidence for evaluation and discussion. Conversation and interpretive skills are also required to distinguish between plagiarised, repurposed, purchased, ghosted or traded work and students whose work is the result of a poor understanding of academic writing conventions (Rogerson & Bretag, 2015 ). Differentiating between purposeful misrepresentation of authorship and a genuine lack of academic writing expertise is reliant on the skills and experience of the assessor in addition to their ability to identify and interpret clues and patterns (Rogerson & Bassanta, 2016 ).

Identifying the patterns and clues in essays and report assessments can be a challenge in itself due to the random and erratic nature of encountering students trying to cheat the system. Studies such as Coughlin ( 2015 ) are aligned to post-completion investigations without the benefit of the student voice and where a grade or outcome is already recorded. Dawson and Sutherland-Smith ( 2017 ) reported that academics can identify some forms of contract cheating, but again their experiment was outside of the time pressures of providing grading and feedback within sessional requirements, and without the need to discuss irregularities with students. Other studies focus on how to classify the seriousness of incidents and apply consistent penalty decisions once issues such as plagiarism are identified (Carroll & Appleton, 2005 ; Yeo & Chien, 2007 ). Studies such as these improve our understanding of some issues related to contract cheating, yet do not capture nor examine the cheating behaviours as and when they occurred within a teaching session, nor include student insights.

In order to support academic integrity principles, detection of irregularities of potential contract cheating issues is ideally required at the time the student is taking a class. This means identifying, examining and evaluating submissions for indicators of contract cheating before releasing grades to students. Returning work with a grade and feedback indicates to the student that the work has passed the academic integrity test, and where a student has used contract cheating to pass encourages them to risk repeating or even, promote the behaviour. Once a grade is released to a student it is more difficult (but not impossible) to apply the penalties for academic misconduct and change a pass to a fail for a paper, subject or degree. Retrospective application of penalties leaves the institution open to appeals and public enquiries about standards and processes, all of which are additional burdens in terms of time, resources, and reputation. A more effective and efficient approach is to confront the issue through effective assessment design, communication and to address potential contract cheating issues as and when they occur.

This paper takes up the challenge to provide a practical process to identify irregularities and to approach students for conversations that allow a determination of whether the submitted work is actually contract cheating or a genuine poor understanding of academic writing practices. The examples discussed and presented here are the result of irregularities identified during the grading process of some postgraduate coursework submissions. The focus on postgraduates was the consequence of the author’s teaching allocations. The student insights and explanations are the result of conversations held to evaluate irregularities. Evidence of the irregularities identified during the grading process were noted on a template, which was subsequently used to document relevant insights resulting from conversations held with students about their submissions. Retrospective ethics approval was granted to examine the notes and evidence once the material was matched and de-identified, and all students had completed their course of study or had left the university.

Detecting if a student submission involves contract cheating – What do I look for?

Manual observation skills and academic judgement are required to assess written work in order to detect unoriginal submissions (Bretag & Mahmud, 2009 ). Detection of unoriginal materials in essays and reports through manual observation is reliant on the identification of irregularities or patterns of concern (Rogerson, 2014 ) as at this time technology can only detect some but not all cases of plagiarism and contract cheating (Dahl, 2007 ; Rogerson & McCarthy, 2017 ). There is also the issue that some instances of contract cheating may appear on the surface to be very similar to instances of poor academic practice (Dick et al., 2002 ). Consequently, a process approach is required to identify, document, and investigate irregularities using technological, interpretive, and conversational means. A practical process approach augments many of the methods already used by individuals grading assessment submissions but incorporates them in a more systematic way.

Figure  1 depicts a process that is a continuous cycle where the areas of preparation, examination and grading of submissions, and the evaluation stage feed into each other. The approach outlined in Fig.  1 was developed and refined by the author using an action research approach to address a cohort situation where a larger than normal number of irregular submissions were identified (see Rogerson, 2014 ). Action research in education seeks to improve teaching strategies as well as institutional practices (Kember & Gow, 1992 ) incorporating steps of planning, action, observation and evaluation of strategies tried out in practice (Lewin, 1946 ). This approach includes reflection as people learn from their own experiences (McTaggart, 1991 ).

Process for assessment preparation, grading and evaluation

The method outline in Fig.  1 was and continues to be successful in identifying, examining, evaluating and confirming cases of contract cheating, and differentiating allegations of contract cheating from cases where there is a poor or underdeveloped understanding of academic writing conventions. Ongoing use of the cycle establishes a spiral of continuous improvement and refinement. The stages do not and cannot prevent students from cheating, but can discourage the practice while being successful in reducing the use of contract cheating behaviours. Using combination of process and reflecting on experience has resulted in contract cheating behaviours becoming more obvious and therefore easier to detect.

Preparation phase

The preparation phase is essential to set meaningful assessment tasks that deliver learning outcomes. It is a starting point but as indicated in Fig.  1 , it should draw on observations and on insights gained through previous sessions, student interactions, data analytics, training and development, reflection and feedback. This phase involves reviewing assessment tasks, grading criteria ensuring that any refinements align with curriculum, in addition to institutional polices and assessment strategies.

Review assessments, criteria and curriculum

Assessment and curriculum design can have an influence contract cheating behaviours (Hrasky & Kronenberg, 2011 ). Other influences include the frequency, volume and scheduling of assessment tasks within the session (Bretag et al., 2017 ; Gijbels, van de Watering, & Dochy, 2005 ). When a series of assessment tasks are due on a similar date/time, students are required to be more diligent in their scheduling and time management. The self-scheduling skills necessary in higher education are not necessarily developed in different educational environments such as the transition from high school to university. A lack of preparation and planning may see students seeking short cuts leading to the use of contract cheating practices. Tight scheduling, large classes, and other workload requirements also places pressure on individuals grading work who have limited time to turn around student submissions. Consideration of some of these aspects when designing assessments and curriculum can benefit both the students and the academics.

Assessment task questions should be refreshed each session and cross-checked on the Internet in addition to removal requests (as per DMCA protocol). This means Googling proposed assessment questions, in addition to checking for uploaded assessments on file-sharing sites such as www.coursehero .com, or www.thinkswap.com . Where responses are found, it indicates that the question needs changing, reframing, or contextualisation. The inclusion of contextual factors (such as specific criteria and/or situations) makes it more difficult for sites selling assignments to address. Bretag et al.,( 2017 ) reported that the inclusion of contextual or individualised requirements and outlining specific instructions reduces the motivation for students to outsource assessable work which can be detected by Turnitin®. A reliance on or use of textbook questions, or the repeated use of a particular case study from session to session has a greater potential for previously submitted assignments to be reused by students. It should also be noted that instructor’s guides with model answers are readily available for purchase or access on the Internet, therefore, using questions from set texts is more likely to lead to a student being tempted to cheat.

Embedding discussion in lectures and tutorials

As a further step in preparation, embedding discussion to educate students about criteria, assessment requirements and in lectures and tutorials can contribute to limiting attempts to cheat Bretag et al., ( 2017 ). Embedding skills for students sees observations from previous sessions used to establish preventative measures in current or future sessions (Kelley, Tong, & Choi, 2010 ). This approach establishes an authentic learning environment (Meyers & Nulty, 2009 ) and is considered as best practice in developing student capabilities (McWilliams & Allan, 2014 ). When embedded learning elements are complemented by information about known cheating behaviours in lecture and tutorial based discussion it can lead to a reduction attempts to cheat (Dick et al., 2002 ), particularly when information about the severity of penalties is included (LaSalle, 2009 ).

Some skill development sessions can embedded into lectures and tutorials including how to identify and cite quality reference sources, in addition to exercises on academic writing conventions such as structuring arguments and paraphrasing. Academic skill development is shown to be effective as a deterrent to contract cheating behaviours when embedded in course material (Divan, Bowman, & Seabourne, 2015 , Jones & Maxwell; 2015 ), but is also dependent on students engaging with classes either online, or through actual attendance. Class discussion about assessment requirement should also include highlighting when and what type of collaboration is and is not permitted within the class and/or assessment task (Seals, Hammons, & Mamiseishvili, 2014 ). Clarifying complex terms such as collusion counteracts another form of academic misconduct covered in more recent institutional policies, as some terms are not necessarily understood by students (Gullifer & Tyson, 2014 ). Annotated exemplars accompanied by explanatory dialogue assist students in understanding the relationship between submitted work, grading criteria and descriptors (Bell, Mladenovic, & Price, 2013 ). Providing students with exemplars discourages students from searching the Internet, or posting questions on private Facebook® groups in attempts to see what an assessment response looks like.

It is also beneficial to discuss how to use originality checking software such as Turnitin®. Spending a short time discussing what Turnitin® originality reports show has reduced instances of direct cut and paste plagiarism (Buckley & Cowap, 2013 ; McCarthy & Rogerson, 2009 ). It can also lead to some interesting questions about “free” plagiarism checking software promoted on the Internet. For example: the Viper program is promoted as a free plagiarism checker. However, the software is accessed from sites known to sell assessments such as UK Essays ( https://www.ukessays.com/plagiarism-scanner/download-viper.php ). UK Essays retains a copy of any assignment checked by Viper and after a period (currently three months) publishes the essay in their free resources section.

“When you scan a document, you agree that 3 months after completion of your scan, we will automatically upload your essay to our student essays database which will appear on one of our network of websites so that other students may use it to help them write their own essays.” Source: https://www.ukessays.com/plagiarism-scanner/terms-and-conditions.php#storage

Unless students read the fine print closely under the terms and conditions, they will actually contribute to contract cheating resources. Class discussions tied to assessment tasks also provides an opportunity to highlight to students that using free “resources” such as those promoted by UK Essays ( https://www.ukessays.com/resources/ ) are not a reliable or credible reference source, and that institutions know this type of site exists.

Embedding academic literacy activities and initiating short discussions about contract cheating in lectures and tutorials has an additional benefit. Openly discussing the issue reduces the excuses students can proffer in the evaluation phase about the lack of originality of their work as assessment criteria, requirements and academic integrity principles are made explicit.

Examination and grading phase

In the examination and grading phase, it is important to note any observed irregularities. Making notes while grading provides a basis for comparing observations within a cohort. To facilitate this, a template can form the basis of note taking, which also provides a basis for evidence should it be required for a future conversation with a student to examine irregularities. Based on previous experiences in identifying unoriginal work where a detailed examination of a range of irregularities within a particular student cohort was required (Rogerson, 2014 ), a template process was trialled and implemented.

Due to the large number of irregularities identified when grading an assessment task in one session, the author created a template to document irregularities as they were observed. An example of the template created is provided as Additional file  1 (page 1) and Additional file  2 (page 2). Common irregularities are listed with yes/no responses for ease of circling (Turnitin® matches, differences in English expression; referencing and citation issues) with an “Other” category to capture any other issues of note. Areas on the form include space for listing examples of the irregularities observed (For example: percentage of match, page numbers, and citation details). The template facilitates note taking about any irregularities identified, which are useful in the evaluation phase. The template only takes a few minutes to complete including noting the student details, circling irregularities and brief notes about examples. It is only used in situations where irregularities are observed. The next sections outline some of the irregularities observed during grading process working through the areas in an ordered way.

Identifying concerns using technology

The promise of technological means of detecting unoriginal written submissions is partially effective in cases where text is directly taken in whole or in part from publicly accessible Internet sources, reused by a student, or shared between subject instances. However, as Ellis ( 2012 ) highlights, the widespread use and adoption of “digital detection tools” can establish an over-reliance on them as the sole means of detecting cheating at the expense of trusting personal judgement (Ellis, 2012 , p.50). Consequently, some knowledge of the practices of students, in addition to the more subtle means of detecting irregularities using text-matching technology can be useful to identify instances of potential contract cheating.

Turnitin® similarity reports and originality percentages

Turnitin® is one company providing a suite of online educative and evaluation tools ( www.turnitin.com ) including an area that checks for originality of work submitted to the system. Materials such as written assessments and presentations uploaded to Turnitin® are checked against a database of assignments lodged in previous sessions in addition to other published and Internet based works. Turnitin® generates a similarity percentage score to indicate the amount of material in the submission matched to other sources. An accompanying report highlights where the matches are in the submission, the percentage of individual match, and indicate the source of the match. The reports are an indicator but require interpretation as there may be false positives (Baggaley & Spencer, 2005 ) and may miss some types of contract cheating (Lines, 2016 ; Rogerson, 2014 ; Rogerson & McCarthy, 2017 ).

A common question asked by students in discussions about the use of Turnitin® is “what is a good score to aim for?” The most common misconception about Turnitin® is that a zero similarity percentage score (0%) is good, inferring that no plagiarism is identified. The reality is an overall similarity score of 0 %, or an unusually low score is a cause for concern and an indicator that some irregularity is evident (Lines, 2016 ; Rogerson, 2014 ). For example: A good quality reference list or bibliography using academic journals will match to the original sources and result in an overall similarity index somewhere in the range of 20%–30%, or even higher depending on the number of citations/references included and the ratio between word and citation counts. A zero score can indicate issues such as falsified reference material, use of paraphrasing tools, or inappropriate use of embedded files. Other situations with zero or low scores may be cases where .jpg or .png images of texts or reference lists have been included in a document. Text matching algorithms cannot currently detect these embedded file types. It is a deliberate form of deception to hide direct copies of materials. In order to identify this type of deception, documents uploaded to Turnitin® need to be downloaded and reviewed. Where submissions are lodged as portable document format (PDF) documents, it is necessary to unlock the PDF to open the original text for review of properties (to see who actually authored the document) and body text elements. Documents without authoring information may be the result of a purchased assignment where all tracking information has been removed. Reviews of this nature have also revealed unacknowledged and inappropriate embedded items such as paragraphs, pages and/or bibliographic entries.

Discussions with institutional colleagues have indicated they prefer to exclude references/bibliography checks from the calculations so that students do not see higher scores resulting from reference material matches. However, experience has demonstrated that it is better to have the references/bibliography included. Firstly, to highlight to the student where good quality references have been used, and secondly, identify where students have either shared, or used a reference list previously used by another student in a previous session. Switching off the matching function may reduce the initial score, but would mean that shared or copying of reference lists are not identified. An example of this found in a later session where two students had matching reference lists, but no Turnitin® matches in the body text of the assessment task. This is highly unusual. Discussions were held with both students where it was revealed that one student had expressed their personal difficulty in understanding faculty referencing requirements. The other student had offered to assist and requested an entire copy of the assignment to enter the reference data. Comparing the two assignments side by side, the placement of the in-text citations was identical, yet there were no matches in the body text of the originality reports. The student who had “helped” with the referencing had actually taken the assignment and changed all of the sentences (beyond synonym replacement), left the in-text citations in the same places and submitted the work as their own. If the bibliographic measure in Turnitin® had been set to ignore in the default settings, this incident of contract cheating by sharing (Bretag et al., 2017 ) would never have been detected except through memorising student reference lists when grading.

To assist students in further evaluating their submissions and originality reports the visual trigger of rainbows in the reference list has been used with great success. A reference list linking to academic journals will show a range of colours matching to sources as shown in Fig.  2 . “Rainbows” in the reference list are promoted as a positive indicator for students to understand what their similarity reports are showing (Rogerson, 2016 ). This visual clue in addition to the ability to view their reports with and without bibliographic materials in the originality calculation have provided students with assurances of what is and is not appropriate and taken away the pressure to achieve a ‘zero’ percent in the similarity score. Students are educated in class time how to use the filter functions available on the originality report settings so that they can see the originality percentage with, and without bibliographic input. Providing students with clear referencing criteria (minimum-maximum number, type of references, select sources) gives clear guidelines and expected percentages that can be discussed in class, and highlighting how measures can be switched off and on allowing for student self-feedback prior to due dates.

Turnitin® Originality Report reference list rainbows

If there are no reference matches in a Turnitin® similarity report, some other form of plagiarism detection avoidance may have been used. This can include embedded reference lists as images (such as .png or .jpeg file extensions as discussed earlier) or using other cheat approaches where spacing, punctuation additions, hidden or recoloured characters are used to try to circumvent similarity checking algorithms. Turnitin® originality reports can reveal just as much by what they do not show, as what they do show. It comes down to knowledge and building experience in interpreting originality reports.

Other referencing and citation irregularities

After reviewing any digital detection tool outputs, the next step is to check for other referencing and citation irregularities. The quality, range, accuracy, relevance and presentation of referencing and citation data provides a good indicator of what can be expected in the body content of the assessment task. For example: a well-presented reference list with matching in-text citations is representative of attention to detail, where as a poorer presentation may be indicative of a more casual or ill-informed approach to acknowledgement of sources. Table  1 outlines some of the irregularities observed in referencing and citation data.

Lines ( 2016 ) review of ghost-writing services also highlighted how inadequate referencing could be noted in purchased sources with references either irrelevant to the question, inappropriate, inadequate or insufficient. A visual scan of the reference list can determine whether disciplinary and subject related references are listed and used.

Reviewing language usage and consistency

After reviewing referencing and citation for irregularities, the body text can be examined from both a presentation and content perspective. The Centre for Academic Development at Victoria University in Wellington (New Zealand-Aotearoa) provides a summary list of some formatting, style and content questions that can be considered while examining body text (Centre for Academic Development, n.d .). This includes shifts in font size, gaps within the document, unusual sentence and paragraph fragmentation, and changes in grammar, tense and spelling. While their list is labelled as a Plagiarism Detection List , many of the questions posited are reported to be just as relevant to identifying contract cheating behaviours (Doró, 2016 ).

Descriptors such as language shifts are useful markers but they can indicate a number of other issues. Other issues may be related to poor skills in a second language (Abasi, Akbari, & Graves, 2006 ), use of Internet based translation (Somers, Gaspari, & Niño, 2006 ) or back translation (Jones & Sheridan, 2015 ), use of paraphrasing tools or article spinners (Rogerson & McCarthy, 2017 ), or a lack of understanding of discipline academic writing standards. It can also indicate the partially or fully borrowed, stolen or repurposed work of others that are forms of contract cheating.

Answering the question and addressing criteria

The content of assessment tasks is another area for identifying irregularities. This includes determining whether the student addressed the actual assessment question. An earlier examination of a number submissions found that purchased assessments were bland, and lacked relevance to the topic area, theory or met requirements for the use of examples (Rogerson, 2014 ). Further irregularities identified include the misuse of terminology, incorrect definitions, misattribution of theoretical concepts, or lack of citations within areas clearly referring to ideas or phenomena. These observations are supported by a recent pilot study where contract cheating was correctly identified in cases where submissions identified as instances of contract cheating “did not address key questions”, or had “poor conceptualisation” (Dawson & Sutherland-Smith, 2017 , p.4). Providing students with clear criteria requirements as part of the question can assist in identifying situations where a student may not have written their own submission. Current evidence suggests that many contract cheating sites are not adept in addressing specific details in essay and report writing tasks.

Evaluation phase

The evaluation phase comprises two distinct steps. Firstly, comparing irregularities across the cohort to identify any irregularities that align with expected patterns and classify issues beyond the students’ control. Secondly, to initiate conversations with students to explore irregularities to differentiate between contract cheating behaviours that are forms of academic misconduct and underdeveloped writing skills and determine appropriate next steps.

Comparing irregularities across the cohort

Once observations are documented during the examination phase (by using a pro-forma such as those provided in the appendices, or similar), notes should be compared within the class/across the cohort. Comparing observations will highlight whether some or any of the inconsistencies noted relate to a particular group of students (for example, students new to the institution, differences in disciplinary backgrounds).

Comparing cohort or class data can also identify if there is a pattern to the irregularities that may not necessarily be the result of the quality of the students’ work, or their behaviours. It may even be the result of our own actions in the way we have set an assessment task, clarified (or not specified) criteria and expectations, the result of someone new taking on a subject, or influenced by other factors such as technology glitches and/or corruptions. One off irregularities can appear to be a random event leading to students being given the benefit of the doubt and likely to be considered as the student attempting (but not yet realising) their own writing voice. Alternatively, a repeated pattern across a number of submissions may indicate an undercurrent of a broader institutional issue of admission standards and/or more sinister cheating behaviours spread by word-of-mouth.

Conversations about irregularities or concerns provides an opportunity to evaluate and calibrate observations to ensure that all students are provided a fair and equitable opportunity to complete the assessment task, while reducing the need for appeals on other extraordinary grounds which are beyond the students’ control.

Consider prior student performance

When determining which irregularities are part of a pattern of performance, or an unusual occurrence, a quick view of the student’s prior performance can provide further insight. Past performance in academic study has been reported as being a good indicator of ongoing and future performance (Ayán & García, 2008 ; House, Hurst, & Keely, 1996 ). Where electronic grading and assessment are used (or electronic copies of earlier assessments are held in repositories such as Turnitin®), later assessments can be compared against current assessments (for example comparing results between assessments 1 and 2). (Bretag et al. 2017 ) identified that knowledge of a student and the relationship that a teacher has with that student is one of the key methods of identifying potentially unoriginal work.

Observations, such as language shifts either within a submission or between submissions, can highlight discrepancies in the way language is used, inconsistency in the writing style, and extreme shifts from very poor to a very high level of expression. Reviewing a students’ prior performance may actually result in some irregularities being reclassified as consistent with the students’ performance, or provide further evidence that what is being observed is in fact irregular. For example, a student enrolled in a degree where a large number of calculations are required may perform well in any mathematical or statistical course but struggle when it comes to writing an essay or report. Any clues or patterns noted at this stage can assist in exploring observations with students.

Exploring irregularities through conversations with students

After noting and recording observations, and examples of where any irregularity is identified, the only way of exploring the issue is to interact with the student to gain further insight. Discussing concerns with students about the quality and originality of their writing does not mean accusing them immediately of plagiarism, contract cheating, or copying from other students. It means stating that there are some concerns and irregularities observed in their submission and presenting the evidence that causes an assessor to have those concerns (Rogerson & Bretag, 2015 ). Explanations can then be sought from the student as to why the observations appear the way they do, ensuring they have the opportunity to express their point of view, and bring to light other contextual factors which are not apparent when reviewing a written submission. Conversations of this nature enlighten us and provide opportunities to explore and evaluate learning practice and differentiate between contract cheating and poor academic writing skills.

Questions about lecture and tutorial content

When presenting evidence of irregularities for discussion, questions can also be asked about relevant content, an approach which Dick et al.( 2002 , p.180) describe as a “verification process”. This provides a further opportunity for the student to demonstrate what they have (or have not) learned. This can be a useful approach for international students who can sometimes explain what they have learned but find it more difficult to explain in writing (Sowden, 2005 ). An ESL language speaker, even those who have achieved a level 7 IELTS (which is required for some medical services workers in Australia) will still have some elements in their speech or writing that are grammatically incorrect, particularly in terms of plurals, conjunctions and expression. However, some of these issues can be corrected in written work by using spelling and grammar checks or external (or friend/family member) editing as was the case with one student. This is where initiating conversations with students is important. In this situation, a student was married to an Australian born English as a first language speaker. After doing poorly in their first assignment, the student used the feedback to ensure they addressed all criteria and then asked their spouse to edit their second assignment submission. A few questions about the assessment and topic matter clearly demonstrated that the student had an excellent understanding of the material, could articulate how they had improved between the two assignments, and that the explanation for the language shift was the result of editing assistance. This case is more closely aligned with sharing behaviours (Bretag et al., 2017 ) than contract cheating. The insights gained through this conversation could not have been gleaned from the submitted content alone.

Denials of wrongdoing

While students confronted with suspicions of cheating are likely to proffer a vehement denial as to purchasing or reusing the work of others, in many of these interviews students seemed somewhat relieved to admit to what they had done. An honest admission in an open evaluative conversation provided an opportunity to drill down to why the student had done what they had, discuss the implications for dishonest academic behaviour, and plan support to (hopefully) prevent a recurrence. Personalised feedback delivered face-to-face is highly valued by students, but students vary in what type of feedback they are seeking with some wanting only praise, while others seek guidance on how to improve (Evans & Waring, 2011 ).

Referencing irregularities

Where referencing or citation irregularities are identified and present, some students were asked to demonstrate how they found their references. This is one way of testing whether or not the students are comfortable searching for legitimate reference materials, and can also assist in identifying contract cheating sites such as the bibliographic mashups discussed previously under the referencing heading. While we need to acknowledge that a student may feel nervous about searching, some small allowance for nerves is necessary in meetings. However, it becomes clear when the students cannot locate the reference in question or have difficulty in retrieving any of the references, the reference may not exist or the student has resorted to copying, borrowing, or purchasing materials from elsewhere.

Closing the loop: Using outcomes to inform and improve practice

Addressing observations from previous sessions and incorporating them into subsequent sessions is effective in modifying curriculum in an incremental way rather than mandating major modifications (Kelley et al., 2010). Insights drawn from discussions with students can highlight common misconceptions or points of confusion. Clarifying those points in future classes, in addition to openly discussing how contract cheating can be identified ensures that we are using feedback to inform and improve our practices. This approach also develops our own experience in detecting and limiting contract cheating behaviours.

Implications for practice: Working with staff

Speaking directly with a student about irregularities provides insight about the submission in addition to the students’ understanding of what they have learnt. It can be both easier and convenient for time pressed educators to comment on a need for writing improvement, providing feedback and recommending support materials for students to consider when they next prepare a piece of writing. The alternative approach, discussing irregularities in a face-to-face consultation, is time consuming and a potentially confronting interaction particularly when an academic may suspect misconduct but not necessarily know how to identify the evidence within a submission and present the evidence to the student for explanation. Conversations with students can be very positive experiences and mean the difference between a student’s future success, and whether they are motivated and committed enough to continue with their studies. Conversations also provide direct feedback on our own practices that can inform future subject instances.

Where electronic means of detection are employed a level of academic judgement is still required due to expected matches such as topic areas; restating an essay question; use of specific materials, references, materials, formulae, design, theory; and/or disciplinary specific terms such as treatment methods, chemical compounds, jurisdiction or curricula. Academic judgement is also required as text matching algorithms rely on word and character matches, as at this time, semantic meaning cannot be matched through electronic assessment checking (Rogerson & McCarthy, 2017 ). Identifying patterns of irregularities and clarifying interpretations through conversations develops and refines individual judgement.

A further and necessary element of this type of academic skill development is sharing discoveries and learnings with both colleagues and students. This includes the dissemination of patterns and clues that can alert assessors to work submitted because of contract cheating behaviours and the methods employed by students seeking to avoid writing their own work. Institutions should seek to implement methods to discourage situations where potential cases of contract cheating are ignored or bypassed due to unclear procedures or where it is considered too much work (Doró, 2016 ). The design of assessment tasks can make some of the practices and observations outlined in this paper more obvious, particularly where specific criteria and contextual factors are stipulated.

Implications for practice: Working with students

Open discussion is required. This includes discussing what we, as assessors know about cheating, fraud, plagiarism and misappropriation practices while promoting academic integrity. Conversations with students to determine the reason behind concerns identified in assignments can lead to a greater understanding of student approaches and great discussions about academic integrity and its links to personal integrity. Including elements of academic integrity when discussing assessment tasks alerts those interested in cheating that collectively, as an institution and a sector are aware of the issues. Open dialogue reassures the diligent student that academic integrity is taken seriously. An upfront approach also ensures that students cannot claim ‘naivety’ as an excuse for noncompliance with academic integrity requirements and the consequences of inappropriate actions when the issues have been openly discussed in class. Short discussions in all classes can clarify any expected matches relevant to the assessment task or subject matter. Repeating this in multiple classes reinforces the message and does not leave the responsibility in the hands of a few. Students weighing up the risk are less likely to attempt an action if the actions are consistent between classes, across the institution and where teachers follow through on suspect actions (East, 2009 ). Discussions in class about irregularities identified in previous sessions, and the penalties imposed on previous infractions provide a practical and real application of academic policies, and allow students to ask questions about them to gain a deeper understanding of the principles and application of academic integrity.

Where irregularities are the result of students employing less legitimate approaches to assessment tasks where work has actually been purchased, traded, borrowed or taken, the lack of action in following through about what is observed can actually reinforce the behaviour. For students who are attempting to cheat their way through their education, a lack of detection or action gives the student the confidence to repeat the action and share their success with others leaving institutions open to the possibility of other students adopting the practice. The greater level of action and follow through in the grading and assessment process discourages those considering whether to ‘try it out’ as others appear to be getting away with it. Unless conversations are initiated with students where irregularities are determined to be the results of a lack of skills or confidence in academic writing, they have no cause to understand that their approach to using the writing of others in whole or in part is inappropriate while contravening academic integrity policies. Ignoring contract cheating behaviours only amplifies the range of issues that teachers, assessors and institutions have to manage.

A further approach taken to address these issues in skill development is supporting students learning to use a citation manager such as EndNote®. Taking the pressure off collecting, collating and formatting reference material has improved the quality of submissions, required less time in giving feedback about incorrect referencing formats, and reduced the need for students to copy and paste or borrow references. Consideration can be given to using a reference library developed by a student as part of a portfolio of evidence of achievement in addition to encouraging students to build links between sources across a degree curriculum instead of confining a reference source to a particular subject or course.

Conclusions and recommendations for future research

Irregularities in submissions are not a reason to accuse a student of academic misconduct and/or implying that they participate in contract cheating behaviours. Irregularities are clues or flags about what is outside of the disciplinary norm. In addition, noting irregularities is a way of identifying evidence for evaluation and potentially discussion with a student to gain an understanding of why the patterns and/or discrepancies appear in the assessment submission. Issues beyond the control of the student may be involved including accepting the student into a course of study that may beyond the level of their individual capabilities or language skills. If the student is accepted into a course beyond their level of capability, the institution should provide adequate and appropriate support to develop the skills necessary for the student to succeed. Where patterns of submission indicate the possibility of broader issues, institutions should ensure that poor assessment design and/or repetition of questions is not a contributing factor.

Further studies into detection and detection behaviours are required to build a comprehensive approach to addressing the issue of contract cheating including approaches used to discuss actual cases with students. Implementing this type of study would also be beneficial in determining the influence of conversations on future revisions to assessment tasks and the prevention and detection of contract cheating in written submissions.

Moving beyond identification to using the findings to inform students of examples of inappropriate scholarship while improving student academic practice is a positive step in placing boundaries around temptations available to students to participate in contract cheating behaviours. It is not only educating ourselves about the patterns, markers and clues, but also educating students about what is and is not acceptable practice to countermand the use and prevalence of methods of contract cheating in written submissions. Encouraging a practical and systematic approach to the preparation, grading, examination and evaluation of assessments for evidence of contract cheating as presented in this paper will assist in a proactive way to address contract cheating behaviours at an institutional level.

Abasi AR, Akbari N, Graves B (2006) Discourse appropriation, construction of identities, and the complex issue of plagiarism: ESL students writing in graduate school. J Second Lang Writ 15(2):102–117. doi: 10.1016/j.jslw.2006.05.001

Article   Google Scholar  

Ayán MNR, García MTC (2008) Prediction of university students' academic achievement by linear and logistic models. The Spanish journal of psychology 11(1):275–288

Baggaley J, Spencer B (2005) The mind of a plagiarist. Learning, Media and Technology 30(1):55–62. doi: 10.1080/13581650500075587

Bell A, Mladenovic R, Price M (2013) Students’ perceptions of the usefulness of marking guides, grade descriptors and annotated exemplars. Assessment & Evaluation in Higher Education 38(7):769–788. doi: 10.1080/02602938.2012.714738

Bretag T, Harper R, Ellis C, Newton P, Rozenberg P, Saddiqui S, van Haeringen K (2017) Preliminary findings from a survey of students and staff in Australian higher education. In: Contract cheating and assessment design OLT project Retrieved from www.cheatingandassessment.edu.au/resources/

Google Scholar  

Bretag, T., & Mahmud, S. (2009). A model for determining student plagiarism: Electronic detection and academic judgement. Paper presented at the 4APFEI Asia Pacific Conference on Education Integrity APFEI, Wollongong

Buckley E, Cowap L (2013) An evaluation of the use of Turnitin for electronic submission and marking and as a formative feedback tool from an educator's perspective. Br J Educ Technol 44(4):562–570. doi: 10.1111/bjet.12054

Carroll J, Appleton J (2005) Towards consistent penalty decisions for breaches of academic regulations in one UK university. Int J Educ Integr 1(1):1–11

Centre for Academic Development. (n.d.). Recognising plagiarism checklist. Retrieved from www.cad.vuw.ac.nz/wiki/images/d/d7/PlagiarismChecklist.doc

Clarke, R., & Lancaster, T. (2006). Eliminating the successor to plagiarism? Identifying the usage of contract cheating sites. Paper presented at the Proceedings of 2nd International Plagiarism Conference

Coughlin PE (2015) Plagiarism in five universities in Mozambique: Magnitude, detection techniques, and control measures. Int J Educ Integr 11(1):2. doi: 10.1007/s40979-015-0003-5

Dahl S (2007) Turnitin®: the student perspective on using plagiarism detection software. Act Learn High Educ 8(2):173–191

Dawson P, Sutherland-Smith W (2017) Can markers detect contract cheating? Results from a pilot study. Assessment & Evaluation in Higher Education:1–8. doi: 10.1080/02602938.2017.1336746

Dick M, Sheard J, Bareiss C, Carter J, Joyce D, Harding T, Laxer C (2002) Addressing student cheating: definitions and solutions. In: Paper presented at the ACM SigCSE Bulletin

Divan A, Bowman M, Seabourne A (2015) Reducing unintentional plagiarism amongst international students in the biological sciences: an embedded academic writing development programme. J Furth High Educ 39(3):358–378. doi: 10.1080/0309877X.2013.858674

Doró K (2016) To see or not to see: Indentifying and assessing plagiarism in non-native students’ academic writing without using text-matching software. EDULINGUA 2(1):15–26

East J (2009) Aligning policy and practice: an approach to integrating academic integrity. J Academic Language and Learning 3(1):A38–A51

Ellis C (2012) Streamlining plagiarism detection: the role of electronic assessment management. Int J Educ Integr 8(2):46–56

Evans C, Waring M (2011) Exploring students' perceptions of feedback in relation to cognitive styles and culture. Res Pap Educ 26(2):171–190. doi: 10.1080/02671522.2011.561976

Gijbels D, van de Watering G, Dochy F (2005) Integrating assessment tasks in a problem-based learning environment. Assessment & Evaluation in Higher Education 30(1):73–86. doi: 10.1080/0260293042003243913

Gullifer JM, Tyson GA (2014) Who has read the policy on plagiarism? Unpacking students' understanding of plagiarism. Stud High Educ 39(7):1202–1218. https://doi.org/10.1080/03075079.2013.777412

House JD, Hurst RS, Keely EJ (1996) Relationship between learner attitudes, prior achievement, and performance in a general education course: a multi-institutional study. Int J instructional media 23(3):257–271

Hrasky S, Kronenberg D (2011) Curriculum redesign as a faculty-centred approach to plagiarism reduction. Int J Educ Integr 7(2):23–36

Jones L, Maxwell J (2015) Engaging midwifery students in academic integrity through a multi-faceted, integrated approach. Journal of Education and Human Development 4(4):32–38

Jones M, Sheridan L (2015) Back translation: an emerging sophisticated cyber strategy to subvert advances in ‘digital age’ plagiarism detection and prevention. Assessment & Evaluation in Higher Education 40(5):712–724. doi: 10.1080/02602938.2014.950553

Kelley C, Tong P, Choi BJ (2010) A review of assessment of student learning programs at AACSB schools: a Dean's perspective. J Educ Bus 85(5):299–306

Kember D, Gow L (1992) Action research as a form of staff development in higher education. High Educ 23(3):297–310

Lancaster T, Clarke R (2007) Assessing contract cheating through auction sites–a computing perspective. In: Paper presented at the proceedings of 8th annual conference for information and computer sciences

LaSalle RE (2009) The perception of detection, severity of punishment and the probability of cheating. J Forensic Studies in Accounting & Business 1(2):93–112

Lewin K (1946) Action research and minority problems. J Soc Issues 2(4):34–46

Lines L (2016) Ghostwriters guaranteeing grades? The quality of online ghostwriting services available to tertiary students in Australia. Teach High Educ 21(8):889–914. doi: 10.1080/13562517.2016.1198759

McCarthy G, Rogerson AM (2009) Links are not enough: using originality reports to improve academic standards, compliance and learning outcomes among postgraduate students. Int J Educ Integrity 5(2):47–57

McTaggart R (1991) Principles for participatory action research. Adult Educ Q 41(3):168–187

McWilliams R, Allan Q (2014) Embedding academic literacy skills: towards a best practice model. J University Teaching and Learning Practice 11(3):1–20

Meyers NM, Nulty DD (2009) How to use (five) curriculum design principles to align authentic learning environments, assessment, students’ approaches to thinking and learning outcomes. Assessment & Evaluation in Higher Education 34(5):565–577

Rogerson, A. M. (2014, 16–18 June 2014). Detecting the work of essay mills and file swapping sites: some clues they leave behind . Paper presented at the 6th International Integrity and Plagiarism Conference Newcastle-on-Tyne

Rogerson AM (2016) Being AWARE about academic integrity: a framework to promote discussion, identification and recall. Paper presented at the Higher Education Compliance and Quality Forum, Melbourne

Rogerson AM, Bassanta G (2016) Peer-to-peer file sharing and academic integrity in the internet age. In: Bretag T (ed) Handbook of academic integrity. Springer, Singapore, pp 273–285

Rogerson AM, Bretag T (2015) Developing the confidence to intervene. In: Paper presented at the 7APCEI 7th Asia Pacific Conference on Educational Integrity. Charles Sturt University, Albury

Rogerson AM, McCarthy G (2017) Using internet based paraphrasing tools: original work, patchwriting or facilitated plagiarism? Int J Educ Integr 13(1):1–15. doi: 10.1007/s40979-016-0013-y

Seals M, Hammons JO, Mamiseishvili K (2014) Teaching assistants' preparation for, attitudes towards, and experiences with academic dishonesty: lessons learned. Int J Teaching and Learning in Higher Education 26(1):26–36

Somers, H., Gaspari, F., & Niño, A. (2006, 19-20 June 2006). Detecting inappropriate use of free online machine-translation by language students-a special case of plagiarism detection . Paper presented at the 11th annual conference of the European Association for Machine Translation–Proceedings, Oslo, Norway

Sowden C (2005) Plagiarism and the culture of multilingual students in higher education abroad. ELT J 59(3):226–233. doi: 10.1093/elt/cci042

Yeo S, Chien R (2007) Evaluation of a process and proforma for making consistent decisions about the seriousness of plagiarism incidents. Qual High Educ 13(2):187–204. doi: 10.1080/13538320701629202

Download references

Acknowledgments

The author would like to thank her colleagues Dr. Celeste Rossetto and Dr. Oriana Price for their useful feedback on early manuscript drafts, and the two anonymous reviewers for their constructive feedback on the original review version of this manuscript.

Author information

Authors and affiliations.

Faculty of Business, University of Wollongong, Northfields Avenue, Wollongong, NSW, 2522, Australia

Ann M. Rogerson

You can also search for this author in PubMed   Google Scholar

Corresponding author

Correspondence to Ann M. Rogerson .

Ethics declarations

Competing interests.

The author declares that they have no competing interests.

Publisher’s Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Additional files

Additional file 1:.

Example template for documenting irregularities page 1. (PDF 66 kb)

Additional file 2:

Example template for documenting irregularities page 2. (PDF 26 kb)

Rights and permissions

Open Access This article is distributed under the terms of the Creative Commons Attribution 4.0 International License ( http://creativecommons.org/licenses/by/4.0/ ), which permits unrestricted use, distribution, and reproduction in any medium, provided you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons license, and indicate if changes were made.

Reprints and permissions

About this article

Cite this article.

Rogerson, A.M. Detecting contract cheating in essay and report submissions: process, patterns, clues and conversations. Int J Educ Integr 13 , 10 (2017). https://doi.org/10.1007/s40979-017-0021-6

Download citation

Received : 03 September 2017

Accepted : 31 October 2017

Published : 14 November 2017

DOI : https://doi.org/10.1007/s40979-017-0021-6

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Contract cheating
  • Student conversations
  • Assessment design

International Journal for Educational Integrity

ISSN: 1833-2595

  • Submission enquiries: Access here and click Contact Us
  • General enquiries: [email protected]

cheating in essays

U.S. flag

An official website of the United States government

The .gov means it’s official. Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

The site is secure. The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List
  • Springer Nature - PMC COVID-19 Collection

Logo of phenaturepg

A systematic review of research on cheating in online exams from 2010 to 2021

Fakhroddin noorbehbahani.

Faculty of Computer Engineering, University of Isfahan, Azadi square, 8174673441 Isfahan, Iran

Azadeh Mohammadi

Mohammad aminazadeh.

In recent years, online learning has received more attention than ever before. One of the most challenging aspects of online education is the students' assessment since academic integrity could be violated due to various cheating behaviors in online examinations. Although a considerable number of literature reviews exist about online learning, there is no such review study to provide comprehensive insight into cheating motivations, cheating types, cheating detection, and cheating prevention in the online setting. The current study is a review of 58 publications about online cheating, published from January 2010 to February 2021. We present the categorization of the research and show topic trends in the field of online exam cheating. The study can be a valuable reference for educators and researchers working in the field of online learning to obtain a comprehensive view of cheating mitigation, detection, and prevention.

Introduction

Today, distance education has been transformed into online settings, and the COVID-19 pandemic has raised online learning significantly across the world. The COVID-19 enforced the closing of traditional learning all over the world, resulting in 1.5 billion students and 63 million educators shifting from face-to-face learning to online learning. This situation has revealed the strengths and weaknesses of the digital transformation of education (Valverde-Berrocoso et al., 2020 ).

In (Martin et al., 2020 ), it has been shown that the online learning publications are continuously being increased from 2009 to 2018, and one of the leading research themes is course assessment. Course assessment is very challenging in online learning due to the lack of direct control over students and educators.

For an educational institution, assessment integrity is essential because it affects institutional reputation. It is necessary to employ traditional cheating detection besides prevention methods and new digital monitoring and validation techniques to support assessment integrity in online exams (Fluck, 2019 ).

The study (Watson & Sottile, 2010 ) has reported that students are remarkably more likely to get answers from others during online exams or quizzes compared to live (face-to-face) ones. Therefore, preserving the integrity of online exams is more challenging. There are some strategies to mitigate online exam cheating, such as getting offline (face-to-face) proctored exam, developing cheat-resistant questions (e.g., using subjective measures instead of objective measures), and lessening the exam score percentage contributing to the overall course grade.

Traditional cheating methods include, hiding notes in a pencil case, behind ruler, or clothes, writing on arms/hands, leaving the room, etc. (Curran et al., 2011 ). Technological advances and online learning have enhanced education, however, they also have facilitated cheating in courses (Turner & Uludag, 2013 ). For instance, an examinee could use a mobile phone to text someone to get the answer. Although this would be difficult in the exam hall, some examinees could text without looking at the mobile phone. Applying scientific calculators, Mp3 players calculator, and wireless equipment such as an earphone and a microphone are other tools that facilitate cheating in offline exams (Curran et al., 2011 ).

Although cheating motivations in online and offline exams are not significantly different (Turner & Uludag, 2013 ), detecting and mitigating online cheating could be more intricate. This is because, in addition to traditional cheating methods that also could be exploited in online exam cheating, there exist various technologies and tools that could be applied for cheating in online exams more easily. For example, using remote desktop and share screen, searching for solutions on Internet, using social networks, etc.

Cheating in an online setting is more convenient than a traditional offline exam. Accordingly, detecting and preventing online cheating is critical for online assessment. Therefore, this issue is one of the biggest challenges that MOOC (Massive Open Online Courses) summative assessment faces.

Recent researches imply that a critical issue in online education is academic dishonesty and cheating. Today, paid services exist that impersonate students in online courses to ensure their identity. In recent years, proctoring technologies such as identity authentication, keystroke recognition, and webcam proctoring will be extended to secure online exams (Xiong & Suen, 2018 ). Apart from direct proctoring, there are some techniques such as controlling the browser, limiting exam time, randomizing questions and choices, etc. However, it seems cheating in online courses is pretty common (Dendir & Maxwell, 2020 ).

Although one of the most critical challenges in online learning is to mitigate and handle cheating, there is no comprehensive literature review and classification in this field. Hence, in this paper, we present a systematic mapping review of researches in online examination cheating. The research questions are as follows:

  • RQ1: What are the publication trends in online cheating?
  • RQ2: What are the main reasons for online cheating?
  • RQ3: What are the cheating types in online exams?
  • RQ4: How can online cheating be detected?
  • RQ5: How can online exam cheating be prevented?

The paper is structured as follows. In Section 2 , the research method is described, including study selection criteria, databases and search strategy, and study selection. Section 3 presents review results and provides the answers to research questions. Sections 4 and 5 discuss the results and conclude the paper, respectively.

The current study is a literature review about cheating in online exams. A literature review identifies, selects, and synthesizes primary research studies in order to provide a picture of the topic under investigation. According to (Page et al., 2021 ), a record is the title or abstract (or both) of a report indexed in a database or website, and a report is a document (in paper or electronic format) supplying information about a particular study. It could be a journal article, preprint, conference abstract, study register entry, clinical study report, dissertation, unpublished manuscript, government report, or any other document providing relevant information. The current literature search has been performed based on the well-established PRISMA principles (Page et al., 2021 ).

Inclusion and exclusion criteria

The main criteria for the articles considered in the current review are as follows.

Inclusion criteria:

  • Researches should be written in English.
  • Records should be retrieved utilizing the designed search query.
  • Studies should be published between January 2010 and February 2021.
  • In cases where several papers reported the same study, only the most recent ones were included (i.e., theses and papers extracted from theses, extended version of papers published in journals).

Exclusion criteria:

  • Papers merely related to methods applicable to traditional cheating types, detection, and prevention are eliminated.
  • Studies not related to research questions are ignored.
  • Articles only related to cyber-attacks to online exam systems are excluded.
  • Low-quality researches are discarded (i.e., studies published by non-reputable publishers without peer review, too short review time, and so on, studies with poor theoretical background, experimental evaluation, or structure).

Databases and search strategy

We applied a wide range of databases as our primary source, including Google Scholar, Web of Science, and Scopus. We also added the publications which had cited the extracted records. Records were searched using the following search terms for the title, keywords, and abstract sections.

(Cheat OR e-Cheating OR Fraud OR Dishonesty OR Anti-cheating OR Cheat-resistant OR Abnormal behavior OR Misconduct OR Integrity OR Plagiarism) AND

(Electronic OR Online OR Digital OR Virtual OR Cyber OR Academic) AND

(Exam OR e-Exam OR Course OR e-Course OR Assessment OR e-Assessment OR Test OR e-Test OR Environment OR e-Environment) AND

(Prevent OR Detect OR Mitigate OR Reduce OR Minimize OR Monitor OR Proctor OR Reason OR Motivation OR Type OR Deter OR Control).

Study selection

The search result included 289 records, 26 of which were duplicated, and so they were deleted. From 263 screened records, 54 records were excluded by examining either the title or the abstract. In the next step, 12 reports were eliminated because they were not retrieved because were not accessible. Furthermore, after full-text eligibility checking, 144 reports have been excluded according to the inclusion and exclusion criteria as mentioned earlier. ‌

This resulted in 53 reports that along with 5 other reports (obtained from citation searching and assessed for eligibility), were finally selected for literature review about online cheating. The flow of information through different phases of the review is presented in the PRISMA flow diagram depicted in Fig. ​ Fig.1 1 .

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig1_HTML.jpg

The PRISMA flow diagram

After selecting 58 studies, three domain experts were asked to assign a Credibility Score (CS) to each study. After evaluation of each study, experts agreed on a credibility score ranging from 0 to 5 based on the following criteria: publisher credibility, number of citations per year, theoretical and experimental quality, and organization and structure. CS statistics are as follows: mean = 3.81, SD =0.79, min = 2.5, max =5.

A summary of online cheating research papers and their study themes is presented in Table ​ Table1. 1 . (Appendix ​ (Appendix1 1 .)

Online cheating studies

Several findings emerged as a result of the research synthesis of the selected fifty-eight records on online cheating. The selected studies were categorized into four main topics, namely Cheating reasons, Cheating types, Cheating detection, and Cheating prevention, as shown in Fig. ​ Fig.2. 2 . All subsequent classifications reported in this paper have been provided by the authors. The studies under every four main topics are investigated by three experts, and a list of items is extracted for each category. Notably, some studies were corresponded to multiple main topics. Next, several brainstorming sessions have been conducted to classify each main topic further. To extract the classifications, the XMind tool has been employed, which is a professional and popular mind mapping software.

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig2_HTML.jpg

Online cheating research classification

In the following sub-sections, the detailed analysis of the review results is described according to the five research questions we defined to drive the research.

Publication trends

In Fig. ​ Fig.3, 3 , the number of publications per year is displayed (in this study, the final publication date is applied). In 2017, the greatest number of studies corresponding to the conducted review have been published. As shown in Fig. ​ Fig.4, 4 , the dominant publication type is journal papers with 53% of the total publications. In terms of the average citations of the selected studies regarding their classes, the maximum average citations belong to the journal papers with an average citation of 19.65 (see Fig. ​ Fig.5 5 ).

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig3_HTML.jpg

Number of publications per year

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig4_HTML.jpg

Distribution of publication per types

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig5_HTML.jpg

Average citation per publication type

There are 747 works cite the selected studies related to the review. As displayed in Fig. ​ Fig.6, 6 , the greatest and lowest shares of the total citations pertain to the journal articles and the theses, respectively. The number of publications per research theme is shown in Fig. ​ Fig.7. 7 . The cheating prevention and detection themes are the most prevalent research themes in online cheating. In the following four subsections, the studies under each of the four research themes are described and classified thoroughly.

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig6_HTML.jpg

Distribution of publications according to citations

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig7_HTML.jpg

Number of publications per research theme

Cheating reasons

The primary reason for cheating is that examinees feel the rewards outweigh the risks (Lancaster & Clarke, 2017 ). There exists a wide variety of reasons why candidates decide to commit cheating, still, they could be categorized into four general reasons, namely Teacher-related, Institutional, Internal, and Environmental reasons. The complete classification of the cheating reasons is displayed in Fig. ​ Fig.8, 8 , which is described in the following sections.

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig8_HTML.jpg

Teacher-related reasons

All the reasons related to the teacher or the course instructor are put into this category. Maeda ( 2019 ), has observed that the student’s relationship with the teacher has crucial influences on academic integrity. Teachers’ unethical behaviors, such as favoring those who have bribed over those who have not, or favoring the students who participated in private tutoring sessions, motivate the oppressed students to cheat. The author also found that teachers’ low interest in students’ depth of learning, which also results in a poor pedagogical style, could be an important reason that motivates students to participate in any kind of unethical behavior (Maeda, 2019 ).

Course difficulty could motivate the examinees to cheat. Some students blamed their teachers for complicated and complex course materials. In some specific cases, this reason could be a consequence of students’ lack of perseverance. They find cheating as a way to relieve these difficulties (Amigud & Lancaster, 2019 ).

As a result of distributed learning with online courses and examinations, Moten et al. ( 2013 ), have expressed that students feel isolated in an online environment. They often become frustrated when they do not get the help they immediately need, for instance, the night before an exam. This situation is closely dependent on the presence time of the teacher in online communication environments.

Some teachers restrain from punishing the cheaters appropriately due to ethical issues. This could be due to the sympathy of some teachers with cheaters. After listening to the cheater’s excuses and justifications, the teacher might give them a second chance. Sometimes, teachers are worried about the consequences of punishments and the corresponding pressures that cheaters experience, hence they don’t punish the cheater or the punishment is too mellow.

This increases the students’ courage to cheat during online exams due to decreased risk of being punished after being caught and implies that cheating penalties are insignificant over the long run (Topîrceanu, 2017 ).

Exam design is one of the most important contributing factors that motivates examinees to cheat in the exam. Weakly designed exams such as similar multiple-questions for every examinee or easy accessibility of solutions over the web, can make it easy to cheat. On the other hand, questions being too complex and irrelevant to course materials, forces students to commit cheating during online exams (Srikanth & Asmatulu, 2014 ).

Institutional reasons

In (Maeda, 2019 ), it is observed that the rules and policies of the institution are directly related to the number of unethical behaviors occurrences. It is found that institutions with stricter regulations and better commitment to strengthening academic integrity, face much less cheating behavior between their students. Institutional policies not only create an anti-cheating atmosphere, but also makes dishonest academic behaviors challenging to take place. Also, Backman ( 2019 ) emphasizes that if it becomes easy for students to cheat, they will cheat.

Impulsiveness is a crucial reason why students try to cheat during online examinations. They feel isolated and disconnected, so they may imagine they won’t get caught or the instructor does not care if they commit academic dishonesty. Unethical behaviors have a direct relationship with the student’s impulsiveness (Moten et al., 2013 ).

Moreover, in an isolated environment, due to the lack of face-to-face communications with teachers, students have much less respect for their teachers that leads to increasing misbehaviors. That is why teachers should personalize the online environment for students by calling their names or listening to their voices, so that online classes become more engaging and interactive for students (Moten et al., 2013 ).

Dobrovska ( 2017 ), expressed that the poor quality of the institution’s online learning system discourages students from learning the course materials, and makes it difficult for them to learn, hence, they are more motivated to cheat.

Academic aptitude is one of the most important and underrated reasons leading students to commit misbehaviors. It means educational institutions don’t discriminate between students and ignore their unique abilities, skills, and different levels of preparedness for a specific task. This makes unprepared students feel frustrated about that particular task or course, which leads them to seek help from more talented and prepared students in that specific context (Amigud & Lancaster, 2019 ).

Internal reasons

Another category of cheating reasons is internal motivators. The motivators over which the candidate has complete control, including intrinsic factors, personality and psychological characteristics, lie in this category. The internal reasons are divided into three subcategories as follows.

Student’s academic performance

One significant internal factor is the student’s academic performance. There are several reasons that could result in poor academic performance as follows: lack of learning and skills to find resources, students unwillingness to follow recommended practices, inability to seek appropriate help, procrastination, poor time management (Dobrovska, 2017 ), and lack of confidence in their ability to learn course materials (Norris, 2019 ).

Low intrinsic interest in the course materials

Low intrinsic interest in the course is another reason mentioned in (Dobrovska, 2017 ), which could be caused by a lack of sufficient interest in course materials and subjects or the mindset that these materials and knowledge are unnecessary and unimportant for future life (Norris, 2019 ).

Personal characteristics

There is a strong relationship between students’ moral attitudes toward cheating and their level of participation in academic misbehaviors (Maeda, 2019 ). Therefore, conscientious belief is considered as an internal reason stopping students from unethical behaviors. However, it has been shown that religious beliefs do not necessarily lower cheating behaviors (Srikanth & Asmatulu, 2014 ).

Other reasons included in studies are student’s laziness for sufficient home preparation before the exam (Dobrovska, 2017 ), competition with others and the desire to get ahead (Amigud & Lancaster, 2019 ), desire to help other peers (Moten et al., 2013 ) and the student’s thrill of taking risk (Hylton et al., 2016 ).

Environmental reasons

The reasons mentioned in this section highly depend on the atmosphere and type of environment a student is in, either during the online exam or beforehand in social media or communication with people. We put these reasons in four major categories: Peers’ behavior, Parents’ attitudes, Personal issues and, Social factors.

Peers’ behavior

Peers could influence individuals in a manner that their cheating motivations are increased. In an academic environment, however, it is primarily because of the competing objectives, such as the desire to get ahead in scores. This depends on the amount of competition in the academic environment (Amigud & Lancaster, 2019 ).

Experimental research among Cambodian students, has figured out that being among a group of cheaters, psychologically drives the students to repeat their peers’ actions and commit cheating. In addition, there is high pressure on those who do not collaborate with peers, or reject participating in their group work. It is found that they are blamed for being odd and unkind (Maeda, 2019 ).

According to (Srikanth & Asmatulu, 2014 ), being in an environment where peers’ cheating remains undetected, gives this kind of feeling to non-cheaters that they are setting back in scores and are unfairly disadvantaged compared to those cheaters.

Parents’ attitude

Parents’ acceptance of cheating behaviors, massively affects the student’s mindset toward these behaviors. As expressed in (Maeda, 2019 ), parents’ behaviors toward their child’s cheating, vary from complete unacceptance to active involvement and support. Another reason related to parents’ attitudes is putting their children under pressure to achieve good or higher than average grades (Backman, 2019 ).

Personal issues

Personal issues could be mental and physical health problems (Amigud & Lancaster, 2019 ), problems within the family (e.g., parents arguing, separation and divorce, etc.), and fear of failure in exams and its further consequences like financial and time setbacks (Hylton et al., 2016 ).

Societal factors

Poor economic conditions and the development level of a country are examples of societal factors affecting students’ motivation to cheat and achieve academic success (Maeda, 2019 ).

Countries with various cultures, social expectancies, and people’s attitudes have different behaviors regarding academic performance. In some countries, academic performance and grades are known to be crucial for success in life, whereas, in other countries, academic performance is relatively low valued. This range of different expectations from students leads to various social beliefs and behaviors toward cheating (Maeda, 2019 ). In research presented in (Holden et al., 2020 ), it is shown that a primary reason could be the existence of a cheating culture. Some students may cheat because they desire to portray a better image of themselves to their society (Norris, 2019 ). Another societal factor influencing cheating behaviors is the technology evolution that strengthens cheating motivation (Maeda, 2019 ). This is because technology brings about increased access to cheating resources. The evolution of technology, specifically search engines and social media, makes it easier for students to cheat.

Cheating types and facilitators

To mitigate cheating behaviors effectively and efficiently, cheating methodologies, types, and facilitators should be known. Cheating is performed either individually or by the cooperation of others (called group cheating). Figure ​ Figure9 9 displays the complete classification of cheating types.

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig9_HTML.jpg

Cheating types

Individual cheating

Individual cheating is carried out without any assistance from any person. This type of cheating could be categorized as using forbidden materials and other types are described as follows.

Using forbidden materials

Individual cheating can occur by using forbidden materials during the exam, such as looking at a textbook or a cheat sheet (Fontaine et al., 2020 ), (Holden et al., 2020 ), searching the web, using offline electronic resources such as images, voices, etc. (Korman, 2010 ), (Holden et al., 2020 ), or even using objects in the exam room to hide notes.

Other types

Other types of individual cheating include accessing the questions and solutions before the exam, which Korman ( 2010 ) refers to as “unauthorized intelligence”. Another dishonest behavior is social engineering, which is grade negotiation with the teacher through fake facts and exploiting personal sympathy.

Group cheating

Cheating methods through cooperation with others could be categorized as Impersonation, and Collaboration types.

Impersonation

Impersonation means employing someone to take the exam for the examinee, either the whole exam or some parts of it (Korman, 2010 ), (Holden et al., 2020 ). It can occur in forms of voice conversion, face presentation attack and face impersonation, fake identity matching to a stored biometric, and attack on the keystroke dynamics (Chirumamilla & Sindre, 2019 ). These are attacks on the biometric system to bypass the authentication mechanisms. The other impersonation techniques include remote desktop control by a third party (Kasliwal, 2015 ), (Gruenigen et al., 2018 ), sharing the screen with a third party (Gruenigen et al., 2018 ), (Bawarith, 2017 ), and credential sharing, which is impersonation via shared username and password of an academic account or LMS (Learning Management System) (Dobrovska, 2017 ).

Collaboration

Collaboration is defined as getting any kind of help from others to answer the exam questions. It could be in the form of sign language communications that come in numerous forms, such as foot-tapping, pencil or any object dropping during the proctored exam, abnormal coughing, or suspicious actions (Srikanth & Asmatulu, 2014 ).

Listening to a third party’s whispers behind the camera (Chirumamilla & Sindre, 2019 ), any type of communication which is unauthorized such as sending or receiving messages, or voice and video calls (Korman, 2010 ), are also considered as collaborative cheating.

Other cheating methods in this category are remote desktop control (Kasliwal, 2015 ) and sharing the screen with others to collaborate with others about questions (Gruenigen et al., 2018 ), applying small hidden micro cameras to capture images and record videos for sharing with other peers (Bawarith, 2017 ), and finally, organizational cheating which is a result of institution’s personnel corruption (Korman, 2010 ).

The last one, as Korman ( 2010 ) showed, can take place when personnel help candidates to cheat. Changing the exam grade or exam answers after the exam (exam integrity corruption), giving the solutions to the candidate during the exam, or just bribing the proctor not to report the cheating or not to punish after being caught (Kigwana & Venter, 2016 ) are instances of organized cheating.

Contract work is a type of collaboration that means doing work with the help of someone else under the obligations of a contract. Contract workers may provide some or all of the exam answers. In this case, sometimes impersonating the student through the whole academic course is reported (Chirumamilla & Sindre, 2019 ).

Cheating facilitators

Methods discussed here act as cheating facilitators to support the process of cheating. In other words, these facilitators can be applied to perform any kind of cheating. A study presented in (Peytcheva-Forsyth et al., 2018 ), indicates that technology in general, is the leading facilitator of cheating practices. Cheating facilitators are classified as shown in Fig. ​ Fig.10 10 .

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig10_HTML.jpg

Three different methodologies are used by students to facilitate cheating, either individually or in a group, described as follows.

Interrupting to get more time

Sometimes examinees try to buy more time to work more on the exam answers. For instance, the examinee may report an error about the exam system or exam proctoring software to convince the teacher to restart the exam session. This enables the candidate to get more time for cheating and finding the solutions during this interval when the session is closed (Motenet al., 2013 ). Another interruption method is to submit corrupted answer files by the candidate. In this case, the teacher reports that the files were corrupted and asks the candidate to resubmit the answer files. Most of the time, during the first submission and the second one, there exists at least one day, which implies the candidate gets at least one more day to answer the exam questions (Moten et al., 2013 ).

Other more classical methods to interrupt are toilet requests during the exam (Chirumamilla & Sindre, 2019 ), communication break and delay in answering oral exam right after a question is asked (Chirumamilla & Sindre, 2019 ), circumventing the exam process at a specific time with different excuses, and postponing taking the exam (Fontaine et al., 2020 ), (Korman, 2010 ). By deferring taking the exam, students can buy more time to become more prepared, either by studying more, or getting access to the exam questions and solutions.

Employing multiple devices

In proctored exams, either by a camera or software, students try to use multiple devices and answer the questions with the primary one while cheating via the secondary device. Several types of devices could be employed as the second device, such as computers and laptops (Moten et al., 2013 ), smartwatches (Wong et al., 2017 ), smart glasses such as Google glasses (Srikanth & Asmatulu, 2014 ), smartphones and tablets (Korman, 2010 ), programmable and graphical calculators to store notes and formulas (Kigwana & Venter, 2016 ), and tiny earpieces for remote voice support during the exam (Bawarith, 2017 ).

Other facilitators

Redirecting the webcam to hide something from its field of view (Sabbah, 2017 ), (Srikanth & Asmatulu, 2014 ), or disabling the webcam or microphone completely (Srikanth & Asmatulu, 2014 ) are other tricks used to facilitate cheating.

By using virtual machines on a computer, the user can run a virtual operating system on the primary one. This technique would hide the activities done on the second operating system from the software or the human proctoring the primary operating system. (Kasliwal, 2015 ).

Corrupting the exam system’s integrity to change the exam results after being held (e.g., changing the scores or answers after the examination) is another notable case (Korman, 2010 ). Lastly, in (Parks et al., 2018 ), the authors have investigated that social media and channels operating on them could act as cheating facilitation environments.

Cheating detection

Cheating detection methods can be categorized into during the exam and after the exam detection methods. Further classification of the cheating detection methods is presented in Fig. ​ Fig.11 11 .

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig11_HTML.jpg

Cheating detection during the exam

To ensure academic integrity in online examinations, it is essential to detect cheating during the exam. Cheating detection can be partitioned into two main categories, namely, continuous authentication and online proctoring. Continuous authentication methods verify the identity of test-takers, and online proctoring monitors the examinees to detect any misbehavior during the exam. In the following, we will mention different techniques in each category.

Continuous authentication

One of the main types of cheating is impersonating. Therefore, it is essential to authenticate students before exam registration and prevent unauthorized candidates from taking the examination. In addition, it is necessary to validate the identity of the test-taker during the exam continuously. The continuous authentication systems are mainly based on biometric or behaviometric modalities and can be categorized into unimodal and multimodal schemes.

Unimodal authentication is the automatic recognition and identification of candidates using a unique characteristic. This characteristic could be either static (physiological) such as the face, fingerprint, hand geometry, and iris, or could be dynamic (behavioral) such as voice, handwriting, keystroke, and mouse dynamics (Chirumamilla & Sindre, 2019 ).

As a unimodal authentication system, Arnautovski ( 2019 ) designed a face recognition system, which captures the image of the test-taker at random time intervals. The facial recognition module continuously verifies the examinee’s identity by comparing captured images to the image from the exam registration process. In (Aisyah et al., 2018 ), an Android-based online exam application is implemented that takes photos of the examinee with random intervals and a web-based application lets the admin or supervisor of examination validate pictures of participants. In addition, Idemudia et al. ( 2016 ) proposed a system that tracks and detects faces continuously to verify the candidates. If the authentication failure remains for more than a few seconds, the system will stop the examination.

In (Sabbah, 2017 ), a scheme called ISEEU is proposed, in which each examinee’s session is streamed using a webcam. A proctor monitors the video screens and can generate alerts when any suspicious action is detected. He et al. ( 2018 ) proposed an anti-ghostwriter system using face recognition methods. The ghostwriter merges the student’s photo and their photo to make a fake one, or they change their appearance to mislead the examiners. The experimental results in (He et al., 2018 ), indicate that the proposed framework can detect ghostwriters with an acceptable level of accuracy.

Since some candidates may refuse to use a camera due to privacy concerns, Bilen et al. (2020) suggested that instructors offer their students two options. An examinee can agree to use a camera during the exam. In this situation, the record will be used as evidence if they are accused of cheating. However, if the examinee doesn’t accept using a camera, the instructor can claim cheating without providing evidence to the student.

In (Bawarith, 2017 ), the system authenticates the examinees continuously through an eye tracker. The data obtained from the eye tracker are translated into a set of pixel coordinates so that the presence or absence of eyes in different screen areas can be investigated.

Multimodal biometric authentication systems utilize different biometric or behaviometric traits simultaneously, which makes impersonating more difficult. In this regard, Bawarith et al. ( 2017 ) proposed a system that utilizes fingerprint and eye-tracking for authentication. The eye tribe tracker is used to continuously ensure that test-takers are the ones they are claiming to be. Whenever the system detects the examinee is no longer present in front of the screen, the system is locked, and the test-taker must be authenticated again via fingerprint.

In (Sabbah, 2017 ), a multimodal scheme called SABBAH is proposed, which adds continuous fingerprint and keystroke dynamics to the ISEEU scheme (Sabbah, 2017 ). In contrast to ISEEU, SABBAH uses an automatic system to detect fingerprint, keystroke, or video violations. Traore et al. ( 2017 ) proposed a system that continuously authenticates examinees using three complementary biometric technologies, i.e., face, keystroke, and mouse dynamics. In this system, test-takers are continuously authenticated in the background during the exam, and alarms are created and sent to the instructor through the proctoring panel.

Online proctoring

Online proctoring is essential to promote academic integrity. Alessio et al. ( 2017 ) reported significant grade disparities in proctored versus un-proctored online exams. Online proctoring can be categorized into human and automated proctoring. In human proctoring, a human proctor monitors the students remotely to detect suspicious behavior. In contrast, in automated proctoring, the cheating behaviors are flagged or detected automatically by the proctoring system.

Recently, several technologies have been developed to facilitate proctoring online exams remotely. For example, Kryterion™ Live Video Monitoring and ProctorU allow users to be monitored by a human proctor via a webcam during examination (Hylton et al., 2016 ). In (Reisenwitz, 2020 ), substantial support for online proctoring is provided. The results show a significant difference between the scores of exams that were not proctored and those proctored using ProctorU software.

Some systems can capture screenshots of the candidates’ PCs at random times during the examination (Migut et al., 2018 ). Consequently, if examinees use any forbidden resource on their computer, it will be shown to the proctor. Alessio ( 2018 ) applied video proctoring via a webcam at Miami University. The results demonstrate that students are less likely to cheat when monitored with a webcam during online testing.

In another study, kiosk-based remote online proctored examinations are compared with tests administered under a traditional proctoring environment. In kiosk-based proctoring, the test is taken on special computer kiosks located at accessible places such as libraries. The kiosks are equipped with enhanced webcams and are supervised online by a live remote proctor. The results indicated that examinees’ scores obtained under online kiosk-based proctoring are comparable to examinations taken in test centers with onsite proctors (Weiner & Hurtz, 2017 ).

A different approach for cheating detection is a class mole that means the instructor enrolls in students’ groups under another name as a mole to detect and combat collusion. In this way, they can discover dishonest students when they discuss cheating amongst themselves (Moten et al., 2013 ).

Human proctoring is costly and labor-intensive. Therefore, different automated proctoring systems are proposed to monitor the students during the examination and detect unauthorized behavior. In the following, we discuss several automated methods.

Chuang et al. proposed a semi-automatic proctoring system that employs two factors, namely, time delay in answering the questions and head-pose variation, to detect suspicious behavior. Afterward, a human proctor could use more evidence to decide whether a student has cheated (Chuang et al., 2017 ).

Garg et al. ( 2020 ) proposed a system to detect the candidate’s face using Haar Cascade Classifier and deep learning. If the examinee’s face moves out of the examination frame or multiple faces are detected in the frame, the test will automatically be terminated, and the administrator will receive a notification. In (Fayyoumi & Zarrad, 2014 ), a two-second candidate video is taken during the examination period. The images in the video are analyzed to verify whether the examinee is looking somewhere other than their screen. If the test-taker doesn’t focus on their screen, it may indicate cheating behaviors such as looking at an adjacent PC or reading from an external source.

In (Hu et al., 2018 ), the proposed system uses a webcam to monitor candidates' head posture and mouth state to detect abnormal behavior. Through the rule-based reasoning method, the system can detect suspicious behavior such as turning heads and speaking during the online examination.

Prathish et al. ( 2016 ), developed a multimodal system for online proctoring. The system captures audios and videos of the candidates as well as their active windows. If yaw angle variations, audio presence, or window changes are detected in any time frame, it can be considered an indicator of cheating. Consequently, the captured video, audio, and system usage are fed into a rule-based inference system to detect the possibilities of misbehaviors. ProctorTrack is another automated online exam proctoring product that employs facial and audio recognition, body movements, and computer activity monitoring to detect any suspicious action during examination (Norris, 2019 ).

Atoum et al., ( 2017 ) developed a system that can detect a wide variety of cheating behaviors during an online exam using a webcam, wearcam, and microphone. Using wearcam makes it possible to monitor what the student observes. It helps to detect any phone or text in the testing room that is prohibited. In addition, by using the wearcam, the system can detect another form of cheating that is reading from books, notes, etc. Furthermore, the system can estimate the head gaze of the test-taker by combining the information from the webcam and wearcam. Another form of cheating is getting verbal assistance from another person in the same room, or remotely via a phone call. The system can detect this kind of cheating using the microphone and speech detection. Considering the mentioned aspects, the proposed multimedia system can perform automatic online exam proctoring.

Saba et al. ( 2021 ), developed an automatic exam activity recognition system, which monitors the body movements of the students through surveillance cameras and classifies activities into six categories using a deep learning approach. The action categories are normal performing, looking back, watching towards the front, passing gestures to other fellows, watching towards left or right, and other suspicious actions. Movement recognition based on video images is highly dependent on the quality of images. Therefore, Fan et al. ( 2016 ), employed a Microsoft Kinect device to capture the examinee’s gesture. The duration and frequency of the detected action events are then used to distinguish the misbehavior from the normal behavior.

The system presented in (Mengash, 2019 ) includes a thermal detector attached with a surveillance camera and an eye movement tracker. When examinees intend to cheat, their body will emit a specific range of heat, and the emitted heat will trigger the camera to focus and detect the candidate’s face. Then the eye tracker detects eye movements, and the system detects the cheating intentions of the test-taker. There are other biometric-based methods for cheating detection. For example, keystroke and linguistic dynamics can detect stress, which indicates suspicious behavior (Korman, 2010 ).

Diedenhofen and Musch ( 2017 ), developed a JavaScript application called PageFocus, which can be added to the test page and run in the background. Whenever the examinee switches to a page other than the test page, a defocusing event is registered. The script captures when and how frequently defocusing and refocusing events occur on the test page. Another method is to permit students to get to just a couple of sites that are whitelist. If the examinee tries to open a site that is not allowed (one from blacklist), the instructor will be informed through an Android application or Internet (Kasliwal, 2015 ).

Tiong and Lee ( 2021 ), proposed an e-cheating intelligent agent composed of two modules, namely the internet protocol (IP) detector and the behavior detector. The first module could monitor the examinees’ IP addresses and enable the system to alert if a student changes their device or location. The second module detects abnormal behavior based on the speed of answering questions. Another method for cheating detection is comparing the IP addresses of the examinees to check whether two participants are in the same place (Bawarith, 2017 ).

Cheating detection after the exam

Even though different methods are employed to prevent students from cheating, some will still cheat during the examination. Consequently, a bunch of techniques is proposed to detect cheating students after the exam. This way, the reliability of online assessments will be improved. In the following, we will discuss different methods of cheating detection after the exam.

Video monitoring

The University of Amsterdam has developed a system that records the student’s video screen and the environment during the exam. Later a human proctor views the recording and flags and reports any suspicious behavior (Norris, 2019 ). Proctoring software proposed in (Alessio et al., 2017 ), records everything students do during the examination. After the exam, the recordings can be reviewed by the professor, teaching assistants, or employees of the proctoring vendor to identify cheating behaviors.

Human proctoring is a tedious and time-consuming process. To reduce the time and cost of proctoring, an automatic system can be employed to detect and flag suspicious events using machine learning methods. In this regard, Cote et al. ( 2016 ) proposed a system for the automatic creation of video summaries of online exams. The proposed method employs head pose estimations to model a normal and abnormal examinee’s behavior. Afterward, a video summary is created from sequences of detected abnormal behavior. The video summaries can assist remote proctors in detecting cheating after the exam.

Jalali and Noorbehbahani ( 2017 ), implemented an automatic method for cheating detection using a webcam. During the exam, images are recorded every 30 seconds by a webcam for each candidate. After the exam, the recorded images are compared with reference images of that student. If the difference exceeds a threshold, the image will be labeled as a cheating state.

Li et al. ( 2015 ), proposed a Massive Open Online Proctoring framework that consists of three components. First, the Automatic Cheating Detector (ACD) module uses webcam video to monitor students, and automatically flag suspected cheating behavior. Then, ambiguous cases are sent to the Peer Cheating Detector (PCD) module, which asks students to review videos of their peers. Finally, the list of suspicious cheating behaviors is forwarded to the Final Review Committee (FRC) to make the final decision.

Other methods

There are various ways of cheating, and therefore, different methods are used to detect cheating after the exam. For example, one of the cheating behaviors is to collude and work on tests together. However, most learning management systems allow the instructor to view IP addresses. Therefore, if different students submit their assessments by the same IP address in a short time frame, it could be detected and considered as a sign of collusion (Moten et al., 2013 ).

In addition, statistical methods can be used to analyze student responses to assessments and detect common errors and the similarities of answers (Korman, 2010 ). Mott ( 2010 ) stated that the distribution of identical incorrect responses between examinee pairs is a Polya distribution. The degree of cheating for each examination will follow the skewness or third central moment of the distribution.

Predictive analytics systems implicitly collect data while the students interact with the virtual learning environment. The collected data, which include student’s location, access patterns, learning progress, device characteristics, and performance, is used to predict trends and patterns of student behavior. Consequently, any unusual pattern may indicate suspicious behavior (Norris, 2019 ). Answering an examination takes a reasonable amount of time. Therefore, another indicator of dishonest behavior is an extremely short interval between the access time and the completion of the assessments, which can be detected by log time analysis (Moten et al., 2013 ).

In (Bawarith et al., 2017 ), an E-exam management system is proposed that classifies participants as cheating or non-cheating based on two parameters, namely the total time and the number of times the examinee is out of the screen. The focus of the test-taker is recorded using an eye tracker during the exam.

Kasliwal (Kasliwal, 2015 ), designed an online examination tool that captures the network traffic during the exam using a kismet server. The captured package can then be analyzed to determine the frequency of URLs accessed by students. If one of the URLs is getting accessed more frequently or very rarely, it could be considered suspicious.

To detect plagiarism in papers or essay-type questions, platforms such as DupliChecker.com 1 or Turnitin.com 2 can be used. These websites compute a similarity index and show all potential plagiarisms. Based on the similarity index, the instructor decides about further actions (Moten et al., 2013 ).

A weakness of similarity detection software is that it computes the resemblance of a submitted assessment with others' works and cannot detect an original text written by others for the student in question. Stylometry discovers this issue by checking the consistency of the delivered contents with other texts written by the same student. If the style of a text does not match with the previous works of that student, it may indicate complicity (Chirumamilla & Sindre, 2019 ). Opgen-Rhein et al. ( 2018 ) presented an application that employs machine learning methods to learn the programming styles of students. This work is based on the assumption that the programming style of each student is unique, and therefore, the model can be used to verify the author of assignments.

Another way of cheating detection is using a cheating trap, which means creating websites that could be found when the students search for answers. The solutions in trap websites are incorrect, and consequently, dishonest students could be detected (Korman, 2010 ). However, this method contradicts professional ethics.

In addition, the teacher can search the internet by hand periodically and try to find all possible web pages that provide solutions matching the exam questions. This approach could be applied to create a pool of potential solutions from the internet that will be used for plagiarism detection purposes after the exam (Norris, 2019 ).

Cheating prevention

After discussing and analyzing the examinees’ motivations for cheating and the reasons which directly or indirectly drive them to commit unethical actions during online examinations, a great deal of concern is gathered around how to decrease cheating in online exams and lower the probability of these actions taking place.

We categorized cheating prevention into two major types, namely, before-exam prevention and during-exam prevention. Figure ​ Figure12 12 displays the classification of the cheating prevention methods.

An external file that holds a picture, illustration, etc.
Object name is 10639_2022_10927_Fig12_HTML.jpg

Before-exam prevention

To prevent examinees from cheating, there exist several methods that should be implemented before the exam is held. Each will be discussed in detail as follows.

Exam design

In any situation that prevention is concerned, a proven and low-cost approach is a “cheat-resistant” design -A design that inherently prevents some specific cheating types from happening. This is why exam design is so critical. A cheat-resistant exam design, by its nature, prevents a range of possible forms of cheatings from occurring.

One way of achieving a good design is developing personalized exams for each candidate separately. There are several ways to do so, such as parameterization (Manoharan, 2019 ), which is a set of fixed questions with variable assumption values, using data banks with a large pool of questions to select questions randomly (Manoharan, 2019 ), (Norris, 2019 ) or implementing an AI-based method to produce unique exams (Chua & Lumapas, 2019 ).

Li et al. ( 2020 ) has put effort into designing a method for randomizing the question orders for each candidate. Their general idea is to show the questions one by one, and besides that, each student gets a different question at a time. This research mathematically proves that examinees cannot get much cheating gain.

In (Manoharan, 2019 ), the author has investigated an approach to personalizing multiple-choice examinations using the macro. Macro is a computer program fragment that stores data. It has a set of particular inputs for generating random exams based on a question bank. This method could bring freedom and flexibility to the exam design, but it needs basic programming skills.

Another aspect of exam design concentrates specifically on question design. Some of the most valuable methods are listed below.

  • Using novel questions: This type of question design is so unique in design and phrasing that it becomes very challenging to be plagiarized even with searching the web (Nguyen et al., 2020 ).
  • Using knowledge-based questions instead of information-based questions: These questions challenge the level of knowledge. The answers are not on the web or in reference books, and they need critical thinking and reasoning (Nguyen et al., 2020 ).
  • Using essay questions rather than multiple-choice questions: During an online exam, multiple-choice questions are highly susceptible to cheating. Hence, long essay questions are preferred (Varble, 2014 ).
  • Using questions with specific assumptions and facts: Although giving extra and not useful facts may mislead any candidate, even those taking the exam honestly, it will reduce the possibility of web-based plagiarism considerably by making it less straightforward to search online (Nguyen et al., 2020 ).
  • Having an open-book exam: Open-book exam questions should test students’ understanding, critical reasoning, and analytical skills. Since the answers to these questions are not found in any sources directly, open-book exams may reduce the cheating opportunity (Varble, 2014 ), (Backman, 2019 ).

Finally, other methods not placed into the above categories are mentioned below.

Showing questions one by one without the option of going backward is effective in cheating prevention. If it is employed besides strict time limitations and random question series, collaborative cheating will become quite challenging (Chirumamilla & Sindre, 2019 ), (Backman, 2019 ). By setting strict time limitations, the students do not have enough time to handle cheating, therefore, exam cheating efforts are reduced (Backman, 2019 ).

Cluskey et al. ( 2011 ), emphasize low-cost approaches for addressing online exam cheating. They introduce online exam control procedures (OECP) to achieve this target. Taking the exam only at a defined time and avoiding postponing it for any reason, or changing at least one-third of the questions in the next exam, are some instances of these procedures.

Authentication

Authentication is mainly for impersonation prevention before examinations. It could be done classically by checking the school ID badges or government-issued ID by the webcam (Moten et al., 2013 ) or by a more modern approach like biometrics through fingerprint, palm vein scan (Korman, 2010 ), eye vein scan (Kigwana & Venter, 2016 ), voice, and keystroke biometrics (Norris, 2019 ).

An interesting method to prevent cheating has been presented in (Moten et al., 2013 ). Students should call the instructor at a predetermined time to get the password. After the students’ voices are recognized by the instructor, they are authenticated and receive a random password for exam entrance. The password is valid until the end of the exam time limit, thus this method makes cheating more difficult (Moten et al., 2013 ).

The last method of authentication is the one discussed in (Norris, 2019 ) which uses challenge questions. These are the questions only the student will know, for instance, student ID or personal information. In (Ullah, 2016 ), an approach is proposed that creates and consolidates a student’s profile during the learning process. This information is collected in the form of questions and answers. The questions are pre-defined or extracted from a student’s learning activities. A subset of questions is used for authentication, and the students should answer these questions correctly to get access to the online examination. This approach ensures that the person taking the exam is the same one who has completed the course.

Clustering means partitioning students into several groups based on a predefined similarity measure. In (Topîrceanu, 2017 ), random and strategic clustering methods are proposed to break friendships during the exam, as cheating prevention techniques. The advantages of random clustering are time and cost efficiencies; however, it is imprecise, and some clusters may include unbroken friendships.

Breaking friendships through clustering relies on two hypotheses (Topîrceanu, 2017 ):

  • Students tend to communicate and cheat with the people they know and feel close to.
  • An individuals’ relationship with others on social networks is closely related to their real-life relationships with people.

Regarding the second hypothesis, social network analysis could find students’ close friends and people they know. After clustering students, a unique set of exam questions are prepared for each cluster. Consequently, the collaboration of friends to cheat during the online exam becomes challenging.

Lowering cheating motivation

Approaches expressed in this section are based on mental and psychological aspects driving students toward academic misbehaviors, and the work being done to reduce these behaviors through controlling mental drivers.

There are several tactics to develop students’ moral beliefs encouraging them to avoid unethical behaviors. For instance, implementing honor systems helps build a healthy and ethical environment (Korman, 2010 ). Another tactic is clarifying academic integrity and morality ideals through establishing educational integrity programs (Korman, 2010 ).

As Korman ( 2010 ) further investigated, changing the students' perception about the goal of studying, could decrease cheating. This could be done by reminding them why learning matters and how it affects their future success. In (Varble, 2014 ), it is stated that emphasizing the actual value of education will lead to the same result.

Varble ( 2014 ), indicates that by improving students’ skills such as time management skills, their academic performance will be highly enhanced; accordingly, their academic misbehaviors will be declined. The risks of being caught and the significance of punishments, are inversely related to students’ motivation for cheating.

Varble ( 2014 ) also mentions that applying formative assessment rather than summative assessment effectively reduces examinees’ desire for cheating due to improving their learning outcomes. Formative assessments aim to enhance the candidates’ learning performance rather than testing them. On the other hand, summative assessments mostly care about measuring candidates’ knowledge and are used to check if they are eligible to pass the course or not.

As an additional description about getting a formative assessment to work, Nguyen et al., ( 2020 ) mention that increasing the exam frequency forces students to study course materials repeatedly, resulting in longer retention of information and knowledge in students’ minds. This brings about alleviating candidates’ motivation for cheating (Nguyen et al., 2020 ). Varble ( 2014 ), also suggests that reducing the value of each test lowers the reward gained by the cheaters over each test; consequently, the motivation for cheating is declined.

A cost-efficient and effective method to lower cheating motivation is to declare the cheating policy for examinees before the exam starts (Moten et al., 2013 ). Warning students of the consequences of being caught makes them nervous and can significantly decrease cheating. It is necessary to have a confirmation button, so that no excuses can be made by cheaters after the exam. It is such effective that in two experiments, it decreased the number of cheatings by 50% (Corrigan-Gibbs et al., 2015 ). It is worth mentioning that in the online environment, having an honor system is much less effective than warning about the consequences of cheating if being caught (Fontaine et al., 2020 ).

During-exam prevention

Most cheating prevention methods were discussed in the before-exam section; still, there exist some during-exam prevention tactics, which are presented in this sub-section.

Think-aloud request

A rarely mentioned method called Think-aloud request was discussed in (Chirumamilla & Sindre, 2019 ). In this method, a request is sent to the student to think aloud about a specific subject (or current question) at random times during the exam. The student has to respond to the request orally, and the voice is recorded for further investigation and cheating detection (e.g., slow response and voice impersonation detection). This mechanism forces students to continuously be ready for responding, which reduces the chance of student cheating. The authors have also mentioned that this system and its questions could be implemented by an AI agent.

Cheat-resistant systems

Using cheat-resistant systems will inherently prevent some kinds of cheatings, although they are costly to be implemented (Korman, 2010 ). Using a browser tab locker (Chua & Lumapas, 2019 ) is one of them that prevents unauthorized movements and also identifies them by sniffing their network packets. Another method is using wireless jammers (Chirumamilla & Sindre, 2019 ) to disrupt any radio signals (Internet) in an area which usually is the examination hall, during semi-online exams.

In (Chirumamilla & Sindre, 2019 ), some valuable suggestions are given for oral exams. One is conducting the oral exam as a flow of short questions and answers, instead of a long initial question and an extended answer afterward. This is because a flowing dialogue significantly reduces the chance of the examinee following someone else’s cues of the solution. They have also suggested that asking the examinee to respond quickly, will facilitate achieving this goal. Besides that, if candidates delay, they may be known suspicious. If a candidate was detected suspicious by the instructor, it is good to interrupt the current question with a new question. This will neutralize the effort made by a third party to help the candidate answer the question.

Another suggestion presented in (Chirumamilla & Sindre, 2019 ), is to prepare a big pool of questions for oral exams to prevent questions repetition. As a result, the candidates cannot adjust themselves to the questions asked from previous candidates.

Bribery is a kind of organizational cheating. In (Kigwana & Venter, 2016 ) it is indicated that by assigning a random human proctor for the exam right before it started, bribery and beforehand contractions between examinee and proctor would be impossible.

There is no doubt that online education has changed significantly in recent years. One of the main challenges in online education is the validity of the assessment. Specifically, during the COVID19 pandemic, the integrity of online examinations has become a significant concern. Cheating detection and prevention are hot topics in online assessments. In addition, it is needed to conduct more research on cheating motivation and cheating types. In this research, we review and classify online exam cheating comprehensively.

In this review, only publications written in English were investigated. This could result in review bias, however, it is too difficult and infeasible to review studies in all languages. Many systematic mapping researches consider only publications in English, such as (Nikou & Economides, 2018 ) (Martin et al., 2020 ) (Noorbehbahani et al., 2019 ) (Wei et al., 2021 ).

Figure ​ Figure3 3 indicates that the publications trend is decreasing, contrary to the hypothesis that online learning is rising, especially with the emergence of the COVID-19. Notably, in this study, online cheating researches have been reviewed. So, Fig. ​ Fig.3 3 specifically corresponds to online cheating publications not online learning studies in general. However, more investigations of online cheating studies from February 2021 onwards are required to further analyzing the trends.

Several reviewed studies have made no distinction between cheating detection and prevention (Bawarith, 2017 ; Bawarith et al., 2017 ; Korman, 2010 ; Tiong & Lee, 2021 ). They employed detection methods to identify dishonest behaviors. Then preventive actions such as making an alarm to the student, or closing the browser tab are performed to deter student cheating. Regarding this definition of prevention, several studies have applied these terms interchangeably, confusing the reader. In this study, we define cheating prevention as strategies and methods that try to prevent the occurrence of cheating in online exams. Considering the latter definition, we attempted to provide a better review and clearer classification to the readers.

One limitation in this domain is the lack of statistics on the popularity of the types, methods, and tools. In (Sabbah, 2017 ), the most common cheating behaviors and their average risks have been discussed; however, the results are limited to 10 cheating types. Hence, more investigation is required to determine the prevalence of each cheating type and cheating motivation.

An important cheating reason that is overlooked by researchers is learning styles. Students and educators have different preferred learning styles (auditory, visual, kinesthetic and read/write). If teachers and educational institutes don’t consider this issue, the course will not be apprehensible for some students, and consequently, they will be motivated to cheat.

Another issue that should be addressed is to evaluate the feasibility of cheating detection and prevention methods. If the equipment for securing online exams is expensive, the students cannot afford it. Therefore, this factor should be considered when developing detection and prevention methods. Cluskey et al. ( 2011 ), believe that some solutions (e.g., proctors) that detect cheating during online exams are too costly, and their costs outweigh their benefits in some cases. Therefore, cost-effective systems and methods should be implemented.

Privacy and convenience are also vital for examinees. If employed security mechanism for online exams violates privacy and disturbs student convenience, the evaluation will not be practical due to induced stress. Accordingly, these aspects should be considered in cheating detection and prevention systems.

In this study, cheating in online exams is reviewed and classified comprehensively. It provides the reader with valuable and practical insights to address online exam cheating. To mitigate students cheating, first, it is necessary to know cheating motivations and cheating types and technologies. Furthermore, cheating detection and prevention methods are needed to combat forbidden actions. Detection methods without applying prevention methods could not be effective. As cheating detection and prevention methods are evolved, new cheating types and technologies emerge as well. Consequently, no system can mitigate all kinds of cheating in online exams, and more advanced methods should be employed. It seems the most efficient strategy for cheating handling is to lower cheating motivation.

It should be mentioned that we have not covered studies related to technical attacks and intrusions to online exam systems and teacher devices. This topic could be considered for conducting another review study.

The impact of COVID-19 on online learning and cheating in online exams could be analyzed in future work.

Another future work is to explore how ignoring students’ learning styles in teaching and assessment could affect cheating motivation.

Privacy issues, user convenience, and enforced costs of cheating detection and prevention technologies need to be examined in other studies.

In this study, publications from 2010 to 2021 have been reviewed. More investigations are required to review accepted but unpublished studies and publications in 2022.

Table ​ Table1Table 1

Declarations

The authors declare that they have no competing interests.

1 http://www.duplichecker.com

2 http://www.turnitin.com

Publisher's note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Contributor Information

Fakhroddin Noorbehbahani, Email: ri.ca.iu.gne@inahabhebroon .

Azadeh Mohammadi, Email: [email protected] .

Mohammad Aminazadeh, Email: [email protected] , Email: [email protected] .

  • Aisyah, S., Bandung, Y., & Subekti, L. B. (2018). Development of Continuous Authentication System on Android-Based Online Exam Application. In 2018 International Conference on Information Technology Systems and Innovation, ICITSI 2018 (pp. 171–176). Padang, Indonesia: IEEE. 10.1109/ICITSI.2018.8695954
  • Alessio H. The Impact of Video Proctoring in Online Courses. Journal on Excellence in Col- Lege Teaching. 2018; 29 (3):1–10. [ Google Scholar ]
  • Alessio HM, Malay N, Maurer K, Bailer AJ, Rubin B. Examining the Effect of Proctoring on Online Test Scores. Online Learning. 2017; 2013 (1):1–16. [ Google Scholar ]
  • Amigud A, Lancaster T. 246 reasons to cheat: An analysis of students’ reasons for seeking to outsource academic work. Computers and Education. 2019; 134 :98–107. doi: 10.1016/j.compedu.2019.01.017. [ CrossRef ] [ Google Scholar ]
  • Arnautovski, L. (2019). Face recognition technology in the exam identity authentication system - implementation concept. In 2nd International Scientific Conference MILCON’19 (pp. 51–56). Olsztyn, Poland.
  • Atoum Y, Chen L, Liu AX, Hsu SDH, Liu X. Automated Online Exam Proctoring. IEEE Transactions on Multimedia. 2017; 19 (7):1609–1624. doi: 10.1109/TMM.2017.2656064. [ CrossRef ] [ Google Scholar ]
  • Backman, J. (2019). Student s ’ Experiences of Cheating in the Online Exam Environment.
  • Bawarith, H. R. (2017). Student Cheating Detection System in E-exams . KING ABDULAZIZ UNIVERSITY.
  • Bawarith R, Basuhail A, Fattouh A, Gamalel-din PS. E-exam Cheating Detection System. International Journal of Advanced Computer Science and Applications. 2017; 8 (4):176–181. doi: 10.14569/IJACSA.2017.080425. [ CrossRef ] [ Google Scholar ]
  • Bilen E, Matros A. Online Cheating Amid COVID-19. Journal of Economic Behavior & Organization. 2021; 182 :196–211. doi: 10.1016/j.jebo.2020.12.004. [ CrossRef ] [ Google Scholar ]
  • Chirumamilla, A., & Sindre, G. (2019). Mitigation of Cheating in Online Exams: Strengths and Limitations of. In Biometric Authentication in Online Learning Environments (pp. 47–68). IGI Global. 10.4018/978-1-5225-7724-9.ch003
  • Chua, S. S., & Lumapas, Z. R. (2019). Online Examination System with Cheating Prevention Using Question Bank Randomization and Tab Locking. 2019 4th International Conference on Information Technology (InCIT) , 126–131.
  • Chuang CY, Craig SD, Femiani J. Detecting probable cheating during online assessments based on time delay and head pose. Higher Education Research and Development. 2017; 36 (6):1123–1137. doi: 10.1080/07294360.2017.1303456. [ CrossRef ] [ Google Scholar ]
  • Cluskey GR, Jr, Ehlen CR, Raiborn MH. Thwarting Online Exam Cheating without Proctor Supervision. 2011; 4 :1–7. [ Google Scholar ]
  • Corrigan-Gibbs, H., Gupta, N., Northcutt, C., Cutrell, E., & Thies, W. (2015). Deterring cheating in online environments. ACM Transactions on Computer-Human Interaction , 22 (6). 10.1145/2810239
  • Cote, M., Jean, F., Albu, A. B., & Capson, D. (2016). Video Summarization for Remote Invigilation of Online Exams. In 2016 IEEE Winter Conference on Applications of Computer Vision (pp. 1–9). NY, USA.
  • Curran K, Middleton G, Doherty C. Cheating in Exams with Technology. International Journal of Cyber Ethics in Education. 2011; 1 (2):54–62. doi: 10.4018/ijcee.2011040105. [ CrossRef ] [ Google Scholar ]
  • Dendir S, Maxwell RS. Cheating in online courses: Evidence from online proctoring. Computers in Human Behavior Reports. 2020; 2 :100033. doi: 10.1016/j.chbr.2020.100033. [ CrossRef ] [ Google Scholar ]
  • Diedenhofen B, Musch J. PageFocus: Using paradata to detect and prevent cheating on online achievement tests. Behavior Research Methods. 2017; 49 (4):1444–1459. doi: 10.3758/s13428-016-0800-7. [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Dobrovska D. Technical Student Electronic Cheating on Examination. In: Auer ME, Guralnick D, Uhomoibhi J, editors. Interactive Collaborative Learning. Springer International Publishing; 2017. pp. 525–531. [ Google Scholar ]
  • Fan, Z., Xu, J., Liu, W., & Cheng, W. (2016). Gesture based Misbehavior Detection in Online Examination. In The 11th International Conference on Computer Science & Education (pp. 234–238). NagoyaF, Japan.
  • Fayyoumi A, Zarrad A. Novel Solution Based on Face Recognition to Address Identity Theft and Cheating in Online Examination Systems. Advances in Internet of Things. 2014; 4 (April):5–12. doi: 10.4236/ait.2014.42002. [ CrossRef ] [ Google Scholar ]
  • Fluck AE. An international review of eExam technologies and impact. Computers & Education. 2019; 132 :1–15. doi: 10.1016/j.compedu.2018.12.008. [ CrossRef ] [ Google Scholar ]
  • Fontaine S, Frenette E, Hébert M. Exam cheating among Quebec’s preservice teachers : the influencing factors. International Journal for Educational Integrity. 2020; 16 (14):1–18. [ Google Scholar ]
  • Garg, K., Verma, K., Patidar, K., Tejra, N., & Petidar, K. (2020). Convolutional Neural Network based Virtual Exam Controller. In Proceedings of the International Conference on Intelligent Computing and Control Systems, ICICCS 2020 (pp. 895–899). Secunderabad, India. 10.1109/ICICCS48265.2020.9120966
  • Gruenigen, D. Von, de Azevedo e Souza, F. B., Pradarelli, B., Magid, A., & Cieliebak, M. (2018). Best practices in e-assessments with a special focus on cheating prevention. In 2018 {IEEE} Global Engineering Education Conference, {EDUCON} 2018, Santa Cruz de Tenerife, Tenerife, Islas Canarias, Spain, April 17-20, 2018 (pp. 893–899). IEEE. 10.1109/EDUCON.2018.8363325
  • He, H., Zheng, Q., Li, R., & Dong, B. (2018). Using Face Recognition to Detect “ Ghost Writer ” Cheating in Examination. In Edutainment, Lecture Notes in Computer Science (Vol. 11462, pp. 389–397). Springer International Publishing. 10.1007/978-3-030-23712-7
  • Holden, O., Kuhlmeier, V., & Norris, M. (2020). Academic Integrity in Online Testing: A Research Review. 10.31234/osf.io/rjk7g
  • Hu, S., Jia, X., & Fu, Y. (2018). Research on Abnormal Behavior Detection of Online Examination Based on Image Information. In 10th International Conference on Intelligent Human-Machine Systems and Cybernetics (IHMSC) (Vol. 02, pp. 88–91). Hangzhou, China: IEEE. 10.1109/IHMSC.2018.10127
  • Hylton K, Levy Y, Dringus LP. Computers & Education Utilizing webcam-based proctoring to deter misconduct in online exams. Computers & Education. 2016; 92–93 :53–63. doi: 10.1016/j.compedu.2015.10.002. [ CrossRef ] [ Google Scholar ]
  • Idemudia, S., Rohani, M. F., Siraj, M., & Othman, S. H. (2016). A Smart Approach of E-Exam Assessment Method Using Face Recognition to Address Identity Theft and Cheating. International Journal of Computer Science and Information Security , 14 (10), 515–522. Retrieved from https://sites.google.com/site/ijcsis/
  • Jalali, K., & Noorbehbahani, F. (2017). An Automatic Method for Cheating Detection in Online Exams by Processing the Students Webcam Images. In 3rd Conference on Electrical and Computer Engineering Technology (E-Tech 2017), Tehran, Iran (pp. 1–6). Tehran, Iran.
  • Kasliwal, G. (2015). Cheating Detection in Online Examinations.
  • Kigwana, I., & Venter, H. (2016). Proposed high-level solutions to counter online examination fraud using digital forensic readiness techniques. Proceedings of the 11th International Conference on Cyber Warfare and Security, ICCWS 2016 , 407–414.
  • Korman, M. (2010). Behavioral detection of cheating in online examination. Retrieved from https://pure.ltu.se/ws/files/31188849/LTU-DUPP-10112-SE.pdf
  • Lancaster, T., & Clarke, R. (2017). Rethinking Assessment By Examination in the Age of Contract Cheating. Plagiarism Across Europe and Beyond 2017 .
  • Li, M., Sikdar, S., Xia, L., & Wang, G. (2020). Anti-cheating Online Exams by Minimizing the Cheating Gain, (May). 10.20944/preprints202005.0502.v1
  • Li, X., Yueran, K. C., & Alexander, Y. (2015). Massive Open Online Proctor : Protecting the Credibility of MOOCs Certificates, 1129–1137.
  • Maeda, M. (2019). Exam cheating among Cambodian students : when , how , and why it happens. Compare: A Journal of Comparative and International Education , 1–19. 10.1080/03057925.2019.1613344
  • Manoharan S. Cheat-resistant multiple-choice examinations using personalization. Computers and Education. 2019; 130 :139–151. doi: 10.1016/j.compedu.2018.11.007. [ CrossRef ] [ Google Scholar ]
  • Martin F, Sun T, Westine CD. A systematic review of research on online teaching and learning from 2009 to 2018. Computers & Education. 2020; 159 :104009. doi: 10.1016/j.compedu.2020.104009. [ PMC free article ] [ PubMed ] [ CrossRef ] [ Google Scholar ]
  • Mengash, H. (2019). Automated Detection for Student Cheating During Written Exams: An Updated Algorithm Supported by Biometric of Intent. In First International Conference on Computing (pp. 303–3111). Riyadh, Saudi Arabia. 10.1007/978-3-030-36368-0
  • Migut, G., Koelma, D., Snoek, C. G., & Brouwer, N. (2018). Cheat Me Not: Automated Proctoring Of Digital Exams On Bring-Your-Own-Device. In The 23rd Annual ACM Conference On In- novation And Technology In Computer Science Education (p. 388). New York, NY, USA.
  • Moten JM, Jr, Fitterer A, Brazier E, Leonard J, Brown A, Texas A. Examining Online College Cyber Cheating Methods and Prevention Measures. Electronic Journal of E-Learning. 2013; 11 (2):139–146. [ Google Scholar ]
  • Mott JH. The Detection and Minimization of Cheating During Concurrent Online Assessments Using Statistical Methods. Collegiate Aviation Review. 2010; 28 (2):32–46. [ Google Scholar ]
  • Nguyen, J. G., Keuseman, K. J., & Humston, J. J. (2020). Minimize Online Cheating for Online Assessments During COVID-19 Pandemic. 10.1021/acs.jchemed.0c00790
  • Nikou SA, Economides AA. Mobile-based assessment: A literature review of publications in major referred journals from 2009 to 2018. Computers & Education. 2018; 125 :101–119. doi: 10.1016/j.compedu.2018.06.006. [ CrossRef ] [ Google Scholar ]
  • Noorbehbahani, F., Salehi, F., & Jafar Zadeh, R. (2019). A systematic mapping study on gamification applied to e-marketing. Journal of Research in Interactive Marketing , 13 (3). 10.1108/JRIM-08-2018-0103
  • Norris M. University online cheating - how to mitigate the damage. Research in Higher Education Journal. 2019; 37 :1–20. [ Google Scholar ]
  • Opgen-Rhein, J., Küppers, B., & Schroeder, U. (2018). An application to discover cheating in digital exams. In ACM International Conference Proceeding Series . Koli, Finland. 10.1145/3279720.3279740
  • Page, M. J., Moher, D., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., & Mckenzie, J. E. (2021). PRISMA 2020 explanation and elaboration: Updated guidance and exemplars for reporting systematic reviews. BMJ, 372 ,. 10.1136/bmj.n160 [ PMC free article ] [ PubMed ]
  • Parks RF, Lowry PB, Wigand RT, Agarwal N, Williams TL. Why students engage in cyber-cheating through a collective movement: A case of deviance and collusion. Computers and Education. 2018; 125 :308–326. doi: 10.1016/j.compedu.2018.04.003. [ CrossRef ] [ Google Scholar ]
  • Peytcheva-Forsyth, R., Aleksieva, L., & Yovkova, B. (2018). The impact of technology on cheating and plagiarism in the assessment – The teachers’ and students’ perspectives. In AIP Conference Proceedings 2048 (Vol. 020037, pp. 1–11).
  • Prathish, S., Athi Narayanan, S., & Bijlani, K. (2016). An intelligent system for online exam monitoring. In Proceedings - 2016 International Conference on Information Science, ICIS 2016 (pp. 138–143). Dublin, Ireland. 10.1109/INFOSCI.2016.7845315
  • Reisenwitz TH. Examining the Necessity of Proctoring Online Exams. Journal of Higher Education Theory and Practice. 2020; 20 (1):118–124. [ Google Scholar ]
  • Saba T, Rehman A, Jamail NSM, Marie-Sainte SL, Raza M, Sharif M. Categorizing the Students’ Activities for Automated Exam Proctoring Using Proposed Deep L2-GraftNet CNN Network and ASO Based Feature Selection Approach. IEEE Access. 2021; 9 :47639–47656. doi: 10.1109/ACCESS.2021.3068223. [ CrossRef ] [ Google Scholar ]
  • Sabbah, Y. W. (2017). Security of Online Examinations. In Data Analytics and Decision Support for Cybersecurity (pp. 157–200). Springer International Publishing.
  • Srikanth M, Asmatulu R. Modern Cheating Techniques, Their Adverse Effects on Engineering Education and preventions. International Journal of Mechanical Engineering Education. 2014; 42 (2):129–140. doi: 10.7227/IJMEE.0005. [ CrossRef ] [ Google Scholar ]
  • Tiong, L. C. O., & Lee, H. J. (2021). E-cheating Prevention Measures: Detection of Cheating at Online Examinations Using Deep Learning Approach -- A Case Study, XX (Xx), 1–9. Retrieved from http://arxiv.org/abs/2101.09841
  • Topîrceanu A. Breaking up friendships in exams: A case study for minimizing student cheating in higher education using social network analysis. Computers and Education. 2017; 115 :171–187. doi: 10.1016/j.compedu.2017.08.008. [ CrossRef ] [ Google Scholar ]
  • Traore, I., Nakkabi, Y., Saad, S., & Sayed, B. (2017). Ensuring Online Exam Integrity Through Continuous Biometric Authentication. In Information Security Practices (pp. 73–81). Springer International Publishing. 10.1007/978-3-319-48947-6
  • Turner, S. W., & Uludag, S. (2013). Student perceptions of cheating in online and traditional classes. Proceedings - Frontiers in Education Conference, FIE , (October 2013), 1131–1137. 10.1109/FIE.2013.6685007
  • Ullah, A. (2016). Security and Usability of Authentication by Challenge Questions in Online Examination . University of Hertfordshire.
  • Valverde-Berrocoso, J., Garrido-Arroyo, M. del C., Burgos-Videla, C., & Morales-Cevallos, M. B. (2020). Trends in Educational Research about e-Learning: A Systematic Literature Review (2009–2018). Sustainability , 12 (12). 10.3390/su12125153
  • Varble, D. (2014). Reducing Cheating Opportunities in Online Test Online Tests, 3 (3).
  • Watson, G., & Sottile, J. (2010). Cheating in the Digital Age: Do Students Cheat More in Online Courses?. Online Journal of Distance Learning Administration , 13 (1).
  • Wei X, Saab N, Admiraal W. Assessment of cognitive, behavioral, and affective learning outcomes in massive open online courses: A systematic literature review. Computers & Education. 2021; 163 :104097. doi: 10.1016/j.compedu.2020.104097. [ CrossRef ] [ Google Scholar ]
  • Weiner JA, Hurtz GM. A comparative Study of Online Remote Proctored Vs Onsite Proctored. Journal of Applied Testing Technology. 2017; 18 (1):13–20. [ Google Scholar ]
  • Wong, S., Yang, L., Riecke, B., Cramer, E., & Neustaedter, C. (2017). Assessing the usability of smartwatches for academic cheating during exams. Proceedings of the 19th International Conference on Human-Computer Interaction with Mobile Devices and Services, MobileHCI 2017 . 10.1145/3098279.3098568
  • Xiong Y, Suen HK. Assessment approaches in massive open online courses: Possibilities, challenges and future directions. International Review of Education. 2018; 64 (2):241–263. doi: 10.1007/s11159-018-9710-5. [ CrossRef ] [ Google Scholar ]

Education: Why Do Students Cheat? Essay

Introduction.

Cheating is a common phenomenon among students at all levels of education. It happens in high schools, colleges, and universities. In addition, it occurs in both traditional and online settings of learning. Students have sufficient time and resources that give them the opportunity to work hard and pass their exams through personal effort (Davis et al. 35). This begs the question: why do students cheat?

Research has revealed that several reasons and factors are responsible for cheating in schools. A study conducted to find out the prevalence of cheating in colleges found out that approximately 75 percent of college students cheat at one time in the course of their stay at school (Davis et al. 36).

There is need to find a lasting solution because cheating does not reflect the real potential of students. Effects of cheating are reflected in students’ performance at workplaces. Students cheat because many schools define excellence through grades, lack of self-confidence with one’s ability, pressure from parents and teachers to do well, and poor teaching methods that do not fulfill the goals of learning (McCabe et al. 51).

Students cheat because many institutions of learning value grades more than attainment of knowledge (Davis et al. 36). Many school systems have placed more value on performing well in tests and examination than on the process of learning. When assessment tests and examinations play a key role in determining the future of a student, cheating becomes an appropriate channel to perform well (McCabe et al. 51).

Few institutions encourage mastery of learning materials rather than tests. In such institutions, students develop a positive attitude towards education because they are not worried about their performance in tests (Davis et al. 37). They focus more on the attainment of knowledge and skills. Psychologists argue that placing high value on tests teaches students to value short-term effects of education and ignore the long-term effects.

True or false questions, multiple choice questions, and matching tests are examples of assessments used by institutions that value grades (McCabe et al. 53). On the other hand, essay questions, research papers, and term papers are methods used to teach in institutions that value the learning experience and attainment of knowledge more than grades (Davis et al. 39).

Lack of confidence in their abilities motivates students to cheat. Lack of adequate skills and knowledge are some of the reasons that lead to the loss of confidence by students. According to McCabe et al,

“Teachers who focus more on grades have poor methods of teaching compared to teachers who value knowledge.” (51).

Students who think that they are not smart enough to cheat are more likely to cheat in order to get good grades. Learning that puts emphasis on grades involves repetition and memorization of learning materials (Davis et al. 41). Students forget much of the knowledge gained after sitting for their exams. Bored students have little or no connection to their teachers and are therefore likely to cheat because they are never prepared.

Such learning methods make learning boring and uninteresting (McCabe et al. 53). It does not motivate students to work hard and attain knowledge that could be useful in their careers. Interactive learning endows students with the confidence, which makes them believe in their ability to handle all kinds of challenges and situations (Davis et al. 42).

Students cheat because of pressure exerted on them by their parents and teachers to attain good grades (McCabe et al. 54). Many teachers and parents gauge the abilities of students by their grades. Many colleges use grades as a way of choosing the students who are qualified to join college. Self-efficacy is an important aspect of learning because it gives students the confidence to handle various tasks (McCabe et al. 55).

Teachers can cultivate a sense of self-efficacy in students by believing in all students regardless of their grades. However, many teachers alienate students who get low grades and give more attention to students that get high grades. On the other hand, many parents promise to take their children to college only if they get high grades. This motivates students to cheat in order to gain entry into college.

It is important for teachers and parents to find the weaknesses and strengths of all students and help them to exploit their potential. Sidelining some students is wrong and a good enough reason to cheat.

Another reason that explains why students teach is poor leaning and teaching methods (Davis et al.44). Good learning methods involve movements, inventions, creativity, discussions, and interactions. These methods improve comprehension among students and facilitate proper sharing of knowledge. However, many teachers find these methods tedious and time-consuming.

The aftermath is resentment form students because the teachers use methods that make learning boring. People learning through various methods. In addition, different students have different learning needs (McCabe et al. 56). Therefore, using a single teaching method does not serve the needs of all students. Some students develop a negative attitude towards learning and their teacher.

These students are likely to cheat in exams. Teachers should evaluate their students in order to develop teaching methods that cater to them all (McCabe et al. 58). Otherwise, some students might feel neglected in case they fail to comprehend certain subjects or disciplines.

Finally, students cheat because of laziness and lack of focus. According to Parker, students cheat because of lack f goo morals and laziness. According to Parker,

“A startling number attributed variously to the laziness of today’s students, their lack of a moral compass, or the demands of a hypercompetitive society.” (McCabe et al. 59)

She further argues that society demands much of students. This leads to cheating because students feel under pressure to perform well. Laziness is common among students. Students who waste their time on unimportant things have little time to study and do their homework (McCabe et al. 62).

They are unprepared during exams and result to cheating in order to perform well. On the other hand, many employees determine the capabilities of potential employees based on their grades. This motivates students to cheat in order to get high grades.

Reasons for cheating include lack of self-confidence in one’s ability to perform well, pressure from parents and teachers, and poor teaching methods that do not fulfill the learning needs of all students. In addition, many learning institutions place great value on grades rather than the acquisition of knowledge. Cheating is a common phenomenon among students at different levels of learning.

More research needs to be conducted in order to ascertain why students cheat. Further research is necessary because different students cheat for various reasons. Moreover, it is important for teachers to lay more emphasis on the acquisition of knowledge and skills rather than good grades.

Students have different learning needs that are satisfied using different teaching and learning methods. Teachers should evaluate their students in order to determine the most important teaching methods that cater to the learning needs of all students.

Works Cited

Davis, Stephen, Drinan Patrick, and Gallant Tricia. Cheating in School: What We Know and What We Can Do . New York: John Wiley & Sons, 2011. Print.

McCabe, Donald, Butterfield Kenneth, and Trevino Linda. Cheating in College: Why Students Do It and What Educators Can Do About It. New York: JHU Press, 2012. Print.

  • Chicago (A-D)
  • Chicago (N-B)

IvyPanda. (2023, October 31). Education: Why Do Students Cheat? https://ivypanda.com/essays/education-why-do-students-cheat/

"Education: Why Do Students Cheat?" IvyPanda , 31 Oct. 2023, ivypanda.com/essays/education-why-do-students-cheat/.

IvyPanda . (2023) 'Education: Why Do Students Cheat'. 31 October.

IvyPanda . 2023. "Education: Why Do Students Cheat?" October 31, 2023. https://ivypanda.com/essays/education-why-do-students-cheat/.

1. IvyPanda . "Education: Why Do Students Cheat?" October 31, 2023. https://ivypanda.com/essays/education-why-do-students-cheat/.

Bibliography

IvyPanda . "Education: Why Do Students Cheat?" October 31, 2023. https://ivypanda.com/essays/education-why-do-students-cheat/.

  • Why People Cheat
  • Why Students Cheat in Public Schools?
  • Why College Students Cheat: Discussion
  • Marginal Analysis of Cheating
  • "Why We Cheat" by Fang Ferric and Arturo Casadevall
  • Why Kids at Harvard Cheat
  • Cheating in High Schools: Issue Analysis
  • Is Cheating Okay or Not: Discussion
  • Academic Integrity: Cheating and Plagiarism
  • Cheating Plagiarism Issues
  • St. Louis City Charter Schools Analysis
  • Taxes and Education: A Cooperation That Went Awry
  • Understanding Youth: Consumption, Gender, and Education
  • Challenges Faced by Young Immigrants
  • Education Issues: The Shrinking Enrollment Problem

What Is Grammarly, and Is It Cheating?

cheating in essays

What is Grammarly? We take a deep dive into this online writing resource. How does it help students, and is it ok to use this tool?

Key Takeaways

  • What is Grammarly? It’s a web-based automated writing assistant that results in writing improvements. Grammar, punctuation and spelling mistakes are corrected, but the writer/author still has the responsibility of ensuring their works’ substance.
  • Both students and educators can use the grammar checker to improve their respective writing style. While it may seem like cheating, it isn’t because material modifications aren’t automatically made to the document being checked. The writer/author doesn’t become lazy just by using the writing assistant.
  • Grammarly isn’t just for academic writing either! Even professional writers including business writing professionals use Grammarly for emails, press releases, and other official documents.

Grammarly is an automated writing assistant, an online resource designed to help students spot and correct errors in grammar, spelling, and punctuation. In the simplest terms, Grammarly is a web-based editing application that helps students improve the quality of their writing.

And boy do they need it. I should know. For a decade, I made a living helping students cheat. I worked for an array of contract cheating websites, where students would pay writers like me to complete their book reports, research projects, creative writing assignments, admission essays, thesis statements, and even doctoral dissertations.

So believe me when I tell you that students at every single level, and with every kind of professional ambition, struggle to write grammatically competent sentences. Grammarly aims to help. But does it help? And more importantly, does it help too much? In other words, is using Grammarly cheating?

If you squint your eyes, Grammarly might look a little bit like the shady custom paper writing sites where I once earned my living. Like Grammarly, most paper writing companies describe their services as editorial assistance. But this claim is a thin veil for what paper writing companies actually sell-which is the opportunity to outsource your academic responsibilities wholesale to a hired gun.

By contrast, Grammarly’s offer of editorial assistance seems to be genuine. In fact, there’s a good reason that Grammarly looks, on the surface, like many of the illicit services where you can buy tailor-made papers. It’s because both services address the same need. That is, both custom paper writing services and web-based editing assistance programs recognize that far too many students don’t know how to write.

Students at every level of education struggle with grammar, diction, and punctuation” – AcademicInfluence.com TWEET POST

Students at every level of education—from high school English students to doctoral candidates trudging through dissertations—struggle with grammar, diction, and punctuation. They struggle to organize their ideas, cite their sources, or build a case around a cohesive argument. Writing is an educational requirement and yet, for too many students, it is a source of anxiety and dread. Grammarly can’t necessarily fix all of these issues for you, but it can help you write better, and unlike customer paper writing companies, it isn’t cheating.

Grammarly At a Glance

The popular grammar checker, available in both a free and paid version, is offered to students by numerous colleges and universities. This is the strongest proof that the Grammarly online editor is no longer considered cheating among educators and students.

Furthermore, educators can use the grammar checker to improve their students’ writing style and, thus, enable them to develop their communication skills. By combining their traditional feedback mechanisms with Grammarly’s automated feedback, their teaching strategies related to the improvements of the students’ writing skills become more effective.

How, you ask?

Grammatical mistakes become a thing of the past for students. The plagiarism checker, Grammarly’s review feature, is a virtual writing assistant that adheres to widely accepted grammar rules, also makes life easy in this respect!

Perhaps what makes this application even more appealing is that it can be integrated easily with the programs we constantly use, like email, texting, tweeting, and word processing.

How Teacher’s Feedback Complements Grammarly’s Automated Feedback

The ability to effectively communicate ideas, opinions, and answers in written form is an essential skill among college students. Effective writing skills are also vital for academic progress, professional development, and personal gain among undergraduate and graduate students.

For educators, the opportunity to provide constructive feedback on their students’ writing skills is a common practice, ostensibly to improve their writing style, too.

At its core, the teacher’s feedback aids in bridging the gap between what students know and what areas need improvement, from grammar mistakes to style mistakes. This is true whether in an English composition class or in a business management course, both of which demand effective writing skills.

But teachers deal with more than a few challenges in providing constructive feedback on their students’ writing skills! The process itself requires significant time and effort, and it’s made more complicated by contextual issues, large class sizes, and excessive workloads.

This is where the effective integration between traditional teacher feedback and automated feedback from Grammarly comes in. On one hand, Grammarly provides a wide range of writing tools that check language-related errors including the use of prepositions and determiners, issues related to wordiness and conciseness, and grammar. These types of errors are considered low-order mistakes that a thorough human-initiated editing can spot.

On the other hand, teacher feedback tends to cover high-order concerns that focus on content, substance, and organization of ideas. But, of course, teachers are also aware of the language-related errors that Grammarly detects and provides suggestions for immediate corrections. With the use of Grammarly, they can devote more time and effort to offering feedback on high-order concerns and, thus, make their feedback more effective and efficient.

Teachers are also more able to provide constructive criticism on content, give praise on the students’ written work, and ask for and provide relevant information. Students benefit from the use of Grammarly since surface errors can be detected and corrected immediately, thus, avoiding the scrutiny of teachers. There’s also the sense of greater acceptance of automated feedback among students, perhaps because it’s seen as less personal.

The result of the complementary relationship between teacher’s feedback and Grammarly’s automated feedback: More successful revisions and, thus, more cohesive and substantive written work among students!

How Teachers Can Use Grammarly to Improve Their Students’ Writing Skills

Teachers can use the writing tools on Grammarly for a wide range of purposes. Furthermore, it isn’t just neurotypical students who will benefit from these strategies—students with learning difficulties, special education students, and even advanced learners can improve their writing skills!

Further benefits of using Grammarly:

  • “Grammarly Goals” set specific writing goals for each student
  • Provides consistent, constructive and fast feedback to the students
  • Shows each student their growth by pointing out the fewer mistakes they are making over time
  • Explains the errors and their possible solutions
  • Gives mini-lessons in content, substance, and organization

Teachers and students must work together to make the most of Grammarly cost and maximize this tool’s features. This way, it can truly serve its purpose for both parties!

Universities That Provide Grammarly Services to Their Students

The popularity of Grammarly Premium and Grammarly for Education versions among colleges and universities, as well as K-12 schools, continues to grow! Here are several examples of four-year institutions of higher education that provide Grammarly services to their faculty and staff members as well as current students.

  • National Louis University
  • Chapman University
  • Liberty University
  • University of Arizona Global Campus
  • University of Utah Graduate School
  • Walden University
  • Iowa State University
  • Lone Star College
  • Marshall University
  • Southern University of New Orleans

What Is Grammarly?

Grammarly is a free online writing assistant-though you can pay for an enhanced level of assistance (which we’ll get to in a minute). Grammarly is one of the leading entities in a writing enhancement software sector that includes competitors like ProWritingAid and Ginger .

The primary function of Grammarly is to help users identify grammatical errors, improper sentence structure, punctuation mistakes, and spelling typos in their writing. It can best be described as an editorial tool, one that can improve the user’s ability to produce grammatically correct writing.

How Does Grammarly Work?

Grammarly can be used either directly on the service’s website, or it can be added as a free extension to your browser. In either environment, you can write your document in real-time, or you can paste text that you’ve already written into the text editor for review. As you enter content into the text editor provided by the Grammarly extension—or directly on the Grammarly website—an automated editor will highlight spelling, grammar, and punctuation errors. The editor will also offer explanations for why these errors have been flagged, and will consequently offer suggestions for how you can correct your mistakes.

Tools like Grammarly have potential value in improving time effectiveness for instructors and students.” – AcademicInfluence.com TWEET POST

These services are completely free of charge, and research suggests they have the potential to be quite valuable for students and instructors alike. According to a 2019 study in the Journal of Academic Language & Learning , Automated Writing Evaluation (AWE) tools like Grammarly have potential value in improving “time effectiveness” for instructors and students. The study finds “that students made more revisions if they used an AWE...[and that] these revisions were more likely to be surface-level revisions relating to form, suggesting that automated tools are more appropriate for grammar or spelling reviews than for higher level language issues.”

In other words, the free application can be very helpful in addressing the basic mechanics issues that students experience in their writing. For help with higher level language issues, you can pay Grammarly a monthly premium ($11.66 at the time of writing). This will give you access to a wide array of editorial services, including support in the following areas:

  • Sounding fluent in English
  • Communicating your ideas clearly
  • Avoiding plagiarism
  • Using more dynamic synonyms
  • Refining tone and delivery
  • Writing with concision

If you need even more personalized support, Grammarly also provides access to professional contract editing and proofreading services for paying customers. And perhaps it is this service offering that is likeliest to raise an eyebrow. Just how intensive are these editorial services? And to what extent do writing assistants—whether through an automated application or independently-contracted humans—undermine the creation of original work?

This is where educators may be given pause. We can all agree that improving the basic use of grammar and punctuation is a positive development, no matter how one comes by it. And adding a layer of editorial polish to the work can certainly make it a more digestible experience for the grader. But where is the line drawn between editorial polish and contract cheating. What’s the difference between Grammarly and, something like writemypaper4me.org, for instance?

Well, for one thing, there are no grammar errors on Grammarly’s homepage. But it goes deeper than that...

What’s The Difference Between Using Grammarly and Cheating?

The answer is actually readily found in a Grammarly’s origin story. The online tool has its roots in the anti-plagiarism business. According to The Stock Dork , “Ukrainian Co-founders Alex Shevchenko, Max Lytvyn, and Dmytro Lider started Grammarly in 2009. The development of Grammarly began with the co-founders’ 2004 plagiarism detection start-up called MyDropbox. This software was sold to universities in 2007 as a licensed product. This sale provided funding for the development of the browser extension we know today.”

This initial source of revenue makes Grammarly more akin to something like plagiarism detection leader turnitin.com, than to custom cheating services like writemypaper4me. And true to its roots, Grammarly’s suite of services works more like a helpful advisor standing over your shoulder than an outsourced laborer, delegated to do the work for you.

Grammarly is a utility that you can use to improve your writing, but it won't provide you with the substance at the heart of this writing. That's still your job.” – AcademicInfluence.com TWEET POST

At its heart, Grammarly is a utility that you can use to improve your writing, but it won’t provide you with the substance at the heart of this writing. That’s still your job. And that’s what separates Grammarly from the rather larger online market of cheating services. At most schools (unless expressly forbidden), editorial support is encouraged (or at least it should be).

The Office of Academic Integrity at Johns Hopkins University Bloomberg School of Public Health offers a useful summation on editorial assistance, noting that it is indeed permissible to enlist the services of an editor for course and capstone work, “whether or not the editor receives any compensation in exchange for their work.”

Importantly, the Office of Academic Integrity specifies that “using an editor is only permissible if the editor provides stylistic and not substantive modifications to the course, capstone, or thesis related assignment.”

Stylistic modifications, says the Bloomberg School, include support with spelling, grammar, punctuation, clarity, referencing, and alternative phrasing. All of these editorial inputs are considered acceptable.

By contrast, substantive modifications , says Johns Hopkins, include writing new sentences which introduce new information, rewriting content to introduce new materials, adding or deleting references, or “any other modification that changes the meaning of what you’ve written in a material way.”

Grammarly will not conduct your research, produce your ideas, or build your arguments. And this matters a great deal.” – AcademicInfluence.com TWEET POST

Whether you simply use Grammarly’s free browser extension to spot-check typos, you pay for its premium service to spruce up your wordflow, or you go as far as commissioning the assistance of a professional proofer or editor, Grammarly will not conduct your research, produce your ideas, or build your arguments. And this matters a great deal.

That’s because, by contrast, these are exactly the types of substantive contributions that independently-contracted cheaters will make on behalf of their student customers. Contract paper writers conduct research, produce new ideas, craft arguments, and construct novel sentences to support these arguments. This is materially different from the stylistic services offered by Grammarly.

According to the Helpful Professor , a blog which, in the interest of full disclosure, offers its author a commission for link-throughs, assures that Grammarly won’t do any of the following:

  • “Tell you what to write about to get higher grades.
  • Give answers to your assignment questions.
  • Get grammar right every time.
  • Automatically make changes to your work.”

Again, outside of getting grammar the right every time, custom paper-writing companies literally do all of these things.

Should students use Grammarly?

According to Grammarly’s own research , internal surveys reveal that “75% of its users are afraid of being misunderstood.”

This is a powerful imperative driving people to its services. And it’s also the one thing that Grammarly users and contract cheating customers do have in common. They are both contending with a real and palpable fear. Language and writing deficiencies are rampant at every level of education, and at startling levels even in the upper reaches of the ivory tower.

There are many ways to manage this fear. Hiring a cheating service is certainly one way. But Grammarly presents a far more advisable way to manage the fear, and possibly even to vanquish it.

Should educators use Grammarly?

Put us solidly in the camp of those who advocate the use of Grammarly, not just for students, but for educators as well, especially those working in higher education. At this level of instruction, we’re guessing you haven’t the time, energy or inclination to police punctuation, correct spelling, and train in the basic rules of grammar. These are skills students should have learned on the way to the university.

Unfortunately, many don’t. So unless the goal of each and every class is to grade compositional ability, it’s clear that many students simply need this resource. In fact, there’s a compelling argument that instructors who decline to assist students in basic compositional matters should make Grammarly a mandatory part of the writing process, at least for students who demonstrate the need.

To return briefly to the business of contract cheating, it’s clear to anybody in this illicit sector that the client base is made up primarily of those who demonstrate such a need, whether because English is a second language, or because they simply lack the necessary academic tools to write. Whatever the reason, the reality is that colleges are not in the business of teaching students how to write. Writing is a building block skill. Students are supposed to have mastered this skill before reaching a level of education where deeper thinking is required. But in the absence both of this skill, and the academic assistance required to attain this skill, many students resort to contract cheating.

Automated writing assistance gives the student a chance to focus on the actual substance of an assignment...” – AcademicInfluence.com TWEET POST

By contrast, automated writing assistance gives the student a chance to focus on the actual substance of an assignment, instead of the implementation of rules which the student has already struggled to master for the better part of a 20-year education.

There is a case to be made that much deep thinking in college (and probably in the professional world) is prevented, or at least garbled, by the basic anxiety and distraction of writing incompetence. Grammarly seems like a fantastic way to offset that anxiety, and perhaps even make students more competent writers by simply exposing them to regular, continuous, and real-time feedback on their errors.

This underscores the core benefit of Grammarly to educators, insofar as it does a job that most college-level instructors either lack the time to do themselves or that they may even see as beneath their station as educators. In other words, unless you’re here to coach your students in their writing, be glad that Grammarly is there to do the job.

For study starters, influential books, and much more, check out our full collection of study guides .

Or get tips on studying, student life, and much more with a look at our Student Resources .

How Common is Cheating in Online Exams and did it Increase During the COVID-19 Pandemic? A Systematic Review

  • Open access
  • Published: 04 August 2023

Cite this article

You have full access to this open access article

  • Philip M. Newton   ORCID: orcid.org/0000-0002-5272-7979 1 &
  • Keioni Essex 1  

7096 Accesses

11 Citations

32 Altmetric

Explore all metrics

Academic misconduct is a threat to the validity and reliability of online examinations, and media reports suggest that misconduct spiked dramatically in higher education during the emergency shift to online exams caused by the COVID-19 pandemic. This study reviewed survey research to determine how common it is for university students to admit cheating in online exams, and how and why they do it. We also assessed whether these self-reports of cheating increased during the COVID-19 pandemic, along with an evaluation of the quality of the research evidence which addressed these questions. 25 samples were identified from 19 Studies, including 4672 participants, going back to 2012. Online exam cheating was self-reported by a substantial minority (44.7%) of students in total. Pre-COVID this was 29.9%, but during COVID cheating jumped to 54.7%, although these samples were more heterogenous. Individual cheating was more common than group cheating, and the most common reason students reported for cheating was simply that there was an opportunity to do so. Remote proctoring appeared to reduce the occurrence of cheating, although data were limited. However there were a number of methodological features which reduce confidence in the accuracy of all these findings. Most samples were collected using designs which makes it likely that online exam cheating is under-reported, for example using convenience sampling, a modest sample size and insufficient information to calculate response rate. No studies considered whether samples were representative of their population. Future approaches to online exams should consider how the basic validity of examinations can be maintained, considering the substantial numbers of students who appear to be willing to admit engaging in misconduct. Future research on academic misconduct would benefit from using large representative samples, guaranteeing participants anonymity.

Similar content being viewed by others

cheating in essays

The Impact of Peer Assessment on Academic Performance: A Meta-analysis of Control Group Studies

Kit S. Double, Joshua A. McGrane & Therese N. Hopfenbeck

cheating in essays

Anti-procrastination Online Tool for Graduate Students Based on the Pomodoro Technique

cheating in essays

Unmasking academic cheating behavior in the artificial intelligence era: Evidence from Vietnamese undergraduates

Hung Manh Nguyen & Daisaku Goto

Avoid common mistakes on your manuscript.

Introduction

Distance learning came to the fore during the global COVID-19 pandemic. Distance learning, also referred to as e-learning, blended learning or mobile learning (Zarzycka et al., 2021 ) is defined as learning with the use of technology where there is a physical separation of students from the teachers during the active learning process, instruction and examination (Armstrong-Mensah et al., 2020 ). This physical separation was key to a sector-wide response to reducing the spread of coronavirus.

COVID prompted a sudden, rapid and near-total adjustment to distance learning (Brown et al., 2022 ; Pokhrel & Chhetri, 2021 ). We all, staff and students, had to learn a lot, very quickly, about distance learning. Pandemic-induced ‘lockdown learning’ continued, in some form, for almost 2 years in many countries, prompting predictions that higher education would be permanently changed by the pandemic, with online/distance learning becoming much more common, even the norm (Barber et al., 2021 ; Dumulescu & Muţiu, 2021 ). One obvious potential change would be the widespread adoption of online assessment methods. Online exams offer students increased flexibility, for example the opportunity to sit an exam in their own homes. This may also reduce some of the anxiety experienced during attending in-person exams in an exam hall, and potentially reduce the administrative cost to universities.

However, assessment poses many challenges for distance learning. Summative assessments, including exams, are the basis for making decisions about the grading and progress of individual students, while aggregated results can inform educational policy such as curriculum or funding decisions (Shute & Kim, 2014 ). Thus, it is essential that online summative assessments can be conducted in a way that allows for their basic reliability and validity to be maintained. During the pandemic, Universities shifted, very rapidly, in-person exams to an online format, with limited time to ensure that these methods were secure. There were subsequent media reports that academic misconduct was now ‘endemic’, with universities supposedly ‘turning a blind eye’ towards cheating (e.g. Henry, 2022 ; Knox, 2021 ). However, it is unclear whether this media anxiety is reflected in the real-world experience in universities.

Dawson defines e-cheating as ‘cheating that uses or is enabled by technology’ (Dawson, 2020 , p. 4). Cheating itself is then defined as the gaining of an unfair advantage (Case and King 2007, in Dawson, 2020 , P4). Cheating poses an obvious threat to the validity of online examinations, a format which relies heavily on technology. Noorbebahani and colleagues recently reviewed the research literature on a specific form of e-cheating; online exam cheating in higher education. They found that students use a variety of methods to gain an unfair advantage, including accessing unauthorized materials such as notes and textbooks, using an additional device to go online, collaborating with others, and even outsourcing the exam to be taken by someone else. These findings map onto the work of Dawson, 2020 , who found a similar taxonomy when considering ‘e-cheating’ more generally. These can be driven by a variety of motivations, including a fear of failure, peer pressure, a perception that others are cheating, and the ease with which they can do it (Noorbehbahani et al., 2022 ). However, it remains unclear how many students are actually engaged in these cheating behaviours. Understanding the scale of cheating is an important pragmatic consideration when determining how, or even if, it could/should be addressed. There is an extensive literature on the incidence of other types of misconduct, but cheating in online exams has received less attention than other forms of misconduct such as plagiarism (Garg & Goel, 2022 ).

One seemingly obvious response to concerns about cheating in online exams is to use remote proctoring systems wherein students are monitored through webcams and use locked-down browsers. However, the efficacy of these systems is not yet clear, and their use has been controversial, with students feeling that they are ‘under surveillance’, anxious about being unfairly accused of cheating, or of technological problems (Marano et al., 2023 ). A recent court ruling in the USA found that the use of a remote proctoring system to scan a student’s private resident prior to taking an online exam was unconstitutional (Bowman, 2022 ), although, at the time of writing, this case is ongoing (Witley, 2023 ). There is already a long history of legal battles between the proctoring companies and their critics (Corbyn, 2022 ), and it is still unclear whether these systems actually reduce misconduct. Alternatives have been offered in the literature, including guidance for how to prepare online exams in a way that reduces the opportunity for misconduct (Whisenhunt et al., 2022 ), although it is unclear whether this guidance is effective either.

There is a large body of research literature which examines the prevalence of different types of academic dishonesty and misconduct. Much of this research is in the form of survey-based self-report studies. There are some obvious problems with using self-report as a measure of misconduct; it is a ‘deviant’ or ‘undesirable’ behaviour, and so those invited to participate in survey-based research have a disincentive to respond truthfully, if at all, especially if there is no guarantee of anonymity. There is also some evidence that certain demographic characteristics associated with an increased likelihood of engaging in academic misconduct are also predictive of a decreased likelihood of responding voluntarily to surveys, meaning that misconduct is likely under-reported when a non-representative sampling method is used such as convenience sampling (Newton, 2018 ).

Some of these issues with quantifying academic misconduct can be partially addressed by the use of rigorous research methodology, for example using representative samples with a high response rate, and clear, unambiguous survey items (Bennett et al., 2011 ; Halbesleben & Whitman, 2013 ). Guarantees of anonymity are also essential for respondents to feel confident about answering honestly, especially when the research is being undertaken by the very universities where participants are studying. A previous systematic review of academic misconduct found that self-report studies are often undertaken with small, convenience samples with low response rates (Newton, 2018 ). Similar findings were reported when reviewing the reliability of research into the prevalence of belief in the Learning Styles neuromyth, suggesting that this is a wider concern within survey-based education research (Newton & Salvi, 2020 ).

However, self-report remains one of the most common ways that academic misconduct is estimated, perhaps in part because there are few other ways to meaningfully measure it. There is also a basic, intuitive objective validity to the method; asking students whether they have cheated is a simple and direct approach, when compared to other indirect approaches to quantifying misconduct, based on (for example) learner analytics, originality scores or grade discrepancies. There is some evidence that self-report correlates positively with actual behaviour (Gardner et al., 1988 ), and that data accuracy can be improved by using methods which incentivize truth-telling (Curtis et al., 2022 ).

Here we undertook a systematic search of the literature in order to identify research which studied the prevalence of academic dishonesty in summative online examinations in Higher Education. The research questions were thus.

How common is self-report of cheating in online exams in Higher Education? (This was the primary research question, and studies were only included if they addressed this question).

Did cheating in online exams increase during the COVID-19 pandemic?

What are the most common forms of cheating?

What are student motivations for cheating?

Does online proctoring reduce the incidence of self-reported online exam cheating?

The review was conducted according to the principles of the PRISMA statement for reporting systematic reviews (Moher et al., 2009 ) updated for 2020 (Page et al., 2021 ). We adapted this methodology based on previous work systematically reviewing survey-based research in education, misbelief and misconduct (Fanelli, 2009 ; Newton, 2018 ; Newton & Salvi, 2020 ), based on the limited nature of the outcomes reported in these studies (i.e. percentage of students engaging in a specific behaviour).

Search Strategy and Information Sources

Searches were conducted in July and August 2022. Searches were first undertaken using the ERIC education research database (eric.ed.gov) and then with Google Scholar. We used Google Scholar since it covers grey literature (Haddaway et al., 2015 ), including unpublished Masters and PhD theses (Jamali & Nabavi, 2015 ) as well as preprints. The Google Scholar search interface is limited, and the search returns can include non-research documents search as citations, university policies and handbooks on academic integrity, and multiple versions of papers (Boeker et al., 2013 ). It is also not possible to exclude the results of one search from another. Thus it is not possible for us to report accurately the numbers of included papers returned from each term. ‘Daisy chaining’ was also used to identify relevant research from studies that had already been identified using the aforementioned literature searches, and recent reviews on the subject (Butler-Henderson & Crawford, 2020 ; Chiang et al., 2022 ; Garg & Goel, 2022 ; Holden et al., 2021 ; Noorbehbahani et al., 2022 ; Surahman & Wang, 2022 ).

Selection Process

Search results were individually assessed against the inclusion/exclusion criteria, starting with the title, followed by the abstract and then the full text. If a study clearly did not meet the inclusion criteria based on the title then it was excluded. If the author was unsure, then the abstract was reviewed. If there was still uncertainty, then the full text was reviewed. When a study met the inclusion criteria (see below), the specific question used in that study to quantify online exam cheating was then itself also used as a search term. Thus the full list of search terms used is shown in Supplementary Online Material S1 .

Eligibility Criteria

The following criteria were used to determine whether to include samples. Many studies included multiple datasets (e.g. samples comprising different groups of students, across different years). The criteria here were applied to individual datasets.

Inclusion Criteria

Participants were asked whether they had ever cheated in an online exam (self-report).

Participants were students in Higher Education.

Reported both total sample size and percent of respondents answering yes to the relevant exam cheating questions, or sufficient data to allow those metrics to be calculated.

English language publication.

Published 2013-present, with data collected 2012-present. We wanted to evaluate a 10 year timeframe. In 2013, at the beginning of this time window, the average time needed to publish an academic paper was 12.2 months, ranging from 9 months (chemistry) to 18 months (Business) (Björk & Solomon, 2013 ). It would therefore be reasonable to conclude that a paper published in 2013 was most likely submitted in 2012. Thus we included papers whose publication date was 2013 onwards, unless the manuscript itself specifically stated that the data were collected prior to 2012.

Exclusion Criteria

Asking participants would they cheat in exams (e.g. Morales-Martinez et al., 2019 ), or did not allow for a distinction between self-report of intent and actual cheating (e.g. Ghias et al., 2014 ).

Phrasing of survey items in a way that does not allow for frequency of online exam cheating to be specifically identified according to the criteria above. Wherever necessary, study authors were contacted to clarify.

Asking participants ‘how often do others cheat in online exams’.

Asking participants about helping other students to cheat.

Schools, community colleges/further education, MOOCS.

Cheating in formative exams, or did not distinguish between formative/summative (e.g. quizzes/exams (e.g. Alvarez, Homer et al., 2022 ; Costley, 2019 ).

Estimates of cheating from learning analytics or other methods which did not include directly asking participants if they had cheated.

Published in a predatory journal (see below).

Predatory Journal Criteria

Predatory journals and publishers are defined as “ entities which prioritize self-interest at the expense of scholarship and are characterised by false or misleading information, deviation from best editorial and publication practices, a lack of transparency, and/or the use of aggressive and indiscriminate solicitation practices .” (Grudniewicz et al., 2019 ). The inclusion of predatory journals in literature reviews may therefore have a negative impact on the data, findings and conclusions. We followed established guidelines for the identification and exclusion of predatory journals from the findings (Rice et al., 2021 ):

Each study which met the inclusion criteria was checked for spelling, punctuation and grammar errors as well as logical inconsistencies.

Every included journal was checked against open access criteria;

If the journal was listed on the Directory of Open Access Journals (DOAJ) database (DOAJ.org) then it was considered to be non-predatory.

If the journal was not present in the DOAJ database, we looked for it in the Committee on Publication Ethics (COPE) database (publicationethics.org). If the journal was listed on the COPE database then it was considered to be non-predatory.

Only one paper met these criteria, containing logical inconsistencies and not listed on either DOAJ or COPE. For completeness we also searched an informal list of predatory journals ( https://beallslist.net ) and the journal was listed there. Thus the study was excluded.

All data were extracted by both authors independently. Where the extracted data differed between authors then this was clarified through discussion. Data extracted were, where possible, as follows:

Author/date

Year of Publication

Year study was undertaken . If this was a range (e.g. Nov 2016-Apr 2017) then the most recent year was used as the data point (e.g. 2017 in the example). If it was not reported when the study was undertaken, then we recorded the year that the manuscript was submitted. If none of these data were available then the publication year was entered as the year that the study was undertaken.

Publication type. Peer reviewed journal publication, peer reviewed conference proceedings or dissertation/thesis.

Population size. The total number of participants in the population, from which the sample is drawn and supposed to represent. For example, if the study is surveying ‘business students at University X’, is it clear how many business students are currently at University X?

Number Sampled. The number of potential participants, from the population, who were asked to fill in the survey.

N . The number of survey respondents.

Cheated in online summative examinations . The number of participants who answered ‘yes’ to having cheated in online exams. Some studies recorded the frequency of cheating on a scale, for example a 1–5 Likert scale from ‘always’ to ‘never’. In these cases, we collapsed all positive reports into a single number of participants who had ever cheated in online exams. Some studies did not ask for a total rate of cheating (i.e. cheating by any/all methods) and so, for analysis purposes the method with the highest rate of cheating was used (see Results).

Group/individual cheating. Where appropriate, the frequency of cheating via different methods was recorded. These were coded according to the highest level of the framework proposed by Noorbehbahani (Noorbehbahani et al., 2022 ), i.e. group vs. individual. More fine-grained analysis was not possible due to the number and nature of the included studies.

Study Risk of Bias and Quality metrics

Response rate . Defined as “ the percentage of people who completed the survey after being asked to do so” (Halbesleben & Whitman, 2013 ).

Method of sampling. As one of the following; convenience sampling, where all members of the population were able to complete the survey, but data were analysed from those who voluntarily completed it. ‘Unclassifiable’ where it was not possible to determine the sampling method based on the data provided (no other sampling methods were used in the included studies).

Ethics. Was it reported whether ethical/IRB approval had been obtained? (note that a recording of ‘N’ here does not mean that ethical approval was not obtained, just that it is not reported)

Anonymity . Were participants assured that they were answering anonymously? Students who are found to have cheated in exams can be given severe penalties, and so a statement of anonymity (not just confidentiality) is important for obtaining meaningful data.

Synthesis Methods

Data are reported as mean ± SEM unless otherwise stated. Datasets were tested for normal distribution using a Kolmogorov-Smirnov test prior to analysis and parametric tests were used if the data were found to be normally distributed. The details of the specific tests used are in the relevant results section.

25 samples were identified from 19 studies, containing a total of 4672 participants. Three studies contained multiple distinct samples from different participants (e.g. data was collected in different years (Case et al., 2019 ; King & Case, 2014 ), or were split by two different programmes of study (Burgason et al., 2019 ), or whether exams were proctored or not (Owens, 2015 ). Thus, these samples were treated as distinct in the analysis since they represent different participants. Multiple studies asked the same groups of participants about different types of cheating, or the conditions under which cheating happens. The analysis of these is explained in the relevant results subsection. A summary of the studies is in Table  1 . The detail of each individual question asked to study participants is in supplementary online data S2 .

Descriptive Metrics of Studies

Sampling method.

23/25 samples were collected using convenience sampling. The remaining two did not provide sufficient information to determine the method of sampling.

Population Size

Only two studies reported the population size.

Sample Size

The average sample size was 188.7 ± 36.16.

Response Rate

Fifteen of the samples did not report sufficient information to allow a response rate to be calculated. The ten remaining samples returned an average response rate of 55.6% ±10.7, with a range from 12.2 to 100%.

Eighteen of the 23 samples (72%) stated that participant responses were collected anonymously.

Seven of the 25 samples (28%) reported that ethical approval was obtained for the study.

How Common is Self-Reported Online Exam Cheating in Higher Education?

44.7% of participants (2088/4672) reported engaging in some form of cheating in online exams. This analysis included those studies where total cheating was not recorded, and so the most commonly reported form of cheating was substituted in. To check the validity of this inclusion, a separate analysis was conducted of only those studies where total cheating was recorded. In this case, 42.5% of students (1574/3707) reported engaging in some form of cheating. An unpaired t -test was used to compare the percentage cheating from each group (total vs. highest frequency), and returned no significant difference ( t (23) = 0.5926, P = 0.56).

Did the Frequency of Online Exam Cheating Increase During COVID?

The samples were classified as having been collected pre-COVID, or during COVID (no samples were identified as having been collected ‘post-COVID’). One study (Jenkins et al., 2022 ) asked the same students about their behaviour before, and during, COVID. For the purposes of this specific analysis, these were included as separate samples, thus there were 26 samples, 17 pre-COVID and 9 during COVID. Pre-COVID, 29.9% (629/2107) of participants reported cheating in online exams. During COVID this figure was 54.7% (1519/2779).

To estimate the variance in these data, and to test whether the difference was statistically significant, the percentages of students who reported cheating for each study were grouped into pre-and during-COVID and the average calculated for each group. The average pre-COVID was 28.03% ± 4.89, (N = 17), whereas during COVID the average is 65.06 ± 9.585 (N = 9). An unpaired t- test was used to compare the groups, and returned a statistically significant difference ( t (24) = 3.897, P = 0.0007). The effect size (Hedges g) was 1.61, indicating that the COVID effect was substantial (Fig.  1 ).

figure 1

Increased self-report of cheating in online exams during the COVID-19 pandemic. Data represent the mean ± SEM of the percentages of students who self-report cheating in online exams pre-and-during COVID. *** = P < 0.005 unpaired t- test

To test the reliability of this result, we conducted a split sample test as in other systematic reviews of the prevalence of academic misconduct (Newton, 2018 ), wherein the data for each group were ordered by size and then every other sample was extracted into a separate group. So, the sample with the lowest frequency of cheating was allocated into Group A, the next smallest into Group B, the next into Group A, and so on. This was conducted separately for the pre-COVID and ‘during COVID’. Each half-group was then subject to an unpaired t- test to determine whether cheating increased during COVID in that group. Each group returned a significant difference ( t (10) = 2.889 P = 0.0161 for odd-numbered samples, t (12) = 2.48, P = 0.029 for even-numbered samples. This analysis gives confidence that the observed increase in self-reported online exam cheating during the pandemic is statistically robust, although there may be other variables which contribute to this (see discussion).

Comparison of Group vs. Individual Online Exam Cheating in Higher Education

In order to consider how best to address cheating in online exams, it is important to understand the specific behaviours of students. Many studies asked multiple questions about different types of cheating, and these were coded according to the typology developed by Noorbehbehani which has a high-level code of ‘individual’ and ‘group’ (Noorbehbahani et al., 2022 ). More fine-grained coding was not possible due to the variance in the types of questions asked of participants (see S2). ‘Individual’ cheating meant that, whatever the type of cheating, it could be achieved without the direct help of another person. This could be looking at notes or textbooks, or searching for materials online. ‘Group’ cheating meant that another person was directly involved, for example by sharing answers, or having them sit the exam on behalf of the participant (contract cheating). Seven studies asked their participants whether they had engaged in different forms of cheating where both formats (Group and Individual) were represented. For each study we ranked all the different forms of cheating by the frequency with which participants reported engaging in it. For all seven of the studies which asked about both Group and Individual cheating, the most frequently reported cheating behaviour was an Individual cheating behaviour. For each study we calculated the difference between the two by subtracting the frequency of the most commonly reported Group cheating behaviour from the frequency of the most commonly reported Individual cheating behaviour. The average difference was 23.32 ± 8.0% points. These two analyses indicate that individual forms of cheating are more common than cheating which involves other people.

Effect of Proctoring/Lockdown Browsers

The majority of studies did not make clear whether their online exams were proctored or unproctored, or whether they involved the use of related software such as lockdown browsers. Thus it was difficult to conduct definitive analyses to address the question of whether these systems reduce online exam cheating. Two studies did specifically address this issue in both cases there was a substantially lower rate of self-reported cheating where proctoring systems were used. Jenkins et al., in a study conducted during COVID, asked participants whether their instructors used ‘anti cheating software (e.g., Lockdown Browser)’ and, if so, whether they had tried to circumvent it. 16.5% admitted to doing this, compared to the overall rate of cheating of 58.4%. Owens asked about an extensive range of different forms of misconduct, in two groups of students whose online exams were either proctored or unproctored. The total rates of cheating in each group did not appear to be reported. The most common form of cheating was the same in both groups (‘web search during an exam’) and was reported by 39.8% of students in the unproctored group but by only 8.5% in the proctored group (Owens, 2015 ).

Reasons Given for Online Exam Cheating

Ten of the studies asked students why they cheated in online exams. These reasons were initially coded by both authors according to the typology provided in (Noorbehbahani et al., 2022 ). Following discussion between the authors, the typology was revised slightly to that shown in Table  1 , to better reflect the reasons given in the reviewed studies.

Descriptive statistics (the percentages of students reporting the different reasons as motivations for cheating) are shown in Table  2 . Direct comparison between the reasons is not fully valid since different studies asked for different options, and some studies offered multiple options whereas some only identified one. However in the four studies that offered multiple options to students, three of them ranked ‘opportunities to cheat’ as the most common reason (and the fourth study did not have this as an option). Thus students appear to be most likely to cheat in online exams when there is an opportunity to do so.

We reviewed data from 19 studies, including 25 samples totaling 4672 participants. We found that a substantial proportion of students, 44.7%, were willing to admit to cheating in online summative exams. This total number masks a finding that cheating in online exams appeared to increase considerably during the COVID-19 pandemic, from 29.9 to 54.7%. These are concerning findings. However, there are a number of methodological considerations which influence the interpretation of these data. These considerations all lead to uncertainty regarding the accuracy of the findings, although a common theme is that, unfortunately, the issues highlighted seem likely to result in an under-reporting of the rate of cheating in online exams.

There are numerous potential sources of error in survey-based research, and these may be amplified where the research is asking participants to report on sensitive or undesirable behaviours. One of these sources of error comes from non-respondents, i.e. how confident can we be that those who did not respond to the survey would have given a similar pattern of responses to those that did (Goyder et al., 2002 ; Halbesleben & Whitman, 2013 ; Sax et al., 2003 ). Two ways to minimize non-respondent error are to increase the sample size as a percentage of the population, and then simply to maximise the percentage of the invited sample who responds to the survey. However only nine of the samples reported sufficient information to even allow the calculation of a response rate, and only two reported the total population size. Thus for the majority of samples reported here, we cannot even begin to estimate the extent of the non-response error. For those that did report sufficient information, the response rate varied considerably, from 12.2% to 100, with an average of 55.6%. Thus a substantial number of the possible participants did not respond.

Most of the surveys reviewed here were conducted using convenience sampling, i.e. participation was voluntary and there was no attempt to ensure that the sample was representative, or that the non-respondents were followed up in a targeted way to increase the representativeness of the sample. People who voluntarily respond to survey research are, compared to the general population, older, wealthier, more likely to be female and educated (Curtin et al., 2000 ). In contrast, individuals who engage in academic misconduct are more likely to be male, younger, from a lower socioeconomic background and less academically able (reviewed in Newton, 2018 ). Thus the features of the survey research here would suggest that the rates of online exam cheating are under-reported.

A second source of error is measurement error – for example, how likely is it that those participants who do respond are telling the truth? Cheating in online exams is clearly a sensitive subject for potential survey participants. Students who are caught cheating in exams can face severe penalties. Measurement error can be substantial when asking participants about sensitive topics, particularly when they have no incentive to respond truthfully. Curtis et al. conducted an elegant study to investigate rates of different types of contract cheating and found that rates were substantially higher when participants were incentivized to tell the truth, compared to traditional self-report (Curtis et al., 2022 ). Another method to increase truthfulness is to use a Randomised Response Technique, which increases participants confidence that their data will be truly anonymous when self-reporting cheating (Mortaz Hejri et al., 2013 ) and so leads to increased estimates of the prevalence of cheating behaviours when measured via self-report (Kerkvliet, 1994 ; Scheers & Dayton, 1987 ). No studies reviewed here reported any incentivization or use of a randomized response technique, and many did not report IRB (ethical) approval or that participants were guaranteed anonymity in their responses. Absence of evidence is not evidence of absence, but it again seems reasonable to conclude that the majority of the measurement error reported here will also lead to an under-reporting of the extent of online exam cheating.

However, there are very many variables associated with likelihood of committing academic misconduct (also reviewed in Newton, 2018 ). For example, in addition to the aforementioned variables, cheating is also associated with individual differences such as personality traits (Giluk & Postlethwaite, 2015 ; Williams & Williams, 2012 ), motivation (Park et al., 2013 ), age and gender (Newstead et al., 1996 ) and studying in a second language (Bretag et al., 2019 ) as well as situational variables such as discipline studied (Newstead et al., 1996 ). None of the studies reviewed here can account for these individual variables, and this perhaps explains, partly, the wide variance in the studies reported, where the percentage of students willing to admit to cheating in online exams ranges from essentially none, to all students, in different studies. However, almost all of the variables associated with differences in likelihood of committing academic misconduct were themselves determined using convenience sampling. In order to begin to understand the true nature, scale and scope of academic misconduct, there is a clear need for studies using large, representative samples, with appropriate methodology to account for non-respondents, and rigorous analyses which attempt to identify those variables associated with an increased likelihood of cheating.

There are some specific issues which must be considered when determining the accuracy of the data showing an increase in cheating during COVID. In general, the pre-COVID group appears to be a more homogenous set of samples, for example, 11 of the 16 samples are from students studying business, and 15 of the 16 pre-COVID samples are from the USA. The during-COVID samples are from a much more diverse range of disciplines and countries. However the increase in self-reported cheating was replicated in the one study which directly asked students about their behaviour before, and during, the pandemic; Jenkins and co-workers found that 28.4% of respondents were cheating pre-COVID, nearly doubling to 58.4% during the pandemic (Jenkins et al., 2022 ), very closely mirroring the aggregate results.

There are some other variables which may be different between the studies and so affect the overall interpretation of the findings. For example, the specific questions asked of participants, as shown in the supplemental online material ( S2 ) reveal that most studies do not report on the specific type of exam (e.g. multiple choice vs. essay based), or the exam duration, weighting, or educational level. This is likely because the studies survey groups of students, across programmes. Having a more detailed understanding of these factors would also inform strategies to address cheating in online exams.

It is difficult to quantify the potential impact of these issues on the accuracy of the data analysed here, since objective measures of cheating in online exams are difficult to obtain in higher education settings. One way to achieve this is to set up traps for students taking closed-book exams. One study tested this using a 2.5 h online exam administered for participants to obtain credit from a MOOC. The exam was set up so that participants would “likely not benefit from having access to third-party reference materials during the exam” . Students were instructed not to access any additional materials or to communicate with others during the exam. The authors built a ‘honeypot’ website which had all of the exam questions on, with a button ‘click to show answer’. If exam participants went online and clicked that button then the site collected information which allowed the researchers to identify the unique i.d. of the test-taker. This approach was combined with a more traditional analysis of the originality of the free-text portions of the exam. Using these methods, the researchers estimated that ~ 30% of students were cheating (Corrigan-Gibbs et al., 2015b ). This study was conducted in 2014-15, and the data align reasonably well with the pre-COVID estimates of cheating found here, giving some confidence that the self-report measures reported here are in the same ball park as objective measures, albeit from only one study.

The challenges of interpreting data from small convenience samples will also affect the analysis of the other measures made here; that students are more likely to commit misconduct on their own, because they can. The overall pattern of findings though does align somewhat, suggesting that concerns may be with the accuracy of the numbers rather than a fundamental qualitative problem (i.e. it seems reasonable to conclude that students are more likely to cheat individually, but it is challenging to put a precise number to that finding). For example, the apparent increase in cheating during COVID is associated with a rapid and near-total transition to online exams. Pre-covid, the use of online exams would have been a choice made by education providers, presumably with some efforts to ensure the security and integrity of that assessment. During COVID lockdown, the scale and speed of the transition to online exams made it much more challenging to put security measures in place, and this would therefore almost certainly have increased the opportunities to cheat.

It was challenging to gather more detail about the specific types of cheating behaviour, due to the considerable heterogeneity between the studies regarding this question. The sector would benefit from future large-scale research using a recognized typology, for example those proposed by Dawson (Dawson, 2020 , p. 112) or Noorbehbahani (Noorbehbahani et al., 2022 ).

Another important recommendation that will help the sector in addressing the problem is for future survey-based research of student dishonesty to make use of the abundant methodological research undertaken to increase the accuracy of such surveys. In particular the use of representative sampling, or analysis methods which account for the challenges posed by unrepresentative samples. Data quality could also be improved by the use of question formats and survey structures which motivate or incentivize truth-telling, for example by the use of methods such as the Randomised Response Technique which increase participant confidence that their responses will be truly anonymous. It would also be helpful to report on key methodological features of survey design; pilot testing, scaling, reliability and validity, although these are commonly underreported in survey based research generally (Bennett et al., 2011 ).

Thus an aggregate portrayal of the findings here is that students are committing misconduct in significant numbers, and that this has increased considerably during COVID. Students appear to be more likely to cheat on their own, rather than in groups, and most commonly motivated by the simple fact that they can cheat. Do these findings and the underlying data give us any information that might be helpful in addressing the problem?

One technique deployed by many universities to address multiple forms of online exam cheating is to increase the use of remote proctoring, wherein student behaviour during online exams is monitored, for example, through a webcam, and/or their online activity is monitored or restricted. We were unable to draw definitive conclusions about the effectiveness of remote proctoring or other software such as lockdown browsers to reduce cheating in online exams, since very few studies stated definitively that the exams were, or were not, proctored. The two studies that examined this question did appear to show a substantial reduction in the frequency of cheating when proctoring was used. Confidence in these results is bolstered by the fact that these studies both directly compared unproctored vs. proctored/lockdown browser. Other studies have used proxy measures for cheating, such as time engaged with the exam, and changes in exams scores, and these studies have also found evidence for a reduction in misconduct when proctoring is used (e.g. (Dendir & Maxwell, 2020 ).

The effectiveness (or not) of remote proctoring to reduce academic misconduct seems like an important area for future research. However there is considerable controversy about the use of remote proctoring, including legal challenges to its use and considerable objections from students, who report a net negative experience, fuelled by concerns about privacy, fairness and technological challenges (Marano et al., 2023 ), and so it remains an open question whether this is a viable option for widespread general use.

Honour codes are a commonly cited approach to promoting academic integrity, and so (in theory) reducing academic misconduct. However, empirical tests of honour codes show that they do not appear to be effective at reducing cheating in online exams (Corrigan-Gibbs et al., 2015a , b ). In these studies the authors likened them to ‘terms and conditions’ for online sites, which are largely disregarded by users in online environments. However in those same studies the authors found that replacing an honour code with a more sternly worded ‘warning’, which specifies the consequences of being caught, was effective at reducing cheating. Thus a warning may be a simple, low-cost intervention to reduce cheating in online exams, whose effectiveness could be studied using appropriately conducted surveys of the type reviewed here.

Another option to reduce cheating in online exams is to use open-book exams. This is often suggested as a way of simultaneously increasing the cognitive level of the exam (i.e. it assesses higher order learning) (e.g. (Varble, 2014 ), and was suggested as a way of reducing the perceived, or potential increase in academic misconduct during COVID (e.g. (Nguyen et al., 2020 ; Whisenhunt et al., 2022 ). This approach has an obvious appeal in that it eliminates the possibility of some common forms of misconduct, such as the use of notes or unauthorized web access (Noorbehbahani et al., 2022 ; Whisenhunt et al., 2022 ), and can even make this a positive feature, i.e. encouraging the use of additional resources in a way that reflects the fact that, for many future careers, students will have access to unlimited information at their fingertips, and the challenge is to ensure that students have learned what information they need and how to use it. This approach certainly fits with our data, wherein the most frequently reported types of misconduct involved students acting alone, and cheating ‘because they could’. Some form of proctoring or other measure may still be needed in order to reduce the threat of collaborative misconduct. Perhaps most importantly though, it is unclear whether open-book exams truly reduce the opportunity for, and the incidence of, academic misconduct, and if so, how might we advise educators to design their exams, and exam question, in a way that delivers this as well as the promise of ‘higher order’ learning. These questions are the subject of ongoing research.

In summary then, there appears to be significant levels of misconduct in online examinations in Higher Education. Students appear to be more likely to cheat on their own, motivated by an examination design and delivery which makes it easy for them to do so. Future research in academic integrity would benefit from large, representative samples using clear and unambiguous survey questions and guarantees of anonymity. This will allow us to get a much better picture of the size and nature of the problem, and so design strategies to mitigate the threat that cheating poses to exam validity.

Alvarez, Homer, T., Reynald, S., Dayrit, Maria Crisella, A., Dela Cruz, C. C., Jocson, R. T., Mendoza, A. V., & Reyes (2022). & Joyce Niña N. Salas. Academic dishonesty cheating in synchronous and asynchronous classes: A proctored examination intervention. International Research Journal of Science, Technology, Education, and Management , 2 (1), 1–1.

Armstrong-Mensah, E., Ramsey-White, K., Yankey, B., & Self-Brown, S. (2020). COVID-19 and Distance Learning: Effects on Georgia State University School of Public Health Students. Frontiers in Public Health , 8 . https://www.frontiersin.org/articles/ https://doi.org/10.3389/fpubh.2020.576227 .

Barber, M., Bird, L., Fleming, J., Titterington-Giles, E., Edwards, E., & Leyland, C. (2021). Gravity assist: Propelling higher education towards a brighter future - Office for Students (Worldwide). Office for Students. https://www.officeforstudents.org.uk/publications/gravity-assist-propelling-higher-education-towards-a-brighter-future/ .

Bennett, C., Khangura, S., Brehaut, J. C., Graham, I. D., Moher, D., Potter, B. K., & Grimshaw, J. M. (2011). Reporting guidelines for Survey Research: An analysis of published Guidance and Reporting Practices. PLOS Medicine , 8 (8), e1001069. https://doi.org/10.1371/journal.pmed.1001069 .

Article   Google Scholar  

Björk, B. C., & Solomon, D. (2013). The publishing delay in scholarly peer-reviewed journals. Journal of Informetrics , 7 (4), 914–923. https://doi.org/10.1016/j.joi.2013.09.001 .

Blinova, O., & WHAT COVID TAUGHT US ABOUT ASSESSMENT: STUDENTS’ PERCEPTIONS OF ACADEMIC INTEGRITY IN DISTANCE LEARNING. (2022). INTED2022 Proceedings , 6214–6218. https://doi.org/10.21125/inted.2022.1576 .

Boeker, M., Vach, W., & Motschall, E. (2013). Google Scholar as replacement for systematic literature searches: Good relative recall and precision are not enough. BMC Medical Research Methodology , 13 , 131. https://doi.org/10.1186/1471-2288-13-131 .

Bowman, E. (2022, August 26). Scanning students’ rooms during remote tests is unconstitutional, judge rules. NPR . https://www.npr.org/2022/08/25/1119337956/test-proctoring-room-scans-unconstitutional-cleveland-state-university .

Bretag, T., Harper, R., Burton, M., Ellis, C., Newton, P., Rozenberg, P., Saddiqui, S., & van Haeringen, K. (2019). Contract cheating: A survey of australian university students. Studies in Higher Education , 44 (11), 1837–1856. https://doi.org/10.1080/03075079.2018.1462788 .

Brown, M., Hoon, A., Edwards, M., Shabu, S., Okoronkwo, I., & Newton, P. M. (2022). A pragmatic evaluation of university student experience of remote digital learning during the COVID-19 pandemic, focusing on lessons learned for future practice. EdArXiv . https://doi.org/10.35542/osf.io/62hz5 .

Burgason, K. A., Sefiha, O., & Briggs, L. (2019). Cheating is in the Eye of the beholder: An evolving understanding of academic misconduct. Innovative Higher Education , 44 (3), 203–218. https://doi.org/10.1007/s10755-019-9457-3 .

Butler-Henderson, K., & Crawford, J. (2020). A systematic review of online examinations: A pedagogical innovation for scalable authentication and integrity. Computers & Education , 159 , 104024. https://doi.org/10.1016/j.compedu.2020.104024 .

Case, C. J., King, D. L., & Case, J. A. (2019). E-Cheating and Undergraduate Business Students: Trends and Role of Gender. Journal of Business and Behavioral Sciences , 31 (1). https://www.proquest.com/openview/9fcc44254e8d6d202086fc58818fab5d/1?pq-origsite=gscholar&cbl=2030637 .

Chiang, F. K., Zhu, D., & Yu, W. (2022). A systematic review of academic dishonesty in online learning environments. Journal of Computer Assisted Learning , 38 (4), 907–928. https://doi.org/10.1111/jcal.12656 .

Corbyn, Z. (2022, August 26). ‘I’m afraid’: Critics of anti-cheating technology for students hit by lawsuits. The Guardian . https://www.theguardian.com/us-news/2022/aug/26/anti-cheating-technology-students-tests-proctorio .

Corrigan-Gibbs, H., Gupta, N., Northcutt, C., Cutrell, E., & Thies, W. (2015a). Measuring and maximizing the effectiveness of Honor Codes in Online Courses. Proceedings of the Second (2015) ACM Conference on Learning @ Scale , 223–228. https://doi.org/10.1145/2724660.2728663 .

Corrigan-Gibbs, H., Gupta, N., Northcutt, C., Cutrell, E., & Thies, W. (2015b). Deterring cheating in Online environments. ACM Transactions on Computer-Human Interaction , 22 (6), 28:1–2823. https://doi.org/10.1145/2810239 .

Costley, J. (2019). Student perceptions of academic dishonesty at a Cyber-University in South Korea. Journal of Academic Ethics , 17 (2), 205–217. https://doi.org/10.1007/s10805-018-9318-1 .

Curtin, R., Presser, S., & Singer, E. (2000). The Effects of Response Rate Changes on the index of consumer sentiment. Public Opinion Quarterly , 64 (4), 413–428. https://doi.org/10.1086/318638 .

Curtis, G. J., McNeill, M., Slade, C., Tremayne, K., Harper, R., Rundle, K., & Greenaway, R. (2022). Moving beyond self-reports to estimate the prevalence of commercial contract cheating: An australian study. Studies in Higher Education , 47 (9), 1844–1856. https://doi.org/10.1080/03075079.2021.1972093 .

Dawson, R. J. (2020). Defending Assessment Security in a Digital World: Preventing E-Cheating and Supporting Academic Integrity in Higher Education (1st ed.). Routledge. https://www.routledge.com/Defending-Assessment-Security-in-a-Digital-World-Preventing-E-Cheating/Dawson/p/book/9780367341527 .

Dendir, S., & Maxwell, R. S. (2020). Cheating in online courses: Evidence from online proctoring. Computers in Human Behavior Reports , 2 , 100033. https://doi.org/10.1016/j.chbr.2020.100033 .

Dumulescu, D., & Muţiu, A. I. (2021). Academic Leadership in the Time of COVID-19—Experiences and Perspectives. Frontiers in Psychology , 12 . https://www.frontiersin.org/article/ https://doi.org/10.3389/fpsyg.2021.648344 .

Ebaid, I. E. S. (2021). Cheating among Accounting Students in Online Exams during Covid-19 pandemic: Exploratory evidence from Saudi Arabia. Asian Journal of Economics Finance and Management , 9–19.

Elsalem, L., Al-Azzam, N., Jum’ah, A. A., & Obeidat, N. (2021). Remote E-exams during Covid-19 pandemic: A cross-sectional study of students’ preferences and academic dishonesty in faculties of medical sciences. Annals of Medicine and Surgery , 62 , 326–333. https://doi.org/10.1016/j.amsu.2021.01.054 .

Fanelli, D. (2009). How many scientists fabricate and falsify Research? A systematic review and Meta-analysis of Survey Data. PLOS ONE , 4 (5), e5738. https://doi.org/10.1371/journal.pone.0005738 .

Gardner, W. M., Roper, J. T., Gonzalez, C. C., & Simpson, R. G. (1988). Analysis of cheating on academic assignments. The Psychological Record , 38 (4), 543–555. https://doi.org/10.1007/BF03395046 .

Garg, M., & Goel, A. (2022). A systematic literature review on online assessment security: Current challenges and integrity strategies. Computers & Security , 113 , 102544. https://doi.org/10.1016/j.cose.2021.102544 .

Gaskill, M. (2014). Cheating in Business Online Learning: Exploring Students’ Motivation, Current Practices and Possible Solutions. Theses, Student Research, and Creative Activity: Department of Teaching, Learning and Teacher Education . https://digitalcommons.unl.edu/teachlearnstudent/35 .

Ghias, K., Lakho, G. R., Asim, H., Azam, I. S., & Saeed, S. A. (2014). Self-reported attitudes and behaviours of medical students in Pakistan regarding academic misconduct: A cross-sectional study. BMC Medical Ethics , 15 (1), 43. https://doi.org/10.1186/1472-6939-15-43 .

Giluk, T. L., & Postlethwaite, B. E. (2015). Big five personality and academic dishonesty: A meta-analytic review. Personality and Individual Differences , 72 , 59–67. https://doi.org/10.1016/j.paid.2014.08.027 .

Goff, D., Johnston, J., & Bouboulis, B. (2020). Maintaining academic Standards and Integrity in Online Business Courses. International Journal of Higher Education , 9 (2). https://econpapers.repec.org/article/jfrijhe11/v_3a9_3ay_3a2020_3ai_3a2_3ap_3a248.htm .

Goyder, J., Warriner, K., & Miller, S. (2002). Evaluating Socio-economic status (SES) Bias in Survey Nonresponse. Journal of Official Statistics , 18 (1), 1–11.

Google Scholar  

Grudniewicz, A., Moher, D., Cobey, K. D., Bryson, G. L., Cukier, S., Allen, K., Ardern, C., Balcom, L., Barros, T., Berger, M., Ciro, J. B., Cugusi, L., Donaldson, M. R., Egger, M., Graham, I. D., Hodgkinson, M., Khan, K. M., Mabizela, M., Manca, A., & Lalu, M. M. (2019). Predatory journals: No definition, no defence. Nature , 576 (7786), 210–212. https://doi.org/10.1038/d41586-019-03759-y .

Haddaway, N. R., Collins, A. M., Coughlin, D., & Kirk, S. (2015). The role of Google Scholar in evidence reviews and its applicability to Grey Literature Searching. Plos One , 10 (9), https://doi.org/10.1371/journal.pone.0138237 .

Halbesleben, J. R. B., & Whitman, M. V. (2013). Evaluating Survey Quality in Health Services Research: A decision Framework for assessing Nonresponse Bias. Health Services Research , 48 (3), 913–930. https://doi.org/10.1111/1475-6773.12002 .

Henry, J. (2022, July 17). Universities “turn blind eye to online exam cheats” as fraud rises. Mail Online . https://www.dailymail.co.uk/news/article-11021269/Universities-turning-blind-eye-online-exam-cheats-studies-rates-fraud-risen.html .

Holden, O. L., Norris, M. E., & Kuhlmeier, V. A. (2021). Academic Integrity in Online Assessment: A Research Review. Frontiers in Education , 6 . https://www.frontiersin.org/articles/ https://doi.org/10.3389/feduc.2021.639814 .

Jamali, H. R., & Nabavi, M. (2015). Open access and sources of full-text articles in Google Scholar in different subject fields. Scientometrics , 105 (3), 1635–1651. https://doi.org/10.1007/s11192-015-1642-2 .

Janke, S., Rudert, S. C., Petersen, Ä., Fritz, T. M., & Daumiller, M. (2021). Cheating in the wake of COVID-19: How dangerous is ad-hoc online testing for academic integrity? Computers and Education Open , 2 , 100055. https://doi.org/10.1016/j.caeo.2021.100055 .

Jantos, A., & IN SUMMATIVE E-ASSESSMENT IN HIGHER EDUCATION - A QUANTITATIVE ANALYSIS. (2021). MOTIVES FOR CHEATING. EDULEARN21 Proceedings , 8766–8776. https://doi.org/10.21125/edulearn.2021.1764 .

Jenkins, B. D., Golding, J. M., Grand, L., Levi, A. M., M. M., & Pals, A. M. (2022). When Opportunity knocks: College Students’ cheating amid the COVID-19 pandemic. Teaching of Psychology , 00986283211059067. https://doi.org/10.1177/00986283211059067 .

Jones, I. S., Blankenship, D., Hollier, G., & I CHEATING? AN ANALYSIS OF ONLINE STUDENT PERCEPTIONS OF THEIR BEHAVIORS AND ATTITUDES. (2013). AM. Proceedings of ASBBS , 59–69. http://asbbs.org/files/ASBBS2013V1/PDF/J/Jones_Blankenship_Hollier(P59-69).pdf .

Kerkvliet, J. (1994). Cheating by Economics students: A comparison of Survey results. The Journal of Economic Education , 25 (2), 121–133. https://doi.org/10.1080/00220485.1994.10844821 .

King, D. L., & Case, C. J. (2014). E-CHEATING: INCIDENCE AND TRENDS AMONG COLLEGE STUDENTS. Issues in Information Systems , 15 (1), 20–27. https://doi.org/10.48009/1_iis_2014_20-27 .

Knox, P. (2021). Students “taking it in turns to answer exam questions” during home tests. The Sun . https://www.thesun.co.uk/news/15413811/students-taking-turns-exam-questions-cheating-lockdown/ .

Larkin, C., & Mintu-Wimsatt, A. (2015). Comparing cheating behaviors among graduate and undergraduate Online Business Students—ProQuest. Journal of Higher Education Theory and Practice , 15 (7), 54–62.

Marano, E., Newton, P. M., Birch, Z., Croombs, M., Gilbert, C., & Draper, M. J. (2023). What is the Student Experience of Remote Proctoring? A Pragmatic Scoping Review . EdArXiv. https://doi.org/10.35542/osf.io/jrgw9 .

Moher, D., Liberati, A., Tetzlaff, J., Altman, D. G., & Group, T. P. (2009). Preferred reporting items for systematic reviews and Meta-analyses: The PRISMA Statement. PLOS Medicine , 6 (7), e1000097. https://doi.org/10.1371/journal.pmed.1000097 .

Morales-Martinez, G. E., Lopez-Ramirez, E. O., & Mezquita-Hoyos, Y. N. (2019). Cognitive mechanisms underlying the Engineering Students’ Desire to Cheat during Online and Onsite Statistics Exams. Cognitive mechanisms underlying the Engineering Students’ Desire to Cheat during Online and Onsite Statistics Exams , 8 (4), 1145–1158.

Mortaz Hejri, S., Zendehdel, K., Asghari, F., Fotouhi, A., & Rashidian, A. (2013). Academic disintegrity among medical students: A randomised response technique study. Medical Education , 47 (2), 144–153. https://doi.org/10.1111/medu.12085 .

Newstead, S. E., Franklyn-Stokes, A., & Armstead, P. (1996). Individual differences in student cheating. Journal of Educational Psychology , 88 , 229–241. https://doi.org/10.1037/0022-0663.88.2.229 .

Newton, P. M. (2018). How Common Is Commercial Contract Cheating in Higher Education and Is It Increasing? A Systematic Review. Frontiers in Education , 3 . https://www.frontiersin.org/article/ https://doi.org/10.3389/feduc.2018.00067 .

Newton, P. M., & Salvi, A. (2020). How Common Is Belief in the Learning Styles Neuromyth, and Does It Matter? A Pragmatic Systematic Review. Frontiers in Education , 5 . https://doi.org/10.3389/feduc.2020.602451 .

Nguyen, J. G., Keuseman, K. J., & Humston, J. J. (2020). Minimize Online cheating for online assessments during COVID-19 pandemic. Journal of Chemical Education , 97 (9), 3429–3435. https://doi.org/10.1021/acs.jchemed.0c00790 .

Noorbehbahani, F., Mohammadi, A., & Aminazadeh, M. (2022). A systematic review of research on cheating in online exams from 2010 to 2021. Education and Information Technologies . https://doi.org/10.1007/s10639-022-10927-7 .

Owens, H. (2015). Cheating within Online Assessments: A Comparison of Cheating Behaviors in Proctored and Unproctored Environment. Theses and Dissertations . https://scholarsjunction.msstate.edu/td/1049 .

Page, M. J., McKenzie, J. E., Bossuyt, P. M., Boutron, I., Hoffmann, T. C., Mulrow, C. D., Shamseer, L., Tetzlaff, J. M., Akl, E. A., Brennan, S. E., Chou, R., Glanville, J., Grimshaw, J. M., Hróbjartsson, A., Lalu, M. M., Li, T., Loder, E. W., Mayo-Wilson, E., McDonald, S., & Moher, D. (2021). The PRISMA 2020 statement: An updated guideline for reporting systematic reviews. Bmj , 71. https://doi.org/10.1136/bmj.n71 .

Park, E. J., Park, S., & Jang, I. S. (2013). Academic cheating among nursing students. Nurse Education Today , 33 (4), 346–352. https://doi.org/10.1016/j.nedt.2012.12.015 .

Pokhrel, S., & Chhetri, R. (2021). A Literature Review on Impact of COVID-19 pandemic on teaching and learning. Higher Education for the Future , 8 (1), 133–141. https://doi.org/10.1177/2347631120983481 .

Rice, D. B., Skidmore, B., & Cobey, K. D. (2021). Dealing with predatory journal articles captured in systematic reviews. Systematic Reviews , 10 , 175. https://doi.org/10.1186/s13643-021-01733-2 .

Romaniuk, M. W., & Łukasiewicz-Wieleba, J. (2022). Remote and stationary examinations in the opinion of students. International Journal of Electronics and Telecommunications , 68 (1), 69.

Sax, L. J., Gilmartin, S. K., & Bryant, A. N. (2003). Assessing response Rates and Nonresponse Bias in web and paper surveys. Research in Higher Education , 44 (4), 409–432. https://doi.org/10.1023/A:1024232915870 .

Scheers, N. J., & Dayton, C. M. (1987). Improved estimation of academic cheating behavior using the randomized response technique. Research in Higher Education , 26 (1), 61–69. https://doi.org/10.1007/BF00991933 .

Shute, V. J., & Kim, Y. J. (2014). Formative and Stealth Assessment. In J. M. Spector, M. D. Merrill, J. Elen, & M. J. Bishop (Eds.), Handbook of Research on Educational Communications and Technology (pp. 311–321). Springer. https://doi.org/10.1007/978-1-4614-3185-5_25 .

Subotic, D., & Poscic, P. (2014). Academic dishonesty in a partially online environment: A survey. Proceedings of the 15th International Conference on Computer Systems and Technologies , 401–408. https://doi.org/10.1145/2659532.2659601 .

Surahman, E., & Wang, T. H. (2022). Academic dishonesty and trustworthy assessment in online learning: A systematic literature review. Journal of Computer Assisted Learning , n/a (n/a). https://doi.org/10.1111/jcal.12708 .

Tahsin, M. U., Abeer, I. A., & Ahmed, N. (2022). Note: Cheating and Morality Problems in the Tertiary Education Level: A COVID-19 Perspective in Bangladesh. ACM SIGCAS/SIGCHI Conference on Computing and Sustainable Societies (COMPASS) , 589–595. https://doi.org/10.1145/3530190.3534834 .

Valizadeh, M. (2022). CHEATING IN ONLINE LEARNING PROGRAMS: LEARNERS’ PERCEPTIONS AND SOLUTIONS. Turkish Online Journal of Distance Education , 23 (1), https://doi.org/10.17718/tojde.1050394 .

Varble, D. (2014). Reducing Cheating Opportunities in Online Test. Atlantic Marketing Journal , 3 (3). https://digitalcommons.kennesaw.edu/amj/vol3/iss3/9 .

Whisenhunt, B. L., Cathey, C. L., Hudson, D. L., & Needy, L. M. (2022). Maximizing learning while minimizing cheating: New evidence and advice for online multiple-choice exams. Scholarship of Teaching and Learning in Psychology , 8 (2), 140–153. https://doi.org/10.1037/stl0000242 .

Williams, M. W. M., & Williams, M. N. (2012). Academic dishonesty, Self-Control, and General Criminality: A prospective and retrospective study of academic dishonesty in a New Zealand University. Ethics & Behavior , 22 (2), 89–112. https://doi.org/10.1080/10508422.2011.653291 .

Witley, S. (2023). Virtual Exam Case Primes Privacy Fight on College Room Scans (1) . https://news.bloomberglaw.com/privacy-and-data-security/virtual-exam-case-primes-privacy-fight-over-college-room -scans?context=search&index=1.

Zarzycka, E., Krasodomska, J., Mazurczak-Mąka, A., & Turek-Radwan, M. (2021). Distance learning during the COVID-19 pandemic: Students’ communication and collaboration and the role of social media. Cogent Arts & Humanities , 8 (1), 1953228. https://doi.org/10.1080/23311983.2021.1953228 .

Download references

Acknowledgements

We would like to acknowledge the efforts of all the researchers whose work was reviewed as part of this study, and their participants who gave up their time to generate the data reviewed here. We are especially grateful to Professor Carl Case at St Bonaventure University, NY, USA for his assistance clarifying the numbers of students who undertook online exams in King and Case ( 2014 ) and Case et al. ( 2019 ).

No funds, grants, or other support was received.

Author information

Authors and affiliations.

Swansea University Medical School, Swansea, SA2 8PP, Wales, UK

Philip M. Newton & Keioni Essex

You can also search for this author in PubMed   Google Scholar

Contributions

PMN designed the study. PMN + KE independently searched for studies and extracted data. PMN analysed data and wrote the results. KE checked analysis. PMN + KE drafted the introduction and methods. PMN wrote the discussion and finalised the manuscript.

Corresponding author

Correspondence to Philip M. Newton .

Ethics declarations

Conflict of interest.

The authors have no relevant financial or non-financial interests to disclose.

Ethics approval and consent to participate

This paper involved secondary analysis of data already in the public domain, and so ethical approval was not necessary.

Additional information

Publisher’s note.

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Electronic Supplementary Material

Below is the link to the electronic supplementary material.

Supplementary Material 1

Supplementary material 2, rights and permissions.

Open Access This article is licensed under a Creative Commons Attribution 4.0 International License, which permits use, sharing, adaptation, distribution and reproduction in any medium or format, as long as you give appropriate credit to the original author(s) and the source, provide a link to the Creative Commons licence, and indicate if changes were made. The images or other third party material in this article are included in the article’s Creative Commons licence, unless indicated otherwise in a credit line to the material. If material is not included in the article’s Creative Commons licence and your intended use is not permitted by statutory regulation or exceeds the permitted use, you will need to obtain permission directly from the copyright holder. To view a copy of this licence, visit http://creativecommons.org/licenses/by/4.0/ .

Reprints and permissions

About this article

Newton, P.M., Essex, K. How Common is Cheating in Online Exams and did it Increase During the COVID-19 Pandemic? A Systematic Review. J Acad Ethics (2023). https://doi.org/10.1007/s10805-023-09485-5

Download citation

Accepted : 17 June 2023

Published : 04 August 2023

DOI : https://doi.org/10.1007/s10805-023-09485-5

Share this article

Anyone you share the following link with will be able to read this content:

Sorry, a shareable link is not currently available for this article.

Provided by the Springer Nature SharedIt content-sharing initiative

  • Academic Integrity
  • Distance Learning
  • Digital Education
  • Find a journal
  • Publish with us
  • Track your research

Logo

Essay on Cheating

Students are often asked to write an essay on Cheating in their schools and colleges. And if you’re also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic.

Let’s take a look…

100 Words Essay on Cheating

What is cheating.

Cheating is acting dishonestly to gain an unfair advantage. In school, this might mean looking at someone else’s test answers or using a hidden note during an exam. It’s not just about breaking rules; it’s about not being true to yourself or others.

Why People Cheat

Some students cheat because they feel pressure to get good grades. Others might think they won’t get caught. Sometimes, they don’t understand the work and are afraid to ask for help. But cheating doesn’t solve these problems; it only hides them.

Effects of Cheating

Cheating can lead to trouble in school, like failing a test or even being kicked out. It also means a person isn’t learning what they should. Over time, if they keep cheating, they might find it hard to trust others or feel good about themselves.

Being Honest

It’s better to be honest and do your own work. If you’re having trouble, it’s okay to ask for help. Learning from mistakes is part of growing up. When you’re honest, you can be proud of your hard work and the grades you earn.

Also check:

  • Paragraph on Cheating

250 Words Essay on Cheating

Cheating is when someone acts dishonestly to gain an unfair advantage. It can happen in school, sports, and even in relationships. In school, it usually means copying someone else’s work or using secret notes during a test.

Why Do People Cheat?

People cheat for many reasons. Some might feel pressure to get good grades or win a game. Others might think they won’t get caught or it’s the only way to succeed. But even if it seems like a quick solution, it’s not fair to others and can lead to trouble.

Cheating can make things worse. If you cheat in school, you might not learn what you’re supposed to. This can make future classes really hard. If you get caught, you could get a zero on your test, fail the class, or even get kicked out of school.

It’s much better to be honest and do your own work. This way, you really learn and can feel proud of what you’ve done. If something is hard, it’s okay to ask for help instead of cheating.

Cheating might seem like an easy way out, but it’s not worth it. It’s not fair to others, and it doesn’t help you learn. Being honest is the best choice, even if it’s not the easiest one.

500 Words Essay on Cheating

Cheating is when someone acts dishonestly or unfairly to gain an advantage. It can happen in many places, like schools, sports, and games. In school, it often means breaking the rules to do better on a test or homework. For example, a student might look at someone else’s paper during a test or use a secret note when they’re not supposed to.

People cheat for different reasons. Some might feel a lot of pressure to get good grades or to win, so they think cheating is the only way to succeed. Others might not have prepared well enough and cheat as a last-minute way to avoid failing. There are also those who cheat because they see others doing it and think it’s okay or because they don’t think they’ll get caught.

Cheating can hurt everyone involved. The person who cheats misses out on learning and doesn’t get to really show what they know. It can also make other people feel it’s unfair, especially if they worked hard and didn’t cheat. Over time, if a person keeps cheating, they might find it hard to trust others or to be trusted themselves.

Consequences of Cheating

When someone is caught cheating, there are usually consequences. In school, this might mean a zero on the test, a note to parents, or even being suspended. The consequences depend on how serious the cheating was and if the person has cheated before. The idea behind these consequences is to teach a lesson so that the person doesn’t cheat again.

Preventing Cheating

To stop cheating, schools and teachers can help by making clear rules about what is and isn’t allowed. They can also create a place where cheating is hard to do and where students feel they can do well without having to cheat. Parents can help by teaching their kids about honesty and by encouraging them to do their best, even if they don’t always win or get the highest grades.

Learning from Mistakes

If someone has cheated, it’s important for them to learn from their mistake. They should understand why it was wrong and how it affected others. It’s also important for them to work on being honest and to rebuild trust with their teachers and friends.

The Value of Honesty

Being honest is always the best choice. It leads to true success and helps build a good character. When people are honest, they can be proud of their work and achievements, knowing they did it all on their own. This kind of success feels much better than any that comes from cheating.

In conclusion, cheating is a choice that can have many negative effects. It’s important for everyone to understand why honesty is valuable and to work together to create a world where cheating is not needed or wanted. By doing this, we can all enjoy the rewards of our own hard work and be proud of what we achieve.

That’s it! I hope the essay helped you.

If you’re looking for more, here are essays on other interesting topics:

  • Essay on Challenges Of Prefectship
  • Essay on Charitable Trusts
  • Essay on Challenging Behaviour

Apart from these, you can look at all the essays by clicking here .

Happy studying!

Leave a Reply Cancel reply

Your email address will not be published. Required fields are marked *

Save my name, email, and website in this browser for the next time I comment.

Things you buy through our links may earn Vox Media a commission

The Lure of Divorce

Seven years into my marriage, i hit a breaking point — and had to decide whether life would be better without my husband in it..

Portrait of Emily Gould

This article was featured in One Great Story , New York ’s reading recommendation newsletter. Sign up here to get it nightly.

In the summer of 2022, I lost my mind. At first, it seemed I was simply overwhelmed because life had become very difficult, and I needed to — had every right to — blow off some steam. Our family was losing its apartment and had to find another one, fast, in a rental market gone so wild that people were offering over the asking price on rent. My husband, Keith, was preparing to publish a book, Raising Raffi, about our son, a book he’d written with my support and permission but that, as publication loomed, I began to have mixed feelings about. To cope with the stress, I asked my psychiatrist to increase the dosage of the antidepressant I’d been on for years. Sometime around then, I started talking too fast and drinking a lot.

I felt invincibly alive, powerful, and self-assured, troubled only by impatience with how slowly everyone around me was moving and thinking. Drinking felt necessary because it slightly calmed my racing brain. Some days, I’d have drinks with breakfast, lunch, and dinner, which I ate at restaurants so the drink order didn’t seem too unusual. Who doesn’t have an Aperol spritz on the way home from the gym in the morning? The restaurant meals cost money, as did the gym, as did all the other random things I bought, spending money we didn’t really have on ill-fitting lingerie from Instagram and workout clothes and lots of planters from Etsy. I grew distant and impatient with Keith as the book’s publication approached, even as I planned a giant party to celebrate its launch. At the party, everyone got COVID. I handed out cigarettes from a giant salad bowl — I had gone from smoking once or twice a day to chain-smoking whenever I could get away with it. When well-meaning friends tried to point out what was going on, I screamed at them and pointed out everything that was wrong in their lives. And most crucially, I became convinced that my marriage was over and had been over for years.

Spring Fashion Issue

We want moore.

package-table-of-contents-photo

I built a case against my husband in my mind. This book of his was simply the culmination of a pattern: He had always put his career before mine; while I had tended to our children during the pandemic, he had written a book about parenting. I tried to balance writing my own novel with drop-offs, pickups, sick days, and planning meals and shopping and cooking, most of which had always been my primary responsibility since I was a freelancer and Keith had a full-time job teaching journalism. We were incompatible in every way, except that we could talk to each other as we could to no one else, but that seemed beside the point. More relevant: I spent money like it was water, never budgeting, leaving Keith to make sure we made rent every month. Every few months, we’d have a fight about this and I’d vow to change; some system would be put in place, but it never stuck. We were headed for disaster, and finally it came.

Our last fight happened after a long day spent at a wedding upstate. I’d been drinking, first spiked lemonade at lunch alone and then boxed wine during the wedding reception, where I couldn’t eat any of the food — it all contained wheat, and I have celiac disease. When we got back, late, to the house where we were staying, I ordered takeout and demanded he go pick it up for me. Calling from the restaurant, he was incensed. Did I know how much my takeout order had cost? I hadn’t paid attention as I checked boxes in the app, nor had I realized that our bank account was perilously low — I never looked at receipts or opened statements. Not knowing this, I felt like he was actually denying me food, basic sustenance. It was the last straw. I packed a bag as the kids played happily with their cousins downstairs, then waited by the side of the road for a friend who lived nearby to come pick me up, even as Keith stood there begging me to stay. But his words washed over me; I was made of stone. I said it was over — really over. This was it, the definitive moment I’d been waiting for. I had a concrete reason to leave.

A few days later, still upstate at my friend’s house, I had a Zoom call with my therapist and my psychiatrist, who both urged me in no uncertain terms to check myself into a psychiatric hospital. Even I couldn’t ignore a message that clear. My friend drove me to the city, stopping for burgers along the way — I should have relished the burger more, as it was some of the last noninstitutional food I would eat for a long time — and helped me check into NYU Langone. My bags were searched, and anything that could be used as a weapon was removed, including my mascara. I spent my first night there in a gown in a cold holding room with no phone, nothing but my thoughts. Eventually, a bed upstairs became free and I was brought to the psych ward, where I was introduced to a roommate, had blood drawn, and was given the first of many pills that would help me stop feeling so irrepressibly energetic and angry. They started me on lithium right away. In a meeting with a team of psychiatrists, they broke the news: I had been diagnosed with bipolar disorder; they weren’t sure which kind yet. They gave me a nicotine patch every few hours plus Klonopin and Seroquel and lithium.

I wasn’t being held involuntarily, which meant I could write letters on an official form explaining why I ought to be released, which the psychiatrists then had three days to consider. I attached extra notebook pages to the letters explaining that I was divorcing my husband and was terrified I would never be able to see my kids again if I was declared unfit because I was insane. These letters did not result in my release; if anything, they prolonged my stay. I got my phone back — it would soon be revoked again, wisely — but in that brief interim, I sent out a newsletter to my hundreds of subscribers declaring that I was getting a divorce and asking them to Venmo me money for the custody battle I foresaw. In this newsletter, I also referenced Shakespeare. The drugs clearly had not kicked in yet. I cycled through three different roommates, all of whom were lovely, though I preferred the depressed one to the borderline ones. We amused ourselves during the day by going to art therapy, music therapy, and meetings with our psychiatrists. I made a lot of beaded bracelets.

In the meetings with the shrinks, I steadfastly maintained that I was sane and that my main problem was the ending of my marriage. I put Keith, and my mother, on a list of people who weren’t allowed to visit me. Undaunted, Keith brought me gluten-free egg sandwiches in the morning, which I grudgingly ate — anything for a break from the hospital food. My parents came up from D.C. and helped Keith take care of our children. I was in the hospital for a little more than three weeks, almost the entire month of October, longer than I’d ever been away from my kids before in their lives. I celebrated my 41st birthday in the hospital and received a lot of very creative cards that my fellow crazies had decorated during art therapy. Eventually, the drugs began to work: I could tell they were working because instead of feeling energetic, I suddenly couldn’t stop crying. The tears came involuntarily, like vomit. I cried continuously for hours and had to be given gabapentin in order to sleep.

cheating in essays

On the day I was released, I didn’t let anyone pick me up. I expected the superhuman strength I’d felt for months to carry me, but it was gone, lithiumed away. Instead, I felt almost paralyzed as I carried my bags to a cab. When I arrived at my apartment, I couldn’t figure out where I should sleep. It didn’t feel like my home anymore. We couldn’t afford to live separately, even temporarily, but the one thing that our somewhat decrepit, inconveniently located new apartment had in its favor was two small attic bedrooms and one larger bedroom downstairs. I claimed this downstairs room for myself and began to live there alone, coming into contact with Keith only when we had to be together with our children.

You might assume that my fixation on divorce would have subsided now that my mental health had stabilized and I was on strong antipsychotic medication. But I still did not want to stay in my marriage. If anything, I felt a newfound clarity: Keith and I had fundamentally incompatible selves. Our marriage had been built on a flaw. My husband was older, more established and successful in his career. These were the facts, so it had to be my job to do more of the work at home. Unless, of course, I decided to take myself and my work as seriously as he took his. But that was unappealing; I had managed to publish three books before turning 40, but I didn’t want to work all the time, like he does.

I wondered if my marriage would always feel like a competition and if the only way to call the competition a draw would be to end it.

We picked the kids up from school and dropped them off, or really mostly Keith did. I appeared at meals and tried to act normal. I was at a loss for what to do much of the time. I attended AA meetings and the DBT meetings required by the hospital outpatient program, and I read. I read books about insanity: Darkness Visible, The Bell Jar, An Unquiet Mind, Postcards From the Edge. I tried to understand what was happening to me, but nothing seemed to resonate until I began to read books about divorce. I felt I was preparing myself for what was coming. The first book I read was Rachel Cusk’s Aftermath, which has become the go-to literary divorce bible since its 2012 publication. In it, Cusk describes the way her life shattered and recomposed after the dissolution of her marriage, when her daughters were still very young. She makes the case for the untenability of her relationship by explaining that men and women are fundamentally unequal. She posits that men and women who marry and have children are perpetually fighting separate battles, lost to each other: “The baby can seem like something her husband has given her as a substitute for himself, a kind of transitional object, like a doll, for her to hold so that he can return to the world. And he does, he leaves her, returning to work, setting sail for Troy. He is free, for in the baby the romance of man and woman has been concluded: each can now do without the other.”

At our relationship’s lowest moments, this metaphor had barely been a metaphor. I remembered, the previous winter, Keith going off on a reporting trip to Ukraine at the very beginning of the war, leaving me and the kids with very little assurance of his safety. I had felt okay for the first couple of days until I heard on the news of bombing very close to where he was staying. After that, I went and bummed a cigarette from a neighbor, leaving the kids sleeping in their beds in order to do so. It was my first cigarette in 15 years. Though that had been the winter before my mania began, I believe the first seeds of it were sown then: leaving the children, smoking the cigarette, resenting Keith for putting himself in harm’s way and going out into the greater world while I tended to lunches, homework, and laundry as though everything were normal.

In Nora Ephron’s Heartburn, as in Aftermath, I found an airtight case for divorce. The husband was the villain and the wife the wronged party, and the inevitable result was splitting up. I felt an echo of this later on when I read Lyz Lenz’s polemic This American Ex-Wife, out this month, marketed as “a deeply validating manifesto on the gender politics of marriage (bad) and divorce (actually pretty good!).” The book begins by detailing how Lenz’s husband rarely did household chores and hid belongings of hers that he didn’t like — e.g., a mug that said WRITE LIKE A MOTHERFUCKER — in a box in the basement. “I didn’t want to waste my one wild and precious life telling a grown man where to find the ketchup,” Lenz writes. “What was compelling about my marriage wasn’t its evils or its villains, but its commonplace horror.”

This was not quite the way I felt. Even though I could not stand to see my husband’s face or hear his voice, even though I still felt the same simmering resentment I had since I entered the hospital, I also found myself feeling pangs of sympathy for him. After all, he was going through this too. When we were inevitably together, at mealtimes that were silent unless the children spoke, I could see how wounded he was, how he was barely keeping it together. His clothes hung off his gaunt frame. And at night, when we passed in the kitchen making cups of tea that we would take to our respective rooms, he sometimes asked me for a hug, just a hug. One time I gave in and felt his ribs through his T-shirt. He must have lost at least 15 pounds.

It began to seem like I only ever talked to friends who had been through divorces or were contemplating them. One friend who didn’t know whether to split up with her husband thought opening their marriage might be the answer. Another friend described the ease of sharing custody of his young daughter, then admitted that he and his ex-wife still had sex most weekends. In my chronically undecided state, I admired both of these friends who had found, or might have found, a way to split the difference. Maybe it was possible to break up and remain friends with an ex, something that had never happened to me before in my entire life. Maybe it was possible to be married and not married at the same time. Then I went a little further in my imagination, and the idea of someone else having sex with my husband made me want to gag with jealousy. Maybe that meant something. I was so confused, and the confusion seemed to have no end.

I read more books about divorce. I received an early copy of Sarah Manguso’s Liars, marketed as “a searing novel about being a wife, a mother, and an artist, and how marriage makes liars out of us all.” In it, John, a creative dilettante, and Jane, a writer, meet and soon decide to marry. Liars describes their marriage from beginning to end, a span of almost 15 years, and is narrated by Jane. The beginning of their relationship is delirious: “I tried to explain that first ferocious hunger and couldn’t. It came from somewhere beyond reason.” But the opening of that book also contains a warning. “Then I married a man, as women do. My life became archetypal, a drag show of nuclear familyhood. I got enmeshed in a story that had already been told ten billion times.” I felt perversely reassured that I was merely adding another story to the 10 billion. It made it seem less like it was my fault.

The beginning of my relationship with my husband wasn’t that dramatic or definitive. I thought I was getting into something casual with someone I didn’t even know if I particularly liked, much less loved, but was still oddly fascinated by. I wanted to see the way he lived, to see if I could emulate it and become more like him. He lived with roommates in his 30s — well, that was the price you paid if you wanted to do nothing but write. I wanted what he had, his seriousness about his work. We went on dates where we both sat with our laptops in a café, writing, and this was somehow the most romantic thing I’d ever experienced. On our third date, we went to his father’s home on Cape Cod to dog-sit for a weekend, and it was awkward in the car until we realized we were both thinking about the same Mary Gaitskill story, “A Romantic Weekend,” in which a couple with dramatically mismatched needs learn the truth about each other through painful trial and error. Our weekend was awkward, too, but not nearly as awkward as the one in the story. On the way home, I remember admiring Keith’s driving, effortless yet masterful. I trusted him in the car completely. A whisper of a thought: He would make a good father.

In Liars, cracks begin to form almost immediately, even before John and Jane get engaged; she is accepted to a prestigious fellowship and he isn’t, and he is forthright about his fear that she will become more successful than he is: “A moment later he said he didn’t want to be the unsuccessful partner of the successful person. Then he apologized and said that he’d just wanted to be honest. I said, It was brave and considerate to tell me. ”

Through the next few years, so gradually that it’s almost imperceptible, John makes it impossible for Jane to succeed. He launches tech companies that require cross-country moves, forcing Jane to bounce between adjunct-teaching gigs. And then, of course, they have a baby. The problem with the baby is that Jane wants everything to be perfect for him and throws herself into creating a tidy home and an ideal child-development scenario, whereas John works more and more, moving the family again as one start-up fails and another flourishes. Jane begins to wonder whether she has created a prison for herself but pacifies herself with the thought that her situation is normal: “No married woman I knew was better off, so I determined to carry on. After all, I was a control freak, a neat freak, a crazy person.” The story John tells her about herself becomes her own story for a while. For a while, it’s impossible to know whose story is the truth.

I thought about Keith’s side of the story when I read Liars. Maybe it was the lack of alcohol’s blur that enabled me to see this clearly for the first time — I began to see how burdened he had been, had always been, with a partner who refused to plan for the future and who took on, without being asked, household chores that could just as easily have been distributed evenly. Our situation had never been as clear-cut as it was for Lyz Lenz; Keith had never refused to take out the trash or hidden my favorite mug. But he worked more and later hours, and my intermittent book advances and freelance income could not be counted on to pay our rent. As soon as we’d had a child, he had been shunted into the role of breadwinner without choosing it or claiming it. At first, I did all the cooking because I liked cooking and then, when I stopped liking cooking, I did it anyway out of habit. For our marriage to change, we would have needed to consciously decide to change it, insofar as our essential natures and our financial situation would allow. But when were we supposed to have found the time to do that? It was maddening that the root of our fracture was so commonplace and clichéd — and that even though the problem was ordinary, I still couldn’t think my way out of it.

Splinters: Another Kind of Love Story, by Leslie Jamison , is in some ways the successor to Aftermath — the latest divorce book by a literary superstar. It is mostly an account of Jamison’s passionate marriage to a fellow writer, C., and the way that marriage fell apart after her career accelerated and they had a child together. It then details her first months of life as a single mother and her forays into dating. In it, she is strenuously fair to C., taking much of the blame for the dissolution of their marriage. But she can’t avoid describing his anger that her book merits an extensive tour, while his novel — based on his relationship with his first wife, who had died of leukemia — fails commercially. “It didn’t get the reception he had hoped for,” Jamison writes, and now, “I could feel him struggling. He wanted to support me, but there was a thorn in every interview.” C. grows distant, refusing to publicly perform the charming self that Jamison fell in love with. “I wished there was a way to say, Your work matters, that didn’t involve muting my own,” Jamison writes.

For all my marriage’s faults, we never fought in public. Friends encouraged us to reconcile, saying, “You always seemed so good together.” (As if there were another way to seem! Standing next to each other at a party, it had always been easy to relax because we couldn’t fight.) And we never did anything but praise each other’s work. Until this last book of my husband’s, that is. I had read Raising Raffi for the first time six months before it was published, while I was out of town for the weekend. I had, at that time, enjoyed reading it — it was refreshing, in a way, to see someone else’s perspective on a part of my own life. I even felt a certain relief that my child’s early years, in all their specificity and cuteness, had been recorded. This work had been accomplished, and I hadn’t had to do it! There had been only a slight pang in the background of that feeling that I hadn’t been the one to do it. But as publication drew nearer, the pang turned into outright anger . The opening chapter described my giving birth to our first son, and I didn’t realize how violated I felt by that until it was vetted by The New Yorker ’s fact-checker after that section was selected as an excerpt for its website. Had a geyser of blood shot out of my vagina? I didn’t actually know. I had been busy at the time. I hung up on the fact-checker who called me, asking her to please call my husband instead. (In case you’re wondering, Keith has read this essay and suggested minimal changes.)

I related to the writers in Splinters trying to love each other despite the underlying thrum of competing ambitions. But most of all, Jamison’s book made me even more terrified about sharing custody. “There was only one time I got on my knees and begged. It happened in our living room, where I knelt beside the wooden coffee table and pleaded not to be away from her for two nights each week,” she writes. Envisioning a future in which we shared custody of our children made me cringe with horror. It seemed like absolute hell. At the time we separated, our younger son was only 4 years old and required stories and cuddles to get to bed. Missing a night of those stories seemed like a punishment neither of us deserved, and yet we would have to sacrifice time with our kids if we were going to escape each other, which seemed like the only possible solution to our problem. Thanksgiving rolled around, and I cooked a festive meal that we ate without looking at each other. Whenever I looked at Keith, I started to cry.

We decided to enter divorce mediation at the beginning of December. On Sixth Avenue, heading to the therapist’s office, we passed the hospital where I’d once been rushed for an emergency fetal EKG when I was pregnant with our first son. His heart had turned out to be fine. But as we passed that spot, I sensed correctly that we were both thinking of that moment, of a time when we had felt so connected in our panic and desperate hope, and now the invisible cord that had bound us had been, if not severed, shredded and torn. For a moment on the sidewalk there, we allowed ourselves to hold hands, remembering.

The therapist was a small older woman with short curly reddish hair. She seemed wise, like she’d seen it all and seen worse. I was the one who talked the most in that session, blaming Keith for making me go crazy, even though I knew this wasn’t technically true or possible: I had gone crazy from a combination of sky-high stress and a too-high SSRI prescription and a latent crazy that had been in me, part of me, since long before Keith married me, since I was born. Still, I blamed his job, his book, his ambition and workaholism, which always surpassed my own efforts. I cried throughout the session; I think we both did. I confessed that I was not the primary wronged person in these negotiations, and to be fair I have to talk about why. Sometime post–Last Fight and pre-hospitalization, I had managed to cheat on my husband. I had been so sure we were basically already divorced that I justified the act to myself; I couldn’t have done it any other way. I had thought I might panic at the last minute or even throw up or faint, but I had gone through with it thanks to the delusional state I was in. There aren’t many more details anyone needs to know. It was just one time, and it was like a drug I used to keep myself from feeling sad about what was really happening. Anyway, there’s a yoga retreat center I’ll never be able to go to again in my life.

At the end of the session, we decided to continue with the therapist but in couples therapy instead of divorce mediation. It was a service she also provided, and as a bonus, it was $100 cheaper per session. She didn’t say why she made this recommendation, but maybe it was our palpable shared grief that convinced her that our marriage was salvageable. Or maybe it was that, despite everything I had told her in that session, she could see that, even in my profound sadness and anger, I looked toward Keith to complete my sentences when I was searching for the right word and that he did the same thing with me. As broken as we were, we were still pieces of one once-whole thing.

My husband would have to forgive me for cheating and wasting our money. I would have to forgive him for treading on my literary territory: our family’s life, my own life. My husband would have to forgive me for having a mental breakdown, leaving him to take care of our family on his own for a month, costing us thousands of uninsured dollars in hospital bills. I would have to forgive him for taking for granted, for years, that I would be available on a sick day or to do an early pickup or to watch the baby while he wrote about our elder son. I would have to forgive him for taking for granted that there would always be dinner on the table without his having to think about how it got there. He would have to forgive me for never taking out the recycling and never learning how to drive so that I could move the car during alternate-side parking. I would have to forgive him for usurping the time and energy and brain space with which I might have written a better book than his. Could the therapist help us overcome what I knew to be true: that we’d gone into marriage already aware that we were destined for constant conflict just because of who we are? The therapist couldn’t help me ask him to do more if I didn’t feel like I deserved it, if I couldn’t bring myself to ask him myself. I had to learn how to ask.

No one asked anything or forgave anything that day in the couples therapist’s office. After what felt like months but was probably only a few days, I was watching Ramy on my laptop in my downstairs-bedroom cave after the kids’ bedtime when some moment struck me as something Keith would love. Acting purely on impulse, I left my room and found him sitting on the couch, drinking tea. I told him I’d been watching this show I thought was funny and that he would really like it. Soon, we were sitting side by side on the couch, watching Ramy together. We went back to our respective rooms afterward, but still, we’d made progress.

After a few more weeks and a season’s worth of shared episodes of Ramy, I ventured for the first time upstairs to Keith’s attic room. It smelled alien to me, and I recognized that this was the pure smell of Keith, not the shared smell of the bedrooms in every apartment we’d lived in together. I lay down next to him in the mess of his bed. He made room for me. We didn’t touch, not yet. But we slept, that night, together. The next night, we went back to sleeping alone.

Pickups and drop-offs became evenly divided among me and Keith and a sitter. Keith learned to make spaghetti with meat sauce. He could even improvise other dishes, with somewhat less success, but he was improving. I made a conscious effort not to tidy the house after the children left for school. I made myself focus on my work even when there was chaos around me. Slowly, I began to be able to make eye contact with Keith again. At couples therapy, we still clutched tissue boxes in our hands, but we used them less. Our separate chairs inched closer together in the room.

That Christmas, we rented a tiny Airbnb near his dad’s house in Falmouth. It had only two bedrooms, one with bunk beds for the kids and one with a king-size bed that took up almost the entirety of the small room. We would have to share a bed for the duration of the trip. The decision I made to reach across the giant bed toward Keith on one of the last nights of the trip felt, again, impulsive. But there were years of information and habit guiding my impulse. Sex felt, paradoxically, completely comfortable and completely new, like losing my virginity. It felt like sleeping with a different person and also like sleeping with the same person, which made sense, in a way. We had become different people while somehow staying the same people we’d always been.

Slowly, over the course of the next months, I moved most of my things upstairs to his room, now our room. We still see the therapist twice a month. We talk about how to make things more equal in our marriage, how not to revert to old patterns. I have, for instance, mostly given up on making dinner, doing it only when it makes more sense in the schedule of our shared day or when I actually want to cook. It turns out that pretty much anyone can throw some spaghetti sauce on some pasta; it also turns out that the kids won’t eat dinner no matter who cooks it, and now we get to experience that frustration equally. Keith’s work is still more stable and prestigious than mine, but we conspire to pretend that this isn’t the case, making sure to leave space for my potential and my leisure. We check in to make sure we’re not bowing to the overwhelming pressure to cede our whole lives to the physical and financial demands, not to mention the fervently expressed wants, of our children. It’s the work that we’d never found time to do before, and it is work. The difference is that we now understand what can happen when we don’t do it. I’m always surprised by how much I initially don’t want to go to therapy and then by how much lighter I feel afterward. For now, those sessions are a convenient container for our marriage’s intractable defects so that we get to spend the rest of our time together focusing on what’s not wrong with us.

The downstairs bedroom is now dormant, a place for occasional guests to stay or for our elder son to lie in bed as he plays video games. Some of my clothes from a year earlier still fill the drawers, but none of it seems like mine. I never go into that room if I can help it. It was the room of my exile from my marriage, from my family. If I could magically disappear it from our apartment, I would do it in a heartbeat. And in the attic bedroom, we are together, not as we were before but as we are now.

More From the spring 2024 fashion issue

  • Bring Back These ’00s Trends
  • Packing for Paris With Alex Consani
  • Where New York City Tweens Actually Like to Shop  
  • remove interruptions
  • newsletter pick
  • spring 2024 fashion issue
  • best of the cut
  • new york magazine
  • audio article
  • spring fashion
  • one great story

The Cut Shop

Most viewed stories.

  • The Mom Who Left Her Husband at 20 Weeks Pregnant
  • Ask Polly: My Friend Lost Weight and Now I Want Him. Am I Shallow?
  • Fani Willis Didn’t Stand a Chance
  • How I Got Scammed Out of $50,000  
  • An Extremely Thorough Guide to ‘Who TF Did I Marry’
  • The Lure of Divorce  

Editor’s Picks

cheating in essays

Most Popular

What is your email.

This email will be used to sign into all New York sites. By submitting your email, you agree to our Terms and Privacy Policy and to receive email correspondence from us.

Sign In To Continue Reading

Create your free account.

Password must be at least 8 characters and contain:

  • Lower case letters (a-z)
  • Upper case letters (A-Z)
  • Numbers (0-9)
  • Special Characters (!@#$%^&*)

As part of your account, you’ll receive occasional updates and offers from New York , which you can opt out of anytime.

To revisit this article, visit My Profile, then   View saved stories .

To revisit this article, select My Account, then   View saved stories

  • Conditionally
  • Newsletter Signup

Once a Cheater, Always a Cheater? Here’s What Couples Therapists Think

By Jenna Ryu

Graphic of someone cheating in front of their partner

If you’re dating someone who’s cheated before, that doesn’t automatically mean the relationship is destined to fail. Still, knowing that your partner got a little too cozy with a coworker in the past, perhaps, or had a months-long physical affair can understandably make you paranoid about your future together.

Maybe the classic saying “Once a cheater, always a cheater” is lingering in your head. Contrary to popular belief, though, it’s not a universal truth, Kayla Knopp, PhD , co-founder and clinical psychologist at Enamory in San Diego, tells SELF. For starters, absolutes like “always” usually aren’t factual: A ton of jerks might have a track record of being unfaithful, sure, but many others probably learned from their mistakes, Dr. Knopp says.

Case in point: Her research at the University of Denver found that not all former cheaters repeated their behaviors in their next relationships. “Lots of people have a fear of being cheated on though, so believing in black-and-white rules [like ‘once a cheater, always a cheater’] can make them feel safer,” Dr. Knopp says.

Whether or not your significant other gets their shit together this time around can depend on a lot of different factors. Here, we asked couples counselors for some hopeful signs that they won’t repeat the same pattern with you.

1. They willingly open up about their past—because they want to, not because they have to.

“We’re not entitled to know everything about our significant others’ pasts,” Dr. Knopp says. Technically speaking, your partner isn’t obligated to disclose a previous emotional affair , for example—and it makes sense why they might be hesitant to own up to infidelity, especially if you just started dating. (After all, who wants to jeopardize a new relationship with old drama?)

That’s why, according to Dr. Knopp, “when someone voluntarily shares their mistakes, they’re much more likely to be a trustworthy partner in the future.” Getting ahead of things doesn’t guarantee they won’t do you dirty down the line, of course, but it does show that they’re confessing their indiscretion for the sake of being honest—and not out of pressure or obligation, Dr. Knopp says. (Plus, it’s an effort to make sure your relationship isn’t built on lies and secrets from the get-go, she adds.)

2. They know why they cheated in the first place.

Their explanation may not justify what they did, but it can tell you a whole lot about their ability to self-reflect and take responsibility for their actions—which are both essential steps toward addressing (and correcting) harmful behaviors, Dr. Knopp says.

“If [your partner] acts like it wasn’t within their control or claims they don’t know why they cheated, those are red flags ,” she explains. On the flip side, someone who’s learned from their mistakes should be able to acknowledge their shortcomings by admitting that they liked the attention of the pretty bartender, for instance, instead of solely blaming tequila shots. Or, they might reveal that their low self-esteem—not their “unappreciative” or “distant” ex—caused them to seek validation from someone else.

3. They understand if you’re critical or skeptical of them.

If your partner gets defensive when you bring up their past, say, or guilt-trips you for needing space, this usually shows they’re not really taking responsibility, K’Hara McKinney, LMFT , a couples therapist in Los Angeles, tells SELF. That’s because true remorse involves owning up to your bad behavior —even if forgiveness isn’t guaranteed, McKinney says.

To figure out how sorry they truly are , she suggests paying extra attention to the way they talk about their infidelity: “Do they feel remorseful and regretful, or are they boastful or indifferent? Are they being secretive about the details or blaming anyone but themselves?” Someone who turned over a new leaf will be receptive to whatever reaction you have—disappointment, sadness, even straight-up rage —instead of arguing that you’re “blowing things out of proportion.” And ideally, they’ll be patient if you’re (understandably) reconsidering whether dating them is worth the risk, McKinney adds.

4. They only did it once.

Sometimes, the circumstances of an affair—like whether it was emotional or physical—don’t matter. (These forms of betrayal can be equally painful.) But certain details, like whether it was a one-time slip-up or a more persistent habit, can help you determine if their old ways are truly behind them, both McKinney and Dr. Knopp say.

“Someone with a longer pattern of infidelity is definitely more likely to cheat again than someone who’s only done it once—unless they’ve done serious work on themselves,” Dr. Knopp points out. That’s because repetitive behaviors usually indicate a deeper-rooted compulsion (like needing external validation or fearing commitment) compared to something situational that caused a momentary lapse in judgment, she says. Addressing serial infidelity usually requires a more long-term intervention, like therapy, Dr. Knopp adds, which brings us to our next and final pointer…

5. They’ve taken actions to repair their mistakes, like going to therapy.

Solemnly swearing that they’ll never cheat again or calling their infidelity the “biggest regret” of their life can be reassuring. However, these grand statements aren’t enough to prove they’ve changed their ways. Anyone—especially someone with a track record of deception—can talk the talk, but how do you know if they can walk the walk?

It’s less about what they say, and more about the steps they’ve taken to learn from (and own) their mistakes, according to McKinney and Dr. Knopp. Going to therapy is one green flag: This usually means they’ve done (or are doing) the work to address their behavior, McKinney says. Even if they haven’t seen a professional, other efforts that show they’re serious about making things right include apologizing to their former partner, reading books to better understand why they cheated, and consistently encouraging open communication or check-ins about your thoughts, feelings, and boundaries, Dr. Knopp adds.

None of these signs are foolproof, and the occasional outlier armed with charming lies and fake empathy might still fool you. But here’s the takeaway: As disheartening as a cheating past can be, your relationship isn’t necessarily doomed to repeat history—especially if your partner has shown they’re genuinely committed to doing better.

  • The Biggest Relationship Problems Couples Therapists See Over and Over Again
  • We Asked 13 People What Finally Helped Them Get Over a Bad Breakup
  • How to Get Back Into Dating After a Long-Term Relationship Crumbles

cheating in essays

SELF does not provide medical advice, diagnosis, or treatment. Any information published on this website or by this brand is not intended as a substitute for medical advice, and you should not take any action before consulting with a healthcare professional.

This Is What a 37-Year-Old Face Without Filters or Makeup Actually Looks Like

  • International edition
  • Australia edition
  • Europe edition

Home Office sign.

English test scandal: students wrongly accused of cheating launch legal action

Overseas students seek compensation from Home Office for unlawful detention and loss of earnings after it cancelled visas

A group of overseas students who were wrongly accused of cheating in the English language tests they were required to take to renew their study visas have launched legal proceedings against the Home Office , seeking compensation for unlawful detention and loss of earnings.

The government has made payments in at least two cases, but lawyers have expressed frustration at the department’s refusal to agree a standard settlement scheme for wrongly accused students, which they believe would speed up the process of securing justice.

The law firm Bindmans is representing 23 students who have already won immigration appeals, and overturned the Home Office’s decision to cancel their visas amid cheating allegations, and is pushing for the department to treat this as a group action.

after newsletter promotion

Clients are seeking compensation for wrongful arrest, false imprisonment, loss of earnings (during the period their contested immigration status meant they were prohibited from working) and damage to their mental health.

Attempts to secure compensation come 10 years after the Home Office took steps to cancel the visas of about 35,000 international students, after a BBC documentary revealed evidence of cheating in some English language test centres. Although cheating clearly happened in some Home Office approved test centres, thousands of students have spent years protesting that they were wrongly affected by the department’s decision to classify 97% of those who took the test as possible cheats.

Some of the firm’s clients were not initially told that they had been accused of cheating but were detained by immigration enforcement officers during dawn raids, with no clear indication of what had prompted the arrest, Alice Hardy, a partner at Bindmans, said.

“Our clients have been through hell. The Home Office deliberately concealed from them the fact that they had been accused of cheating, denying them the opportunity to defend themselves, and instead removed their immigration status with no in-country right of appeal. They lost everything as a result; homes, livelihoods, the right to work, study and pay rent. They suffered the shame and rejection of their families, relationship breakdowns, destitution and the torment of seeing everything they had worked for taken away from them,” Hardy said.

“These situations persisted for up to 10 years and caused untold suffering. It is now apparent that the allegations were based on thin evidence.”

The firm issued the 23 claims between October 2020 and March 2022 but only one case has settled. Lawyers had hoped to persuade the department to develop a scheme, based on the model of the Windrush compensation scheme, that would set out clear categories under which damages could be paid, which they hoped would speed up the process. The Home Office has rejected the lawyers’ proposal, Bindmans said.

“It is open to the Home Office to repair some of the damage done by apologising to our clients and agreeing a sensible, efficient settlement scheme to enable them to move on with their lives. It is deeply disappointing that they are declining to do that,” Hardy said.

At least one student represented by a different law firm has received compensation after an allegation of cheating in the test. Mohammad Bhuiyan received about £13,500 in compensation in 2021 from the Home Office for wrongful detention after being held in immigration detention for 47 days after an accusation of cheating in an English language test.

A Home Office spokesperson said: “The 2014 investigation into the abuse of English language testing revealed systemic cheating which was indicative of significant organised fraud. Courts have consistently found the evidence was sufficient to take the action we did.”

  • English language testing scandal
  • Home Office

Most viewed

IMAGES

  1. Causes and Effects of Cheating Free Essay Example

    cheating in essays

  2. 019 Effects Of Cheating In Exams Essay Example ~ Thatsnotus

    cheating in essays

  3. Cause and Effect Essay Cheating on Exams why students cheat in exams essay

    cheating in essays

  4. Cheating as a Violation of Objectivity Essay Example

    cheating in essays

  5. 28 Ways to Cheat on a Test Using School Supplies

    cheating in essays

  6. Consequences of a College Student Cheating In Exams

    cheating in essays

VIDEO

  1. Cheating Facts #shorts #psychologyfacts #cheating

  2. South Park PREDICTS ChatGPT CHEATING for Essays

  3. WHAT IS THE DEFINITION OF CHEATING?

  4. CHATGPT Hack! How to catch students cheating on essays!

COMMENTS

  1. Essays About Cheating: Top 5 Examples and 9 Writing Prompts

    "Cheating is a false representation of the child's ability which he may not be able to give without cheating. It is unfair to everyone involved as it deprives the true one of the chance to come on the top." Prasanna begins the essay by defining cheating in schools and then incorporates how this unethical behavior occurs in reality.

  2. Buying College Essays Is Now Easier Than Ever. But Buyer Beware

    In the cat-and-mouse game of academic cheating, students these days know that if they plagiarize, they're likely to get caught by computer programs that automatically compare essays against a...

  3. Why Students Cheat—and What to Do About It

    Classroom Management Why Students Cheat—and What to Do About It A teacher seeks answers from researchers and psychologists. By Andrew Simmons April 27, 2018 "Why did you cheat in high school?" I posed the question to a dozen former students. "I wanted good grades and I didn't want to work," said Sonya, who graduates from college in June.

  4. Academic Dishonesty: 5 Methods of Identifying Cheating and Plagiarism

    Most of the time, this is labeled simply as cheating, defined as intentionally using or attempting to use unauthorized materials on any academic exercise, or plagiarism, the appropriation or use of another person's ideas, results, or words without giving appropriate credit, but we see instances of fabrication and other acts of dishonesty.

  5. Essay cheating: How common is it?

    By Reality Check team Getty Images A BBC investigation has found that prominent YouTube stars are encouraging students to buy essays. Passing off a custom-made essay as your own is a form of...

  6. 93 Cheating Essay Topic Ideas & Examples

    1 hour! 93 Cheating Essay Topic Ideas & Examples Updated: Nov 9th, 2023 6 min Table of Contents 🏆 Best Cheating Topic Ideas & Essay Examples Consequences of a College Student Cheating In Exams Another effect of cheating in exams is that the honest present and even the future students in the system also suffer from the cheating behaviour.

  7. Cheating, Inc.: How Writing Papers for American College Students Has

    It is not clear how widely sites for paid-to-order essays, known as "contract cheating" in higher education circles, are used. A 2005 study of students in North America found that 7 percent of...

  8. Essay cheating: How common is it?

    Passing off a custom-made essay as your own is a form of plagiarism known as contract cheating. It involves a student ordering an essay, usually through a website, for a fee. But it could...

  9. Doing away with essays won't necessarily stop students cheating

    It's known as contract cheating - when a student pays a third party to undertake their assignments which they then pass off as their own. Contract cheating isn't new - the term was coined in...

  10. What students see as cheating and how allegations are handled

    A lot of students don't even realize it's an easy way to catch a student cheating.". Online exams appear to be seen as more sacred by students, with the majority of survey respondents saying that using unapproved technology or tools in exams is very unacceptable and only 17 percent seeing it as somewhat or very acceptable.

  11. A systematic review of research on cheating in online exams ...

    The current study is a review of 58 publications about online cheating, published from January 2010 to February 2021. We present the categorization of the research and show topic trends in the field of online exam cheating. ... During an online exam, multiple-choice questions are highly susceptible to cheating. Hence, long essay questions are ...

  12. Detecting contract cheating in essay and report submissions: process

    Contract cheating has extended beyond earlier definitions used to describe students outsourcing assessable work to external parties (Clarke & Lancaster, 2006) to include other behaviours such as sharing, trading, ghosting and impersonation (Bretag et al. 2017 ).

  13. Cheating on exams: Investigating Reasons, Attitudes, and the Role of

    As for age, Jensen et al. (2002) found that younger students were more inclined to cheat than older students. In a similar vein, Franklyn-Stokes and Newstead (1995) found that students' cheating was the function of their age. Petrak and Bartolac (2014) conducted a study with health students and found that cheating was moderately prevalent among the 1,088 students with whom their survey was ...

  14. How to cheat on your final paper: Assigning AI for student writing

    12k Accesses 28 Citations 112 Altmetric 13 Mentions Explore all metrics Abstract This paper shares results from a pedagogical experiment that assigns undergraduates to "cheat" on a final class essay by requiring their use of text-generating AI software.

  15. A systematic review of research on cheating in online exams from 2010

    Traditional cheating methods include, hiding notes in a pencil case, behind ruler, or clothes, writing on arms/hands, leaving the room, etc. (Curran et al., 2011). Technological advances and online learning have enhanced education, however, they also have facilitated cheating in courses (Turner & Uludag, 2013). For instance, an examinee could ...

  16. Essays on Cheating

    Essays on Cheating Essay examples Essay topics 36 essay samples found 1 The Epidemic of Cheating in American High Schools 1 page / 610 words 'Too Much Pressure' - Argument Essay An epidemic has broken out in American High Schools: cheating.

  17. Education: Why Do Students Cheat?

    Students cheat because many institutions of learning value grades more than attainment of knowledge (Davis et al. 36). Many school systems have placed more value on performing well in tests and examination than on the process of learning. When assessment tests and examinations play a key role in determining the future of a student, cheating ...

  18. Argumentative Essay on Cheating

    Cite This Essay. Download. "Cheaters are cowards that are tempted to chase the fantasy of what could be instead of courageously addressing their own self-destructive behavior and civilization what is". This is a quote by "Doctor Steve Maraboli". Cheating refers to the way of achieving a goal. Also, cheating is a reward given to be ...

  19. What Is Grammarly, and Is It Cheating?

    What Is Grammarly, and Is It Cheating? By Dave Tomar Published: Feb 17, 2021 Updated: Jul 28, 2023 What is Grammarly? We take a deep dive into this online writing resource. How does it help students, and is it ok to use this tool? Key Takeaways What is Grammarly? It's a web-based automated writing assistant that results in writing improvements.

  20. How Common is Cheating in Online Exams and did it Increase ...

    Academic misconduct is a threat to the validity and reliability of online examinations, and media reports suggest that misconduct spiked dramatically in higher education during the emergency shift to online exams caused by the COVID-19 pandemic. This study reviewed survey research to determine how common it is for university students to admit cheating in online exams, and how and why they do ...

  21. Why Students Cheat? and What We Do About It?

    - A Plus Topper Cheating In School Essay | Why Students Cheat? and What We Do About It? November 1, 2021 by Prasanna Cheating In School Essay: Cheating is a crime. Whether you cheat your friend, parents, or an unknown person, it is an unethical way of achieving your aim.

  22. Essay on Cheating for Students

    Students are often asked to write an essay on Cheating in their schools and colleges. And if you're also looking for the same, we have created 100-word, 250-word, and 500-word essays on the topic. Let's take a look… 100 Words Essay on Cheating What is Cheating? Cheating is acting dishonestly to gain an unfair advantage.

  23. Should I Leave My Husband? The Lure of Divorce

    My husband would have to forgive me for cheating and wasting our money. I would have to forgive him for treading on my literary territory: our family's life, my own life. My husband would have to forgive me for having a mental breakdown, leaving him to take care of our family on his own for a month, costing us thousands of uninsured dollars ...

  24. Once a Cheater, Always a Cheater? Here's What Couples ...

    Topics relationships dating cheating. More from Self. This Is What a 37-Year-Old Face Without Filters or Makeup Actually Looks Like. Introducing Group Shot, a new photo series where people get ...

  25. Educators Battle Plagiarism As 89% Of Students Admit To Using ...

    Nearly 90% students are already using ChatGPT for homework assignments, creating challenges around plagiarism, cheating, and learning. Subscribe to newsletters. ... 53% had it write an essay, and ...

  26. English test scandal: students wrongly accused of cheating launch legal

    A group of overseas students who were wrongly accused of cheating in the English language tests they were required to take to renew their study visas have launched legal proceedings against the ...

  27. CBS faces uproar after seizing investigative journalist's files

    The position of CBS has alarmed many, including the union, as an attack on free press principles by one of the nation's most esteemed press organizations.