Donate

University News

Computer science tops in academic violations last year

Department uses software to detect cheating

By
Sports Editor
Monday, November 1, 2010

Forty-two students were cited with potential violations of the Academic Code last academic year — with nearly 70 percent of the cases coming from the Department of Computer Science, according to a faculty committee’s report released this month.

Of the cases referred to the committee, with possible outcomes ranging from “no action taken” to “directed no credit with transcript notation,” 29 of them came from computer science classes. Half of the total cases heard by the committee last year resulted in the latter, which was the most severe punishment given out.

Thomas Doeppner, co-chair of the Standing Committee on the Academic Code and associate professor of computer science, said computer science’s high number of cases is partly due to the department’s systematic oversight procedures. Foremost among those is the Measure of Software Similarity, or MOSS, which is a free tool developed by a computer science professor currently at Stanford University.

“It’s a resource that’s available to anyone, and it is used by many computer science departments across the country,” Doeppner said.

But without a clear-cut mechanism like MOSS, some officials say it is difficult to catch plagiarizers in other academic departments.  

“It’s not that there is more cheating in computer science,” said Deputy Dean of the College Stephen Lassonde. “From what I know, the computer science department is the only one that checks systematically — they check if students are copying code. In all other cases, professors are reliant on seeing the cheating during exams, or, in terms of essays, they see phrases that seem to not be the student’s.”

The departmental breakdown of cases heard by the standing committee weighs heavily towards the sciences. After the 29 cases in computer science, the next-highest number of cases referred was four from the Department of Chemistry and three from the Department of Psychology.

But Lassonde said he believes the most prominent form of cheating on campus is actually seen in humanities classes.  

“From what I’ve seen, it’s plagiarism performed on the Internet,” either intentionally or unintentionally, he said. “Students make the mistake these days of copying directly from an article into an essay.”  

One possible prevention measure against plagiarized essays is use of a paid service such as Turnitin, a program that evaluates Internet-submitted student essays. Like MOSS, Turnitin checks student assignments for similarities — though Turnitin compares essays, not coded programs.  

Doeppner said he sees the value in using Turnitin.

“If Brown was to use something like this, I think we would see collaboration cases in the rest of the University at the same level we see in computer science,” he said. “But without a tool like that, it’s really hard to find such cases.”

About 800,000 instructors at 9,500 schools worldwide use Turnitin, according to its website, but Lassonde said Brown faculty are not among the consumers.

“The program isn’t used by anyone on campus,” Lassonde said. “The University would have to pay thousands of dollars in order to use it.”

Just four miles away from campus, at Johnson and Wales University, administrators encounter a majority of their plagiarism cases in a different form.

“It’s been cheating on quizzes and tests,” said JWU Assistant Director of Student Conduct Briana Sevigny. “Students are sharing answers during exams, or they’re bringing in cheat sheets.”

Sevigny said the university, which uses Turnitin, has seen students move away from cheating on essays.

“I think what a lot of students are finding is that if they purchased a paper, that paper was plagiarized by the person who wrote it,” she said. “So it was a double whammy — not only did they pay for it, but they also paid for a plagiarized paper.”  

While improvements in technology have led to the creation of plagiarism-detection tools like MOSS and Turnitin, both Sevigny and Lassonde noted the role of the Internet in making it simpler for students to cheat.

“It’s required less creativity to cheat in the past 10 years or so,” Sevigny said.

“It’s easier to cheat,” Lassonde said. “A lot of people are sloppy in terms of how they compose papers.”  

At Brown, the lack of a computerized detection service outside of computer science lays the brunt of plagiarism detection on those departments’ professors.

“We rely on the expertise of their department members to discern when students are cheating,” Lassonde said.

But Doeppner noted that this is an imperfect way of identifying plagiarizers.

“Unless it’s something that the grader or professor is really familiar with, it’s really difficult to catch,” he said. “Collaboration that is caught, it’s usually because it’s a dramatically different style than the student has been using in the past. It’s just something that catches the professor’s eye.”  

Sevigny said she sees the same problems among instructors at JWU detecting cheating despite the tools they can access.

“Right now, most professors either don’t have the time or the wherewithal to use all the resources available to them,” she said.

Before Brown’s computer science professors and teaching assistants began using MOSS, Doeppner said, his department encountered the same detection problems that the rest of the University faces.

“The TAs, or whoever was grading the assignment, would notice similarities, but this probably would only happen if the same person was grading both papers,” Doeppner said. “It was more hit-and-miss, and it was tougher to really put together convincing evidence for the academic code committee.”

The clarity of instructor policies can lead to confusion about when it is appropriate to work together. Joint efforts on homework may be encouraged in one class, but absolutely forbidden in another. Lassonde said the Department of Computer Science excels in specifying its rules.

“Professors encourage students to collaborate to a certain degree, but they’re extremely clear when you cannot collaborate,” Lassonde said. “That helps students recognize the difference between shared work and collaborative work.”

A student currently enrolled in CSCI 0150: “Introduction to Object-Oriented Programming and Computer Science” said the department does a good job in making the distinction understandable.  

“In CS 15, you’re supposed to do everything yourself,” said Cyril Gary ’12, a chemistry concentrator. “They made it clear that they were very strict about it.”

Gary added that he has not encountered much academic dishonesty at Brown.

“I haven’t really seen that much cheating, definitely much less than my old school,” said Gary, who transferred from the University of California at San Diego. “When you have a collaboration policy that encourages you to study with other people, there’s obviously going to be exploitation with people just copying the homework. But I haven’t seen any cheating on exams or anything like that.”

A Herald poll conducted Nov. 2–4, 2009, reported that 12.4 percent of students had copied homework answers, 4.2 percent had not properly cited outside resources and 2.3 percent had copied on an exam that semester.

To stay up-to-date, subscribe to our daily newsletter.

Comments are closed.

Comments are closed. If you have corrections to submit, you can email The Herald at herald@browndailyherald.com.