Skip to Content, Navigation, or Footer.

Enhance, but not replace: How two student groups are responding to AI use

The Herald spoke with the Brown Undergraduate Law Review and Collegiate Consulting Group about their approach to AI use.

Maroon palette illustration of a group of tired students swamped with work while one desperately hands out flyers to disinterested people.

Artificial intelligence is a growing concern in many academic spaces at Brown. But discussions about AI usage are not only happening in the classroom — as of late, many clubs have grappled with AI usage among their members as well.

“AI is pretty advanced, but at this moment in time, it's not advanced to the point where you can fully get away with using it to just write something completely,” said Daniella Goldrich ’27, an editor-in-chief for the Brown Undergraduate Law Review. According to Goldrich the student-led legal scholarship publication has taken steps to promote original work and mitigate AI use.

“It’s something that we’ve dealt with this year,” she said, adding members of the BULR were surprised the issue hadn’t arisen sooner. “We’re starting to formalize our AI policy for the organization.

The BULR features works by both Brown student-authors and by individuals outside the University, Goldrich said, adding that enforcing an AI policy for external authors is more difficult than for Brown-student contributors. 

ADVERTISEMENT

 “Our organization really prioritizes original legal scholarship,” she said. According to Goldrich, the BULR’s teams currently have their own AI policies, but the “baseline” rule is that if AI is used at any point, writers have to “disclose it to the team.” When AI use is significant enough to make the work “not original scholarship,” it becomes a larger issue, she added.

According to Podcast Director for the BULR Justin Khan ’29, his team is “currently grappling with how AI should be used.” The podcast team permits AI use to identify sources but not to write scripts or questions, he wrote in a message to The Herald. 

“Our work is public-facing and any lapse in quality as a result of AI is a negative reflection of our entire team, especially if we are conducting interviews with professors and scholars,” Khan wrote. “For this reason, we have been very strict that AI usage should be completely limited.”

The BULR has seen few instances of AI use, Goldrich said. In those cases, she said it has seemed that AI was used to cut down the working time required to complete a task rather than to “supplement” where people are burnt out.

Goldrich noted AI usage is typically identified by editors, who flag content that “does not sound right.” Editors then bring it up to leadership, who ensures that they communicate with the author before making a decision about a piece.

Nik Greborunis ’28, who writes for the BULR, believes overcommitment and stress contribute to AI use.

“Everything seems to be career preparation instead of activities done out of intrinsic interest,” he wrote in a message to The Herald. “When this is the dominant mode of choosing classes and activities, people will look for shortcuts because there is no joy or satisfaction in completing a task that is mundane in their mind.” This situation, he said, lends itself to AI.

The Collegiate Consulting Group, a pre-professional club that advises real world clients, has also adjusted to students’ use of AI. 

CCG President Tanay Subramanian ’26 said the group’s leaders have spent a lot of time “discussing how to plan and appropriately address” AI usage in the group.

“We do have a pretty comprehensive AI policy document that we send to all of our members,” Subramanian said, noting that CCG takes “the same approach that the companies we’re going to start working at after college enforce as well.” That approach, he said, is to use AI to “enhance our work, but not replace it.”

ADVERTISEMENT

“There’s tremendous benefits using AI when it comes to expediting research, getting diverse perspectives (and) finding sources,” he said. “We use it to enhance our work, but not replace it,” he said.

“(In) the world of consulting, what companies really value is the human judgment and the human expertise that AI, at least yet, cannot compete with,” Subramanian said. 

When AI policy violations arise, CCG takes a similar approach to BULR based on communicating with member consultants.

Emmitt Rattey ’29, a consultant for CCG, doesn’t believe AI usage is an “issue” in CCG, but notes that a big “contributor” to AI use is just “knowing that it’s available.”

Get The Herald delivered to your inbox daily.

“When you’ve seen what AI is capable of and you’re working through a task, it’s easy to rely on AI” when AI can complete the task better or more quickly. But Rattey, who is passionate about the club, doesn’t “struggle to find motivation to participate in the activities.”

“I think most of us within the club take pride in the work that they provide to the clients and focus on building their own skillset by actively engaging and critically thinking without the help of AI,” Rattey wrote.


Lucia Santiago

Lucia Santiago is a senior staff writer covering undergraduate student life.



Powered by SNworks Solutions by The State News
All Content © 2026 The Brown Daily Herald, Inc.