Skip to Content, Navigation, or Footer.

Rahman ’26: Has ChatGPT turned the BA into BS?

In just one year, generative artificial intelligence models like ChatGPT have transformed the way we work and learn. In the Herald’s Fall 2023 poll, only 30.3% of surveyed students said that they do not use ChatGPT or similar tools in their academic study. Given the prevalence and utility of this technology, AI seems to be here to stay and universities must adapt to ensure that this new reality works for students and not against them.

I first heard about ChatGPT in November 2022 when my first-year roommate came to me gushing: “Tas, you gotta try this new thing called ChatGDP.” Deeply skeptical, I tried it, and the results shocked me. Not only could it answer complex questions, but it was also interactive, able to integrate the feedback I provided. I was concerned then, as I am now, about the impact this new technology could have on the future of education. Indeed, I have seen many peers use ChatGPT to write whole essays or complete coding assignments, an endeavor that perverts the pedagogical purpose of those assignments in the first place. On the other hand, I have used this technology to teach myself concepts in linear algebra, brainstorm arguments for columns and debug faulty code for my research. The paradox of ChatGPT is evident: When used properly, it can be a powerful tool for learning, but when used as a crutch, it diminishes our own intelligence.

ADVERTISEMENT

Responding to the rapid advance of AI, some school systems and universities have made knee-jerk decisions to ban the technology. Though concern is warranted, this approach is deeply problematic. The potential for AI to transform education is not unlike that of previous innovations, such as Google or Wikipedia. A blanket ban that relies on the use of AI-detecting software such as GPTZero is impractical and impossible to adjudicate.

The advent of AI raises questions about what it even means to be educated. The class of 2027 is the first in which AI models like ChatGPT were widely available during the application process. While ChatGPT is likely not sophisticated enough to write convincing essays worthy of being admitted to Brown, the technology is rapidly advancing and challenges our already fraught system of meritocracy. The availability of subscription models for more advanced AIs favors the wealthy. Furthermore, while ChatGPT can get you through an intro coding class, it is inadequate for more advanced courses, setting students who rely on the technology up for failure.

Despite these challenges, ChatGPT also has the possibility for advancing education. Khan Academy, one of the world’s leading open-source education organizations, has created an AI tutor to help students improve their writing. Professors at Harvard, Princeton and Yale have created chatbot TAs to provide personalized feedback in certain CS classes. AI can also democratize access to college admissions coaching that is more frequently available to wealthier students.

The solution to the problems posed by AI is not to ban it but to expect higher standards from students and professors. If an assignment can be easily completed by a machine, then perhaps it is not a good assignment in the first place. And if ChatGPT can pass your class, perhaps the course is not rigorous enough. Shifts made to accommodate, rather than eliminate, the presence of AI will not only prepare students for the real world where this technology is available, but it will also raise the quality of work that employers can expect from Brown students. 

The University must update its academic code to account for the rise of AI and establish a working group to provide comprehensive guidance to faculty and students on the acceptable use of AI in school. The Center for Career Exploration ought to provide students with guidance on how AI will transform the professional world and how to get ahead of the curve. Recognizing AI as both a transformative technology and one that poses real threats to humanity, Brown should aspire to be a leader in researching AI ethics and safety.

At the end of the day, the responsibility for our education falls on each of us. In 1969, during the movement for the Open Curriculum, the faculty issued a statement that reads: “The student, ultimately responsible for his or her own development … must be an active participant in framing his or her own education.” AI, while a useful tool, does not have emotions. It cannot weigh perspectives or feel pride when mastering a difficult concept or writing a profound essay. As Brunonians, students and lifelong learners, it is up to us to determine what we hope to learn from our four years here, and how we choose to use AI to support, or hinder, that learning.

ADVERTISEMENT

Tas Rahman ’26 can be reached at tasawwar_rahman@brown.edu. Please send responses to this opinion to letters@browndailyherald.com and other op-eds to opinions@browndailyherald.com.

Get The Herald delivered to your inbox daily.

Tasawwar Rahman

Tas Rahman is a staff columnist at the Brown Daily Herald writing about issues in higher education. When he's not coding or studying biochemistry, you can find him hiking and enjoying the great outdoors.


Powered by SNworks Solutions by The State News
All Content © 2024 The Brown Daily Herald, Inc.