Skip to Content, Navigation, or Footer.

U. community discusses integration of AI into academics, points to opportunities for innovation

Provost Francis J. Doyle III addresses AI in letter to Brown community, offers resources

Provost Francis J. Doyle III identified the intersection of artificial intelligence and higher education as a University priority in an Aug. 31 letter to the community titled “Potential impact of AI on our academic mission.” Doyle’s address comes at a time of uncertainty as educational institutions struggle to make sense of the roles and regulations of artificial intelligence tools in academia.  

Doyle’s letter begins by zooming in on generative AI tools such as ChatGPT, which soared in popularity after its debut in late November of last year. The program, an open-access online chatbot, raked in over 100 million monthly users within the first two months of its launch, according to data from Pew Research Center

“There is no shortage of public analysis regarding the ways in which the use of generative artificial intelligence tools — which are open-access tools that can generate realistic text, computer code and other content in response to prompts from the user — provide both challenges and opportunities in higher education,” Doyle wrote in the letter. 

“Exploring the use of AI in ways that align with Brown’s values has been a topic of discussion among our senior academic leaders for several months,” he continued. 

ADVERTISEMENT

Doyle did not prescribe University-wide AI policies in the letter but encouraged instructors to offer “clear, unambiguous” guidelines about AI usage in their courses. He also provided a variety of resources for students seeking guidelines on citing AI-generated content, as well as how to use AI as a research tool.

“As we identify the ways in which AI can enhance academic activities, … we must also ensure these tools are understood and used appropriately and ethically,” Doyle wrote.

The contention presented by Doyle is one mirrored by educators and administrators nationwide: How can academic institutions strike a balance between using AI as a learning tool and regulating it enough to avoid misuse?

“The upsides to AI tools such as ChatGPT that are often touted include improved student success, the ability to tailor lessons to individual needs, immediate feedback for students and better student engagement,” Doyle wrote in a message to The Herald. But “it is important for students to understand the inherent risks associated with any open-access technology, in terms of privacy, intellectual property ownership and more.”

Doyle told The Herald that he anticipates prolonged discussions with academic leadership, faculty and students as the University continues to monitor the evolution of AI tools and discovers “innovative applications to improve learning outcomes and inform research directions.”

Michael Vorenberg, associate professor of history, is finding creative ways to bring AI into the classroom. On the first day of his weekly seminar, “HIST 1972A: American Legal History, 1760-1920,” Vorenberg spoke candidly with his students about general attitudes regarding AI in education and the opportunities for exploration these developments afford.

“Most of what educators are hearing about are the negative sides of generative AI programs,” Vorenberg wrote in a message to The Herald. “I am also interested in how generative AI might be used as a teaching tool.” 

Vorenberg outlined two broad potential uses for AI in his class: The examination of sources generated by ChatGPT — allowing students to probe into the “appropriateness” of the retrieved documents from a historian’s perspective — and the intentional criticism of said generated sources, understanding how a historian’s perspective could have produced a stronger source.

“The underlying assumption behind the exercise is that even a moderately skilled historian can do better at this sort of task than a generative AI program,” Vorenberg explained. “Until (this) situation changes, we who teach history have an opportunity to use generative AI to give concrete examples of the ways that well-trained human historians can do history better than AI historians.”

Given the University’s large pool of students interested in pursuing computer science — The Herald’s recent first-year poll shows computer science as the top indicated concentration for the class of 2027 — Brown has the potential to shape the future of AI. 

ADVERTISEMENT

Doyle told The Herald that the University is well-situated to contribute “our creativity (and) our entrepreneurial spirit” to making an impact as researchers continue to strengthen these tools.

Jerry Lu ’25, who is concentrating in both computer science and economics, “obsessively followed the growing momentum behind Open AI, ChatGPT and developments in automation.” 

Lu believes there are two ways the University can best support its students in navigating artificial intelligence — “one from an educational perspective, and another from a more career-oriented view.”

In terms of education, Lu said he hopes that the University would approach AI not just through computer science classes, but “from a sociology approach or humanities lens as well” to equip all students with the necessary skills to address “how AI will undoubtedly affect society.”

Get The Herald delivered to your inbox daily.

Lu also pointed to the restructured Center for Career Exploration as a potential resource for preparing students to enter a workforce heavily influenced by AI.

“The new Career LAB should be cognizant of how these new technologies are going to impact careers,” Lu said. “Offering guidance on how students should think about AI and how they can navigate (it) or use (it) to their advantage, I think that that would be really key.”

When asked about how the universities should engage with AI, ChatGPT focused on the pursuit for a common good. 

“Universities have a critical role to play in the responsible development and application of artificial intelligence,” it replied. “They should focus on research, education, ethics, collaboration and societal impact to ensure that AI technologies benefit humanity as a whole while minimizing potential harms.”


Sofia Barnett

Sofia Barnett is a University News editor overseeing the faculty and higher education beat. She is a junior from Texas studying history and English nonfiction and enjoys freelancing in her free time.



Powered by SNworks Solutions by The State News
All Content © 2024 The Brown Daily Herald, Inc.