Who Are the Biggest Early Beneficiaries of ChatGPT? International Students

Study of writing quality analyzed more than 1 million student submissions at a large public university

Apr 14, 2025 - 13:35
 0
Who Are the Biggest Early Beneficiaries of ChatGPT? International Students

The public release of ChatGPT in November 2022 changed the world. A chatbot could instantly write paragraphs and papers, a task once thought to be uniquely human. Though it may take many years to understand the full consequences, a team of data scientists wanted to study how college writing might already be affected.

The researchers were able to gain access to all the online discussion board comments submitted by college students at an unidentified large public university before and after ChatGPT to compare how student writing quality changed. These are typically low-stakes homework assignments where a professor might ask students to post their thoughts on a reading assignment in, say, psychology or biology. The posts could be as short as a sentence or as long as a few paragraphs, but not full essays or papers. These short homework assignments are often ungraded or loosely factored into a student’s class participation.

The scientists didn’t actually read all 1,140,328 discussion-board submissions written by 16,791 students between the fall term of 2021 and the winter term of 2024. As specialists in analyzing big data sets, the researchers fed the posts into seven different computer models that analyze writing quality, from vocabulary to syntax to readability. Ultimately, they created a single composite index of writing quality in which all the submissions were ranked on this single yardstick.

The results? Overall student writing quality improved. The improvement was slow at first in the early months of 2023 and then it improved substantially from October 2023 until the study period ended in March 2024.

“I think we can infer this is due to the availability of AI because what other things would produce these significant changes?” said Renzhe Yu, an assistant professor of educational data mining at Teachers College, Columbia University, who led the research. Yu’s paper has not yet been published in a peer-reviewed journal, but a draft has been publicly posted on a website at Cornell University that hosts pre-publication drafts of scholarly work. (The Hechinger Report is an independent news organization at Teachers College, Columbia University.)

Yu and his research colleagues didn’t interview any of the students and cannot say for certain that the students were using ChatGPT or any of its competitors, such as Claude or Gemini, to help them with their assignments. But the improvement in student writing following the introduction of ChatGPT does seem to be more than just a random coincidence.

Big upswings for international students 

The unidentified university is a minority serving institution with a large number of Hispanic students who were raised speaking Spanish at home and a large number of international students who are non-native English speakers. And it was these students, whom the researchers classified as “linguistically disadvantaged,” who saw the biggest upswings in writing quality after the advent of ChatGPT. Students who entered college with weak writing skills, a metric that the university tracks, also saw outsized gains in their writing quality after ChatGPT. Meanwhile, stronger English speakers and those who entered college with stronger writing abilities saw smaller improvements in their writing quality. It’s unclear if they’re using ChatGPT less, or if the bot offers less dramatic improvement for a student who is already writing fairly well.

The gains for “linguistically disadvantaged” students were so strong after the fall of 2023 that the gap in writing quality between these students and stronger English speakers completely evaporated and sometimes reversed. In other words, the writing quality for students who didn’t speak English at home and those who entered college with weak writing skills was sometimes even stronger than that of students who were raised speaking English at home and those who entered college with stronger writing abilities.

Gains concentrated among high-income students

However, these gains in writing quality among the “linguistically disadvantaged” were concentrated among higher income students. The researchers were able to match students’ writing submissions with administrative data on students, including their family income, and they noticed that the writing of low-income students whose parents did not attend college didn’t improve quite so much. By contrast, the writing of high-income international students with college educated parents transformed markedly.

That’s a sign that low-income students weren’t using ChatGPT quite so much or not as effectively. Socioeconomic differences in how students benefit from technology aren’t uncommon. Previous studies of word processing software, for example, have found that higher income students tend to be more facile in taking advantage of editing features and see greater writing benefits from the ability to cut and paste and move text around.

Mark Warschauer is a professor of education at University of California, Irvine, and director of its Digital Learning Lab, where he studies the use of technology in education. Warschauer was not involved with this study and he said he suspects that the lopsided benefits for higher income students will be fleeting as low-income students become more acclimated and more facile with AI over time. “We often see with new technologies that high-income people get access first, but then it balances out. I believe that low-income people use cell phones and social media as much as high income people in the U.S.,” he said.

But he predicts that the substantial and larger improvements in writing for international students, far greater than for domestic students, will be “more important and durable.”

Of course, this improved writing quality doesn’t mean that these international students are actually learning to write better, but it does indicate that they’re adept at using technology to present ideas in well-written English.

The study’s researchers didn’t analyze the ideas, the quality of analysis or if the student submissions made any sense. And it’s unclear if the students fed the reading into the chatbot along with the professor’s question and simply copied and pasted the chatbot’s answer into the discussion board, or if students actually did the reading themselves, typed out some preliminary ideas and just asked the chatbot to polish their writing

In Yu’s own classes at Teachers College, he said he encourages students to use ChatGPT in their writing assignments as long as they acknowledge it and also submit transcripts of their conversations with the AI chatbot. In practice, he said, only a few students admit to using it.

He noticed that student writing in his classes had been improving until now. “This year has actually been horrible,” he said. More and more of his students have been submitting typical AI output that “seems reasonable but doesn’t make a lot of sense,” he said.

“It all comes down to motivation,” said Yu. “If they’re not motivated to learn, then students will only make a bad use of whatever the technology is.”

Contact staff writer Jill Barshay at 212-678-3595, jillbarshay.35 on Signal, or barshay@hechingerreport.org.

This story about the ChatGPT writing was written by Jill Barshay and produced by The Hechinger Report, a nonprofit, independent news organization focused on inequality and innovation in education. Sign up for Proof Points and other Hechinger newsletters.