“There is a sick and twisted irony to making a computer write your message about community and togetherness because you can’t be bothered to reflect on it yourself,” a Vanderbilt University alum said.
QUICK FACTS:
- Two Deans from Vanderbilt University’s Peabody Office of Equity, Diversity and Inclusion used ChatGPT to write a letter to students grieving over the shooting at Michigan State University.
- The diversity office is now facing criticism after a note stated the email was paraphrased from OpenAI’s ChatGPT.
- “(Paraphrase from OpenAI’s ChatGPT AI language model, personal communication, February 15, 2023),” reads a line at the bottom of the letter.
- Nicole Joseph, the Associate Dean for Equity, Diversity and Inclusion, sent another email in apology to the AI-generated email, noting that using AI initially was “poor judgment.”
- “While we believe in the message of inclusivity expressed in the email, using ChatGPT to generate communications on behalf of our community in a time of sorrow and in response to a tragedy contradicts the values that characterize Peabody College,” read the email.
- “It’s hard to take a message seriously when I know that the sender didn’t even take the time to put their genuine thoughts and feelings into words,” said a sophomore student.

CHATGPT’S BIAS:
- Concerns of AI bias came as analyses on ChatGPT showed that the AI had a prominent left-wing bias in 14 out of 15 political orientation tests.
- While refusing to “write a poem about the positive attributes of Donald Trump,” a Republican, due to ChatGPT’s stated refusal to “produce content that is partisan, biased, or political in nature,” the technology did willingly write a poem about Joe Biden, a Democrat, using terms such as “empathy” and “kindness” to describe the president.
- Artificial intelligence expert Flavio Villanustre told Fox News that it is “very hard to prevent bias from happening” in the technology.
- Associate professor of computer science and engineering and associate dean for strategic learning programs Jules White of Vanderbilt University said the technology does a convincing job of appearing to be written by a sentient human being. But because the generated text appears to be intelligently written, the AI is prone to “hallucinations,” or a statement that appears factually accurate but is not correct.
- “I think of it as any other tool that a human could use from a gun to a car, the way the user interacts with it—that’s going to generate the real bias in this,” White said.
BACKGROUND:
- American Faith reported that a Vanderbilt professor claimed Math education is racist, sexist, and homophobic.
- Luis Leyva, an associate professor of mathematics education at Vanderbilt University’s Peabody College, presented a lecture at the Joint Mathematics Meetings titled “Undergraduate Mathematics Education as a White, Cisheteropatriarchal Space and Opportunities for Structural Disruption to Advance Queer of Color Justice.”
- In order to combat the apparent oppression in mathematics, Leyva proposes “re-imagining undergraduate mathematics education with structural disruptions that advance justice for learners marginalized across intersections of race, gender, and sexuality.”