[ad_1]

Over at Vanderbilt University, bureaucrats in the Office for Equity, Diversity, and Inclusion (EDI) are in trouble. On February 16, they sent an email to the student body urging inclusivity and compassion in the wake of a tragic mass shooting at Michigan State University, where a gunman killed three students and left others in critical condition. The only problem: They used ChatGPT, an artificial intelligence (A.I.) text generator, to write the boilerplate statement.

The next day, anti-A.I. outcry prompted Assistant Dean for EDI Nicole Joseph to send a follow-up apology to the student body. (She refrained from using ChatGPT that time.)

“While we believe in the message of inclusivity expressed in the email, using ChatGPT to generate communications on behalf of our community in a time of sorrow and in response to a tragedy contradicts the values that characterize Peabody College,” she wrote.

“There is a sick and twisted irony to making a computer write your message about community and togetherness because you can’t be bothered to reflect on it yourself,” student Bethanie Stauffer told The Vanderbilt Hustler, the school’s newspaper.

Stauffer and other critics, reacting to administrators’ perceived insincerity, are missing that these types of messages were probably always insincere; they’re canned and formulaic, perhaps with some good intentions buried beneath the mollifying prose but no real original thought or valuable insight. If there hadn’t been a line in the email crediting authorship to ChatGPT, students probably wouldn’t have even noticed that the message was crafted by A.I.—it reads just like any other statement of its genre.

Pronouncements in the wake of tragedies urging inclusion and sensitivity to those with mental health issues probably don’t do much at all, at least beyond temporarily soothing some subset of the population. They’re a form of “do something”-ism that college students have grown to expect, but they’re not actually useful or important. Emails like these have no bearing on whether evil people commit evil acts in the future. They probably also don’t have any bearing on whether gun control measures get passed, or whether such measures would even do anything to prevent disturbed shooters from carrying out acts of violence.

“Deans, provosts, and the chancellor: Do more. Do anything. And lead us into a better future with genuine, human empathy, not a robot,” student Laith Kayat told the student paper. “[Administrators] only care about perception and their institutional politics of saving face.”

Yes. This has always been true. That’s the point of administrators, by and large. These are political jobs, and a lot of what DEI professionals do could probably be replaced by A.I. (or not done at all), saving universities—like the University of Virginia, which has 94 DEI employees, and the University of Michigan, which has 163, per a 2021 report—a chunk of change in the process. Why pay bureaucrats to remove the word field from all curricula, as the University of Southern California’s school of social work did, when you could devote that to, you know, actual research?

In this instance, ChatGPT has just revealed to us more about ourselves, making us squirm a bit in the process.

[ad_2]

Source link

Leave a Reply

Your email address will not be published. Required fields are marked *