Commentary: Think ChatGPT can write as well as a human? My students disagree

Commentary: Think ChatGPT can write as well as a human? My students disagree

Others have argued that AI-powered writing tools can help users in several ways. They can check grammar and tackle writer’s block by generating an outline for an article. Outsourcing these tasks to AI allows writers to focus on the substance and content of their arguments or stories.

QUESTIONS OF HONESTY AND ACCOUNTABILITY

In addition to the Fort Siloso article that my students critiqued as impersonal, I also asked ChatGPT to write another version using a first-person perspective. It was similarly logical and decently written, but more engaging.

However, the article it generated was dishonest. In one paragraph, it wrote: “Upon entering the fort, I was greeted by a knowledgeable guide who gave me a brief overview of the fort’s history.” How can an AI programme write about a first-hand experience it never had, describing sights it never saw?

A few of my students also pointed out factual mistakes. Indeed, when my colleagues tested ChatGPT by asking it to write short bios of faculty members, many details included were wrong.

When I prompted it to write a literature review and to include a reference list, it listed references that I could not find, and cited articles for information not discussed in them.

If I had used ChatGPT to generate this commentary and it included factual errors, who should be held accountable for any mistakes?

Bylines are not just about acknowledging the efforts and unique voices of authors – they are also about accountability. They identify the writer responsible for what is communicated in the article.