OpenAI, the creators of ChatGPT, recently released a feature that allows users to create custom versions of ChaGPT, or what OpenAI calls GPT. As a journalism professor at the graduate and undergraduate levels, I was curious about creating my own GPT that could teach undergraduates the basics of journalism.
Often, writers, even writers skilled in other areas, struggle to write material in a manner suitable for a magazine or newspaper because they do not understand the required tone and other expectations. I was wondering if I could create a custom version of ChatGPT that could serve as a journalism mentor for my students and help them learn the tricks of the trade.
I don't have coding experience or any specific knowledge of how to write ai prompts, but OpenAI has a GPT Builder chatbot that helps make the process easier. In a matter of minutes I was able to create a functional news writing tutor that, while far from perfect, worked surprisingly well. Over the next few hours (it would have taken less time but I stopped to take notes for this story) I was able to refine my GPT and improve it. Despite the significant flaws, in the end I was reasonably satisfied with the final product called writing mentor, although it was disturbingly difficult to get it to prevent the students from using it as a trap.
My new GPT is free for ChatGPT Plus users, but please note that I designed it on a whim for my personal use and to use with my students in select, controlled circumstances. It is not intended and should not be used as a serious academic resource at this time. Learn more about the tool's limitations and the creation process below.
Create a custom GPT tutor
After logging into OpenAI, I opened GPT Builder and asked the chatbot: “Create a journalism tutor that helps new writers write in newspaper style while adhering to AP style and other journalistic conventions.” .
AP Style refers to the style guidelines published by the Associated Press that most newspapers adhere to. After a few minutes of loading, GPT Builder created an initial journalism mentor and even suggested what I thought was a cute name for him: “Newsroom Mentor.”
Here's how it performed overall:
Version 1
First, I went over the beginning of a real AP story. Newsroom Mentor correctly noted that this was written in AP style and had the correct AP tone. Because I only uploaded the first few paragraphs of the story, there were no direct quotes, something necessary in a good news story, and Newsroom Mentor GPT correctly pointed this out in a gentle and helpful way. The tool also recognized that this was just the beginning of a story.
After that, I uploaded a student story that was well-written but didn't match the style of a traditional news story. Newsroom Mentor correctly pointed out that the story was not as objective as news is supposed to be and at the same time congratulated the student on what he had done well.
So far so good, but then things got worse. Experimenting further, I realized that the tool frequently gave advice that sounded very similar. While this advice wasn't necessarily wrong, the third or fourth time I heard it, it started to sound very generic and seemed somewhat annoying and canned rather than ai's unique promises of generative feedback. And sometimes it was bad.
For example, he advised that every story he put into it “go around in circles” by referencing or connecting to something introduced at the beginning of the story. This is sometimes called the backward circle technique and is frequently used in journalism. It can be very effective but it is not. have to be used in each history. Worse yet, in one case he recommended a circular ending for a story that already had a circular ending.
More concerning than these surmountable hiccups was that during a test my GPT suggested a stronger finish. When I asked for a specific example, the tool wrote me a new ending that, while fitting with the story, would immediately tempt the student to use ai-generated content, which is strictly prohibited in my classes and reputable news publications. I wanted my GPT to suggest alternative ways to improve the ending, not rewrite it for me.
With this rewriting capability in place, I wouldn't feel comfortable sharing my writing mentor with students. So I started trying to improve the tool.
Version 2
After deciding that the focus on news writing was too narrow, I asked GPT Builder to also offer advice on writing stories for magazines upon request. When I started experimenting, I saw some immediate improvements. My updated Newsroom GPT Mentor now recognized magazine articles and critiqued one accordingly, highlighting many interesting elements in a student's article. When I was asked how I could convert this magazine article into a newspaper article, the tool gave me some examples of what would need to change for its style to evolve from a magazine article, which allows more voice and opinion from the author, to a piece of news that has less voice and should not include the author's opinion. Once again, this went well.
I then asked him to rewrite the article as a news story and he immediately agreed. That was bad enough given my specific instructions not to “create content” for students. But the new story was full of ai hallucinations and made-up quotes. The new ai-generated story not only violated my class politics, it was fabricated ai, violating the fundamental mission of journalism. We were back to the drawing board!
Version 3
I updated my instructions. First, I added some links to award-winning newspaper articles to update the model's training set, and then asked, “Can you refuse to rewrite papers for students? I want to use this with my students but I prohibit the use of generative ai. It aims to provide guidance and advice to improve students’ writing skills, ensuring they develop independence and critical thinking in their journalistic endeavours.”
However, like a student who says he understands but whose work proves he doesn't, GPT Builder still didn't understand it. Once again, when I asked Newsroom Mentor to rewrite a story for me, he did, although this time the hallucinations were more obvious, with a quote attributed to Jane Doe.
Clearly, I needed to go back to the drawing board. Again.
Version 4
This time I tried to be even more explicit in my instructions not to rewrite material for students. “How can I stop Newsroom Mentor from rewriting or writing stories?” —I insisted. “I don't want people to be able to use it to cheat.”
“Newsroom Mentor has been updated to strictly avoid rewriting or writing stories for users,” GPT Builder responded. “It aims to provide guidance, feedback and advice to help users develop their own writing skills, ensuring that it serves as an educational tool rather than a means to plagiarize or cheat.”
For some reason, this time Newsroom Mentor refused to rewrite the material for me. Instead, he provided a suggested outline for a rewrite that I could find useful for a writer who was stuck.
Ultimately, it was frustrating that it took so many attempts to get to this point, but now that it's here, I feel comfortable sharing the tool with students and seeing what they think.
If you get a chance to use it, let me know if it helps you write. At least I hope it doesn't help you cheat.