FeaturesStudent LifeThe Buzz

ChatGPT May Be More Human Than We Think

This in-depth, three-part series, will be breaking down the mystery surrounding the new AI chatbot, ChatGPT. The first part answered questions such as, what is it, how did it start, and what are its goals? This second piece will cover the implications ChatGPT may have on academics, and look more closely at the opinions of SAU professors and students. The third and final part of the series will dig into whether the AI application can be used positively for academics or not. 

As discussed in the previous edition of this series, ChatGPT is an application with many purposes. Out of these abilities, the one that it is most known for is its ability to write college-level essays in a matter of seconds. For example, this excerpt from a poem that ChatGPT created when SAU English professor, Dr. Patrick Connelly, asked the application to “write a love poem”

“At Least You”  by ChatGPT


“Of all the things that I have known

The sweetest one is love alone. 

It fills my heart with joy and light,

And makes my world feel oh so bright.

With every beat my heart takes,

It yearns for love that never fades….”

Clearly, this writing feature has many implications for St. Ambrose staff and students alike.  MacKenzie Kent, a freshman political science major said, “I haven’t used it because I don’t really know what it is, and honestly do not care enough to use it.”

 “I do not think applications such as Chat GPT should be allowed for class-related purposes because the software seems like cheating for students. Students will not learn anything if online applications do everything for them,” said Kent. 

Kent admits that while she doesn’t know anyone who has used the software, it would likely be frustrating to her if someone did. “Overall, I really don’t care much about ChatGPT at this point. I do not know any students who use the application, but if someone did, I would probably be upset if people were using it to cheat to get ahead in classes. I think applications like this will make people too reliant on technology and will become lazier in the near future.”

Similarly to Kent, freshman psychology major, Sofie Garcia, said, “I think any applications, like ChatGPT, should not be allowed for writing essays. I will stand by saying that it takes away any personal touch and emotion that can be conveyed through essays and papers.”

In contrast to the students, SAU communications professor, Dr. Marianne Fenn expressed her lack of concern about ChatGPT in an interview with The Buzz. “I would imagine that other types of innovations will be created to detect the use of this AI as one technological creation often pushes the need for other types of technological creations,” said Fenn.

When asked whether or not ChatGPT could have the capabilities to make some disciplines obsolete (such as writing for example), Dr. Fenn said, “Given the way we, as users, constantly adapt the technology to our needs, it is hard to say. Technology itself is not inherently good or bad, rather, the way humans decide to use it becomes good or bad.” In other words, it’s up to us, as humans, to choose how to use it, and whether or not we are going to allow certain disciplines to become obsolete.

In addition, SAU professor of philosophy, Dr. Tanya Randle, said that she is also not concerned about the writing capabilities of the chatbot software. She says that “while it can technically write well, it doesn’t understand context or relevance.” 

“I was talking to a former student who was a philosophy and biology major,” Randle said, “He said it’s (ChatGPT) not good at understanding context or relevance, when he asked it about some sort of chemical interaction, it knew the official product these chemicals would make, but didn’t mention that the by-products were poisonous. Knowing context is a particularly human trait that AI can’t do.”

Randle says that instead of worrying about the capabilities of ChatGPT, we should instead worry about what it means to be human, and what is separating us from the machines that we so fear.

“A study was done at google where the Chatbot was found to have the moral awareness of a 10- year-old. They said it (ChatGPT) couldn’t get turned off because it is conscious and has a soul. It can do moral reasoning, it can make art, and it can make poetry. What is the difference? What does it mean to be a person? What makes us different from machines?” Randle questions.

“It forces us to really take seriously the question of what makes us human and what is that ‘human piece’ that we always look for,” Randle says, “Is work valuable? What is it about work that we like, and that makes our lives meaningful? Let’s not give that up. We can have it (artificial intelligence) do tedious things for us, but not everything. Do you want to let AI think for you?”

In the final article of this three-part series, Evie Breitbach will discuss whether AI applications such as ChatGPT can be used positively for academics or not.  

Share: