AI technologies are increasingly ubiquitous, and academia is no exception. CSUDH faculty hosted a panel discussion Oct. 30 to discuss the potential and pitfalls of artificial intelligence. Credit: Illustration by Jayden Hart

Symposium marks university first campus-wide town hall on the topic of artificial intelligence in higher-ed.

By Taya Bohenko, Staff Reporter

As artificial intelligence becomes part of daily life and learning, CSUDH students and faculty gathered Oct. 30 to question its growing influence. The panel discussion, titled “Is Artificial Intelligence Dangerous?,” raised concerns about how the AI could shape—and possibly undermine—human judgment and critical thinking.

The symposium marked the university’s first campus-wide AI town hall, bringing together students and professors to discuss how technology is transforming education and society. Panelists raised ethical concerns over widespread AI use in classrooms, including plagiarism, misinformation, and the erosion of essential academic and social skills.

“The cheating is just rampant,” said panelist Lissa McCullough, a CSUDH philosophy professor. “Every paper I look at, my first question is, ‘Is this AI, or is it written by the student?’ But I can’t accuse them, there’s no way to prove it.”

Earlier this year, California State University announced plans to become the nation’s “first and largest AI-powered” university. Through a multimillion-dollar partnership with several prominent tech companies—including ChatGPT maker OpenAI—Cal State rolled out an AI suite to more than 460,000 students and 63,000 faculty and staff systemwide.

The move, said CSU Chancellor Mildred García in a Feb. 2 statement, was to establish “a highly collaborative public-private initiative” that would position Cal State as “a global leader” in AI adoption and “help provide the highly educated workforce that will drive California’s future AI-driven economy.”

Some CSUDH faculty and staff received news of the deal with skepticism, citing systemwide budget cuts that have limited funding, forced layoffs, and prompted the university to reconsider the viability of some programs. 

McCullough and other panelists acknowledged that AI tools can be helpful when used appropriately— creating quick summaries, identifying areas of improvement, or assisting with preliminary research, for example. However, McCullough warned that their influence extends far beyond academics.

“It’s disturbing, way beyond the impact on education,” she said. “It’s going to have a huge impact on society and the internet. The internet is such an important resource, and we are making it more toxic, swampy, and far more dangerous than it already was.”

AI has become commonplace but increasingly less noticeable in news feeds, product recommendations, and search results, according to Brian Gregor, chair of the Philosophy Department.

“When we’re using it, it’s also using us,” Gregor said. “It’s shaping our perception of what the world is like, shaping our expectations of how to interact with people, and how to view our ability to master and possess the world around us.”

Philosophy professor Dana Belu echoed Gregor’s concerns, believing that AI tools like ChatGPT change how students learn and think. 

“The overreliance on ready-made information via ChatGPT obstructs the building of social skills, academic skills, and critical thinking,” Belu said. “Unlike ordinary tools whose repeated use builds skills, ChatGPT is paradoxical. The more one uses it, the more deskilled the user becomes.”

Belu added that AI should not be mistaken for human intelligence, saying that it cannot replicate experience.

“AI lacks location and a body. It lacks a point of view, sensations, feelings, desires, or motivations for action,” Belu said. “All of which are integral parts of individual human experiences.” 

Panelists cautioned that as AI continues to evolve, students must learn to distinguish between what is real and what is generated—whether in photos, videos, voices, text, or other forms of media.

“As we think about how we could use AI wisely and responsibly, we need to be sober and self-aware of what a difficult task this is,” Gregor said. “Because it’s easier said than done.”

Leave a comment

Your email address will not be published. Required fields are marked *