Artificial intelligence alters academia

Home/Artificial intelligence alters...
Artificial intelligence alters academia
Artificial intelligence alters academia Admin CG August 31, 2023

Today in 2023, students, teachers and everyday routine are at a crossroads. Advanced artificial computing has emigrated from the pages of science fiction and is now an inescapable fixture in daily life.

Within a relatively short amount of time, artificial intelligence has quickly adapted to complete a multitude of tasks from a bot that can have human-like conversation to machines capable of winning art competitions.

Despite the current public interest in AI, artificial intelligence has for decades, to a certain extent, been around in some form. From TurboTax to Word’s autocorrect feature, machines have exercised significant autonomy from the human programmer going back to the 2000s. This changed following breakthroughs in generative AI.

Director of the Carl G. Grefenstette Center for Ethics in Science, Technology and Law, John Slattery, said that he believes that universities that have established AI policies might be jumping the gun as AI continues to change.

“AI presents a number of big challenges. Obviously, the biggest one for academia right now is the question of pedagogy,” said Slattery. Pedagogy, the method and practice of teaching, has been studied since ancient Greece and since then has always been practiced by humans until recently.

These predicted and current changes will be discussed in depth at a symposium scheduled for Nov. 10.

“I’m talking to a lot of people about how they’re using generative AI in the classroom,” Slattery said.

In addition to academic administrators, governments are also struggling to regulate AI in part due to its rapid evolution. Congress has yet to pass any legislative framework for AI, leaving the judiciary and executive actions to fill the gap. One noteworthy case working its way through the federal courts is centered around whether or not artificial intelligence can legally hold copyright.

Last week saw U.S. District Judge Beryl A. Howell rule that AI generated artwork cannot legally hold copyright. Whether or not this decision will remain is up to the appellate and U.S. Supreme Court. Regardless, we can expect to see a series of decisions moving forward that will significantly impact generative AI and 21st century jurisprudence.

While it remains a challenge to exactly pin where artificial intelligence is heading one fact is crystal clear, artificial intelligence is only going to entrench itself into every crevice of our daily lives.

From personal to professional, the adaptability of AI will make it ubiquitous in the coming decade, and just as the ability to read or use word processor software is a necessary prerequisite to participate in society, so too will the understanding of artificial intelligence.

Unlike other forms of artificial intelligence, generative AI has the ability to create content from sentences to movies independent of human guidance. The use of generative AI in an academic and professional setting creates a new set of twenty-first century questions that are being integrated across the board.

Programs such as OpenAI’s ChatGPT are capable of passing both the U.S. medical licensing exam and the bar exam. It is the versatility and depth of knowledge of programs such as ChatGPT that makes them both useful and detrimental to academia.

David Dausey, provost of Duquesne, said that due to the rapid advancement of artificial intelligence, Duquesne is struggling to create a fair and flexible AI policy.

“We are encouraging thoughtful experimentation with the teaching and research possibilities that AI affords us as a tool. To aid faculty, the university is developing template guidance to use in syllabi to address AI use and issues in course contexts or in research activity,” Dausey said. “During this academic year, we will discuss ideas for this policy with deans, faculty, and students that address teaching and research alongside academic integrity, privacy, attribution and ethical guidance, among other concerns,” Dausey said.

Until a centralized policy is drafted and implemented, the responsibility of managing AI in the classroom largely falls to deans and professors.

Wesley Oliver, director of the criminal justice program and professor of law takes a different approach to artificial intelligence in his classroom. Rather than reprimand its use, he encourages students to use the tools they will have in the real world.

“I want you to use ChatGPT and or some version of Bard or one of these and I want you to come up with the best possible query. I want to see your queries, and I want to see what comes out of them,” Oliver said.

He also stresses the importance of not over-relying on artificial intelligence as the outputs may not be factually accurate or up-to-date. Instead he opts for using AI as a tool rather than the primary driver of content generation.

“I just personally think that you have to train students to use the things they will have available to them when they graduate,” Oliver said. “Afterall NASA is no longer using slide rules and doing long division by hand.”


PUBLISHING PARTNERS

Tags