ČâČâ´ŤĂ˝ State embraces AI as valuable tool for students, researchers

KENNESAW, Ga. | Aug 21, 2024

Jeanne Law
On Nov. 31, 2022, a group of students gathered around Jeanne Law’s computer and stared in wonder. It was the day after OpenAI released an early demo of ChatGPT, and Law, a professor of English at ČâČâ´ŤĂ˝ State, was curious to see what the chatbot could do.

“In that moment, I knew it could have a profound impact on all of us,” Law said.

At the time, graduate student James Blakely was working in the campus writing center and saw an immediate change in student writing processes.

“I realized this technology would radically affect the academic landscape,” he said. “As I began using this technology in basic writing tasks like editing emails and summarizing class notes, I realized my own composing practices were changing in a way that, quite frankly, concerned me. The capabilities of this technology were so immediate and daunting that I knew I had to continue exploring it and see how it would develop.”

ChatGPT experienced a rapid increase in users soon after its launch. Data from analytics firm Similarweb shows within five days it attracted over one million users, and by its first anniversary it had more than 1.7 billion users, making it the fastest-growing consumer application in history. Since its release, Microsoft and Google created their own versions, called Copilot and Bard, respectively.

There is ongoing debate, as AI becomes increasingly ubiquitous, about its ethical implications, impact on the labor market, and guidelines needed for education institutions. It’s a buzzing topic among state and national politicians who recognize the need to understand and regulate AI. According to the National Conference of State Legislatures, in the 2023 legislative session at least 25 states, including Georgia, introduced AI bills, adopted AI resolutions, or enacted AI legislation.

Versions of artificial intelligence (AI), like chatbots, have been around for years, but the recent introduction of large language models (LLMs) like ChatGPT and Microsoft Copilot have changed higher education and life beyond academia.

The ever-evolving technology raises questions about how it will impact the lives of its users. Will AI take our jobs? Will students using AI on assignments be breaking academic codes of conduct? Can we trust AI to be ethical and unbiased?

KSU faculty, staff, and students are not waiting on the sidelines. They are actively engaging with AI, discussing its implications, incorporating it into their work and contributing to AI research. Researchers have shown a commitment to preparing the next generation of leaders in this transformative field, while also making AI accessible to the public.

“Wąđ have to embrace this technology,” said Sumanth Yenduri, dean of the College of Computing and Software Engineering. “We also have to make sure it is used in a responsible manner. With any new technology there will be disruptions, but people should be excited about the fact that AI will enhance and elevate the work you do.”

WHAT IS ARTIFICIAL INTELLIGENCE?

Ask Microsoft Copilot for the definition of artificial intelligence it spits out the following response: “Computer systems capable of performing tasks that historically required human intelligence. These tasks include recognizing speech, making decisions and identifying patterns. In essence, AI simulates human cognitive abilities using technology.”

The response goes on to address different technologies that fall under AI, like machine learning, deep learning, and natural language processing.

AI is not new. Online customer service chatbots, Apple’s digital assistant Siri, self-driving cars, and facial recognition were all created before ChatGPT launched to the public. However, the current buzz and extreme human fascination with this technology have stemmed from the availability of AI to the public, in the form of generative AI.

Generative AI is a type of machine-learning model that generates new content. A user simply types a prompt into an LLM and it can write text as well as create images, videos, computer code or a song. It interacts in a conversational way and can answer follow-up questions, admit its mistakes, challenge incorrect premises and reject inappropriate requests.

The LLMs have seemingly endless possibilities. They can write an email asking for a raise, paraphrase a lengthy document, alter a favorite cake recipe to be low-fat, or create a lesson plan based on learning objectives. (If prompted, it could also write this entire article, but we can assure you, this was mostly done the old-fashioned way.)

AI IN THE CLASSROOM

Conversations at KSU surrounding the use of this technology happened quickly after the release of ChatGPT according to Anissa Vega, associate vice provost and professor of instructional technology. A few faculty members voiced concerns about academic integrity, and some even asked her to block ChatGPT and Copilot completely from university web servers.

Instead, her office saw the importance of the technology and worked with the Center for Excellence in Teaching and Learning to host panel discussions and create committees to provide guidance on next steps for its use in higher education.

“Wąđ never pursued the idea of not allowing it across the institution,” Vega said. “Wąđ recognized early on that AI literacy is likely a workforce skill that our students need as they graduate. We have been encouraging faculty to find meaningful ways to incorporate it into their curriculum so our students are competitive when they graduate.”

KSU’s Digital Learning Innovations department, which provides an array of services and resources related to distance and technology-enhanced learning, has been at the forefront of educating faculty and staff on best practices for the use of AI. Vega said faculty members are encouraged to include an AI policy in their syllabi, but the University gives them the academic freedom to decide how, or if, they will allow use of AI.

“Wąđ’ve embraced AI technology at KSU,” Vega said. “That doesn’t mean there aren’t a few pockets of hesitation in places we would expect, like the arts and in spaces with a lot of writing. Interestingly, the English department, which initially had some of the greatest concerns, has embraced AI more than many other departments.”

Law, in addition to being a professor of English, is the director of KSU’s First Year Composition program and a driving force behind her department’s acceptance of using generative AI. Her graduate and undergraduate students use generative AI in every assignment.

“I teach my students the basics of what I call rhetorical prompt engineering and how to ethically engage with AI assistants the first week of class,” Law said. “I believe my role as a teacher-scholar is to prepare students to thrive and lead in AI- infused workplaces of the now.”

Prompt engineering refers to the crafting of requests users put into generative AI systems to produce the desired output.

Law and Blakely teamed up in 2023 to research students’ perspectives on AI by surveying thousands of students who filter through the First Year Composition program. At the time, almost all students surveyed said they were aware of AI and about half reported to using generative AI in their personal or workplace writing, but fewer than 35 percent of them admitted to using it in their academic writing. Law believes those numbers are low because there has been a negative idea about using AI in academic writing and its ethical implications.

“I think the question to ask from an ethical standpoint is not, ‘Am I using too much AI?’ The question to ask is, ‘Am I taking responsibility for the usefulness, the relevance, the accuracy, and the harmlessness of that output?’” Law said.

Blakely, who earned his undergraduate degree at KSU, used data they collected for his master’s capstone project, which involves piloting AI-infused composition courses.

“The essence of my project is that there exists an educational imperative to integrate AI technologies in ways that enrich our students’ learning experiences,” said Blakely, who will start pursuing a Ph.D. in Rhetoric and Composition at Purdue University in the fall. “Our data suggests that this integration can help mitigate the various ethical and practical concerns that currently surround AI.”

IS IT CHEATING?

One of the biggest concerns voiced in academic circles is the question of whether students who use generative AI are breaking academic codes of conduct. The answer, according to Vega, is, “It depends.”

Faculty members are encouraged to decide their stance on AI use in each course and explicitly state that in the syllabus. Students’ work can be uploaded to a platformed called TurnItIn, which has long been used to detect plagiarism, and while there is new AI-detection software in the program, it is not reliable.

“Research done by Stanford, UCLA, Vanderbilt, and other universities on AI-detection software has shown high incidence rates of false positives, especially when it checks the work written by non-native English students,” Law said. “Wąđ simply can’t afford to disadvantage students by labeling them as cheaters. Instead, we need to meet them where they are.”

Vega says because the reliability of the technology is still in question, faculty members who use it should consider that data along with other instances of academic integrity or dishonesty.

NEW AI PROGRAMS

Perhaps one of the prime examples of KSU embracing AI is the implementation of a new Master of Science in Artificial Intelligence degree program beginning in the Fall 2024 semester.

Housed within the College of Computing and Software Engineering, it builds upon the existing artificial intelligence concentration in the Master of Science in Computer Science degree program. It will be offered online and in person to increase accessibility and decrease barriers. The college is also planning a minor in AI at the undergraduate level.

“Wąđ don’t want this to be a program solely for computing students,” Yenduri said. “There are various industries where AI can be used. Students interested in defense, healthcare, e-commerce, and the humanities could all benefit from this program. It is not just for lifelong learners, but also career changers or people who want to add more skills to their portfolio.”

In addition, the English department is proposing a potential “AI and Writing Technologies” graduate certificate in which students would learn about ethics of prompt engineering, AI-infused writing processes, and how to use prompt engineering in everyday life.

“Wąđ hope students pursuing the master’s degree in AI and learning all the technical skills would then want to come to the humanities side and earn a certificate that could align with and complement the content they’re getting from a more technical perspective,” Law said.

In the meantime, Law is teaching KSU’s first graduate course in prompt engineering for writers this summer.

AI BEYOND KSU

While faculty and staff navigate best practices for ever-evolving AI technology at KSU, many of them are actively championing the benefits of AI to the public.

“When I started diving deep into researching AI, I talked with stakeholders in the community who really wanted to know how to use it,” Law said.

She created a series of classes through the online platform Coursera, designed to help educators and the public grow their prompt engineering skills. She has also done a series of interviews with an Atlanta-based television station discussing common AI phrases to know and how the average person can use AI to make their lives easier.

Yenduri also recently appeared on an Atlanta news radio station discussing the new master’s degree program and why it’s so important that students learn about, and use, AI.

“Our job is to educate people, not just students,” he said. “We owe it to our community, to our industry partners, we owe it to everybody. There will always be concerns about new technology and it’s important to discuss those concerns and focus on how to improve the technology.”

Yenduri also addressed the major question about AI taking people’s jobs.

“There are certain kinds of jobs that may be lost, particularly with things that include repetitive tasks,” he said. “But it’s in the interest of better business to create new jobs like prompt engineers or ethical researchers. It’s imperative our students are ready to perform those jobs.”

In early 2024, Georgia Chamber President and CEO Chris Clark presented data on the current state of Georgia’s economy and future-focused insights that largely revolved around AI. Clark explained that over the next few years, millions of people around the world will shift careers as automation and AI technologies take over routine tasks. He emphasized the need for educators in Georgia to empower students to thrive in an AI-driven economy by focusing on skill development and fostering adaptability.

The Bureau of Labor Statistics projects that employment of computer and information research scientists, including AI professionals, will grow 21 percent between 2021 and 2031 – a much faster rate than the 5 percent average for all occupations. Additionally, the artificial intelligence field is expected to contribute up to $15.7 trillion to the global economy by 2030, according to PricewaterhouseCoopers’ Global Artificial Intelligence Study.

“Since I started my career, I've seen computing change in a significant way, and artificial intelligence is one area that will revolutionize everything we are doing,” Yenduri said. “Wąđ have a great sense of pride in the programs we have at KSU. We truly feel that the reason why we are offering this new master’s program is to make sure we are meeting the workforce needs of Georgia.”

While the university faces a delicate balance between the promise of AI advancements and the potential perils, faculty and staff members are committed to responsible AI deployment and will continue to shape not only the minds of students but also the future of artificial intelligence itself.

“I’m excited about what AI will bring to KSU and what KSU will bring to AI,” said Vega.

 

FACULTY RESEARCH

KSU is committed to advancing AI technologies, fostering interdisciplinary collaborations, and nurturing a vibrant ecosystem for AI-related projects. Notably, there is at least one faculty member in every college who is engaged in research either directly related to AI technology or using AI methodologies.

College of Computing and Software Engineering

Minjae Woo: Assistant Professor of Statistics and Data Science and AI Ethics Lab Director

Through its ongoing partnership with Atlanta-based global data, analytics, and technology company Equifax, KSU launched the AI Ethics Lab to study the use of artificial intelligence in the U.S. financial services industry and its ethical implications. According to Woo, the AI Ethics Lab director, it is important that credit models used to make financial decisions are transparent and explainable, so consumers can understand the outcome of decisions. The research team is working to establish methods that will help identify how an AI-powered process may create different outcomes than traditional models and the potential impact of these differences.

College of the Arts

Andrea Knowlton: Associate Professor of Dance

While the idea of fusing AI with the arts might initially raise eyebrows, Andrea Knowlton and her research team have found a way to push the boundaries of expression in new ways. Knowlton is working with researchers from Georgia Tech to study dance applications in teaching movement to AI through improvisational dance. Their collaboration resulted in a first-of-its-kind performance this spring where artificial intelligence—in the form of interactive avatar projections—improvised movement with human dance partners. The project is expected to lay the groundwork for the development of other applications that could benefit from such improvisational abilities like physical therapy, artistic brainstorming, and dance as an expressive form of recreation.

Radow College of Humanities and Social Sciences

Dylan Goldblatt, Senior Lecturer of German

For those unfamiliar with AI, terms like large language models and prompt engineering may seem like a foreign language. However, Dylan Goldblatt’s research explores the potential applications of AI and LLMs when teaching and learning a second language. In 2023, the senior lecturer of German created a research lab called Saga, which exists to facilitate student success in language education at KSU and beyond. By integrating AI into language learning, Goldblatt and his research team aim to make language learning accessible and equitable for people across the globe. In addition, his lab partners with nonprofit organizations to tackle issues that can be effectively addressed using AI. 

###

How was generative AI used to write this story?

Our writer felt it was important to be transparent about how AI was, and was not, used to write this story. After receiving the assignment, she asked Microsoft Copilot to suggest some topics to consider when writing about such a broad subject. It gave our writer a list of themes to consider, though some Copilot suggested were not relevant, and she sought out faculty, staff, and students to interview about the most pertinent information. As with most writing projects, Copilot was a tool to get started, but our team of humans conducted the interviews, wrote the article, edited the story, and created the graphics.

– Story Abbey O’Brien Barrows

Photos by Darnell Wilburn

Related Stories

A leader in innovative teaching and learning, ČâČâ´ŤĂ˝ offers undergraduate, graduate and doctoral degrees to its more than 47,000 students. ČâČâ´ŤĂ˝ State is a member of the University System of Georgia with 11 academic colleges. The university’s vibrant campus culture, diverse population, strong global ties and entrepreneurial spirit draw students from throughout the country and the world. ČâČâ´ŤĂ˝ State is a Carnegie-designated doctoral research institution (R2), placing it among an elite group of only 7 percent of U.S. colleges and universities with an R1 or R2 status. For more information, visit kennesaw.edu.