
When Associate Professor John Schneider was in graduate school in the 1980s, the 1000-page Handbook of Mathematical Functions with Formulas, Graphs, and Mathematical Tables was among his most useful reference books.
Now, it’s a doorstop.
Throughout his career, Schneider, with WSU’s School of Electrical Engineering and Computer Science, has seen innovations that saved time, energy, and paper. He and his colleagues are now cautiously and deliberately bringing the world of artificial intelligence into their classrooms.
“I remember when Mathematica arrived and, oh my goodness, that was wonderful, or when MATLAB automated the LINPACK routines,” he said. “Tools have come along of various powers. AI is a whole other level, but it is a tool.”
On the front lines of the AI revolution
WSU’s School of Electrical Engineering and Computer Science might be on the front lines of the AI revolution. The school is incorporating AI into all its courses and has a pivotal leadership role to play.
“In the software industry, it’s more evident, but AI is likely to percolate into so many other areas – in manufacturing and the transportation sector, and so I think the school needs to adapt to these changes quickly, and that could help other units and programs,” said Ananth Kalyanaraman, the school’s director.
So far, companies are using AI as a tool to improve productivity and operational efficiency, which might be considered low-hanging fruit from an AI standpoint, but it’s important to also go beyond that to be transformative. Educators are the ones who need to ask more fundamental questions about AI’s role.
“Can we actually improve our fundamental understanding of science and engineering applications and apply computing to solve more ambitious and challenging problems that we couldn’t before?” said Kalyanaraman. “I think the students coming out of our program will have the right perspective of what AI can help with, with humans as the primary drivers of innovation and creation.”
To illustrate his point, he gives the example of the logarithm book he bought when he was young to look up logs of numbers. It was a simple and tedious look-up task, which AI can easily help with. But looking up the numbers doesn’t get to the fundamental idea of their purpose, which is to divide a number into smaller pieces and then assemble a solution.
“If all I have to do is look up a log, I would be misled into thinking that AI can solve it and there’s nothing else to it,” said Kalyanaraman, “but the entire field of computing — all the problem solving — relies heavily on the concepts of understanding logarithms.”
That’s something that AI cannot teach.
“I think we have to understand AI can provide an abstraction and a set of techniques that I can start applying in different ways to solve different problems,” said Kalyanaraman, “and that’s inherently a human-driven process.”
With AI comes the threat of what’s called ‘metacognitive laziness.’ That is a tendency to not dive deeply into a problem and to be disengaged in the development and design process. The AI tool can help the developer, and they quickly move on.
“This is really a major impediment to the learning process for our students if they approach AI in this manner,” said Kalyanaraman. “From an assessment standpoint, we have to make sure that the students are going that extra mile — much more so, perhaps, than before — to really have them engaged in the learning process.”
AI in the classrooms
AI tools can do coding, and many are already embedded within the programming environment and the interfaces that students use. While there are some courses that are specifically focused on AI, the computer science and engineering programs are encouraging faculty to incorporate the technology into all their courses at all levels of the curriculum.
In an introductory computer programming class, for instance, students still learn core coding skills, but they also get to use generative AI to build projects from scratch. During the first ten weeks of the first-year course, students follow a traditional approach, learning the fundamentals of coding using the Python programming language, but they are also encouraged to use AI large language models as their personal tutors. In the last five weeks of the course, they’re taught how to use generative AI as a tool, culminating in a large project. This allows them to learn—and practice–core fundamentals of computer science, such as how to break down a problem and how to test code to make sure it’s correct. And, they learn how to create effective prompts.
“We can enlarge things and teach them more at an earlier stage,” said Professor Shira Broschat, who teaches the introductory course. “I think that’s kind of what they really enjoy — the AI allows them to just kind of go wild and be creative. It empowers students and lets them see what they can do.”
She cautions, however, that this is only possible because they understand the basics of coding in Python.
In his courses, Associate Professor Subu Kandaswamy aims to have students treat AI like an assistant early on. Later when students are building applications, he encourages them to use AI. They are allowed to use the technology — but not blindly. The faculty and teaching assistants prepare a pool of questions, and the students are required to show which piece of code accomplishes that task.
“With the use of AI as an assistant, the strategy for assessment changes,” he said. “I’m not going to assess how much they completed because we really don’t know if they completed the work — or if the AI completed it for them, so the assessment moves toward how much they understand of what they submitted. If they ask it to do some of their work and it spits out something which is 80% right, they have to identify and fix the remaining 20%.”
The faculty members include an AI use policy in their syllabus, clearly outlining what uses are allowed and what kind of disclosure is expected.
In addition to revamping curriculum and including AI throughout its program, the school is developing an introductory course specifically focused on the technology. The school has also begun collaborating with the AI company, Nvidia, conducting eight-hour workshops and a certification to train students on deep learning skills. The program led by Parteek Kumar, has students across all disciplines.
AI isn’t taking over
For students and others fretting about an AI-dominated future, the computer scientists argue that the large language models still have a long way to go. Researchers had thought that as AI is fed more data, it was supposed to become smarter. They’re finding that scaling up with data hasn’t led to dramatic continued improvements, said Schneider. Eventually, he predicts, the technology is going to see new breakthroughs, but not with the current large machine learning-based models.
“It turns out that the scaling law isn’t really a law,” he said. “It turns out they’re hitting this limit — ChatGPT-5 still hallucinates, and it still makes up garbage.”
Rather, the school leaders want students to know that while AI is a valuable tool, WSU’s computer science and engineering programs provide a much wider education in problem solving and thinking.
“While the electrical engineering and computer engineering faculty have to embrace these technologies, they also have to facilitate student work and go further with deepening students’ understanding of the underlying concepts,” said Schneider, “I think it’s really important that students rethink what they’re bringing to the table and how they’re interacting with AI to really tackle much deeper problems.”
Students are afraid of AI replacing them, so the faculty want to let the students use it in a controlled manner and show the students its limits. In her junior-level course, Associate Professor Venera Arnaoudova lets students work with either a human or an AI companion. As part of the project, the students get to experience the advantages and challenges of working with an AI companion.
“Understanding the limits of AI helps students gain confidence that humans still matter,” she said. “There’s a big future for them, and it’s not just all AI.”
Furthermore, the AI tools in the classroom allow academics and instructors to introduce students to real-world and industry scenarios.
So, for instance, working as teams is something that always happens in industry, but having the AI companions allows for better emulating the work environment. In industry, people also don’t develop code from scratch – they collaborate and integrate the work of others. Invariably, they work on projects that are already existing as products, and they make modifications and functional enhancements.
“In our software engineering curriculum, we are actually simulating what happens in industry to see what the students can do,” said Kalyanaraman. “It gives them more challenges, and the students don’t have to be tied down to what they could do individually if they were asked to do these projects from scratch.”
Challenges of new technology
Working at a university, professors and students are working on the cutting edge of technology and academic advancement, and that is always a challenge. Assessments now might be more difficult, and faculty members also are trying to understand the limits and uses of the technology.
“I think everybody’s in the same boat,” said Kandaswamy. “We’re trying to figure out how to achieve the objective, which is making sure that students learn the fundamentals and are equipped so that when they go into industry, they are not blindsided with the AI tools.”
The goal is to think less about the computer science program as being for programming and more as a program to help students learn problem solving – how to design and envision new ways to apply core concepts in different ways and in real-world applications.
“That will make the field a lot more impactful, and AI obviously becomes a tool for that, but certainly not the end,” said Kalyanaraman.
And, Kalyanaraman points out, computer science is the major that rigorously teaches students how to actually build AI.
“Students who are interested in understanding how to develop the core components that become a different form of AI need to understand how computer systems work, how they interact with real-world environments, and how the human-driven process helps to interface with these physical systems,” he said. “Our program is not just where you learn how to do the best possible code – I think we go far beyond that.”
Faculty in the school are planning a two-day boot camp on AI and data literacy in spring of 2026 to bring together faculty members and instructors in data-driven disciplines to be able to better use AI tools in their research and teaching.