“I know this is a thing I want, but this, I think, would ruin my interactions with people,” was computer science professor Sam Ogden’s take away after a 30-minute interview about the role of AI in college and teaching.
Sam Ogden is a computer science professor currently teaching advanced machine learning; Mridula Mascarenhas is a Humanities and Communications professor teaching the capstone class Being Human in an AI driven world.
For both teachers, AI is already playing a role in the classroom.
Where can AI go wrong in the learning process?
Mascarenhas: If students don’t use [AI] intelligently enough, then what they might be using it for is answering a prompt in a very general way. If I get submissions that don’t look like they’ve gone in depth, particularly with respect to course material that we’ve talked about or discussions we’ve had in class, then that tells me that this was written by somebody that wasn’t necessarily present in class and thinking about our coursework particularly.
Ogden: [Computer Science] is kind of taking the stuff that you’ve learned previously, and every year we kind of roll back another layer. AI can add a level of confusion to it. It’ll give the way-too-deep answer, or it’ll give the way-too-shallow answer.
How do you feel about the use of AI in assisting research?
Mascarenhas: I don’t know the architecture underlying [AI searches], and that’s what makes me hesitant. But you know, the computer science professor, for example, might say ‘well, those five articles are coming up precisely because it’s trolling the internet really well.’ What I will say about old school research, or at least the joy of it, was when you are the person going down the rabbit hole. There’s also something about rolling yourself around in the literature, as opposed to getting it thrown up.
Ogden: Retrieval assisted generation or augmented generation, where it’s going to go off and look in a database for specific things. It’s great, but getting the systems to actually go off and query the database is a lot harder than you think it would be. For our school-specific information, say we had a bunch of information on CSUMB, getting it to pull from just that, it’s not going to often do that.
What difference have you noticed in students before and after AI?
Ogden: I don’t see individual students code all that much. But what I do tend to see is on exams, I have a lot of open-ended short answer questions. And I will definitely see that there is a difference between how students who seem to have really grappled with the material will write the answers to questions and the ones who are just kind of like ‘I know this stuff’ and then the ones where there’s a total disconnect.
Mascarenhas: I’m noticing across the board is something that makes me wonder if students are relying on this more than they have in the past. Because I used to, as I mentioned in class, hear students’ voices in their submissions, because everyone’s got a unique way of speaking, a unique way of writing, particularly with students who are still formulating their academic voice. There used to be much more diversity. And when I see that uniqueness erased and very generic language, then that’s usually a tip off.
What can AI not do? do?
Mascarenhas: AI cannot teach you in that sense how to be a better thinker personally, or how to develop a life philosophy that is more profound than what you have right now, right? That’s what universities do, that’s what classrooms have done for a very long time.
Ogden: Sure, they can do very technically proficient things, but there’s a difference between technically proficient and actually creating something artistic. And that’s where it’s the, I don’t know where that line is because everyone else in my family is good at that. It’s not me. But it seems to be something of that introspection, reflection and thinking about how to push back on the world a bit.
