Oregon State University|blogs.oregonstate.edu

To AI or Not To AI? That Is [Not Necessarily] the [Right] Question  May 18th, 2023

My journey learning to program began a little over a decade ago at this point — I used resources like Coursera and Codecademy to learn Python for the first time. I really liked learning from tools like this because it broke Python down into manageable pieces and walked me through how to use everything. ChatGPT was far from existing at that point, and having tools that would auto-generate solutions was not something that was accessible to me at that point. Looking back, I’m really glad that ChatGPT wasn’t an option for me back then because I don’t view it as a very good learning tool for beginners.

I frequently have friends in other computer science programs at other schools ask me for help with their homework because they’re struggling to understand the concepts, and one commonality I’ve noticed is that they have begun learning to code with a reliance on ChatGPT. What has ended up happening is they come across a problem they don’t know how to solve, and they also haven’t been taught how to locate the resources that would teach them how to solve that problem. Instead, they ask ChatGPT for help with it and ChatGPT returns an answer that doesn’t fully work because ChatGPT doesn’t actually know how to program. And the biggest issue with this strategy is that the people who are learning to code for the first time don’t yet know how to identify why ChatGPT’s solution isn’t working — they don’t know how they are supposed to implement what they are trying to implement because they haven’t mastered the basics yet, and without these basic foundations they are unable to identify why ChatGPT’s attempt isn’t working. Having ChatGPT as easily accessible as it is teaches new programmers to try to “shortcut” their learning by leaning on it, which causes them more problems in the long run because they’re not actually absorbing what it is they’re supposed to be learning.

Having said all that, I don’t think that AI necessarily shouldn’t be used as a learning tool — the circumstances under which it should be used need to be more specific. For example, in my work Slack we have an AI bot (his name is Claude) who can help us answer questions, and one of my colleagues asked Claude how to implement a specific technology with TypeScript. Claude returned an answer walking through how to implement what was asked for, and it was a useful tutorial for getting started with that technology. This is something that works because my colleague already knows how to program in TypeScript and he is not struggling to understand the foundations of the language. He has enough knowledge and experience to try what ChatGPT suggested to him, and then to read through the code and understand why it isn’t working, if it isn’t working. This is something that a brand new programmer doesn’t know how to do yet, and the brand new programmer also hasn’t yet learned just how important it is to know how to do this. Programming isn’t necessarily about coming up with the right answer; it’s about understanding the process of how to get to a correct answer.

Because of all this, I do tend to stay away from AI when I am learning technologies that are brand new to me. Once I have a solid understanding of what I am doing asking AI can be both less detrimental and more useful; in the meantime, I prefer finding walkthrough tutorials, both in text and in video form, depending on what I am trying to learn. Resources that are actually for beginners are what beginners should be using; AI should be utilized after that beginners phase has been moved on from.

Print Friendly, PDF & Email


Leave a Reply