I've been teaching programming for 8 years. The students who use AI from day one are learning something, but it's not programming.
Posted by Ambitious-Garbage-73@reddit | learnprogramming | View on Reddit | 11 comments
This isn't a "AI bad" post. I use AI constantly. But I need to talk about what I'm seeing in students who start learning with AI as a crutch versus those who don't.
The AI-first students can ship. They can take a problem description and produce something that works faster than anyone I've ever taught. Genuinely impressive output speed.
What they can't do: debug without AI. Reason about why their code is slow. Explain what a variable actually holds at runtime. Read an error message and know where to look. Understand what happens when something fails.
I had a student last month who built a working web app in their second week. Legitimately functional. Then I asked them to add a console.log to see what a variable held at a specific point in execution. They didn't know where to put it. They didn't know what "at a specific point in execution" meant. They'd built the whole thing by describing features to AI and accepting outputs.
The mental model of "code as a sequence of instructions the computer executes" never formed. They skipped straight to "code as a thing that does stuff when you describe it right."
That mental model works until it doesn't. When the AI gives you something wrong and you can't tell it's wrong. When you need to optimize something and don't know where the time is going. When you're in a job interview and there's no AI.
The students who learned the hard way first — who struggled with loops, who debugged their own pointer errors, who had to actually understand execution flow — those students use AI well. They know what they're asking for. They can verify the output. They use it as a tool.
The others are building on a foundation that isn't there yet.
Not sure what the right answer is. Curious if others who learned recently feel like they skipped something important, or if I'm just being an old man yelling at clouds.
mpersico@reddit
I think instead of just programmers, we now have both programmers and “vibists”. I think they need to be cross trained.
Astronaut6735@reddit
I think you're on to something. We need a new word to describe people who can create code, but can't write, understand, or modify code.
desrtfx@reddit
Please, stop reposting this. There are already three identical posts of yours. One would have been more than enough.
Immediate-Paint-3825@reddit
AI can delay learned foundations becuase it gives you really cool stuff to start with and it doesn't break until you reach a certain level of complexity. But once you hit that level of complexity it can feel daunting to then have to learn foundational things. It's like looking down a mountain and realizing how far up you went without any experience.
I like thinking of it like you have two mountain climbers. One actually has practiced on a small cliff first, then a medium mountain then went for everest. The other person decided to buy a jetpack that takes up the mountain. They get too ambitious and decide to tey mt. everest and half way through he runs out of fuel. He looks up and realizes he's never actually climbed a day in his life and realizes his incompetence. The other person can use the jetpack just the same, but if it fails he can still default to his skill. And because he's well versed in climbing he can climb the certain parts that are easy and then use the jetpack when he needs. It's selective and used with intention. Not as a crutch.
People often think devs are just being bitter or coping because AI will replace them. But honestly our brains aren't changing any time soon. There's no substitute to repetition, reading, adapting, memorizing, and doing thousands of problems and debugging. We don't know everything about the brain, but we do know you need to pick difficult problems, practice, make mistakes, fix them, and then rinse and repeat for years to get good at things. This applies to any field. Imagine people skipping that step entirely. Funny thing is we've already had that before. Remember when people would get stuck in an endless loop of video tutorials (called "tutorial hell") because the mental load required to open a video and follow along isn't as much or as scary as opening up the terminal and trying to create a project yourself from scratch. AI is that new form of hand holding. AI increases access to knowledge but no one except you can make you consume that knowledge. We have libraries but people still refuse to learn. We have professors that are experts in every field but people still revuse to learn. No we have AI and guess what, people will still refuse to learn. It's more comforting to give a prompt to Ai and then get an output and say -> "I made this" instead of actually learning. Learning is hard, it makes you feel insecure, it challenges you and makes you feel stupid and tired. But that's fine. AI will create a bunch of people that are always stuck in that beginner phase just like video tutorial hell did. And they won't be able to solve bigger and better problems with/without AI becuase they can't even comprehend those problems or know they exist to begin with. Who will prompt an AI better, the engineer who has 10 years of experience, or the beginner with vague prompts that barely even knows what they want. Give a baby AI and see what it does. You can produce better input if you have more experience. But people think we are just coping when we say that. But a doctor can use AI better for medical reasons, and a dev can use AI better to code.
yopla@reddit
Can confirm that most of the interns we get nowadays are completely useless when they can't prompt their way out of the paper bag they trapped themselves in.
I don't expect much from interns in terms of skills. I just expect them to try to learn something but last year's batch has their brain shut-off, just copy pasting slack messages into Claude.
When asked to explain a piece of code the conversation usually went like this:
/Gave up
deleted_by_reddit@reddit
[removed]
silliputti0907@reddit
I tried coding with claude, and while its impressive, it's limitations are clear. I had issues with claude reading apis wrong. It made an assumption about what a key was. It also kept trying to fix python script when they knew there was an connection issue on the server side.
AutoModerator@reddit
Please, ask for programming partners/buddies in /r/programmingbuddies which is the appropriate subreddit
Your post has been removed
I am a bot, and this action was performed automatically. Please contact the moderators of this subreddit if you have any questions or concerns.
Unlikely-Ad9850@reddit
I struggle to understand the point of this post. Most new AI model can and do tell you how to debug and very easily explain how to improve performance
bystanderInnen@reddit
Debug without ai? Lol
Designer-Fix-2861@reddit
Chinese room experiment