How are you balancing actually learning vs letting AI write your code?
Posted by TeaOk8063@reddit | learnprogramming | View on Reddit | 25 comments
I started using AI tools for coding and it’s insanely helpful. Debugging, boilerplate, even learning new concepts faster. But lately I’ve been wondering if I’m actually learning or just getting good at prompting.
Like, I can solve problems way faster now, but if you took the AI away, would I still be able to do it myself? Not always sure.
So I’m trying to find a balance.
Curious how others are handling this.
Do you treat AI as a teacher, a tool, or just a shortcut?
Are you worried it might slow down real learning, or do you think it actually speeds things up long-term?
Would love to hear how you approach it.
ffrkAnonymous@reddit
That's good for me.
New-Koala-9706@reddit
Hey brother, do you have any unneeded game keys from humble which you don't need? I recently built a pc and was wondering if I can use your spares if you have no use for them. I completely understand if that won't be possible but I thought I'll ask. Thanks!
Opening_Pea7537@reddit
When learning I only use AI after actively having tried to solve the problem myself. When I feel like hitting a dead end I use AI to give me clues (kinda like if you asked a professor, teacher or other knowledgeable person). I only let it generate the amswer if absolutely necessary. From my experience I personally understand the topics better if I learned them (almost) without AI. If I let AI do most of the work I think I understand the topic but when I try to solve it without help I'm lost aka I don't really understand it. It might be different for other people but I really do need to do things myself first to properly understand them.
TeaOk8063@reddit (OP)
That’s pretty much how I use it too.
If I go to AI too early, it feels like I “understand it” in the moment but can’t actually reproduce it later. But if I struggle first and only use it for hints, it tends to stick way better.
I don’t think it’s one-size-fits-all, but the “try first, then ask for clues, not answers” approach seems to keep the learning real.
Aromatic_File_5256@reddit
Don't ask AI for direct solutions, ask for hints.
If you receive code from AI evaluate it and improve it evennif just a tiny bit
TeaOk8063@reddit (OP)
i like that
Aromatic_File_5256@reddit
Also, allow yourself to struggle. If you are not being challenged then you are not learning
TeaOk8063@reddit (OP)
That can be applied to life ngl. Thank you
No_Cook_2493@reddit
as a learner, AI should be writing code that you already picture in your head. As it's writing it, you should be following along going "yup, that's what I was thinking!" To nearly every line. If some lines aren'tkke this, make sure you understand the pattern and why the AI chose it
If you're not understanding most of the lines, then I encourage you to try and write more functions yourself, as it's a sign you're not familiar with patterns or the language.
(Just my opinion obviously)
TeaOk8063@reddit (OP)
I like this framing a lot.
That “yup, that’s what I was thinking” check is actually a good signal for whether I’m using it correctly or just copying blindly.
Sometimes though I don’t have the full picture yet, and AI kind of fills in the gaps but I try to go back and re-write it myself after so it sticks.
Temporary_Pie2733@reddit
Learn first, then use AI to automate what you’ve already learned. (So that you can immediately recognize when the AI-generated code is wrong.)
TeaOk8063@reddit (OP)
Thank you
token-tensor@reddit
treat it like a more capable StackOverflow — great for looking things up, bad for doing the actual thinking. the rule that works: use AI for things you've already understood and could write yourself if you had time. avoid it for concepts you're still building intuition for. first time you're learning how loops work or how recursion feels? write it yourself even if it's slow. the danger zone is using AI before you have a mental model of the concept — then you're just memorizing prompts instead of patterns, and one job interview without internet will expose it.
TeaOk8063@reddit (OP)
I mostly agree with this framing.
The “danger zone” part is real if I use AI before I even have a mental model, I can feel myself skipping the struggle that actually builds intuition.
Where I slightly differ is that I don’t think it has to be strictly “already understood vs not at all.” Sometimes I’ll use AI during the learning phase, but in a constrained way (e.g., “explain why this works” or “give hints, not full solution”), then I go implement it myself.
So I guess my rule is less “no AI until mastery” and more “don’t let AI replace the moment where I actually wrestle with the concept.”
azac24@reddit
I don't use AI period. It's never the most efficient and just stunts your growth. The fact you have to ask tells me you also realize you don't learn much when you use AI and you are not at the level to be able to use it effectively.
TeaOk8063@reddit (OP)
I get where you’re coming from. If someone relies on AI too early or too heavily, it can definitely slow down their fundamentals.
For me it’s less “AI vs no AI” and more “where in the process it fits.” I still try to solve things myself first, then use AI to unblock or compare approaches.
I don’t think the tool itself is the issue, it’s how it’s used and whether you’re still actively thinking through the problem.
incompletelucidity@reddit
let's be real, the best engineers came from a time where AI didn't exist yet. having a shortcut for something means the brain can afford lazy and depend on it instead of actually putting in the work for it
first being able to google made memorizing syntax worthless, now AI being able to sort of solve everything for you (albeit in a bad way) we offload cognitive tasks to it and then wonder why we can't solve anything without it
i personally can't learn with AI and don't get how people say they learn stuff with it. maybe the best way to use it is, if you're stuck on a problem for more than X time, ask AI to solve it, because it starts being counterproductive to keep wasting time on it. the same way you would check a leetcode solution after not figuring it out for 30 minutes
TeaOk8063@reddit (OP)
the google analogy is actually perfect. nobody's out here memorizing every CSS property and that's fine. but there's a difference between offloading recall and offloading reasoning and that's where AI gets dangerous.
the X minutes rule is underrated advice. the struggle before the answer is where the learning actually happens. AI just makes it too easy to skip that part.
i think the trap is using AI to get answers vs using it to check your thinking. one makes you dependent, the other actually builds something.
Mortomes@reddit
You could start by not having it write your reddit posts.
narnru@reddit
It seems you also use AI to write engagement posts.
On the point - learning and AI is like North and magenta. Learning is about changing yourself. AI about solving task.
If you remove process if changing yourself by delegating task to solve to AI you hinder your growth. If you delegate the task of finding the concept which allow you to solve the task to AI and then learn that concept then you use AI to improve your growth.
Jwhodis@reddit
Just don't use it.
Hamza_yassen@reddit
I personally don't use AI in my learning phase
TeaOk8063@reddit (OP)
Fair. but it is too tempting you know
Hamza_yassen@reddit
Yeah I get lazy whenever I use it
d4m4s74@reddit
The only AI I use in programming is autocomplete. I don't request full code, I just let it do the repetitive stuff.