Anyone else burning through tokens trying to get AI to refactor large files?

Posted by morph_lupindo@reddit | ExperiencedDevs | View on Reddit | 1 comments

I've been hitting the same wall over and over. I'll feed a large file to an AI agent and ask it to break it into modules. Three problems keep coming up:

Tokens get eaten fast, especially with recent pricing changes. Things get truncated silently. You don't even know what got cut until something breaks at runtime. And sometimes the agent just gets stuck in a loop rewriting the same section.

I get it, I'm sure most of you know how to refactor without AI. I used to code everything by hand back in the day too. But I'm lazy now and the idea that an AI can spin out 1500 lines of working code is hard to walk away from. Problem is projects creep up to 5k, 10k, 20k lines and that's where it falls apart.

I ended up building something that tries to find a middle ground. The AI just decides where to split — which functions belong together, what the dependency graph looks like. Then a dumb mechanical engine copies the code by line range. No tokens burned on the boring part. It handles maybe 80% of a typical file mechanically, the AI still does the tricky 20%.

Open source if anyone wants to kick the tires: github.com/codedrop-codes/refactory

Curious what other approaches people have tried for large file refactoring with AI?