Why do you think Microsoft forces employees to use AI?
Posted by Affectionate-Mail612@reddit | ExperiencedDevs | View on Reddit | 50 comments
I get why they try to push it into VS, or Github - to get revenue from us. If it has value or not - a secondary question to them.
But their employees? If AI was this good, no point of making it mandatory.
My personal theory is that they want developers (theirs or not) to get dependent on AI, which would mean dependence on them (Microsoft). Basically to make them dumber, but less mobile and less independent.
t0rt0ff@reddit
Controversial take in this subreddit: because AI can actually improve engineering throughput but a lot of the engineers feel threatened/don’t want to learn/don’t want to adjust/tried-once-didn’t-work/. I do agree though that for recent layoffs they used AI just as an excuse. AI is not there yet to replace experienced engineers, but if used well it is there to increase their productivity.
Venthe@reddit
It is controversial because it's been proven untrue.
https://arxiv.org/abs/2507.09089
t0rt0ff@reddit
What they actually proved: copilot-style development likely doesn’t work for experienced engineers, especially if they are new to AI-assisted coding. I encourage you to read the article carefully and entirely. Especially check the graphs.
The benefits of AI come when you invoke parallel execution with proper preplanning. METR research completely misses that. So the research did prove something, but definitely not that AI-assisted coding doesn’t work. I won’t even go deep into the discussion that Cursor, which is what they used in the experiment, is (or at least was as of couple months ago when I tried it last time) a pile of crap and should not have been used in the experiment at all.
writesCommentsHigh@reddit
1000%. It's the next tool/thing that will become part of our daily workflow. Any mention of AI being helpful is often met with downvotes in this sub.
BigFanOfGayMarineBmw@reddit
MS is massively invested in AI. If they can't even demonstrate that their own devs get x% boost out of it, it's a failure. They don't really have a choice.
annoying_cyclist@reddit
I've started to see this in job ads, too, as a bullet point in the job description. A recent one even quantified it: "Uses AI tools in day to day work to work faster (target ~30% usage)" or something like that. What the hell "30% usage" means left as an exercise for the reader (or, more likely, ignored by line managers with more sense than upper management innovating on meaningless dictates).
originalchronoguy@reddit
It is quantifiable.
I know a few 20-30 YOE Staff engineers who tell me the same thing.
"I built something over the weekend that normally would have taken me 5 months"
I hear it all the time.
annoying_cyclist@reddit
Sure. I've had those experiences myself. I have a couple of hobby projects where north of 80% of the code was generated by Claude Code, because they're greenfield and relatively simple and LLMs tend to be good at that. I've also gone weeks in paid employment working on things where an LLM would be an unacceptable risk or slower/worse than I would be: older code, a large/mature codebase, a high-risk project that depends on a lot of niche context, etc. I'm a professional, familiar the pros/cons of LLMs, and can use my judgement about where they do and don't make sense, just like I do with my other engineering knowledge. Giving me a reductive or meaningless KPI to hit ("% usage", acceptance rate, % of code generated by AI) interferes with my ability to do that.
If executives want their employees to use AI, they can commit to paying for state of the art tools, give employees time/space to experiment with them, and trust them as skilled professionals to figure the other parts out.
AvailableFalconn@reddit
There’s no 4D chess. They think their employees are lazy and not bothering to learn new tech, rather than accepting that at best the tools are still a work in progress. Plus all their friends at the golf club are doing it to their employees.
Itallian-Scallion@reddit
Totally - they think that the devs are lazy and need to be pushed. We're all using it already but they really want to see more.
Affectionate-Mail612@reddit (OP)
I'm an AI sceptic and still think LLMs are great to learn new stuff. One of the few legitimate uses for me.
ampersand355@reddit
Completely disagree. When using AI I don’t think you should ever let it lead you but you should lead it. It can be completely hallucinating and you wouldn’t know until you learned enough to realize.
Affectionate-Mail612@reddit (OP)
Let's say it's 80% legit useful information and 20% wrong.
It's still 80% useful information.
PetroarZed@reddit
“It ain't what you don't know that gets you into trouble. It's what you know for sure that just ain't so.”
ampersand355@reddit
You have no idea what is useful or useless. It hallucinates methods, functionality, purpose... If you find that helpful, I'm not here to convince you otherwise. I am forced to use it and I use it as a sounding board and it's awful. I've turned off it's LSP-like suggestions because they are always worthless.
Mammoth-Clock-8173@reddit
I must say, it seems to have infinite patience when I ask it the same question 6 different ways. This definitely sets it apart from my architect.
chmod777@reddit
https://en.m.wikipedia.org/wiki/Eating_your_own_dog_food
writesCommentsHigh@reddit
Also cuz the industry currently believes this is the direction we are all heading.
ampersand355@reddit
No one who works in the industry believes this. The only believers are the MBAs.
writesCommentsHigh@reddit
Do you have supporting evidence besides reddit's own echo chamber?
In the past year it has become superior to googling for answers and understanding issues.
Vector-Zero@reddit
Okay, but Google has become useless as a search engine. Now I'm wondering if that's intentional to funnel users into using their AI tools.
writesCommentsHigh@reddit
Possibly. Could also be dead internet theory or optimization for ad revenue (likely always true).
Company want money and be in lead. Company do what company do 🦍
I’ve done my namesake and am now Zooming out to a wider timeline… we’ve (humanity) barely had the internet. 30 years since windows 95 which is around the time the net started to proliferate.
The internet and technology have never stopped evolving, at a pace we’ve never seen before. Googling and human based information creation has likely peaked (in terms of content on the internet, ignoring research and science). Sure new websites and ideas emerge but we probably have 80% of knowledge at a minimum available online.
As I see it right now; AI is the repository of the internet that can also do it’s own searching.
Idk where I’m gonna with this, I finished the joint.
Oh yeah, we’re in the middle of some change, whether it’s AI or the next thing, it’s gonna keep changing.
In the last 15 years since the iPhone and android, we’ve basically given every human a computer.
It’s gonna keep changing. We just gotta wait and see which sci fi writer got it right.
ampersand355@reddit
Only for technology that has a bunch of software already written to handle it or public configurations. I've found it worthless for enterprise software.
writesCommentsHigh@reddit
Interesting but also makes sense if none of the tech you're using is publicly available.
Also curious to know: what tools and LLM's have you've tried?
ampersand355@reddit
Only Gemini and Copilot as I have KPIs to hit for usage of each.
writesCommentsHigh@reddit
That’s annoying. I can see why you’d be frustrated. I’ve had a lot of success with Gemini 2.5 pro. But i work with consumer facing iOS projects. Plenty of learning material for an LLM.
My experience with copilot on Xcode was shit. I’ve had to use a separate app that indexes our codebase and submits it (privately) so the LLM is not handicapped.
ampersand355@reddit
I didn't mean to hate on it too badly, I just see where this is all headed and it's made me cynical and a bit of a hater. I do think AI is a great sounding board for ideas and for seeing examples of things.
The problem is that we have performance metrics to hit for usage and query evaluation due to how expensive the AI contracts are. It's just annoying at this point and I ran into a bunch of issues where I was having to correct the AI all the time. Right now it's just a big annoyance to me.
You're giving me a reason to explore with an AI to write my first iOS app :D I am kind of just a generalist, I used to work on a web of microservices at a credit card company and now I just write internal tooling for another financial company.
writesCommentsHigh@reddit
Perhaps it’s time to search for opportunities without execs counting Ai queries? I’ve personally avoided big corpo cuz I would struggle with the bs.
For Xcode I use AlexCodes app however I (mostly) use it in place of google or when I’ve run out of steam (vibe coding often ends at a dead end).
For example Monday night I barely slept so by the time Tuesday afternoon rolled around I was vibe coding a coach mark tutorial. I got it working but then Ai got stuck in a hallucination loop trying to fix an issue ( I was attempting to guide it while doing other non work stuff).
The following morning I fixed the issue in minutes myself.
I’ve also used replit to build a small web app
AvailableFalconn@reddit
To executives, the MBAs are the industry, not the boots on the ground. We are resources, headcount, cost centers.
Dave-Alvarado@reddit
It's this. Probably because they know that if they make 50k engineers use it, some of them are going to have some good ideas how to use it in a useful and ultimately profitable way. I would bet good money that Microsoft gets to use OpenAI (and now Anthropic) products effectively for free. Writing a policy that everybody uses AI is also effectively free (ignoring opportunity cost of slowing down your engineers). So, anything the engineers invent internally that you can sell to customers is pure profit.
riggiddyrektson@reddit
AI will be THE tool of the future, in one way or another. Having all the devs be on top of that or at least familiar with makes total sense to me.
snorktacular@reddit
People are downvoting you because they're reading this as a prediction instead of a description of what's happening. We're basically witnessing Chinatown or the GM streetcar conspiracy in real time.
Affectionate-Mail612@reddit (OP)
What if people downvoting him because they don't believe autocomplete on steroids is overhyped?
snorktacular@reddit
You missed my point. I'm saying that the people in power are declaring how the future is going to look because they're the ones who will make it that way, by fiat.
LLMs don't have to be great to be forced on us. Same with open offices, RTO, pointless meetings. If leaders decide that our jobs require shoveling shit then they can justify (to themselves) firing us for not shoveling shit and there's not much we can do about it.
I don't like it any more than you. There are ways to push back and make people see sense. The chatbot wrapper SaaS startup bubble will pop, and vibe code in prod will cause some majorly embarrassing incidents. But many of these leaders are using reality-distortion fields to justify their bad decisions. Even if they don't, they're just going to fly away on their golden parachutes and leave the mess behind for someone else to clean up.
writesCommentsHigh@reddit
You're going to get downvoted to shreds here because most devs here don't want to admit the truth.
People hate change.
Affectionate-Mail612@reddit (OP)
Very questionable statement. It has uses, but hype is too much. Besides, it's not even AI, it's LLM.
pl487@reddit
Or they actually think the thing they have built has value and want that value to be realized. They're just people.
iduzinternet@reddit
Ill admit im a bit on the bandwagon but as a mid career dev i find a lot of my coworkers are hesitant to adapt. I think just like any training, you legitimately don’t want your team to fall behind, so you tell the ones that won’t go that they must. Ill admit a lot of top levels have a lot more hype then reality but there isn’t much dev work that can’t be aided at all by super auto complete and having your tickets auto update and an automatic rough draft of your documentation.
I treat it like an intern all of the above need to be reviewed, but unless your intern is terrible, it’s easier to review the work and recommend changes, then not have them do it at all.
spacechimp@reddit
Aside from the obvious productivity/financial reasons: Copilot is not allowed to train on customer code, so I wouldn't be surprised if they want as much internal usage as possible to continue training it.
Comprehensive_Top927@reddit
All companies that invest in AI want their employees to use it to justify the heavy spending on AI and sell it to other companies..
I think there will be a point in the future, where the heads of these companies realize that AI is not the hyped panacea. There was a recent MIT article about this.
honey1337@reddit
They probably just view it as a boost to productivity. So if you aren’t using it you are deliberately trying to work slower (I don’t necessarily agree, but it is pretty good at a lot of tasks).
ampersand355@reddit
It’s just a form of self-dealing. They have massive investments in the technology, they won’t allow it to fail, they need users to ensure it continues being trained.
DancingSouls@reddit
easy. so then they can contribute any employee improvements and effort to the increased internal usage of AI whilest getting free dogfooding.
whatever makes stock go up lol
boneytooth_thompkins@reddit
It's this.
Mo-42@reddit
Affectionate-Mail612@reddit (OP)
I remembered Ryan from Office when he was an executive when read this.
CompetitiveSubset@reddit
They are running around like headless chickens, chasing trends as their biggest fear is FOMO.
solar_powered_wind@reddit
These companies aren't ruled by the best of humanity, they're ruled by sycophants that care about their own wealth.
Companies make stupid decisions all the time, it's perfectly health and normal. The idea that Microsoft still existing in the year 2200 fills me with existential dread.
drew_eckhardt2@reddit
Executives believe AI will make developers more efficient.
suitupyo@reddit
So that they can outsource their job to India, tell investors it was because of AI efficiencies and sell AI products