I ditch all LLM framework and use only OpenAI SDK for everything, I start loving building AI application this way.
Posted by dheetoo@reddit | LocalLLaMA | View on Reddit | 29 comments
I've tried several LLM frameworks and libraries, each with their own direction like Haystack, LangChain, etc. I've also tried several agent frameworks like AutoGen, SmolAgent, and Strands. All I can say about these frameworks is that they're "exhausting."
I feel like every application built with these tools consumes twice my time. I have to go back and forth reviewing documentation and maybe other people's examples just to implement some simple control flow.
With just the OpenAI SDK (or just API calls), you can connect to almost any model that supports the OpenAI API spec, and everything is just structured output. You treat the LLM just like a function that reliably returns predefined values you can expect. I love building AI applications this way - it's so lean and easy, and you get full visibility on how each API call went.
kmouratidis@reddit
Have you tried sglang's "language"? It's not a direct alternative, but imo it's at a good level between full and zero abstraction.
No_Afternoon_4260@reddit
Care to elaborate?
kmouratidis@reddit
"SGLang Frontend Language" (https://docs.sglang.ai/frontend/frontend.html) provides a relatively minimal interface that I find acceptable.
e.g. tool use:
and images:
And the server has some additional endpoints too (https://docs.sglang.ai/backend/native_api.html) that may make certain tasks easier (e.g.
/clasify
,/v1/rerank
).No_Afternoon_4260@reddit
Thanks a lot that's some useful documentation, I need to put it on my todo
__JockY__@reddit
Yup. The OpenAI stuff just works, I’m using it locally with vLLM for agentic workflows with MCP (FastMCP) and it’s been quick, easy, and effective.
I have my issues with Sam Altman and the company as a whole, but the OpenAI frameworks are great for those of us who ignore the cloud crap and work locally.
SaltResident9310@reddit
Is it paid?
__JockY__@reddit
No, it’s an open source Python library.
SaltResident9310@reddit
Can it be used as a coding assistant by any chance? I've been looking for an offline local method for setting up an agent on VS Code for a while.
__JockY__@reddit
No, it’s a Python library. Google it, I’m not here to answer rudimentary questions based on misunderstanding.
ApprehensiveBat3074@reddit
What a dick you are.
Fox-Lopsided@reddit
You can use cline or roo code and connect it to a LLM you hosted with LM Studio or Ollama (they both provide an openai compatible endpoint)
Fox-Lopsided@reddit
Another alternative you could look into would be Opencode if you prefer cli tools.
Charming_Support726@reddit
I feel you.
Last year I did few things with langchain. Exhausting describes it very well. LangChain and LangGraph made me cry. Docs and toolchain were somewhere between hard to understand and unreliable. And they pretend that might get better if I pay. Aha.
Then I switched to SmolAgents. Nice experience and great idea to introduce the codeagent concept. But then I did a proof of concept with it (for a pitch). And I found that I hat rewritten 50% of the code, because my agents needed a different "spin".
So after all I agree, that these frameworks are not of great use.
__SlimeQ__@reddit
anyone telling you they know the right way to integrate llm's are full of shit, and that is the base thesis for ALL of these frameworks.
the field is moving too fast. there's no point in learning about other people's magic spells when you can just actually learn the science and do it yourself with the appropriate discretion
No_Afternoon_4260@reddit
In french we have a saying:
"ce qui se conçoit bien, s'énonce clairement, et les mots pour le dire viennent aisément".
Which could be translated to:
"What is well conceived is clearly stated, and the words to say it come easily"
It isn't that hard and once you know what you want to do the path to get there is easy. + Gives you freedom to experiment at will.
I spent so much time trying to learn these framework, then trying to build mine. I think stupid scripting is often the answer.
RandomUserRU123@reddit
Yeah I also feel like there are very few abstractions that actually make sense to learn. So far I only find vLLM, SGLang and Verl to be useful
Phptower@reddit
KISS 💋 . But LangChain has built-in session which is nice.
__JockY__@reddit
No, it’s a library. Google it, I’m not here for rudimentary questions based on misunderstanding.
__Maximum__@reddit
Aren't those tools also built around OAI API? Are you saying writing your own functions is better than using these libraries?
Tenzu9@reddit
That's the beauty of Json wrangling apps/libraries. As long as you have an OpenAI API compatible endpoint, you are set!
KoboldCpp, LMStudio, Llama.cpp, Ollama and many others support this API feature.
searchblox_searchai@reddit
You can use SearchAI locally and avoid all the "agent frameworks" https://www.searchblox.com/downloads
Longjumping_Try4676@reddit
OP is having their AI r/minimalism moment
tvmaly@reddit
I gave up on LangChain early on when I tried the course the creator made with Andrew Ng. The code the provide in the notebook didn’t work.
I listened to the interview on PydanticAI on the Latent Space podcast and I am a bit excited about that. We have it for a team project and it is working great.
Asleep-Ratio7535@reddit
I agree with you for easy part. But actually a lot of providers don't support some of the calls, like tools. How did you get it done? Make a switch, when it's not supported turn it off?
05032-MendicantBias@reddit
It's what I'm using as well. I use LM Studio to host the model, and APIs to query the model.
Minute_Attempt3063@reddit
/r/openai
dheetoo@reddit (OP)
I use OpenAI SDK with LM Studios that host my local models and it still works like a charm!
Relevant-Savings-458@reddit
Reasonable viewpoint.
BidWestern1056@reddit
i had similarly positive experiences with openai sdk and models when i was still working for someone who paid for the api costs. ive tried to produce that reliability for local models as well while providing seamless experience between local and enterprise. i know you dont like frameworks because of how much they obfuscate, but npcpy may be one you should try out if you ever want to integrate other non-openai providers and have AIs with personas without as much handwringing. everything in npcpy built with litellm so the way tools are set up and is the openai spec too so shouldn't need to change too much.
https://github.com/npc-worldwide/npcpy
trying to have this be a simple library for integrating common data sources and for common desired data outputs (text, image gen, video gen, image editing)