Keep API work local: Why offline-first beats cloud-based tools
Posted by kiselitza@reddit | programming | View on Reddit | 64 comments
A gist of the article is that cloud-based API tools like Postman can expose your data, and leave you stuck when servers fail or docs lag (both actually happened multiple time in the recent period).
Offline-first API workflows, on the other hand, offer much better security, efficiency, and more developer control.
This isn’t about swearing off the cloud. You’ll still hit live endpoints for real requests. You'll host a bunch of things, as you should. But secrets and API Keys? You're really let a 3rd party cloud take care of those? I sure don't want to.
TadpoleNo1549@reddit
cloud tools are super convenient, but trusting them with secrets and API keys is where it starts getting risky, offline-first just gives you more control and fewer surprises when things go down, I feel like this is also where something like runable fits nicely, keeping workflows controlled while not fully depending on external systems
SignificantPound8853@reddit
We are concerned about entrusting private keys and API keys to a third-party cloud. In the event of an emergency, it is impossible to predict the risks that may arise.
Trang0ul@reddit
This article about "Service as a Software Substitute" by Richard Stallman is also worth reading.
rar_m@reddit
What? Am I just old or do people not write their code and test it locally before deploying to staging, verifying on staging then deploy to production?
Who develops API's against a remote service lol, that's so stupid.
Also, postman is just a tool to make HTTP requests, you can use it against a local server too. How is it a 'cloud-based' tool?
I feel so out of the loop.
TomWithTime@reddit
It has been updated so that you need to log on to use features. If you are logged out then you lose access to your collections. I don't like postman (or graphql, or distributed monoliths, or ORMs, etc) but I have no control over the tools and stack work chooses.
You wouldn't think that would not be the case for something like postman, but we have to use it so that cloud bullshit is available to the rest of the team when we make changes. It's also how we share and receive API collections with third parties.
JackSpyder@reddit
The fact it isnt saved locally and let's you just have cloud backups is monumentally fucking stupid.
TomWithTime@reddit
Agreed, It's nowhere near good enough to attempt this hostile change, but a lot of businesses sharing collections of API calls with each other apparently feel trapped by it. I got off postman personally very early into using it because it did not at the time support custom http requests like "report" which I used for my "get request with a post body"
I still don't really use postman at work, it's just a really inefficient ftp between us and a few third parties.
menckenjr@reddit
You're not old. Also, depending on your platform of choice there are all kinds of free HTTP request tools that don't require you to have an account at all.
_twelvemoons@reddit
plain old curl.
EveryQuantityEver@reddit
Postman used to be one of those tools, too.
menckenjr@reddit
https://httptoolkit.com although I've used CocoaRestClient and others. Since I work at a place that uses GraphQL, I'd also include Altair
hasen-judi@reddit
I'm 40
The weird things I've seen 20 something devs do these days boggle my mind
rar_m@reddit
I'm pushing 40 myself but haven't worked on a large team with younger devs in about a decade. Feels like the tech world/culture is evolving on without me.
kiselitza@reddit (OP)
Unless you stopped updating postman a decade or something ago, you got to have an account to use it. Account stores your collections, not in your filebase, but in their cloud. Which had an outage a few days ago, people couldn’t do anything at all.
Aside from dumb stuff like not being able to hit localhost endpoint without internet. The data breaches and cloud dependency alone should be sufficient enough to steer you in the opposite direction.
rar_m@reddit
Ahh i c, yea close to a decade is roughly when I last used it, mid 2010's.
So the tool itself is cloud based, rip.
BrainiacV@reddit
e-shitification strikes again
MechanicalHorse@reddit
Which is why I switched away from it. Fuck these stupid tools that require me to register an account just to test shit locally.
BrainiacV@reddit
tons of people over-complicating things bro, it's the new trend when people stop focusing on fundamentals haha
larso0@reddit
Sounds like the "we have to make it scalable" people. They have a hammer and everything looks like a nail.
TinderVeteran@reddit
Not defending the practice but sometimes teams work with an intricate network of microservices that developing locally becomes impossible beyond unit/integration testing. So the only way to see your API do actual work is by deploying it to a dev environment.
Downtown_Category163@reddit
Yes we do that and it is shit, good luck having teams working on more than one feature at once without having "deployment wars" for respective branches or just giving up and chucking everything into a shared branch
ILikeBumblebees@reddit
Sure, but you're still using local tools to make test calls against your fully-deployed dev instance, right? Not using SaaS apps hosted on the public internet to test your internal dev environment. Right?
Randolpho@reddit
Generally I handle this by making the dev environment accessible to devs via firewall rules, so if they're working on a microservice locally, it can still address/talk to any other microservices in the dev environment without having to run them locally as well.
That said, it should always be possible (albeit with expanding difficulty as additional microservices are added) to run everything locally.
kairos@reddit
If you use k8s, have a look at telepresence
rar_m@reddit
Yea I mean I can imagine why it's done. You would hope that you could just pull a repo for whatever microservice you need, build and run it on an instance locally but I've worked at places where this is difficult for "reasons". Usually for me, it's not having production data that takes up so much time for testing.
My impression from the blog post was that the intentionally wanted to debug against the cloud, realized it was a bad idea (duh) and then decided to share their wisdom with everyone else, like we didn't know this already hehe.
CpnStumpy@reddit
So many don't know this already, there's such a a huge cottage industry around devops explicitly selling products to do this as if it's even a good idea... It's baffling but I've heard numerous engineers declare it's the only way (basically they didn't build software, they just used software and it can't run locally because deployment is a nightmare that some DevOps guy completed once and everybody just deals with)
HoratioWobble@reddit
In my experience it's the older engineering teams that work from fixed remote infrastructure.
And honestly it's depressingly common.
DigThatData@reddit
I haven't even read the article yet and I'm reasonably confident the author is the kind of early career dev who doesn't write tests.
RiftHunter4@reddit
You'd be surprised. Or perhaps not lol.
BiteFancy9628@reddit
I use Bruno. It’s free and open source and can use plain text templates instead of binaries. So your api examples are git versionable and .env can go in .gitignore.
PsychohistorySeldon@reddit
I'm confused. For starters, the article is just a self-promotional piece in the blog of a startup.
But even if we ignore that, are people relying on tools like Postman to build products? Are people not building proper clients, with tests, off of a formal spec, with CI/CD, etc? Is this a problem that only hobbyists are gonna have? I've never worked in a team that relies on an external (cloud based or local) API client to integrate with other services.
ILikeBumblebees@reddit
Am I the only one who finds it much more efficient to just use curl on the shell, and occasionally wrap it with some bash scripts?
DotRevolutionary7803@reddit
Local dev environment is very difficult to beat unless a lot of engineering is put into fixing all the small annoyances of it being remote. The exception is when your laptop is lacking in RAM in which case the cloud environment will keep your system running smooth. I hated the latency of writing code on a remote machine, but in another implementation, they synced local code to the cloud environment using mutagen. This made writing code better, but resolving conflicts wasn't fun. Making it painless to connect to the cloud environment is also a challenge
anengineerandacat@reddit
Biased as all hell article, it's saying Postman is crap while pitching it's own HTTP client essentially; whereas I can agree that having only one offline workspace in Postman sucks there are others like Bruno that are quite good and offline as well (and fully open source).
Best of luck out there though, more competitors is a good thing.
kiselitza@reddit (OP)
So the article makes the case for bruno as well.. As for the other offline (or no account) folks. Being published on a tool website and having a CTA backlink to its GH Discussions page shouldn’t be so controversial imho, but I respect the way you put the remark.
Vectorial1024@reddit
Obligatory shoutout to Bruno
kiselitza@reddit (OP)
Bruno is a good contender here. I'm not crazy with the recent pricing changes. I don't believe in a pay-per-seat setup in devtools. Just don't. Voiden approach resonates better with me. But with the clear villains in the niche, I still do see Bruno folks as the good guys.
dreadcain@reddit
Bruno is free though? What's the diffrence between using free Bruno vs Voiden?
savagegrif@reddit
well the difference for this guy is he’s paid to promote Voiden
dr_wtf@reddit
Well Voiden is closed-source masquerading as open source. They have a Github link but it's an empty repo used for issue tracking. No source code. Last commit was to delete an accidentally-committed MIT licence.
Big red flags for me. I'm out.
nemec@reddit
nowhere do they claim to be open source. Just having a Github presence is not pretending to be OSS. Github won the war on developer mindshare and provides a simple public issue tracker, which they are using.
dr_wtf@reddit
Prominently displaying the Github logo in the header while not actually having a page anywhere on the site with the licence or terms of use strongly implies open source.
Unintentional perhaps, but definitely misleading.
Also the fact OP is astroturfing and the majority of their posts are spamming this project all over random subreddits is another red flag. I'm still out.
nemec@reddit
I don't know where this misconception comes from. Even if there was a link to the code, it is never open source unless it's clearly licensed. You can't just assume you can freely use code (though there will rarely be consequences for doing it incorrectly)
dr_wtf@reddit
Note the word "implied".
But you're replying to the person that clicked on the link to check the licence. I am also well aware that most people don't, which is why I pointed out that it's misleading.
kiselitza@reddit (OP)
Dr is being delusional here. Masquerading where exactly? Other than CEO making a public commitment to OSS it in the coming months, what delulu sources can you provide for your “red flag” claims?! 😅
Akeshi@reddit
It's always fun when somebody behind a new application - but who isn't outright saying they're behind it - tries to shill their new (yes, new - the domain was registered less than six months ago, we don't care that you've backfilled the changelog) clone of another product, gets put into a corner, and starts insulting people.
You are terrible at communicating. If this is the real company you're pretending it is ("CEO") then hire someone to do marketing.
kiselitza@reddit (OP)
Can be* free :)
https://www.usebruno.com/pricing
dreadcain@reddit
Ok, so what's the difference between free bruno and voiden?
As far as I'm aware what you're paying for with Bruno is mainly a support contract. There are a few niche features they enable, but nothing particularly important
kiselitza@reddit (OP)
Different approaches to same issue. Bruno is a postman lookalike that did great job in removing all the postman bloat. Very similar to Yaak, a tad less (but still a lot) to all the other alternatives in the space (about a dozen contenders I’d say, depending on the slight differences in their focus)
Voiden is markdown, hotkeys, DRY principle reusable building blocks, you can build any extension you need and/or install it without anyone locking you into their workflows. I could go all night debating the differences and how they affect flows, but got 2 toddlers to put to sleep.
dreadcain@reddit
How are you going to call Bruno a postman lookalike as if that was a bad thing and as if Voiden doesn't use an identical UI?
I don't think anything you listed there is unique to Voiden or missing from Bruno
Akeshi@reddit
One of them is respected, open source, and well established. The other is Voiden.
AyrA_ch@reddit
or milkman, which does other things too, like direct SQL queries
gynnihanssen@reddit
or httpie (which has a nice ui now, too)
radarsat1@reddit
I've never felt the need for these types of tools for testing http endpoints, I usually just write a python script because i always end up needing some logic in there and anyways am often wanting to build some (thin) client lib around the api too, so i just use it for testing. Am i doing it wrong?
taelor@reddit
I do the exact same thing. I never use postman, I just start writing the code to query whatever it, because as you said, you are gonna use it anyway when you start building your system. I use elixir, so like python we have a nice repl to mess around with.
I’ve been very adamant about doing this the last two years.
Maybe we are both wrong?
TwiliZant@reddit
Dumb question, how is more secure if every developer has to have the secrets on their local machine?
kiselitza@reddit (OP)
Unless you’re in a really messed up company (or a freelancer who skips the fine print of the contract)… your local machine will always be better protected than a heavily used cloud. Not because it is, but because you almost have to mess up on purpose to leak it. There are pass rotations, disk encryptions, pass keeping tools, whole bunch of apps and protocols put in place to secure local hardware. Which, again, is uninteresting for security attacks and usually mess up only with direct human error, e.g. committing a prod api key to a bloody repo or even in the docs.
TwiliZant@reddit
My experience is 99% of credential leaks are caused by devs, not by cloud providers.
This isn't even a pro-cloud argument, this is just a numbers argument. The more people have access to sensitive data, the higher the chance of a data breach becomes.
Tbh, I trust Vault or AWS Secrets Manager waaaaay more with OpSec than the 300+ devs in my org.
rar_m@reddit
You use dev/staging keys to hit dev/staging servers while you test locally.
Ideally, a dev (or anyone not responsible for deployments) shouldn't have access to production keys.
TwiliZant@reddit
I get that. But then I don't understand their argument about not trusting 3rd parties if the secrets themselves can be distributed on dev machines.
genitor@reddit
This whole 'article' and the person who posted it are obviously just shills, and the crappy AI-generated picture isn't helping anything. This is an ad, plain and simple.
surrendertoblizzard@reddit
How are these tools used that produce this issue ? Are they incorporated into a cicd pipeline which makes you depend on them ? Anyone has some examples what is actually meant here?
kiselitza@reddit (OP)
Which of the mentioned issues, data breaches, outages? Both happened in the past couple of months. One is a repetitive concern, not related with usage but with the used tool architecture, other one is fairly an incident. Outages do happen for various reasons. There are cures for those, but people usually only implement them after losing a lot of money first.
surrendertoblizzard@reddit
Yes outages and the like but what is an online API workflow that completely prevents you from doing your development work or loses you money ? I guess I can't imagine or I dont yet grasp the idea of an 'online' API workflow?