Writing good APIs is hard especially in a small fast paced team, during pr's you SHOULD verify that the api endpoints docs are still correct and valid. When I was on the api team there was a lot of verifying that the documentation for the endpoints are simple and easy to follow, now I'm on an integration having to use other companies APIs where everything is a get with the "body" as a base64 encoded json in the query
Or when you literally pass in the same arguments as an example in the documentation and you get back an error.
Tf?? I was querying a single employee with a set id.
Then you find out: oh, when they say employee Id, what they really mean is, you call getAllEmployees and iterate through every element until you find the employee you need by name and star sign.
Then you take the_internal-unused_id and you use THAT for the employee Id, completely ignoring the property: employee_id. That's a decoy Id to sabotage pen testers.
The worst API I've ever seen was a Java class in which every single method took List
The worst part was that the documentation for every method was something vague like "This method usually takes these arguments and returns..." Usually? What am I supposed to do if the method does something different? Crash and give the user an equally vague "something went wrong" error message?
Do note that the blog post was about APIs in general, not just web APIs — even though the author himself seemed to ignore that. In that more general context, the idea of an "API team" makes me chuckle.
Of course, we do need some unified vision of where the boundaries of a program lie. That can be done by one person, or a small team, at the beginning of the project, but then this "API team" has to disband and actually program¹. Sure the APIs will still need ongoing work, but it will mostly be maintenance. That is, assuming the first version wasn’t too badly botched.
[1]: I have seen "architects" that stopped programming a while ago, and not having to suffer the consequences of their decisions directly tend to detach them from reality.
What matters though is that the schema for that b64 encoded json is stable. Having base64 encoded json does not intrinsically make the api worse. not having it well spec'ed out does.
The 3rd party api json has no schema the swagger is sadly "/api/authorize/{Json}" in their swagger, only 10% of the endpoints has a description that is like "{ 'prop':1, 'prop2': 'two'}" it is absurdly bad, it does not tell you what gets returned
It's not as profound as it's trying to be. "If you delete the implementation but have all the inputs necessary to recreate it, what are you losing?" well nothing except time to reimplement it. Just in a lot more words.
It would be great if everything was always perfectly tested, specified and documented. But it's not, because humans are lazy, humans have schedules to keep and humans make mistakes, and doing everything perfectly is a ton of work.
Like I've worked in codebases where someone forgot to rename a test after changing functionality, so reading that test is very confusing. I've worked on tests that are buggy and aren't actually testing what they're supposed to be testing but they pass so everyone thinks it's fine.
I've also worked with codebases where not every single thing in existence is properly tested. Or where everything hasn't been properly documented and written down. Specifications communicated through emails, meetings, and calls, not properly put down in documentation and only existing as a comment in the source somewhere.
Etc etc.
I could keep going, but the thing is, what we do isn't perfect and there's a ton of things not written down or put in their proper place, especially when dealing with large and long term enterprise solutions.
Have you worked on software of a size too large for a single person to hold in their head? It is entirely the norm today that teams are deathly afraid of even touching some existing code, let alone removing it.
The point is that most software today doesn’t work like that.
I think it’s the newer software frameworks that enforce or help with modularizing a lot better than the older stuff.
Usually with newer software everyone knows more or less what the entire application does, and smaller groups have their focus on individual modules, where they know the ins and outs.
Yeah that’s uhhh, absolutely not true in several dimensions. The most important of which of course being that the most sophisticated and valuable systems in the world are heaps of so-called legacy code whose authors have long moved on.
He's probably a cloud guy. So many people work in cloud world these days (and have never worked anywhere else) that they have little concept of anything outside of that world.
I work in cloud too! The property holds there too. In the valuable software that runs the world, new features and legacy behaviors are intertwined and you can’t Martin Fowler your way out of it.
In my case we are regularly adding a lot of functionality to a workflow engine that runs customer created workflows about 5 trillion times a month. It’s been around for 15 years and undergone several UI overhauls and re-architectures through an endlessly rotating cast of developers over the years. The system it integrates with generates 50 billion dollars a year and has thousands of engineers working on things. There is no such thing as any one person or team knowing how all of this actually works. We have a sophisticated sociotechnical system that sustains knowledge and value over the years.
I don't know why I'm getting upvotes and you get downvotes for linking to a blog post that explains (almost?) the same idea better than I did, but thanks for posting it.
A big crowd of programmers here likes to stick their head in the sand about anything involving AI assisted or AI driven development. Doesn’t really matter if the topic isn’t even about AI.
Whether or not the post is AI slop, APIs may be 'easy' to create the first time. The issue is what happens over the next 20 years when there is a lot of other code using them, but you have to move them forward to deal with new needs. If you have to live with the consequences of your API design, and don't want to end up drowning in evolutionary baggage over time, it's not so easy for more complex subsystems.
honestly the worst apis are the ones where someone thought they were being clever... like why would you base64 encode json in a query param when POST exists lol. i've seen apis where every endpoint returns 200 but the actual error is buried in the response body, makes debugging a nightmare. unpopular opinion but swagger/openapi docs are only as good as the team that maintains them, which is usually nobody after month 2
There's a reason for that, generally it's not safe to automatically cache POST requests. There's a reason why they're adding a new QUERY method to HTTP.
I’m a minority, but I think good code does what it was meant to do. I like clean code, but I don’t make my team write clean from the start. Life finds a way for us to understand messy code in the future
It's not that crazy. He's prioritizing the code being functional than clean, which makes sense. If you look at the source code for Next.js, some of it looks terrifyingly bad (at least when I read it). But it does the job. Clean code may miss edge cases or be so abstracted that it's hard to reason about.
It is, but it's also false to say you can have your cake and eat it too. Clean code costs resources (time, effort, validation). And sometimes the benefits of clean code aren't just worth it.
"You know the one thing better than clean code? Code that works."
Or so the saying goes.
Not personally familiar with the source for Next.js, but I can believe it, and I agree with your practical take on things. It's nice to have code that is both clean and functional, but sometimes it's not possible or would just take too long compared to the alternative.
The counterpoint is also true: bad APIs age terribly and create massive tech debt. REST APIs with inconsistent naming, undocumented side effects, and no versioning become archaeological layers that everyone is afraid to touch. The best sign of a good API is when you can read the method signatures and understand the intent without looking at the implementation — it communicates its contract clearly. The worst sign is when you have to read the source code AND find a Stack Overflow thread from 2015 just to understand what a single method actually does.
I think the one caveat around APIs focusing on frontend needs are Backend For Frontend APIs, where the important distinction is that the team building the frontend system also owns, builds, and operates the BFF API. In that situation a tighter coupling is ok, and probably even somewhat desirable. Though the team should still aim to make parts of the API flexible for reuse where it makes sense.
Since they own both sides though, the challenge of versioning, retiring older versions once no longer in use, and all of that is at least the responsibility of a single team and doesn't involve cross-team coordination.
Popped this open fully expecting it to be AI slop. Pleasantly surprised to see that not to be the case. Interesting insights and well-reasoned. Nice write up.
treehuggerino@reddit
Writing good APIs is hard especially in a small fast paced team, during pr's you SHOULD verify that the api endpoints docs are still correct and valid. When I was on the api team there was a lot of verifying that the documentation for the endpoints are simple and easy to follow, now I'm on an integration having to use other companies APIs where everything is a get with the "body" as a base64 encoded json in the query
Urtehnoes@reddit
Uhhh I can't stand that!
Or when you literally pass in the same arguments as an example in the documentation and you get back an error.
Tf?? I was querying a single employee with a set id.
Then you find out: oh, when they say employee Id, what they really mean is, you call getAllEmployees and iterate through every element until you find the employee you need by name and star sign.
Then you take the_internal-unused_id and you use THAT for the employee Id, completely ignoring the property: employee_id. That's a decoy Id to sabotage pen testers.
rowantwig@reddit
The worst API I've ever seen was a Java class in which every single method took List
The worst part was that the documentation for every method was something vague like "This method usually takes these arguments and returns..." Usually? What am I supposed to do if the method does something different? Crash and give the user an equally vague "something went wrong" error message?
bwmat@reddit
Reminds me of plugin API since people in my company came up with (for C++)
Every parameter was either a primitive, or an array of strings, and all return values were strings (C strings)
If an error occurred, they could return NULL (if they remembered to check, otherwise they'd throw an exception and probably mangle the process)
evaned@reddit
On an unrelated thought... I wonder if thedailywtf.com is still a thing?
Oh hey, it is ;-)
Snarwin@reddit
Guy who's only programmed in Lisp writing Java for the first time:
donalmacc@reddit
This is object-ively worse.
cake-day-on-feb-29@reddit
Java developers discover stringly typed APIs, yet worse somehow.
loup-vaillant@reddit
Do note that the blog post was about APIs in general, not just web APIs — even though the author himself seemed to ignore that. In that more general context, the idea of an "API team" makes me chuckle.
Of course, we do need some unified vision of where the boundaries of a program lie. That can be done by one person, or a small team, at the beginning of the project, but then this "API team" has to disband and actually program¹. Sure the APIs will still need ongoing work, but it will mostly be maintenance. That is, assuming the first version wasn’t too badly botched.
[1]: I have seen "architects" that stopped programming a while ago, and not having to suffer the consequences of their decisions directly tend to detach them from reality.
code_architect@reddit
What matters though is that the schema for that b64 encoded json is stable. Having base64 encoded json does not intrinsically make the api worse. not having it well spec'ed out does.
AyeMatey@reddit
It also matters that the API is sane and usable. And a b64 encoded JSON blob in a query string is a massive usability and maintainability stench.
treehuggerino@reddit
The 3rd party api json has no schema the swagger is sadly "/api/authorize/{Json}" in their swagger, only 10% of the endpoints has a description that is like "{ 'prop':1, 'prop2': 'two'}" it is absurdly bad, it does not tell you what gets returned
LOOKITSADAM@reddit
Shema is not limited to url encoding. That
{Json}can have a well defined and enforced schema in itself.EC36339@reddit
APIs are the one thing I'd never leave to AI sloppification.
You can replace implementations.
Interfaces are contracts you cannot break, and you shouldn't want to break them because they were poorly designed.
And if the interfaces last, then so do the tests.
phillipcarter2@reddit
An interesting thought exercise that concurs with your statement: https://aicoding.leaflet.pub/3md5ftetaes2e
Miserygut@reddit
It's not as profound as it's trying to be. "If you delete the implementation but have all the inputs necessary to recreate it, what are you losing?" well nothing except time to reimplement it. Just in a lot more words.
phillipcarter2@reddit
The point is that most software today doesn’t work like that.
PrydwenParkingOnly@reddit
Why not?
Aozi@reddit
Because we don't live in a perfect world.
It would be great if everything was always perfectly tested, specified and documented. But it's not, because humans are lazy, humans have schedules to keep and humans make mistakes, and doing everything perfectly is a ton of work.
Like I've worked in codebases where someone forgot to rename a test after changing functionality, so reading that test is very confusing. I've worked on tests that are buggy and aren't actually testing what they're supposed to be testing but they pass so everyone thinks it's fine.
I've also worked with codebases where not every single thing in existence is properly tested. Or where everything hasn't been properly documented and written down. Specifications communicated through emails, meetings, and calls, not properly put down in documentation and only existing as a comment in the source somewhere.
Etc etc.
I could keep going, but the thing is, what we do isn't perfect and there's a ton of things not written down or put in their proper place, especially when dealing with large and long term enterprise solutions.
phillipcarter2@reddit
Have you worked on software of a size too large for a single person to hold in their head? It is entirely the norm today that teams are deathly afraid of even touching some existing code, let alone removing it.
PrydwenParkingOnly@reddit
I think it’s the newer software frameworks that enforce or help with modularizing a lot better than the older stuff.
Usually with newer software everyone knows more or less what the entire application does, and smaller groups have their focus on individual modules, where they know the ins and outs.
phillipcarter2@reddit
Yeah that’s uhhh, absolutely not true in several dimensions. The most important of which of course being that the most sophisticated and valuable systems in the world are heaps of so-called legacy code whose authors have long moved on.
Full-Spectral@reddit
He's probably a cloud guy. So many people work in cloud world these days (and have never worked anywhere else) that they have little concept of anything outside of that world.
phillipcarter2@reddit
I work in cloud too! The property holds there too. In the valuable software that runs the world, new features and legacy behaviors are intertwined and you can’t Martin Fowler your way out of it.
In my case we are regularly adding a lot of functionality to a workflow engine that runs customer created workflows about 5 trillion times a month. It’s been around for 15 years and undergone several UI overhauls and re-architectures through an endlessly rotating cast of developers over the years. The system it integrates with generates 50 billion dollars a year and has thousands of engineers working on things. There is no such thing as any one person or team knowing how all of this actually works. We have a sophisticated sociotechnical system that sustains knowledge and value over the years.
EC36339@reddit
I don't know why I'm getting upvotes and you get downvotes for linking to a blog post that explains (almost?) the same idea better than I did, but thanks for posting it.
Key_Ferret7942@reddit
Probably because the linked post is AI slop
phillipcarter2@reddit
A big crowd of programmers here likes to stick their head in the sand about anything involving AI assisted or AI driven development. Doesn’t really matter if the topic isn’t even about AI.
EC36339@reddit
I see the connection, but this idea is independent of the entire topic of AI.
eganwall@reddit
I'm just speaking for myself, but their article's first paragraph immediately gives "copied from an LLM's output" and that's extremely off-putting
EC36339@reddit
I must have overlooked that part. Not that it matters, though, as the rest of the article is solid.
Plank_With_A_Nail_In@reddit
Just AI waffle slop.
Most API's are very simple, its really not this hard.
Full-Spectral@reddit
Whether or not the post is AI slop, APIs may be 'easy' to create the first time. The issue is what happens over the next 20 years when there is a lot of other code using them, but you have to move them forward to deal with new needs. If you have to live with the consequences of your API design, and don't want to end up drowning in evolutionary baggage over time, it's not so easy for more complex subsystems.
EC36339@reddit
Stay away from designing APIs.
Infamous_Guard5295@reddit
honestly the worst apis are the ones where someone thought they were being clever... like why would you base64 encode json in a query param when POST exists lol. i've seen apis where every endpoint returns 200 but the actual error is buried in the response body, makes debugging a nightmare. unpopular opinion but swagger/openapi docs are only as good as the team that maintains them, which is usually nobody after month 2
theAndrewWiggins@reddit
There's a reason for that, generally it's not safe to automatically cache POST requests. There's a reason why they're adding a new QUERY method to HTTP.
zjm555@reddit
Google APIs usually get shitcanned after 6 months
alternatex0@reddit
Because their goal is to get someone promoted. Afterwards they're redundant.
Personal_Offer1551@reddit
stripe is basically the gold standard for this. their versioning is magic.
Silhouette@reddit
They used to be. Not any more sadly.
tatloani@reddit
That title makes me remember the as of this time tom scotts's This video has 75M views
VictoryMotel@reddit
That has nothing to do with this
Total_Literature_809@reddit
I’m a minority, but I think good code does what it was meant to do. I like clean code, but I don’t make my team write clean from the start. Life finds a way for us to understand messy code in the future
Status-Artichoke-755@reddit
Are you high? What a ridiculous thing to say
throwaway34564536@reddit
It's not that crazy. He's prioritizing the code being functional than clean, which makes sense. If you look at the source code for Next.js, some of it looks terrifyingly bad (at least when I read it). But it does the job. Clean code may miss edge cases or be so abstracted that it's hard to reason about.
LittleLordFuckleroy1@reddit
It’s a false dichotomy.
Saint_Nitouche@reddit
It is, but it's also false to say you can have your cake and eat it too. Clean code costs resources (time, effort, validation). And sometimes the benefits of clean code aren't just worth it.
Total_Literature_809@reddit
That’s it
StardustGogeta@reddit
Or so the saying goes.
Not personally familiar with the source for Next.js, but I can believe it, and I agree with your practical take on things. It's nice to have code that is both clean and functional, but sometimes it's not possible or would just take too long compared to the alternative.
Robertgdel@reddit
Tell us where you work so we know to stay away from it
Total_Literature_809@reddit
Big financial services company
Jumpy-Baseball-4983@reddit
The counterpoint is also true: bad APIs age terribly and create massive tech debt. REST APIs with inconsistent naming, undocumented side effects, and no versioning become archaeological layers that everyone is afraid to touch. The best sign of a good API is when you can read the method signatures and understand the intent without looking at the implementation — it communicates its contract clearly. The worst sign is when you have to read the source code AND find a Stack Overflow thread from 2015 just to understand what a single method actually does.
sp3ng@reddit
I think the one caveat around APIs focusing on frontend needs are Backend For Frontend APIs, where the important distinction is that the team building the frontend system also owns, builds, and operates the BFF API. In that situation a tighter coupling is ok, and probably even somewhat desirable. Though the team should still aim to make parts of the API flexible for reuse where it makes sense.
Since they own both sides though, the challenge of versioning, retiring older versions once no longer in use, and all of that is at least the responsibility of a single team and doesn't involve cross-team coordination.
LittleLordFuckleroy1@reddit
Popped this open fully expecting it to be AI slop. Pleasantly surprised to see that not to be the case. Interesting insights and well-reasoned. Nice write up.
Downtown_Mark_6390@reddit
https://apyhub.com/ - This is a pretty good collection of APIs imo.
Status-Artichoke-755@reddit
Garbage, bloated site