Pardon my naivete, are the benefits of the flexibility that GraphQL give worth the unpredictability costs (or costs of customizing to add limits) of that same flexibility compared to writing a tailored server side call to do those calls and return the limited data instead?
Yeah, I just don't get it - even if you use GraphQL, you're going to have to do server-side code to make the calls to the different backends.
Maybe it's just that the backend devs in my last project werenyevery good, but the backend GraphQL code was ridiculously complex and impossible to reason about.
It's easy to write an incomprehensible GraphQL server the first time you try it, but it's by no means an innate trait of the technology. Assuming your downsteam APIs are done well, it should be possible to write a GraphQL server which is easy to reason about and has predictable performance (largely on par with what you'd get if you built the endpoints by hand).
Yes, as this is the main reason that Facebook created GraphQL, it is expensive for mobile phones to query that much data. Something else interesting is that while you begin to write the tailored server side call, and you optimize it, you will end up with something that looks like GraphQL anyway.
Server-to-server GraphQL seems much more reasonable as both sides are controlled. It's the client-to-server I have less of a justification for compared to targeted calls w/ individual, limited contracts that are quantifiable and optimizable. I have witnessed the costs of allowing highly flexible data fetching from the client once it grows large.
> seems much more reasonable as both sides are controlled
GraphQL has been around for years and people keep making this argument, but where are all the horror stories of unbounded queries being made and systems being hobbled? The argument is beginning to sound anemic.