If you're just _hacking_ a few simple calls, curl is the way to go.
But if you're working in a team, with multiple environments, with complex payloads, authentication, doing dozens of API calls everyday...
Having a software able to manage libraries of endpoints, parameters, simple environment switching, included auth, sharing between team members... is a big time saver.
I personally prefer IntelliJ's HTTP Client[0] since I always have my IDE open, the files are not obfuscated in a gibberish format and can be versioned/shared through Git.
But when I start working on an existing project, having a Postman collection to rely on is a huge time-saver, instead of having to go down in-existent API docs or trying to infer from the code itself.
This and also when you newly join a team it is more productive to start using the tooling what they are using and move to preferred tooling once familiar with the API endpoints.
You may like Hurl [1]: it's an Open source cli based on curl (libcurl to be exact), to run and test HTTP requests with plain text (I'm one of the maintainers). We recently add --curl option to come back to curl. Give it a shout!
I've recently used Hurl to create a test suite for migrating a complex Nginx configuration to Caddy and it was a great choice!
I ran Caddy replacing the upstreams with mockbin-like services (don't remember which one I used) so it would respond with information about the request made by the proxy, so Hurl could make assertions on that.
Recently tried out hurl for a project to show how abstract tests can be run in a specific environment. Great tool, it will definitely stay as part of my standard toolset.
I cannot express in words how much as an engineer I am sick of every app people suggest to me needing an entire quasi-visualized OS running behind it, written in the shittiest language ever to grace our cursed machines, just to render text and perform web requests.
This is one application where I simply prefer a GUI for most of the use cases. All the various components of an HTTP call are visually better represented as far as I am concerned. Often enough it is also specifically the payload on both sides I am specifically interested in and viewing them side by side and making quick adjustments is just my personal preferences.
In the year 2024 with 8 cores, 32gb of ram and 4tb of storage available the electron overhead generally also doesn't really matter that much to me.
Though this doesn't seem to be an electron app either, rather a PWA. Which makes me wonder how well it works with all CORS limitations you are facing in the browser.
Not that I have anything against CURL either, when I am working in a CLI (for example on a server) it is the perfect tool.
I use a jupyter notebook and Python with requests.
The problem with all these tools is you pretty quickly end up basically wanting full programmability...at which point, Python is just easier and more flexible.
Combine that with uv for on-demand script dependencies and it's also completely shareable and manageable.
Curl is nice but its syntax leaves a lot to be desired. It's also hard to distribute to others a "collection" of them, as you're going to end up with troubleshooting questions from less-technical users, especially when it comes to multi-request items that require pre/post response processing.
No, you are not the only professional among the amateurs/hobbyists.
I know that sounds offensive, but let's be real, nothing wrong with using AI to code or using a gui to learn about the underlying protocol. But it is what it is
I always use curl, jq, etc. It's so simple and straightforward. But I would be lying if I claimed it to be easy to convince my teammates to use the same toolset I use.
I don't want a 300MB electron client doing GET/POST work, especially those with hidden telemetry on.