Definitely get that. Being hammered by scrapers is a massive PITA (especially with latest aggressive AI crawlers). We focus primarily on allowing people to automate their existing workflows. For all hosted workflows we have rate limits to prevent mass scraping/affecting server workload in any real capacity. In fact, because we don't load js/html and hit endpoints directly I would guess that we consume less server resources in the end.
The requests still route through your servers/the data still lives with you. Kampala is a powerful tool but I don't see people replacing the actual apps with it. Most of our customers use it for automating repetitive actions in legacy dashboards.