Yeah, but what hiring process intends to hire average devs? Yes, most probably -end up with- average devs...but most also ask and test for X years in a language, not people who seek to learn things and are curious and want to take responsibility and own things.
It's not penalty or step size. It's loss as in amount of information lossd (not encoded in your network) compared to one perfectly encoding ground truth.
Learning rate, as in what is the maximum amount of delta you are allowed to change your inputs to minimise your information loss analogous to how quickly you can possibly learn in one experiment.
I'm in two minds about this (deeper integration with a particular vendor - i.e. "serverless")
Reduced time to market is incredibly valuable. Current client base is well in its millions. Ability to test to few and roll out to many instantly is invaluable. You no longer have to hire competent software developers who understand all patterns and practices to make scalable code and infrastructure. Just need them to work on a particular unit or function.
The thing which scares me is, some of these companies are decades of years old, hundreds. How long has AWS/GCP/Azure abstractions been around for? How quick are we to graveyard some of these platforms. Quite. A lot quicker than you can lift, shift and rewrite your solution to elsewhere.