> there's a lot of rules that are written by exactly those people
I don't know if that is true. I do think we get rigid safety protocols or specifications that can make novel designs impossible. This happens in all bureaucracies.
There's a bit of that sometimes (and I think the general safety protocols in the submersable rating standards they ignored were kinda like that, though it wasn't impossible, it was just "If you're doing something nonstandard, you need to test the shit out of it", which involved needing to build multiple full-scale prototypes to do long and destructive tests on, which is hella expensive but also realistically the kind of thing I'd really like to see for something I'm going to put between me and the depths of the ocean). These rules were for the most part written by experts who knew what kinds of things can go wrong and how to build as much confidence in a new design as possible.
But on the other hand, you also get people making such rules who really don't know their stuff, and who have a reflexive tendancy to add rules 'just in case', without allowing for judgement or thought in the process. Because they don't know what's going on, they often produce rulesets that are both overconstraining and insufficient, sometimes in ways that are counterproductive to their goals. Areas where I've seen this: Medical device regulation (I've seen how the sausage is made, and it's shockingly easy to get absolute crap approved but a good design can get caught up in endless headaches by box-ticking), IT security processes (endless checklists of 'best practices', very little sensible risk assessment, threat modeling, or red-teaming, huge incentives for people to workaround the system as intended, creating more security holes). UL safety regs entered a bit of a spiral of this at one point as well: they started just playing whack-a-mole with rules and wound up with something that was really difficult to implement but didn't really improve safety. EU standards actually tend to be pretty sensible in this regard (at least outside medical areas). Company health and safety can easily fall into this trap as well, if run incompetently.
What you are describing is "bureaucratic collapse". Those extra rules are not meant to support outcomes for the ostensible goal, but a better outcome for the bureaucracy itself. Bureaucracy is not in and of itself bad, it is a very powerful tool, but when it is administered by non-aligned agents and not subject to the proper feedback loops it will always shift goals into protecting itself over its original mission.
Stockton Rush had the perfect storm of a personality that saw everything he hated in "rules" meant to constrain submersible engineering towards safe designs.
I work globally installing high voltage switchgear.
In certain countries, safety culture is exactly how parent described. Layer upon layer upon layer of red tape to do the simplest thing, making everything needlessly hard to do. Quality suffers and I suspect safety as well.
Instead of training the personnel and putting faith in them, they are treated like children.
> Instead of training the personnel and putting faith in them, they are treated like children.
Exactly this, so much time and effort gets expended on paranoid bureaucracy.
Ideally the decisions on these things should be taken by those with a good amount of practical experience, who've been held accountable in the past, know the kinds risks low-skill employees bring and the pressures vested interests bring to the table. The ivory tower types don't know any of that.
I don't know if that is true. I do think we get rigid safety protocols or specifications that can make novel designs impossible. This happens in all bureaucracies.