I was just thinking the same thing. Usually programming is a very binary thing - you tell the computer exactly what to do, and it will do exactly what you asked for whether it's right or wrong. These system prompts feel like us humans are trying really hard to influence how the LLM behaves, but we have no idea if it's going to work or not.