I wonder if that was a case of being literal as in, technically speaking gpt-5 isn’t being _trained_, while omitting that they are working on collecting and preparing the data.
I previously did this. I wrote a naive integer factorization program in C, compiled it, extracted the disassembly and intentionally broke it.
It generated a working c function they was almost correct given the broken assembly. I then “talked” with it to improve the code, even suggested that the original disassembly contained an error. It was surprisingly good.
Note: I broke the disassembly intentionally because when I presented the original disassembly it immediately outputted the/a C program to factorize integers.