Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Just out of curiosity, in what sense is Codex is better trained than CodeGen?


OpenAI hasn't said exactly how they trained code-davinci-002 so this is speculative, but I'm reasonably sure it was trained on more data and languages than CodeGen and for longer. It was also trained using fill-in-the middle [1].

[1] https://arxiv.org/abs/2207.14255




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: