Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

A Mixture of Experts model is still one behemoth neural network and believing otherwise is just a common misconception on term.

MoE are attempts at sparsity, only activating a set number of neurons/weights at a time. They're not separate models stitched together. They're not an Ensemble. I blame the name at this point.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: