Hacker News
new
|
past
|
comments
|
ask
|
show
|
jobs
|
submit
login
whimsicalism
on May 13, 2024
|
parent
|
context
|
favorite
| on:
GPT-4o
They have not. You probably read "MoE" and some pop article about what that means without having any clue.
matsemann
on May 13, 2024
[–]
If you know better it would be nice of you to provide the correct information, and not just refute things.
whimsicalism
on May 13, 2024
|
parent
[–]
gpt-4 is a sparse MoE model with ~1.2T params. this is all public knowledge and immediately precludes the two previous commentators assertions
Guidelines
|
FAQ
|
Lists
|
API
|
Security
|
Legal
|
Apply to YC
|
Contact
Search: