Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The HuggingFace link is published, but not working yet: https://huggingface.co/MiniMaxAI/MiniMax-M2.1

Looks like this is 10 billion activated parameters / 230 billion in total.

So, this is biggest open model, which can be run on your own host / own hardware with somewhat decent speed. I'm getting 16 t/s on my Intel Xeon W5-3425 / DDR5-4800 / RTX4090D-48GB

And looking at the benchmark scores - it's not that far from SOTA (matches or exceeds the performance of Claude Sonnet 4.5)



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: