Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

they have been cooking on it for decades too - larrabee was 2009 after all. Xeon phi survived that collapse and continued for another decade.

Honestly the problem is the same then as it was today - having good hardware isn’t enough. You need software penetration and organic usage/ecosystem. And the world can support maybe 2-3 of those right now - nvidia, apple, maybe one other. Everyone else gets the bullshit “we are the ones you need a cross-platform “adapter” ecosystems with effectively no real corporate moat etc.

(amd, intel, the door to this room is locked… one of you will get an ecosystem and the other goes home in a box… the game begins now.)

Like even just in gaming, AMD’s continued refusal to do the devrel has hurt them, it’s cost them revenue. It’s not even just “the drivers” or the constant deficit in features and functionality, but getting the people out into the studios to assist in development and tuning is how it has to be, nobody is going to tune for your hardware if you don’t do it yourself.



Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: