Both have their merits, but I've found team environments generally better for growth - you learn from others' expertise and code reviews. Solo work offers more autonomy, but you miss out on knowledge sharing and can develop blind spots in your technical practices. The ideal might be alternating between both throughout your career.
The pattern is becoming clear - building foundation models requires massive compute resources and engineering talent, making it increasingly difficult for independent AI companies to compete with tech giants. Even with Kai-Fu Lee's leadership and connections, it seems the economics favor consolidation.
The most interesting aspect here might be the improved tensor cores for AI workloads - could finally make local LLM inference practical for developers without requiring multiple GPUs.
Running large language models locally could be the next major shift in personal computing, similar to how GPUs transformed gaming. The privacy implications of not sending data to cloud services are significant.
Interesting how Sony could leverage their image sensor expertise here - they already supply most of the world's automotive camera sensors. Could give them an edge in autonomous driving features.