Yes, it's not arbitrary at all — they're only offering it on devices with at least 8GB of memory.
The iPhone 15 Pros were the first iPhones with 8GB. All M1+ Macs/iPads have at least 8GB of ram.
LLMs are very memory hungry, so frankly I'm a little surprised they support such low memory requirements (especially knowing that the system is running other tasks, not just ML). Microsoft's Copilot+ program has a 16GB minimum.
The iPhone 15 Pros were the first iPhones with 8GB. All M1+ Macs/iPads have at least 8GB of ram.
LLMs are very memory hungry, so frankly I'm a little surprised they support such low memory requirements (especially knowing that the system is running other tasks, not just ML). Microsoft's Copilot+ program has a 16GB minimum.