That makes sense, though in an ideal world there would be an abstracted header that said "hey, I'm not gonna render JS the way a regular browser will, so send me something prerendered". Then you could write something that would actually be future-proof and work with other search engines.
The way Google suggests there actually seems a little bit nefarious, as it makes it hard-coded to Google instead of working for any search engine.
You do realize that you gave a link which is from 2006? And more recent recommendations does not include that.
[EDIT] OK as I was downvoted I will clarify my point: https://developers.google.com/webmasters/ajax-crawling/docs/... This is recommended practice for crawling javascript generated pages, no need to lookup for spiders IP address as someone mentioned.
http://googlewebmastercentral.blogspot.co.uk/2006/09/how-to-...