Any idea whether this affects (randomized) A/B testing? I think that in the past, Google has simply ignored the dynamic test changes to the site's content. Now I'm not quite sure anymore.
For duplicate elimination, it's important to have deterministic execution of Javascript, for duplicate elimination. Getting several identical pages (with different URLs) in your search results is a really bad user experience.
As of when I left Google in 2010, the JavaScript random number generator always returned 0.5 (and some SEO figured it out and blogged about it, no secrets here). However, I was trying to convince my manager to let me instead seed a random number generator with an HMAC of all of the currently loaded HTML and JavaScript (to make it deterministic but hard to display something good 1 in a million times but 100% of the time to Google's indexing system).