This may or may not be the proper place for this question but it seems the best place to ask.
Because of the limitations of GAE(specifically no writing of new files unless you are deploying) how can one structure their application so that google and other search engines can crawl it?
The main pre-determined content is obviously not an issue, however in most implementations of say a blog, the blog posts are stored as static content in the datastore.
If I were writing a competitor to blogspot(I'm not) how could I provide users with the opportunity to receive generic traffic?
Has anyone tackled this issue with any success? Or does using GAE require you to subscribe to the "generic traffic is less important everyday." school of thought?