Elasticsearch is really awesome for keeping large amount of searchable data. I used it in a previous application where we stored millions of items a week.
For data retention I had different indexes with different TTLs, depending on the type of queries that hit them (queries that only dealt with frequent items were sent to an index with a very short TTL).
For data retention I had different indexes with different TTLs, depending on the type of queries that hit them (queries that only dealt with frequent items were sent to an index with a very short TTL).
For graphing I also used Graphite, with metrics (http://metrics.codahale.com/) for sending data from Java programs and scales (https://github.com/Cue/scales) for sending data from Python applications.
The only problem I had was tuning it for faceting (Faceting consumed lots of RAM).