← Back to team overview

graphite-dev team mailing list archive

Re: [Question #219215]: Large install recommendations

 

Question #219215 on Graphite changed:
https://answers.launchpad.net/graphite/+question/219215

Yee-Ting Li posted a new comment:
i have read that someone out there is doing a million metrics/minute
without too much problem - and i believe that was on whisper.

having said that, my personal experience is not so much the number of
metrics but the number of queues (ie the number of unique timeseries
data in your dataset). from personal experience, version 0.9.10 has some
serious issues when a single carbon-cache reaches beyond around 40k
queues. the only way to solve this (without hacking the code) is to run
consistent hashing (with multiple caches on the same physical box, or
distributed across many). it works well, however, the carbon-relay can
become cpu bound. (my solution was to conduct the consistent hashing at
the client end sending the data rather than using a relay)

i also tried ceres for a while, but discovered that it simply exhausted
the number of inodes on my disk; there were some serious issues with the
ceres not appending the data but creating new files. i'm not sure how
much this was due to a real bug or if the my system was being too taxed.
this was with the megacarbon branch a few months back.

-- 
You received this question notification because you are a member of
graphite-dev, which is an answer contact for Graphite.