graphite-dev team mailing list archive
-
graphite-dev team
-
Mailing list archive
-
Message #01315
Re: [Question #170295]: how to handle high-volume data?
Question #170295 on Graphite changed:
https://answers.launchpad.net/graphite/+question/170295
Status: Needs information => Open
Daniel Lawrence gave more information on the question:
I think the answer is going to be around the caching for bulk updates,
or how i am sending the data to carbon. as i havn't come across anything
that slapped me in the face to say use the following connection type.
1) The poor performance presents it self as high iowait on the system (
about 80% in some cases ) this leads to a host of other performance
issues because of the io blocking.
2) I don't really have a performance target at the moment, If i can get
more of the appliance data into graphite i'll be happy say 200,000 -
250,000 points over 10 minute.
3) I am using the following bulkupdate code from
https://github.com/daniellawrence/carbonclient
to summarize:
I am using a basic python socket connection to interact with the plain-text listener.
I cut up the 10,000 datapoints into 500 lines in one socket connection
socket = socket.socket()
socket.connect( ('carbon.example.com', 2003) )
socket.sendall( 500-line-string )
time.sleep(.1)
socket.sendall( 500-line-string )
--
You received this question notification because you are a member of
graphite-dev, which is an answer contact for Graphite.