openstack team mailing list archive
-
openstack team
-
Mailing list archive
-
Message #24158
Re: Swift performance issues with requests
Hi Klaus,
How's the disk space usage now?
Cheers
+Hugo Kuo+
hugo@xxxxxxxxxxxxxx
tonytkdk@xxxxxxxxx
+886 935004793
2013/5/31 Klaus Schürmann <klaus.schuermann@xxxxxxxxxxxxx>
> Hi,
>
> when I test my new swift cluster I get a strange behavior with GET and PUT
> requests.
> Most time it is really fast. But sometimes it takes a long time to get the
> data.
> Here is an example with the same request which took one time 17 seconds:
>
> May 31 10:33:08 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/08 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> tx2804381fef91455dabf6c9fd0edf4206 - 0.0546 -
> May 31 10:33:08 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/08 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> tx90025e3259d74b9faa8f17efaf85b104 - 0.0516 -
> May 31 10:33:08 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/08 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> tx942d79f78ee345138df6cd87bac0f860 - 0.0942 -
> May 31 10:33:08 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/08 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> tx73f053e15ed345caad38a6191fe7f196 - 0.0584 -
> May 31 10:33:08 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/08 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> txd4a3a4bf3f384936a0bc14dbffddd275 - 0.1020 -
> May 31 10:33:26 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/26 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> txd8c6b34b8e41460bb2c5f3f4b6def0ef - 17.7330 - <<<<<<<<<<<<<<
> May 31 10:33:26 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/26 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> tx21aaa822f8294d9592fe04b3de27c98e - 0.0226 -
> May 31 10:33:26 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/26 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> txcabe6adf73f740efb2b82d479a1e6b20 - 0.0385 -
> May 31 10:33:26 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/26 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> txc1247a1bb6c04bd3b496b3b986373170 - 0.0247 -
> May 31 10:33:26 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/26 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> txdf295a88e513443393992f37785f8aed - 0.0144 -
> May 31 10:33:26 swift-proxy1 proxy-logging 10.4.2.99 10.4.2.99
> 31/May/2013/08/33/26 GET /v1/AUTH_provider1/129450/829188397.31 HTTP/1.0
> 200 - Wget/1.12%20%28linux-gnu%29
> provider1%2CAUTH_tke6408efec4b2439091fb6f4e75911602 - 283354 -
> tx62bb33e8c20d43b7a4c3512232de6fe4 - 0.0125 -
>
> Alle requests on the storage nodes are below 0.01 sec.
>
> The tested cluster contain one proxy (DELL R420, 16 G RAM, 2 CPU) and 5
> storage-nodes (DELL R720xd, 16 G RAM 2 CPU, 2 HDD). The proxy-server
> configuration:
>
> [DEFAULT]
> log_name = proxy-server
> log_facility = LOG_LOCAL1
> log_level = INFO
> log_address = /dev/log
> bind_port = 80
> user = swift
> workers = 32
> log_statsd_host = 10.4.100.10
> log_statsd_port = 8125
> log_statsd_default_sample_rate = 1
> log_statsd_metric_prefix = Proxy01
> #set log_level = DEBUG
>
> [pipeline:main]
> pipeline = healthcheck cache proxy-logging tempauth proxy-server
>
> [app:proxy-server]
> use = egg:swift#proxy
> allow_account_management = true
> account_autocreate = true
>
> [filter:tempauth]
> use = egg:swift#tempauth
> user_provider1_xxxx = xxxx .xxx http://10.4.100.1/v1/AUTH_provider1
> log_name = tempauth
> log_facility = LOG_LOCAL2
> log_level = INFO
> log_address = /dev/log
>
> [filter:cache]
> use = egg:swift#memcache
> memcache_servers = 10.12.0.2:11211,10.12.0.3:11211
> set log_name = cache
>
> [filter:catch_errors]
> use = egg:swift#catch_errors
>
> [filter:healthcheck]
> use = egg:swift#healthcheck
>
> [filter:proxy-logging]
> use = egg:swift#proxy_logging
> access_log_name = proxy-logging
> access_log_facility = LOG_LOCAL3
> access_log_level = DEBUG
> access_log_address = /dev/log
>
>
> Can someone explain such behavior?
>
> Thanks
> Klaus
>
> _______________________________________________
> Mailing list: https://launchpad.net/~openstack
> Post to : openstack@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~openstack
> More help : https://help.launchpad.net/ListHelp
>
Follow ups
References