← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1352548] [NEW] test_volume_upload times out with internal server errors from glance

 

Public bug reported:

Saw this on stable/icehouse, the test reports a timeout failure:

http://logs.openstack.org/03/111803/1/check/check-tempest-dsvm-postgres-
full/a17f798/console.html#_2014-08-04_20_20_55_504

2014-08-04 20:20:55.504 | tempest.api.volume.test_volumes_actions.VolumesV2ActionsTestXML.test_volume_upload[gate,image]
2014-08-04 20:20:55.504 | ----------------------------------------------------------------------------------------------
2014-08-04 20:20:55.504 | 
2014-08-04 20:20:55.504 | Captured traceback:
2014-08-04 20:20:55.504 | ~~~~~~~~~~~~~~~~~~~
2014-08-04 20:20:55.504 |     Traceback (most recent call last):
2014-08-04 20:20:55.504 |       File "tempest/test.py", line 128, in wrapper
2014-08-04 20:20:55.504 |         return f(self, *func_args, **func_kwargs)
2014-08-04 20:20:55.504 |       File "tempest/api/volume/test_volumes_actions.py", line 106, in test_volume_upload
2014-08-04 20:20:55.505 |         self.image_client.wait_for_image_status(image_id, 'active')
2014-08-04 20:20:55.505 |       File "tempest/services/image/v1/json/image_client.py", line 304, in wait_for_image_status
2014-08-04 20:20:55.505 |         raise exceptions.TimeoutException(message)
2014-08-04 20:20:55.505 |     TimeoutException: Request timed out
2014-08-04 20:20:55.505 |     Details: (VolumesV2ActionsTestXML:test_volume_upload) Time Limit Exceeded! (196s)while waiting for active, but we got saving.

There are HTTP 500 errors in the cinder volume logs:

Looks like the image id in the failure is a1159899-0f66-47e4-92fb-
dd13f92db283 .

You can see the copy_volume_to_image failure here:

http://logs.openstack.org/03/111803/1/check/check-tempest-dsvm-postgres-
full/a17f798/logs/screen-c-vol.txt.gz?level=TRACE#_2014-08-04_20_05_11_540

2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher Traceback (most recent call last):
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     incoming.message))
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     return self._do_dispatch(endpoint, method, ctxt, args)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     result = getattr(endpoint, method)(ctxt, **new_args)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/volume/manager.py", line 719, in copy_volume_to_image
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     payload['message'] = unicode(error)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/openstack/common/excutils.py", line 68, in __exit__
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/volume/manager.py", line 713, in copy_volume_to_image
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     image_meta)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/volume/drivers/lvm.py", line 276, in copy_volume_to_image
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     self.local_path(volume))
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/image/image_utils.py", line 242, in upload_volume
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     image_service.update(context, image_id, {}, image_file)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/image/glance.py", line 311, in update
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     _reraise_translated_image_exception(image_id)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/image/glance.py", line 309, in update
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     **image_meta)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/image/glance.py", line 158, in call
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     return getattr(client.images, method)(*args, **kwargs)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/python-glanceclient/glanceclient/v1/images.py", line 329, in update
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     resp, body = self.client.put(url, headers=hdrs, data=image_data)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/python-glanceclient/glanceclient/common/http.py", line 237, in put
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     return self._request('PUT', url, **kwargs)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/python-glanceclient/glanceclient/common/http.py", line 199, in _request
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     raise exc.from_response(resp, resp.content)
2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher HTTPInternalServerError: 

And then failures in the glance-api logs here:

http://logs.openstack.org/03/111803/1/check/check-tempest-dsvm-postgres-
full/a17f798/logs/screen-g-api.txt.gz?level=TRACE#_2014-08-04_20_05_11_141

2014-08-04 20:05:11.140 13322 ERROR glance.store.swift [eee51f13-e043-47b7-90e3-be5fd744c911 bac82baa06084e2293fcdd7393c4cc40 94e85506bb0745e98d9eb0c484066970 - - -] Failed to add object to Swift.
Got error from Swift: put_object('glance', 'a1159899-0f66-47e4-92fb-dd13f92db283', ...) failure and no ability to reset contents for reupload.
2014-08-04 20:05:11.141 13322 ERROR glance.api.v1.upload_utils [eee51f13-e043-47b7-90e3-be5fd744c911 bac82baa06084e2293fcdd7393c4cc40 94e85506bb0745e98d9eb0c484066970 - - -] Failed to upload image a1159899-0f66-47e4-92fb-dd13f92db283
2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils Traceback (most recent call last):
2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils   File "/opt/stack/new/glance/glance/api/v1/upload_utils.py", line 99, in upload_data_to_store
2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils     store)
2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils   File "/opt/stack/new/glance/glance/store/__init__.py", line 382, in store_add_to_backend
2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils     (location, size, checksum, metadata) = store.add(image_id, data, size)
2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils   File "/opt/stack/new/glance/glance/store/swift.py", line 503, in add
2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils     raise glance.store.BackendException(msg)
2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils BackendException: Failed to add object to Swift.
2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils Got error from Swift: put_object('glance', 'a1159899-0f66-47e4-92fb-dd13f92db283', ...) failure and no ability to reset contents for reupload.
2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils 

Looking in the swift object logs, there is a 408 (timeout):

http://logs.openstack.org/03/111803/1/check/check-tempest-dsvm-postgres-
full/a17f798/logs/screen-s-object.txt.gz

object-server: 127.0.0.1 - - [04/Aug/2014:20:04:59 +0000] "PUT
/sdb1/366/AUTH_215b70ae49734fd482be91769c7fbcb0/glance/a1159899-0f66-47e4
-92fb-dd13f92db283" 408 - "PUT
http://127.0.0.1:8080/v1/AUTH_215b70ae49734fd482be91769c7fbcb0/glance/a1159899-0f66-47e4
-92fb-dd13f92db283" "txc679bcdcf40e4fb4b1593-0053dfe65a" "proxy-server
13313" 272.9061

** Affects: cinder
     Importance: Undecided
         Status: New

** Affects: glance
     Importance: Undecided
         Status: New

** Affects: swift
     Importance: Undecided
         Status: New

** Also affects: swift
   Importance: Undecided
       Status: New

** No longer affects: cinder

** Also affects: cinder
   Importance: Undecided
       Status: New

** Also affects: glance
   Importance: Undecided
       Status: New

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to Glance.
https://bugs.launchpad.net/bugs/1352548

Title:
  test_volume_upload times out with internal server errors from glance

Status in Cinder:
  New
Status in OpenStack Image Registry and Delivery Service (Glance):
  New
Status in OpenStack Object Storage (Swift):
  New

Bug description:
  Saw this on stable/icehouse, the test reports a timeout failure:

  http://logs.openstack.org/03/111803/1/check/check-tempest-dsvm-
  postgres-full/a17f798/console.html#_2014-08-04_20_20_55_504

  2014-08-04 20:20:55.504 | tempest.api.volume.test_volumes_actions.VolumesV2ActionsTestXML.test_volume_upload[gate,image]
  2014-08-04 20:20:55.504 | ----------------------------------------------------------------------------------------------
  2014-08-04 20:20:55.504 | 
  2014-08-04 20:20:55.504 | Captured traceback:
  2014-08-04 20:20:55.504 | ~~~~~~~~~~~~~~~~~~~
  2014-08-04 20:20:55.504 |     Traceback (most recent call last):
  2014-08-04 20:20:55.504 |       File "tempest/test.py", line 128, in wrapper
  2014-08-04 20:20:55.504 |         return f(self, *func_args, **func_kwargs)
  2014-08-04 20:20:55.504 |       File "tempest/api/volume/test_volumes_actions.py", line 106, in test_volume_upload
  2014-08-04 20:20:55.505 |         self.image_client.wait_for_image_status(image_id, 'active')
  2014-08-04 20:20:55.505 |       File "tempest/services/image/v1/json/image_client.py", line 304, in wait_for_image_status
  2014-08-04 20:20:55.505 |         raise exceptions.TimeoutException(message)
  2014-08-04 20:20:55.505 |     TimeoutException: Request timed out
  2014-08-04 20:20:55.505 |     Details: (VolumesV2ActionsTestXML:test_volume_upload) Time Limit Exceeded! (196s)while waiting for active, but we got saving.

  There are HTTP 500 errors in the cinder volume logs:

  Looks like the image id in the failure is a1159899-0f66-47e4-92fb-
  dd13f92db283 .

  You can see the copy_volume_to_image failure here:

  http://logs.openstack.org/03/111803/1/check/check-tempest-dsvm-
  postgres-
  full/a17f798/logs/screen-c-vol.txt.gz?level=TRACE#_2014-08-04_20_05_11_540

  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher Traceback (most recent call last):
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 133, in _dispatch_and_reply
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     incoming.message))
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 176, in _dispatch
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     return self._do_dispatch(endpoint, method, ctxt, args)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/usr/local/lib/python2.7/dist-packages/oslo/messaging/rpc/dispatcher.py", line 122, in _do_dispatch
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     result = getattr(endpoint, method)(ctxt, **new_args)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/volume/manager.py", line 719, in copy_volume_to_image
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     payload['message'] = unicode(error)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/openstack/common/excutils.py", line 68, in __exit__
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     six.reraise(self.type_, self.value, self.tb)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/volume/manager.py", line 713, in copy_volume_to_image
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     image_meta)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/volume/drivers/lvm.py", line 276, in copy_volume_to_image
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     self.local_path(volume))
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/image/image_utils.py", line 242, in upload_volume
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     image_service.update(context, image_id, {}, image_file)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/image/glance.py", line 311, in update
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     _reraise_translated_image_exception(image_id)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/image/glance.py", line 309, in update
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     **image_meta)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/cinder/cinder/image/glance.py", line 158, in call
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     return getattr(client.images, method)(*args, **kwargs)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/python-glanceclient/glanceclient/v1/images.py", line 329, in update
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     resp, body = self.client.put(url, headers=hdrs, data=image_data)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/python-glanceclient/glanceclient/common/http.py", line 237, in put
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     return self._request('PUT', url, **kwargs)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher   File "/opt/stack/new/python-glanceclient/glanceclient/common/http.py", line 199, in _request
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher     raise exc.from_response(resp, resp.content)
  2014-08-04 20:05:11.540 14177 TRACE oslo.messaging.rpc.dispatcher HTTPInternalServerError: 

  And then failures in the glance-api logs here:

  http://logs.openstack.org/03/111803/1/check/check-tempest-dsvm-
  postgres-
  full/a17f798/logs/screen-g-api.txt.gz?level=TRACE#_2014-08-04_20_05_11_141

  2014-08-04 20:05:11.140 13322 ERROR glance.store.swift [eee51f13-e043-47b7-90e3-be5fd744c911 bac82baa06084e2293fcdd7393c4cc40 94e85506bb0745e98d9eb0c484066970 - - -] Failed to add object to Swift.
  Got error from Swift: put_object('glance', 'a1159899-0f66-47e4-92fb-dd13f92db283', ...) failure and no ability to reset contents for reupload.
  2014-08-04 20:05:11.141 13322 ERROR glance.api.v1.upload_utils [eee51f13-e043-47b7-90e3-be5fd744c911 bac82baa06084e2293fcdd7393c4cc40 94e85506bb0745e98d9eb0c484066970 - - -] Failed to upload image a1159899-0f66-47e4-92fb-dd13f92db283
  2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils Traceback (most recent call last):
  2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils   File "/opt/stack/new/glance/glance/api/v1/upload_utils.py", line 99, in upload_data_to_store
  2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils     store)
  2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils   File "/opt/stack/new/glance/glance/store/__init__.py", line 382, in store_add_to_backend
  2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils     (location, size, checksum, metadata) = store.add(image_id, data, size)
  2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils   File "/opt/stack/new/glance/glance/store/swift.py", line 503, in add
  2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils     raise glance.store.BackendException(msg)
  2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils BackendException: Failed to add object to Swift.
  2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils Got error from Swift: put_object('glance', 'a1159899-0f66-47e4-92fb-dd13f92db283', ...) failure and no ability to reset contents for reupload.
  2014-08-04 20:05:11.141 13322 TRACE glance.api.v1.upload_utils 

  Looking in the swift object logs, there is a 408 (timeout):

  http://logs.openstack.org/03/111803/1/check/check-tempest-dsvm-
  postgres-full/a17f798/logs/screen-s-object.txt.gz

  object-server: 127.0.0.1 - - [04/Aug/2014:20:04:59 +0000] "PUT
  /sdb1/366/AUTH_215b70ae49734fd482be91769c7fbcb0/glance/a1159899-0f66-47e4
  -92fb-dd13f92db283" 408 - "PUT
  http://127.0.0.1:8080/v1/AUTH_215b70ae49734fd482be91769c7fbcb0/glance/a1159899-0f66-47e4
  -92fb-dd13f92db283" "txc679bcdcf40e4fb4b1593-0053dfe65a" "proxy-server
  13313" 272.9061

To manage notifications about this bug go to:
https://bugs.launchpad.net/cinder/+bug/1352548/+subscriptions


Follow ups

References