← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1655558] [NEW] vmware create vm instance from snapshot error

 

Public bug reported:

my environment is:
vcenter version 6.0, vsan version 6.0, and the neutron use vmware's NSX!
openstack is deploy by fuel 9.0,then the openstack version is mitaka!

when i create a snapshot from vm instance,and the deploy the new vm
instance from the snapshot ,get the error:

2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images Traceback (most recent call last):
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images   File "/usr/lib/python2.7/dist-packages/nova/virt/vmwareapi/images.py", line 186, in image_transfer
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images     data = read_handle.read(CHUNK_SIZE)
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images   File "/usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py", line 645, in read
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images     data = next(self._iter)
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images   File "/usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py", line 653, in get_next
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images     for data in self._glance_read_iter:
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images   File "/usr/lib/python2.7/dist-packages/glanceclient/common/utils.py", line 477, in __iter__
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images     for chunk in self.iterable:
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images   File "/usr/lib/python2.7/dist-packages/glanceclient/common/utils.py", line 435, in integrity_iter
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images     (md5sum, checksum))
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images IOError: [Errno 32] Corrupt image download. Checksum was 4aeaba2179866eb9d9b801d679964dfa expected f28c8836cf65c087e8b43f4c798477f6
2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images 
<183>Dec 22 07:11:59 node-10 nova-compute: 2016-12-22 07:11:59.899 4011 DEBUG oslo_vmware.rw_handles [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Getting lease state for https://192.168.103.98/nfc/5201b6e8-5cc7-e7bf-dc30-1300dda05ae0/disk-0.vmdk. close /usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py:455
<183>Dec 22 07:11:59 node-10 nova-compute: 2016-12-22 07:11:59.900 4011 DEBUG oslo_vmware.api [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Waiting for function oslo_vmware.api._invoke_api to return. func /usr/lib/python2.7/dist-packages/oslo_vmware/api.py:122
<183>Dec 22 07:11:59 node-10 nova-compute: 2016-12-22 07:11:59.914 4011 DEBUG oslo_vmware.rw_handles [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Lease for https://192.168.103.98/nfc/5201b6e8-5cc7-e7bf-dc30-1300dda05ae0/disk-0.vmdk is in state: ready. close /usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py:464
<183>Dec 22 07:11:59 node-10 nova-compute: 2016-12-22 07:11:59.915 4011 DEBUG oslo_vmware.rw_handles [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Releasing lease for https://192.168.103.98/nfc/5201b6e8-5cc7-e7bf-dc30-1300dda05ae0/disk-0.vmdk. close /usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py:466
<183>Dec 22 07:11:59 node-10 nova-compute: 2016-12-22 07:11:59.916 4011 DEBUG oslo_vmware.api [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Waiting for function oslo_vmware.api._invoke_api to return. func /usr/lib/python2.7/dist-packages/oslo_vmware/api.py:122
<183>Dec 22 07:12:03 node-10 nova-compute: 2016-12-22 07:12:03.842 4011 DEBUG oslo_vmware.rw_handles [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Closed VMDK write handle for https://192.168.103.98/nfc/5201b6e8-5cc7-e7bf-dc30-1300dda05ae0/disk-0.vmdk. close /usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py:481
<183>Dec 22 07:12:03 node-10 nova-compute: 2016-12-22 07:12:03.843 4011 DEBUG oslo_concurrency.lockutils [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Releasing semaphore "[WYYVSANDATASTOR] vcenter-WyyCluster_base/121e0150-88a8-440f-9ce4-2f5a1bceae2d/121e0150-88a8-440f-9ce4-2f5a1bceae2d.vmdk" lock /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:225
<179>Dec 22 07:12:03 node-10 nova-compute: 2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] [instance: c6cea137-8a09-4960-bc01-8612a60fe250] Instance failed to spawn
2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250] Traceback (most recent call last):
2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]   File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 2218, in _build_resources
2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]     yield resources
2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]   File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 2064, in _build_and_run_instance
2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]     block_device_info=block_device_info)
2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]   File "/usr/lib/python2.7/dist-packages/nova/virt/vmwareapi/driver.py", line 381, in spawn
2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]     admin_password, network_info, block_device_info)
2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]   File "/usr/lib/python2.7/dist-packages/nova/virt/vmwareapi/vmops.py", line 751, in spawn
2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]     self._fetch_image_if_missing(context, vi)
2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]   File "/usr/lib/python2.7/dist-packages/nova/virt/vmwareapi/vmops.py", line 614, in _fetch_image_if_missing
2016-12-22 07:12:03.844 4011

** Affects: nova
     Importance: Undecided
         Status: New

** Attachment added: "this is full log!"
   https://bugs.launchpad.net/bugs/1655558/+attachment/4802756/+files/nova-log.log

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1655558

Title:
  vmware create vm instance from snapshot error

Status in OpenStack Compute (nova):
  New

Bug description:
  my environment is:
  vcenter version 6.0, vsan version 6.0, and the neutron use vmware's NSX!
  openstack is deploy by fuel 9.0,then the openstack version is mitaka!

  when i create a snapshot from vm instance,and the deploy the new vm
  instance from the snapshot ,get the error:

  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images Traceback (most recent call last):
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images   File "/usr/lib/python2.7/dist-packages/nova/virt/vmwareapi/images.py", line 186, in image_transfer
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images     data = read_handle.read(CHUNK_SIZE)
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images   File "/usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py", line 645, in read
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images     data = next(self._iter)
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images   File "/usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py", line 653, in get_next
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images     for data in self._glance_read_iter:
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images   File "/usr/lib/python2.7/dist-packages/glanceclient/common/utils.py", line 477, in __iter__
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images     for chunk in self.iterable:
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images   File "/usr/lib/python2.7/dist-packages/glanceclient/common/utils.py", line 435, in integrity_iter
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images     (md5sum, checksum))
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images IOError: [Errno 32] Corrupt image download. Checksum was 4aeaba2179866eb9d9b801d679964dfa expected f28c8836cf65c087e8b43f4c798477f6
  2016-12-22 07:11:59.884 4011 ERROR nova.virt.vmwareapi.images 
  <183>Dec 22 07:11:59 node-10 nova-compute: 2016-12-22 07:11:59.899 4011 DEBUG oslo_vmware.rw_handles [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Getting lease state for https://192.168.103.98/nfc/5201b6e8-5cc7-e7bf-dc30-1300dda05ae0/disk-0.vmdk. close /usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py:455
  <183>Dec 22 07:11:59 node-10 nova-compute: 2016-12-22 07:11:59.900 4011 DEBUG oslo_vmware.api [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Waiting for function oslo_vmware.api._invoke_api to return. func /usr/lib/python2.7/dist-packages/oslo_vmware/api.py:122
  <183>Dec 22 07:11:59 node-10 nova-compute: 2016-12-22 07:11:59.914 4011 DEBUG oslo_vmware.rw_handles [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Lease for https://192.168.103.98/nfc/5201b6e8-5cc7-e7bf-dc30-1300dda05ae0/disk-0.vmdk is in state: ready. close /usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py:464
  <183>Dec 22 07:11:59 node-10 nova-compute: 2016-12-22 07:11:59.915 4011 DEBUG oslo_vmware.rw_handles [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Releasing lease for https://192.168.103.98/nfc/5201b6e8-5cc7-e7bf-dc30-1300dda05ae0/disk-0.vmdk. close /usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py:466
  <183>Dec 22 07:11:59 node-10 nova-compute: 2016-12-22 07:11:59.916 4011 DEBUG oslo_vmware.api [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Waiting for function oslo_vmware.api._invoke_api to return. func /usr/lib/python2.7/dist-packages/oslo_vmware/api.py:122
  <183>Dec 22 07:12:03 node-10 nova-compute: 2016-12-22 07:12:03.842 4011 DEBUG oslo_vmware.rw_handles [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Closed VMDK write handle for https://192.168.103.98/nfc/5201b6e8-5cc7-e7bf-dc30-1300dda05ae0/disk-0.vmdk. close /usr/lib/python2.7/dist-packages/oslo_vmware/rw_handles.py:481
  <183>Dec 22 07:12:03 node-10 nova-compute: 2016-12-22 07:12:03.843 4011 DEBUG oslo_concurrency.lockutils [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] Releasing semaphore "[WYYVSANDATASTOR] vcenter-WyyCluster_base/121e0150-88a8-440f-9ce4-2f5a1bceae2d/121e0150-88a8-440f-9ce4-2f5a1bceae2d.vmdk" lock /usr/lib/python2.7/dist-packages/oslo_concurrency/lockutils.py:225
  <179>Dec 22 07:12:03 node-10 nova-compute: 2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [req-985a3617-9d0b-4ee7-926d-49e11578a811 a0f3b1d5ef8842ac88e3ebd976685d88 9f7fc3501e654b0eb0166bf8be207044 - - -] [instance: c6cea137-8a09-4960-bc01-8612a60fe250] Instance failed to spawn
  2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250] Traceback (most recent call last):
  2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]   File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 2218, in _build_resources
  2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]     yield resources
  2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]   File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 2064, in _build_and_run_instance
  2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]     block_device_info=block_device_info)
  2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]   File "/usr/lib/python2.7/dist-packages/nova/virt/vmwareapi/driver.py", line 381, in spawn
  2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]     admin_password, network_info, block_device_info)
  2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]   File "/usr/lib/python2.7/dist-packages/nova/virt/vmwareapi/vmops.py", line 751, in spawn
  2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]     self._fetch_image_if_missing(context, vi)
  2016-12-22 07:12:03.844 4011 ERROR nova.compute.manager [instance: c6cea137-8a09-4960-bc01-8612a60fe250]   File "/usr/lib/python2.7/dist-packages/nova/virt/vmwareapi/vmops.py", line 614, in _fetch_image_if_missing
  2016-12-22 07:12:03.844 4011

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1655558/+subscriptions