← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1773102] [NEW] Abnormal request id in logs

 

Public bug reported:

Description
===========
After VM creation, the request id related to periodic tasks in nova-compute.log is changed to the same as the request id related to the VM creation task.


Steps to reproduce
==================

* nova boot xxx

* check the nova-compute.log on the compute node hosting the VM

* search the request id related to VM creation task


Expected result
===============
The request id related to periodic tasks should be different from the request id related to VM creation task.

Actual result
=============
The request id related to periodic task is changed to the same as the request id related to VM creation task after VM creation task is handled.

Environment
===========
1. OpenStack version
OS:
CentOS

nova version: 
openstack-nova-compute-17.0.2-1.el7.noarch


2. hypervisor
   Libvirt + QEMU

2. storage type
   LVM

3. Which networking type did you use?
   Neutron with Linuxbridge


Logs & Configs
==============
1. before nova-compute handling VM creation task:
2018-05-24 03:08:15.264 27469 DEBUG oslo_service.periodic_task [req-c63d0555-7bf1-42da-abb7-556cc9eede8c 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215
2018-05-24 03:08:15.265 27469 DEBUG nova.compute.manager [req-c63d0555-7bf1-42da-abb7-556cc9eede8c 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python2.7/site-packages/nova/compute/manager.py:7238
2018-05-24 03:08:18.269 27469 DEBUG oslo_service.periodic_task [req-c63d0555-7bf1-42da-abb7-556cc9eede8c 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215

2. begin to handle VM creation task:
2018-05-24 03:08:26.244 27469 DEBUG oslo_concurrency.lockutils [req-2d5b3957-9749-40ba-9b94-e8260c7145bf 9de813eb53ba4ac982a37df462783d5d 3ce4f026aed1411baa6e8013b13f9257 - default default] Lock "a0ded3b0-0e60-4d82-a516-588871c4917c" acquired by "nova.compute.manager._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:273
2018-05-24 03:08:26.312 27469 DEBUG oslo_service.periodic_task [req-2d5b3957-9749-40ba-9b94-e8260c7145bf 9de813eb53ba4ac982a37df462783d5d 3ce4f026aed1411baa6e8013b13f9257 - default default] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215
2018-05-24 03:08:26.312 27469 DEBUG nova.compute.manager [req-2d5b3957-9749-40ba-9b94-e8260c7145bf 9de813eb53ba4ac982a37df462783d5d 3ce4f026aed1411baa6e8013b13f9257 - default default] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python2.7/site-packages/nova/compute/manager.py:6572
2018-05-24 03:08:26.312 27469 DEBUG nova.compute.manager [req-2d5b3957-9749-40ba-9b94-e8260c7145bf 9de813eb53ba4ac982a37df462783d5d 3ce4f026aed1411baa6e8013b13f9257 - default default] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python2.7/site-packages/nova/compute/manager.py:6576
2018-05-24 03:08:26.334 27469 DEBUG nova.compute.manager [req-2d5b3957-9749-40ba-9b94-e8260c7145bf 9de813eb53ba4ac982a37df462783d5d 3ce4f026aed1411baa6e8013b13f9257 - default default] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python2.7/site-packages/nova/compute/manager.py:6648


3. after handling VM creation task (request id is also changed for VM creation task)
2018-05-24 03:08:40.896 27469 INFO nova.compute.manager [req-18b1870e-239a-4adc-9962-40e63fffcda7 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] [instance: a0ded3b0-0e60-4d82-a516-588871c4917c] Took 14.34 seconds to build instance.
2018-05-24 03:08:41.278 27469 DEBUG oslo_concurrency.lockutils [req-18b1870e-239a-4adc-9962-40e63fffcda7 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] Lock "a0ded3b0-0e60-4d82-a516-588871c4917c" released by "nova.compute.manager._locked_do_build_and_run_instance" :: held 15.033s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:285
2018-05-24 03:08:42.267 27469 DEBUG oslo_service.periodic_task [req-18b1870e-239a-4adc-9962-40e63fffcda7 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215
2018-05-24 03:09:04.264 27469 DEBUG oslo_service.periodic_task [req-18b1870e-239a-4adc-9962-40e63fffcda7 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215

** Affects: nova
     Importance: Undecided
         Status: New

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1773102

Title:
  Abnormal request id in logs

Status in OpenStack Compute (nova):
  New

Bug description:
  Description
  ===========
  After VM creation, the request id related to periodic tasks in nova-compute.log is changed to the same as the request id related to the VM creation task.

  
  Steps to reproduce
  ==================

  * nova boot xxx

  * check the nova-compute.log on the compute node hosting the VM

  * search the request id related to VM creation task

  
  Expected result
  ===============
  The request id related to periodic tasks should be different from the request id related to VM creation task.

  Actual result
  =============
  The request id related to periodic task is changed to the same as the request id related to VM creation task after VM creation task is handled.

  Environment
  ===========
  1. OpenStack version
  OS:
  CentOS

  nova version: 
  openstack-nova-compute-17.0.2-1.el7.noarch

  
  2. hypervisor
     Libvirt + QEMU

  2. storage type
     LVM

  3. Which networking type did you use?
     Neutron with Linuxbridge

  
  Logs & Configs
  ==============
  1. before nova-compute handling VM creation task:
  2018-05-24 03:08:15.264 27469 DEBUG oslo_service.periodic_task [req-c63d0555-7bf1-42da-abb7-556cc9eede8c 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] Running periodic task ComputeManager._reclaim_queued_deletes run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215
  2018-05-24 03:08:15.265 27469 DEBUG nova.compute.manager [req-c63d0555-7bf1-42da-abb7-556cc9eede8c 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] CONF.reclaim_instance_interval <= 0, skipping... _reclaim_queued_deletes /usr/lib/python2.7/site-packages/nova/compute/manager.py:7238
  2018-05-24 03:08:18.269 27469 DEBUG oslo_service.periodic_task [req-c63d0555-7bf1-42da-abb7-556cc9eede8c 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] Running periodic task ComputeManager._sync_scheduler_instance_info run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215

  2. begin to handle VM creation task:
  2018-05-24 03:08:26.244 27469 DEBUG oslo_concurrency.lockutils [req-2d5b3957-9749-40ba-9b94-e8260c7145bf 9de813eb53ba4ac982a37df462783d5d 3ce4f026aed1411baa6e8013b13f9257 - default default] Lock "a0ded3b0-0e60-4d82-a516-588871c4917c" acquired by "nova.compute.manager._locked_do_build_and_run_instance" :: waited 0.000s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:273
  2018-05-24 03:08:26.312 27469 DEBUG oslo_service.periodic_task [req-2d5b3957-9749-40ba-9b94-e8260c7145bf 9de813eb53ba4ac982a37df462783d5d 3ce4f026aed1411baa6e8013b13f9257 - default default] Running periodic task ComputeManager._heal_instance_info_cache run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215
  2018-05-24 03:08:26.312 27469 DEBUG nova.compute.manager [req-2d5b3957-9749-40ba-9b94-e8260c7145bf 9de813eb53ba4ac982a37df462783d5d 3ce4f026aed1411baa6e8013b13f9257 - default default] Starting heal instance info cache _heal_instance_info_cache /usr/lib/python2.7/site-packages/nova/compute/manager.py:6572
  2018-05-24 03:08:26.312 27469 DEBUG nova.compute.manager [req-2d5b3957-9749-40ba-9b94-e8260c7145bf 9de813eb53ba4ac982a37df462783d5d 3ce4f026aed1411baa6e8013b13f9257 - default default] Rebuilding the list of instances to heal _heal_instance_info_cache /usr/lib/python2.7/site-packages/nova/compute/manager.py:6576
  2018-05-24 03:08:26.334 27469 DEBUG nova.compute.manager [req-2d5b3957-9749-40ba-9b94-e8260c7145bf 9de813eb53ba4ac982a37df462783d5d 3ce4f026aed1411baa6e8013b13f9257 - default default] Didn't find any instances for network info cache update. _heal_instance_info_cache /usr/lib/python2.7/site-packages/nova/compute/manager.py:6648

  
  3. after handling VM creation task (request id is also changed for VM creation task)
  2018-05-24 03:08:40.896 27469 INFO nova.compute.manager [req-18b1870e-239a-4adc-9962-40e63fffcda7 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] [instance: a0ded3b0-0e60-4d82-a516-588871c4917c] Took 14.34 seconds to build instance.
  2018-05-24 03:08:41.278 27469 DEBUG oslo_concurrency.lockutils [req-18b1870e-239a-4adc-9962-40e63fffcda7 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] Lock "a0ded3b0-0e60-4d82-a516-588871c4917c" released by "nova.compute.manager._locked_do_build_and_run_instance" :: held 15.033s inner /usr/lib/python2.7/site-packages/oslo_concurrency/lockutils.py:285
  2018-05-24 03:08:42.267 27469 DEBUG oslo_service.periodic_task [req-18b1870e-239a-4adc-9962-40e63fffcda7 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] Running periodic task ComputeManager._instance_usage_audit run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215
  2018-05-24 03:09:04.264 27469 DEBUG oslo_service.periodic_task [req-18b1870e-239a-4adc-9962-40e63fffcda7 809cb6c22acc445c843db1a806d4e817 68b078adaf13420391fdb0fde1608816 - default default] Running periodic task ComputeManager.update_available_resource run_periodic_tasks /usr/lib/python2.7/site-packages/oslo_service/periodic_task.py:215

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1773102/+subscriptions


Follow ups