← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1816086] Re: Resource Tracker performance with Ironic driver

 

** Also affects: nova/rocky
   Importance: Undecided
       Status: New

** Also affects: nova/stein
   Importance: Undecided
       Status: New

** Changed in: nova/rocky
       Status: New => Confirmed

** Changed in: nova/stein
       Status: New => Confirmed

** Changed in: nova/rocky
   Importance: Undecided => Medium

** Changed in: nova/stein
   Importance: Undecided => Medium

** Changed in: nova/rocky
   Importance: Medium => High

** Changed in: nova/stein
   Importance: Medium => High

** Tags added: ironic performance resource-tracker

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1816086

Title:
  Resource Tracker performance with Ironic driver

Status in OpenStack Compute (nova):
  Fix Released
Status in OpenStack Compute (nova) rocky series:
  Confirmed
Status in OpenStack Compute (nova) stein series:
  Confirmed

Bug description:
  The problem is in rocky.

  The resource tracker builds the resource provider tree and it's updated 2 times in "_update_available_resource". 
  With "_init_compute_node" and in the "_update_available_resource" itself.

  The problem is that the RP tree will contain all the ironic RP and all
  the tree is flushed to placement (2 times as described above) when the
  periodic task iterate per Ironic RP.

  In our case with 1700 ironic nodes, the period task takes:
  1700 x (2 x 7s) = ~6h

  +++

  mitigations:
  - shard nova-compute. Have several nova-computes dedicated to ironic.
  Most of the current deployments only use 1 nova-compute to avoid resources shuffle/recreation between nova-computes.
  Several nova-computes will be need to accommodate the load.

  - why do we need to do the full resource provider tree flush to placement and not only the RP that is being considered?
  As a work around we are doing this now!

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1816086/+subscriptions


References