← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 2009701] [NEW] 202 Accepted response on a 'resize' operation that led to a InstanceFaultRollback error.

 

Public bug reported:

Description
===========
I have been having problems with the following tempest test: "test_list_migrations_in_flavor_resize_situation", which at some point tries to resize a server and fails to do so as a timeout triggers while waiting for that same server to change to the "VERIFY_RESIZE" step.

By debugging the test, I saw that the requested API is the following: "/servers/{server_id}/action" and that the payload is this one: "{"resize": {"flavorRef": "84"}}", which matches the schema for the action. The request is answered then with the following:
"{'date': 'Wed, 08 Mar 2023 11:20:53 GMT', 'server': 'Apache/2.4.52 (Ubuntu)', 'content-length': '0', 'content-type': 'application/json', 'openstack-api-version': 'compute 2.1', 'x-openstack-nova-api-version': '2.1', 'vary': 'OpenStack-API-Version,X-OpenStack-Nova-API-Version', 'x-openstack-request-id': 'req-3110f8f5-8f2c-446e-9af3-a22915e99fe9', 'x-compute-request-id': 'req-3110f8f5-8f2c-446e-9af3-a22915e99fe9', 'connection': 'close', 'status': '202', 'content-location': 'http://127.0.0.1/compute/v2.1/servers/b3069cb0-1554-4ab0-bde5-cfa64affc241/action'}", which leads me to believe that the action went out correctly. However, looking at the compute's node log, I see that the following error happened:
"
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 10710, in _error_out_instance_on_exception
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     yield
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 5980, in _resize_instance
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     disk_info = self.driver.migrate_disk_and_power_off(
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 11504, in migrate_disk_and_power_off
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise exception.InstanceFaultRollback(
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server nova.exception.InstanceFaultRollback: Instance rollback performed due to: Migration pre-check error: Migration is not supported for LVM backed instances
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server 
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred:
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server 
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self.force_reraise()
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise self.value
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self.force_reraise()
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise self.value
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self.force_reraise()
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise self.value
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 5946, in resize_instance
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self.force_reraise()
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise self.value
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 5943, in resize_instance
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self._resize_instance(context, instance, image, migration,
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 5957, in _resize_instance
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     with self._error_out_instance_on_exception(
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self.gen.throw(typ, value, traceback)
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 10723, in _error_out_instance_on_exception
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise error.inner_exception
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server nova.exception.MigrationPreCheckError: Migration pre-check error: Migration is not supported for LVM backed instances
Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server 
"
Now that makes sense, as I am running a DevStack that uses LVM as its backend. With this, the operation is then reverted and the server will never reach the "VERIFY_RESIZE" status, thus making the test fail.

The main problem is that this situation is answered with a 202 message,
there is no way for the user to be aware that an error happened and that
their request was ignored.

Steps to reproduce
==================
1.- Run "tempest.api.compute.admin.test_migrations.MigrationsAdminTest.test_list_migrations_in_flavor_resize_situation" under this bug's environment.
2.- Wait for TimeOutError to happen.

Expected result
===============
An error in the range of the 400 is returned.

Actual result
=============
202 Accepted is returned instead.

Environment
===========
Ubuntu 22.04
DevStack:
 "NOVA_BACKEND" = "LVM"

Nova on master branch.

** Affects: nova
     Importance: Undecided
         Status: New

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/2009701

Title:
  202 Accepted response on a 'resize' operation that led to a
  InstanceFaultRollback error.

Status in OpenStack Compute (nova):
  New

Bug description:
  Description
  ===========
  I have been having problems with the following tempest test: "test_list_migrations_in_flavor_resize_situation", which at some point tries to resize a server and fails to do so as a timeout triggers while waiting for that same server to change to the "VERIFY_RESIZE" step.

  By debugging the test, I saw that the requested API is the following: "/servers/{server_id}/action" and that the payload is this one: "{"resize": {"flavorRef": "84"}}", which matches the schema for the action. The request is answered then with the following:
  "{'date': 'Wed, 08 Mar 2023 11:20:53 GMT', 'server': 'Apache/2.4.52 (Ubuntu)', 'content-length': '0', 'content-type': 'application/json', 'openstack-api-version': 'compute 2.1', 'x-openstack-nova-api-version': '2.1', 'vary': 'OpenStack-API-Version,X-OpenStack-Nova-API-Version', 'x-openstack-request-id': 'req-3110f8f5-8f2c-446e-9af3-a22915e99fe9', 'x-compute-request-id': 'req-3110f8f5-8f2c-446e-9af3-a22915e99fe9', 'connection': 'close', 'status': '202', 'content-location': 'http://127.0.0.1/compute/v2.1/servers/b3069cb0-1554-4ab0-bde5-cfa64affc241/action'}", which leads me to believe that the action went out correctly. However, looking at the compute's node log, I see that the following error happened:
  "
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 10710, in _error_out_instance_on_exception
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     yield
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 5980, in _resize_instance
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     disk_info = self.driver.migrate_disk_and_power_off(
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/virt/libvirt/driver.py", line 11504, in migrate_disk_and_power_off
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise exception.InstanceFaultRollback(
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server nova.exception.InstanceFaultRollback: Instance rollback performed due to: Migration pre-check error: Migration is not supported for LVM backed instances
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server 
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred:
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server 
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     res = self.dispatcher.dispatch(message)
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     return self._do_dispatch(endpoint, method, ctxt, args)
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     result = func(ctxt, **new_args)
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/exception_wrapper.py", line 65, in wrapped
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self.force_reraise()
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise self.value
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/exception_wrapper.py", line 63, in wrapped
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     return f(self, context, *args, **kw)
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 166, in decorated_function
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self.force_reraise()
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise self.value
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 157, in decorated_function
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/utils.py", line 1439, in decorated_function
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 213, in decorated_function
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self.force_reraise()
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise self.value
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 203, in decorated_function
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     return function(self, context, *args, **kwargs)
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 5946, in resize_instance
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     with excutils.save_and_reraise_exception():
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 227, in __exit__
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self.force_reraise()
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/local/lib/python3.10/dist-packages/oslo_utils/excutils.py", line 200, in force_reraise
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise self.value
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 5943, in resize_instance
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self._resize_instance(context, instance, image, migration,
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 5957, in _resize_instance
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     with self._error_out_instance_on_exception(
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/usr/lib/python3.10/contextlib.py", line 153, in __exit__
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     self.gen.throw(typ, value, traceback)
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server   File "/opt/stack/nova/nova/compute/manager.py", line 10723, in _error_out_instance_on_exception
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server     raise error.inner_exception
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server nova.exception.MigrationPreCheckError: Migration pre-check error: Migration is not supported for LVM backed instances
  Mar 08 11:48:25 devstack nova-compute[127002]: ERROR oslo_messaging.rpc.server 
  "
  Now that makes sense, as I am running a DevStack that uses LVM as its backend. With this, the operation is then reverted and the server will never reach the "VERIFY_RESIZE" status, thus making the test fail.

  The main problem is that this situation is answered with a 202
  message, there is no way for the user to be aware that an error
  happened and that their request was ignored.

  Steps to reproduce
  ==================
  1.- Run "tempest.api.compute.admin.test_migrations.MigrationsAdminTest.test_list_migrations_in_flavor_resize_situation" under this bug's environment.
  2.- Wait for TimeOutError to happen.

  Expected result
  ===============
  An error in the range of the 400 is returned.

  Actual result
  =============
  202 Accepted is returned instead.

  Environment
  ===========
  Ubuntu 22.04
  DevStack:
   "NOVA_BACKEND" = "LVM"

  Nova on master branch.

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/2009701/+subscriptions