← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 2028851] Re: Console output was empty in test_get_console_output_server_id_in_shutoff_status

 

test_get_console_output_server_id_in_shutoff_status  test was always
wrong since starting which used to get the console for the server
created in the setup method and is in active state. This test never
tried to get the console of the shutoff server.

It is still unknown how this wrong test started failing after the test
refactoring in this commit
https://review.opendev.org/c/openstack/tempest/+/889109 and unhide this
issue.

This Tempest test corrects the tempest test, which will always fail
because the test expects the console of *shutoff* Guest also, which is
not something returned by Nova. Nova does not return the server console
for the shutoff server.

There is an open question on Nova's behaviours:

1. What should Nova return the console output of the shutoff guest
2. what status code should Nova return in case the "console is not available"? Currently, it returns 404.

There is some discussion over this topic in IRC:
- hhttps://meetings.opendev.org/irclogs/%23openstack-nova/%23openstack-nova.2023-07-27.log.html#t2023-07-27T18:50:03


** Changed in: nova
       Status: Invalid => New

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/2028851

Title:
   Console output was empty in
  test_get_console_output_server_id_in_shutoff_status

Status in OpenStack Compute (nova):
  New
Status in tempest:
  Confirmed

Bug description:
  test_get_console_output_server_id_in_shutoff_status

  https://github.com/openstack/tempest/blob/04cb0adc822ffea6c7bfccce8fa08b03739894b7/tempest/api/compute/servers/test_server_actions.py#L713

  is failing consistently in the nova-lvm job starting on July 24 with
  132 failures in the last 3 days. https://tinyurl.com/kvcc9289

  
  Traceback (most recent call last):
    File "/opt/stack/tempest/tempest/api/compute/servers/test_server_actions.py", line 728, in test_get_console_output_server_id_in_shutoff_status
      self.wait_for(self._get_output)
    File "/opt/stack/tempest/tempest/api/compute/base.py", line 340, in wait_for
      condition()
    File "/opt/stack/tempest/tempest/api/compute/servers/test_server_actions.py", line 213, in _get_output
      self.assertTrue(output, "Console output was empty.")
    File "/usr/lib/python3.10/unittest/case.py", line 687, in assertTrue
      raise self.failureException(msg)
  AssertionError: '' is not true : Console output was empty.

  its not clear why this has started failing. it may be a regression or
  a latent race in the test that we are now failing.

      def test_get_console_output_server_id_in_shutoff_status(self):
          """Test getting console output for a server in SHUTOFF status

          Should be able to GET the console output for a given server_id
          in SHUTOFF status.
          """

          # NOTE: SHUTOFF is irregular status. To avoid test instability,
          #       one server is created only for this test without using
          #       the server that was created in setUpClass.
          server = self.create_test_server(wait_until='ACTIVE')
          temp_server_id = server['id']

          self.client.stop_server(temp_server_id)
          waiters.wait_for_server_status(self.client, temp_server_id, 'SHUTOFF')
          self.wait_for(self._get_output)

  the test does not wait for the VM to be sshable so its possible that
  we are shutting off the VM before it is fully booted and no output has
  been written to the console.

  this failure has happened on multiple providers but only in the nova-lvm job.
  the console behavior is unrelated to the storage backend but the lvm job i belive is using
  lvm on a loopback file so the storage performance is likely slower then raw/qcow.

  so perhaps the boot is taking longer and no output is being written.

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/2028851/+subscriptions



References