← Back to team overview

yahoo-eng-team team mailing list archive

[Bug 1478630] Re: nova-compute was forced down due to "[Errno 24] too many open files"

 

Assuming you haven't seen any more of these since changing your ceph
config, lets close this one as invalid.

** Changed in: nova
       Status: Incomplete => Invalid

-- 
You received this bug notification because you are a member of Yahoo!
Engineering Team, which is subscribed to OpenStack Compute (nova).
https://bugs.launchpad.net/bugs/1478630

Title:
  nova-compute was forced down due to "[Errno 24] too many open files"

Status in OpenStack Compute (nova):
  Invalid

Bug description:
  vi /var/log/nova-all.log
  <180>Jul 27 10:15:33 node-1 nova-compute Auditing locally available compute resources
  <179>Jul 27 10:15:33 node-1 nova-compute Error during ComputeManager.update_available_resource: [Errno 24] Too many open files
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task Traceback (most recent call last):
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/openstack/common/periodic_task.py", line 198, in run_periodic_tasks
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     task(self, context)
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/compute/manager.py", line 5963, in update_available_resource
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     rt.update_available_resource(context)
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/compute/resource_tracker.py", line 313, in update_available_resource
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     resources = self.driver.get_available_resource(self.nodename)
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 4939, in get_available_resource
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     stats = self.get_host_stats(refresh=True)
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 5809, in get_host_stats
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     return self.host_state.get_host_stats(refresh=refresh)
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 6383, in get_host_stats
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     self.update_status()
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 6406, in update_status
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     disk_info_dict = self.driver._get_local_gb_info()
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/driver.py", line 4552, in _get_local_gb_info
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     info = LibvirtDriver._get_rbd_driver().get_pool_info()
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/rbd_utils.py", line 273, in get_pool_info
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     with RADOSClient(self) as client:
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/rbd_utils.py", line 86, in __init__
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     self.cluster, self.ioctx = driver._connect_to_rados(pool)
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/nova/virt/libvirt/rbd_utils.py", line 108, in _connect_to_rados
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     conffile=self.ceph_conf)
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/dist-packages/rados.py", line 198, in __init__
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     librados_path = find_library('rados')
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/ctypes/util.py", line 224, in find_library
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     return _findSoname_ldconfig(name) or _get_soname(_findLib_gcc(name))
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task   File "/usr/lib/python2.7/ctypes/util.py", line 213, in _findSoname_ldconfig
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task     f = os.popen('/sbin/ldconfig -p 2>/dev/null')
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task OSError: [Errno 24] Too many open files
  2015-07-27 10:15:33.401 12422 TRACE nova.openstack.common.periodic_task

  
  Current limit setting:
  root@node-1:/tmp# ulimit -a
  core file size          (blocks, -c) 0
  data seg size           (kbytes, -d) unlimited
  scheduling priority             (-e) 0
  file size               (blocks, -f) unlimited
  pending signals                 (-i) 386140
  max locked memory       (kbytes, -l) 64
  max memory size         (kbytes, -m) unlimited
  open files                      (-n) 1024
  pipe size            (512 bytes, -p) 8
  POSIX message queues     (bytes, -q) 819200
  real-time priority              (-r) 0
  stack size              (kbytes, -s) 8192
  cpu time               (seconds, -t) unlimited
  max user processes              (-u) 386140
  virtual memory          (kbytes, -v) unlimited
  file locks                      (-x) unlimited
  root@node-1:/tmp# ulimit -n
  1024

To manage notifications about this bug go to:
https://bugs.launchpad.net/nova/+bug/1478630/+subscriptions


References