← Back to team overview

openstack team mailing list archive

Re: [Cinder] New volume status stuck at "Creating" after creation in Horizon

 

Hi Razique,

Following is the info you requested:

root@novato:~# pvdisplay

  --- Physical volume ---
  PV Name               /dev/loop2
  VG Name               cinder-volumes
  PV Size               10.00 GiB / not usable 4.00 MiB
  Allocatable           yes
  PE Size               4.00 MiB
  Total PE              2559
  Free PE               2559
  Allocated PE          0
  PV UUID               fYtaeo-MAg8-inx0-vqut-GUw6-behR-bKI3Q7

root@novato:~# vgdisplay
  --- Volume group ---
  VG Name               cinder-volumes
  System ID
  Format                lvm2
  Metadata Areas        1
  Metadata Sequence No  1
  VG Access             read/write
  VG Status             resizable
  MAX LV                0
  Cur LV                0
  Open LV               0
  Max PV                0
  Cur PV                1
  Act PV                1
  VG Size               10.00 GiB
  PE Size               4.00 MiB
  Total PE              2559
  Alloc PE / Size       0 / 0
  Free  PE / Size       2559 / 10.00 GiB
  VG UUID               kDlol2-KqAx-4E26-ebXR-4ppS-na5M-9vBeqd

root@novato:~# cat /etc/cinder/cinder.conf
[DEFAULT]
rootwrap_config=/etc/cinder/rootwrap.conf
sql_connection = mysql://cinderUser:cinderPass@10.176.20.102/cinder
api_paste_confg = /etc/cinder/api-paste.ini
iscsi_helper=ietadm
volume_name_template = volume-%s
volume_group = cinder-volumes
verbose = True
auth_strategy = keystone
#osapi_volume_listen_port=5900
root@novato:~#


Regards,
Ahmed.



On Wed, Dec 5, 2012 at 12:57 AM, Razique Mahroua
<razique.mahroua@xxxxxxxxx>wrote:

> Hi Ahmed,
> can you run
> $ pvdisplay
> and
> $ vgdisplay
>
> can we see /etc/cinder/cinder.conf ?
>
> thanks,
> *Razique Mahroua** - **Nuage & Co*
> razique.mahroua@xxxxxxxxx
> Tel : +33 9 72 37 94 15
>
>
> Le 5 déc. 2012 à 09:54, Ahmed Al-Mehdi <ahmedalmehdi@xxxxxxxxx> a écrit :
>
> I posted the cinder-scheduler log in my first post, but here they are here
> again.  There are generated right around the time frame when I created the
> volume.  I am trying to understand the error message "VolumeNotFound:
> Volume 9dd360bf-9ef2-499f-ac6e-
> 893abf5dc5ce could not be found".  Is this error message related to
> volume_group "cinder-volumes" or the new volume I just created.
>
>
> 2012-12-04 09:05:02 23552 DEBUG cinder.openstack.common.rpc.
> amqp [-] received {u'_context_roles': [u'Member', u'admin'],
> u'_context_request_id': u'req-1b122042-c3e4-4c1e-8285-ad148c8c2367',
> u'_context
> _quota_class': None, u'args': {u'topic': u'cinder-volume', u'image_id':
> None, u'snapshot_id': None, u'volume_id':
> u'9dd360bf-9ef2-499f-ac6e-893abf5dc5ce'}, u'_context_auth_token':
> '<SANITIZED>', u'_co
> ntext_is_admin': False, u'_context_project_id':
> u'70e5c14a28a14666a86e85b62ca6ae18', u'_context_timestamp':
> u'2012-12-04T17:05:02.375789', u'_context_read_deleted': u'no',
> u'_context_user_id': u'386d0
> f02d6d045e7ba49d8edac7bb43f', u'method': u'create_volume',
> u'_context_remote_address': u'10.176.20.102'} _safe_log
> /usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/common.py:195
> 2012-12-04 09:05:02 23552 DEBUG cinder.openstack.common.rpc.amqp [-]
> unpacked context: {'user_id': u'386d0f02d6d045e7ba49d8edac7bb43f', 'roles':
> [u'Member', u'admin'], 'timestamp': u'2012-12-04T17:05:
> 02.375789', 'auth_token': '<SANITIZED>', 'remote_address':
> u'10.176.20.102', 'quota_class': None, 'is_admin': False, 'request_id':
> u'req-1b122042-c3e4-4c1e-8285-ad148c8c2367', 'project_id': u'70e5c14a
> 28a14666a86e85b62ca6ae18', 'read_deleted': u'no'} _safe_log
> /usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/common.py:195
> 2012-12-04 09:05:02 23552 ERROR cinder.openstack.common.rpc.amqp [-]
> Exception during message handling
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp Traceback
> (most recent call last):
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> "/usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/amqp.py",
> line 276, in _process_data
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp     rval
> = self.proxy.dispatch(ctxt, version, method, **args)
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> "/usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/dispatcher.py",
> line 145, in dispatch
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> return getattr(proxyobj, method)(ctxt, **kwargs)
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> "/usr/lib/python2.7/dist-packages/cinder/scheduler/manager.py", line 98, in
> _schedule
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> db.volume_update(context, volume_id, {'status': 'error'})
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> "/usr/lib/python2.7/dist-packages/cinder/db/api.py", line 256, in
> volume_update
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> return IMPL.volume_update(context, volume_id, values)
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line 124,
> in wrapper
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> return f(*args, **kwargs)
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line 1071,
> in volume_update
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> volume_ref = volume_get(context, volume_id, session=session)
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line 124,
> in wrapper
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> return f(*args, **kwargs)
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
> "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line 1014,
> in volume_get
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp     raise
> exception.VolumeNotFound(volume_id=volume_id)
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
> VolumeNotFound: Volume 9dd360bf-9ef2-499f-ac6e-893abf5dc5ce could not be
> found.
> 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>
> Thank you,
> Ahmed.
>
>
>
> On Tue, Dec 4, 2012 at 11:10 PM, Huang Zhiteng <winston.d@xxxxxxxxx>wrote:
>
>> Can you check the cinder scheduler log?
>>
>> On Wed, Dec 5, 2012 at 1:44 AM, Ahmed Al-Mehdi <ahmedalmehdi@xxxxxxxxx>
>> wrote:
>> > Hello,
>> >
>> > I setup a two node OpenStack setup, one controller-node and one
>> > compute-node.  I am using Quantum, Cinder services, and KVM for
>> > virtualization.  I am running into an issue creating a volume through
>> > Horizon which I will attach to a VM later on.  The status of volume in
>> > Horizon is stuck at "Creating".  The output of "cinder list" shows
>> nothing.
>> >
>> > The iscsi service is setup properly, as far as I can tell.  I feel
>> there is
>> > a communication issue between the openstack services.
>> >
>> > No log entry in cinder-volume.log.
>> >
>> > However, cinder-scheduler.log has the following entry:
>> >
>> > 2012-12-04 09:05:02 23552 DEBUG cinder.openstack.common.rpc.amqp [-]
>> > received {u'_context_roles': [u'Member', u'admin'],
>> u'_context_request_id':
>> > u'req-1b122042-c3e4-4c1e-8285-ad148c8c2367', u'_context
>> > _quota_class': None, u'args': {u'topic': u'cinder-volume', u'image_id':
>> > None, u'snapshot_id': None, u'volume_id':
>> > u'9dd360bf-9ef2-499f-ac6e-893abf5dc5ce'}, u'_context_auth_token':
>> > '<SANITIZED>', u'_co
>> > ntext_is_admin': False, u'_context_project_id':
>> > u'70e5c14a28a14666a86e85b62ca6ae18', u'_context_timestamp':
>> > u'2012-12-04T17:05:02.375789', u'_context_read_deleted': u'no',
>> > u'_context_user_id': u'386d0
>> > f02d6d045e7ba49d8edac7bb43f', u'method': u'create_volume',
>> > u'_context_remote_address': u'10.176.20.102'} _safe_log
>> >
>> /usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/common.py:195
>> > 2012-12-04 09:05:02 23552 DEBUG cinder.openstack.common.rpc.amqp [-]
>> > unpacked context: {'user_id': u'386d0f02d6d045e7ba49d8edac7bb43f',
>> 'roles':
>> > [u'Member', u'admin'], 'timestamp': u'2012-12-04T17:05:
>> > 02.375789', 'auth_token': '<SANITIZED>', 'remote_address':
>> u'10.176.20.102',
>> > 'quota_class': None, 'is_admin': False, 'request_id':
>> > u'req-1b122042-c3e4-4c1e-8285-ad148c8c2367', 'project_id': u'70e5c14a
>> > 28a14666a86e85b62ca6ae18', 'read_deleted': u'no'} _safe_log
>> >
>> /usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/common.py:195
>> > 2012-12-04 09:05:02 23552 ERROR cinder.openstack.common.rpc.amqp [-]
>> > Exception during message handling
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> Traceback
>> > (most recent call last):
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
>> > "/usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/amqp.py",
>> line
>> > 276, in _process_data
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> rval =
>> > self.proxy.dispatch(ctxt, version, method, **args)
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
>> >
>> "/usr/lib/python2.7/dist-packages/cinder/openstack/common/rpc/dispatcher.py",
>> > line 145, in dispatch
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> return
>> > getattr(proxyobj, method)(ctxt, **kwargs)
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
>> > "/usr/lib/python2.7/dist-packages/cinder/scheduler/manager.py", line
>> 98, in
>> > _schedule
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> > db.volume_update(context, volume_id, {'status': 'error'})
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
>> > "/usr/lib/python2.7/dist-packages/cinder/db/api.py", line 256, in
>> > volume_update
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> return
>> > IMPL.volume_update(context, volume_id, values)
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
>> > "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line
>> 124, in
>> > wrapper
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> return
>> > f(*args, **kwargs)
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
>> > "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line
>> 1071,
>> > in volume_update
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> > volume_ref = volume_get(context, volume_id, session=session)
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
>> > "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line
>> 124, in
>> > wrapper
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> return
>> > f(*args, **kwargs)
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp   File
>> > "/usr/lib/python2.7/dist-packages/cinder/db/sqlalchemy/api.py", line
>> 1014,
>> > in volume_get
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> raise
>> > exception.VolumeNotFound(volume_id=volume_id)
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> > VolumeNotFound: Volume 9dd360bf-9ef2-499f-ac6e-893abf5dc5ce could not be
>> > found.
>> > 2012-12-04 09:05:02 23552 TRACE cinder.openstack.common.rpc.amqp
>> >
>> >
>> > Has anyone run into this issue.  Can I issue some cinder-* cli command
>> to
>> > get more info about the issue.
>> > Any help would be very appreciated.
>> >
>> > Thank you,
>> > Ahmed.
>> >
>> >
>> > _______________________________________________
>> > Mailing list: https://launchpad.net/~openstack
>> > Post to     : openstack@xxxxxxxxxxxxxxxxxxx
>> > Unsubscribe : https://launchpad.net/~openstack
>> > More help   : https://help.launchpad.net/ListHelp
>> >
>>
>>
>>
>> --
>> Regards
>> Huang Zhiteng
>>
>
> _______________________________________________
> Mailing list: https://launchpad.net/~openstack
> Post to     : openstack@xxxxxxxxxxxxxxxxxxx
> Unsubscribe : https://launchpad.net/~openstack
> More help   : https://help.launchpad.net/ListHelp
>
>
>

JPEG image


References