maas-builds team mailing list archive
-
maas-builds team
-
Mailing list archive
-
Message #00546
Jenkins Failure - precise-adt-maas-daily 128
See http://d-jenkins.ubuntu-ci:8080/job/precise-adt-maas-daily/128/
[...truncated 3548 lines...]
Setting up maas-cluster-controller (1.4+bzr1701+dfsg-0+1718+212~ppa0~ubuntu12.04.1) ...
Enabling module version.
To activate the new configuration, you need to run:
service apache2 restart
* Restarting web server apache2 [80G ... waiting .
[74G[ OK ]
maas-pserv start/running, process 11836
maas-cluster-celery start/running, process 11867
Setting up python-txlongpoll (0.3.1+bzr86-0ubuntu2~ctools0) ...
Setting up python-django-maas (1.4+bzr1701+dfsg-0+1718+212~ppa0~ubuntu12.04.1) ...
Setting up maas-region-controller (1.4+bzr1701+dfsg-0+1718+212~ppa0~ubuntu12.04.1) ...
Considering dependency proxy for proxy_http:
Enabling module proxy.
Enabling module proxy_http.
To activate the new configuration, you need to run:
service apache2 restart
Enabling module expires.
To activate the new configuration, you need to run:
service apache2 restart
Module wsgi already enabled
rsyslog stop/waiting
rsyslog start/running, process 12015
squid-deb-proxy stop/waiting
squid-deb-proxy start/running, process 12064
Restarting rabbitmq-server: SUCCESS
rabbitmq-server.
Creating user "maas_longpoll" ...
...done.
Creating vhost "/maas_longpoll" ...
...done.
Setting permissions for user "maas_longpoll" in vhost "/maas_longpoll" ...
...done.
Creating user "maas_workers" ...
...done.
Creating vhost "/maas_workers" ...
...done.
Setting permissions for user "maas_workers" in vhost "/maas_workers" ...
...done.
* Restarting PostgreSQL 9.1 database server [80G
[74G[ OK ]
dbconfig-common: writing config to /etc/dbconfig-common/maas-region-controller.conf
Creating config file /etc/dbconfig-common/maas-region-controller.conf with new version
creating postgres user maas: success.
verifying creation of user: success.
creating database maasdb: success.
verifying database maasdb exists: success.
dbconfig-common: flushing administrative password
Syncing...
Creating tables ...
Creating table auth_permission
Creating table auth_group_permissions
Creating table auth_group
Creating table auth_user_groups
Creating table auth_user_user_permissions
Creating table auth_user
Creating table django_content_type
Creating table django_session
Creating table django_site
Creating table piston_nonce
Creating table piston_consumer
Creating table piston_token
Creating table south_migrationhistory
Installing custom SQL ...
Installing indexes ...
Installed 0 object(s) from 0 fixture(s)
Synced:
> django.contrib.auth
> django.contrib.contenttypes
> django.contrib.sessions
> django.contrib.sites
> django.contrib.messages
> django.contrib.staticfiles
> piston
> south
Not synced (use migrations):
- maasserver
- metadataserver
(use ./manage.py migrate to migrate these)
Running migrations for maasserver:
- Migrating forwards to 0058_add_agent_name_to_node.
> maasserver:0001_initial
> maasserver:0002_add_token_to_node
> maasserver:0002_macaddress_unique
> maasserver:0003_rename_sshkeys
> maasserver:0004_add_node_error
> maasserver:0005_sshkey_user_and_key_unique_together
> maasserver:0006_increase_filestorage_filename_length
> maasserver:0007_common_info_created_add_time
> maasserver:0008_node_power_address
> maasserver:0009_add_nodegroup
> maasserver:0010_add_node_netboot
> maasserver:0011_add_dns_zone_serial_sequence
> maasserver:0012_DHCPLease
> maasserver:0013_connect_node_node_group
> maasserver:0014_nodegroup_dhcp_settings_are_optional
> maasserver:0016_node_nodegroup_not_null
> maasserver:0017_add_dhcp_key_to_nodegroup
> maasserver:0018_activate_worker_user
> maasserver:0019_add_nodegroup_dhcp_interface
> maasserver:0020_nodegroup_dhcp_interfaces_is_plural
> maasserver:0021_add_uuid_to_nodegroup
> maasserver:0022_add_status_to_nodegroup
> maasserver:0023_add_bootimage_model
> maasserver:0024_add_nodegroupinterface
> maasserver:0025_remove_unused_fields_in_nodegroup
> maasserver:0026_add_node_distro_series
> maasserver:0027_add_tag_table
> maasserver:0028_add_node_hardware_details
> maasserver:0029_zone_sharing
> maasserver:0030_ip_address_to_generic_ip_address
> maasserver:0031_node_architecture_field_size
> maasserver:0032_node_subarch
> maasserver:0033_component_error
> maasserver:0034_timestamp_component_error
> maasserver:0035_add_nodegroup_cluster_name
> maasserver:0036_populate_nodegroup_cluster_name
> maasserver:0037_nodegroup_cluster_name_unique
> maasserver:0038_nodegroupinterface_ip_range_fix
> maasserver:0039_add_filestorage_content
> maasserver:0039_add_nodegroup_to_bootimage
> maasserver:0040_make_filestorage_data_not_null
> maasserver:0041_remove_filestorage_data
> maasserver:0042_fix_039_conflict
> maasserver:0043_unique_hostname_preparation
> maasserver:0044_node_hostname_unique
> maasserver:0045_add_tag_kernel_opts
> maasserver:0046_add_nodegroup_maas_url
> maasserver:0047_add_owner_to_filestorage
> maasserver:0048_add_key_to_filestorage
> maasserver:0049_filestorage_key_unique
> maasserver:0050_shared_to_per_tenant_storage
> maasserver:0051_bigger_distro_series_name
> maasserver:0052_add_node_storage
> maasserver:0053_node_routers
> maasserver:0054_download_progress
> maasserver:0055_nullable_bytes_downloaded
> maasserver:0056_netboot_off_for_allocated_nodes
> maasserver:0057_remove_hardware_details
> maasserver:0058_add_agent_name_to_node
- Loading initial data for maasserver.
Installed 0 object(s) from 0 fixture(s)
Running migrations for metadataserver:
- Migrating forwards to 0014_commission_result_rename_data_bin_col.
> metadataserver:0001_initial
> metadataserver:0002_add_nodecommissionresult
> metadataserver:0003_populate_hardware_details
> metadataserver:0004_add_commissioningscript
> metadataserver:0005_nodecommissionresult_add_timestamp
> metadataserver:0006_nodecommissionresult_add_status
> metadataserver:0007_nodecommissionresult_change_name_size
> metadataserver:0008_rename_lshw_commissioning_output
> metadataserver:0009_delete_status
> metadataserver:0010_add_script_result
> metadataserver:0011_commission_result_binary_data_col
> metadataserver:0012_commission_result_binary_data_recode
> metadataserver:0013_commission_result_drop_old_data_col
> metadataserver:0014_commission_result_rename_data_bin_col
- Loading initial data for metadataserver.
Installed 1 object(s) from 1 fixture(s)
* Restarting web server apache2 [80G ... waiting
[74G[ OK ]
squid-deb-proxy stop/waiting
squid-deb-proxy start/running, process 13151
maas-txlongpoll start/running, process 13217
maas-region-celery start/running, process 13278
Setting up maas (1.4+bzr1701+dfsg-0+1718+212~ppa0~ubuntu12.04.1) ...
Setting up maas-dns (1.4+bzr1701+dfsg-0+1718+212~ppa0~ubuntu12.04.1) ...
* Stopping domain name service... bind9 [80G waiting for pid 9348 to die
[74G[ OK ]
* Starting domain name service... bind9 [80G
[74G[ OK ]
Processing triggers for libc-bin ...
ldconfig deferred processing now taking place
-> Finished parsing the build-deps
adt-run: trace: & ubtree0t-maas-package-test: [----------------------------------------
Build timed out (after 90 minutes). Marking the build as failed.
qemu: terminating on signal 15 from pid 3194
Connection to localhost closed by remote host.
Connection to localhost closed.
+ RET=255
+ [ 0 -eq 1 ]
+ [ 255 -gt 0 ]
+ log_failure_msg adt-run exited with status 255.
+ log_msg Failure: adt-run exited with status 255.\n
+ date +%F %X
+ printf 2014-04-01 21:14:40: Failure: adt-run exited with status 255.\n
2014-04-01 21:14:40: Failure: adt-run exited with status 255.
+ [ 0 -eq 0 ]
+ mkdir -p /home/ubuntu/jenkins-jobs/workspace/precise-adt-maas-daily/results
+ ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o CheckHostIP=no -i /var/tmp/adt/disks/adtkey -p 54323 -tt -o BatchMode=yes -l ubuntu localhost sudo chown -R ubuntu /root/adt-log; find /root/adt-log -type f -empty | xargs rm 2>/dev/null
ssh: connect to host localhost port 54323: Connection refused
+ true
+ scp -r -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o CheckHostIP=no -i /var/tmp/adt/disks/adtkey -P 54323 ubuntu@localhost:/root/adt-log/* /var/crash/*crash /var/log/syslog /var/tmp/testresults /home/ubuntu/jenkins-jobs/workspace/precise-adt-maas-daily/results
+ true
+ log_info_msg Test artifacts copied to /home/ubuntu/jenkins-jobs/workspace/precise-adt-maas-daily/results
+ log_msg Info: Test artifacts copied to /home/ubuntu/jenkins-jobs/workspace/precise-adt-maas-daily/results\n
+ date +%F %X
+ printf 2014-04-01 21:14:41: Info: Test artifacts copied to /home/ubuntu/jenkins-jobs/workspace/precise-adt-maas-daily/results\n
2014-04-01 21:14:41: Info: Test artifacts copied to /home/ubuntu/jenkins-jobs/workspace/precise-adt-maas-daily/results
+ [ -f /home/ubuntu/jenkins-jobs/workspace/precise-adt-maas-daily/results/summary.log ]
+ [ -n 2014-04-01_23-44-40 ]
+ ls -tr /home/ubuntu/jenkins-jobs/workspace/precise-adt-maas-daily/results/*.result
+ tail -1
+ resfile=
+ [ -n ]
+ log_failure_msg Test didn't end normally. Generating error file
+ log_msg Failure: Test didn't end normally. Generating error file\n
+ date +%F %X
+ printf 2014-04-01 21:14:41: Failure: Test didn't end normally. Generating error file\n
2014-04-01 21:14:41: Failure: Test didn't end normally. Generating error file
+ date +%Y%m%d-%H%M%S
+ errfile=/home/ubuntu/jenkins-jobs/workspace/precise-adt-maas-daily/results/precise_amd64_maas_20140401-211441.error
+ echo precise amd64 maas
+ rsync -a /home/ubuntu/jenkins-jobs/workspace/precise-adt-maas-daily/results/precise_amd64_maas_20140401-211441.error /precise/tmp/
rsync: mkdir "/precise/tmp" failed: No such file or directory (2)
rsync error: error in file IO (code 11) at main.c(674) [Receiver=3.1.0]
+ true
+ [ 0 -eq 0 ]
+ ssh -o UserKnownHostsFile=/dev/null -o StrictHostKeyChecking=no -o CheckHostIP=no -i /var/tmp/adt/disks/adtkey -p 54323 -tt -o BatchMode=yes -l ubuntu localhost sudo poweroff
ssh: connect to host localhost port 54323: Connection refused
+ on_exit
+ log_info_msg Cleaning up
+ log_msg Info: Cleaning up\n
+ date +%F %X
+ printf 2014-04-01 21:14:41: Info: Cleaning up\n
2014-04-01 21:14:41: Info: Cleaning up
+ [ -f /var/tmp/adt/disks/run/precise-amd64-maas-20140401_194521.Q5xw3V.img.pid ]
+ cat /var/tmp/adt/disks/run/precise-amd64-maas-20140401_194521.Q5xw3V.img.pid
+ kill -9 11832
+ rm -f /var/tmp/adt/disks/run/precise-amd64-maas-20140401_194521.Q5xw3V.img.pid
+ on_exit
+ log_info_msg Cleaning up
+ log_msg Info: Cleaning up\n
+ date +%F %X
+ printf 2014-04-01 21:14:41: Info: Cleaning up\n
2014-04-01 21:14:41: Info: Cleaning up
+ [ -f /var/tmp/adt/disks/run/precise-amd64-maas-20140401_194521.Q5xw3V.img.pid ]
+ rm -f /var/lock/adt/ssh.54323.lock
+ rm -f /var/lock/adt/vnc.5911.lock
+ [ -d /tmp/adt-amd64.pPrECR ]
+ rm -Rf /tmp/adt-amd64.pPrECR
+ rm -f /var/tmp/adt/disks/run/precise-amd64-maas-20140401_092547.FqTsZO.img /var/tmp/adt/disks/run/precise-amd64-maas-20140401_092547.FqTsZO.img.pid /var/tmp/adt/disks/run/precise-amd64-maas-20140401_194521.Q5xw3V.img
Archiving artifacts
Email was triggered for: Failure
Sending email for trigger: Failure
Follow ups