Deploy is failed on step in rabbitmq.pp file with alert that "task on node is valid, but failed"

Bug #1433583 reported by Tatyana Dubyk
8
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Fuel for OpenStack
New
Undecided
Fuel Partner Integration Team

Bug Description

Deploy is failed on step in rabbitmq.pp file with alert that "task on node is valid, but failed"

Steps to reproduce:
set up lab from iso 6.1.all (#202) on 5 slaves
1. Create a cluster with vCenter+qemu as a hypervisors on Ubuntu
    all other settings by default
2. Add 4 nodes with roles:
                 3 controllers
                 1 compute
3. Deploy the cluster

settings for vcenter in vmware tab:
Availability zone: vcenter
vCenter host: 172.16.0.254
vCenter username: <email address hidden>
vCenter password: Qwer!1234
Cluster: ‘Cluster1’
Service name: any
Datastore regex: .*

Network type: nova-network Flat dhcp manager with settings:
public network: 172.16.0.0/24
private(VM) network: tag should be set to 103
management network: 172.16.1.0/24 (tag 102)
storage network:default (tag 104)

Interfaces on slaves should be setup this way in Fuel interface:
eth0 - admin(PXE)
eth1 - public
eth2 - management
eth3 - VM(Fixed)
eth4 - storage

reason of fail in astute logs:
--------------------------------------------------------------------
2015-03-17 18:01:09 ERR
[418] No more tasks will be executed on the node 4

2015-03-17 18:01:09 ERR
[418] Task '{"priority"=>1800, "type"=>"puppet", "uids"=>["4"], "parameters"=>{"puppet_modules"=>"/etc/puppet/modules", "puppet_manifest"=>"/etc/puppet/modules/osnailyfacter/modular/rabbitmq/rabbitmq.pp", "timeout"=>3600, "cwd"=>"/"}}' on node 4 valid, but failed

full errors in astute logs:
--------------------------------------------------------------------
2015-03-17T18:01:07 debug: [418] Data received by DeploymentProxyReporter to report it up: {"nodes"=>[{"uid"=>"4", "progress"=>55, "status"=>"deploying", "role"=>"primary-controller", "task"=>{"priority"=>1800, "type"=>"puppet", "uids"=>["4"], "parameters"=>{"puppet_modules"=>"/etc/puppet/modules", "puppet_manifest"=>"/etc/puppet/modules/osnailyfacter/modular/rabbitmq/rabbitmq.pp", "timeout"=>3600, "cwd"=>"/"}}}]}
2015-03-17T18:01:07 warning: [418] Attempt to assign lesser progress detected: Progress was: deploying, attempted to assign: 55. Skipping this node (id=4)
2015-03-17T18:01:09 debug: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: MC agent 'puppetd', method 'last_run_summary', results: {:sender=>"4", :statuscode=>0, :statusmsg=>"OK", :data=>{:time=>{"anchor"=>0.000254544, "config_retrieval"=>1.812490243, "cs_resource"=>4.861098403, "exec"=>0.000234807, "file"=>0.192907183, "filebucket"=>0.000159984, "package"=>0.000524482, "rabbitmq_plugin"=>1.138060749, "schedule"=>0.000862692, "service"=>650.661323062, "total"=>658.6679161489999, "last_run"=>1426615268}, :resources=>{"failed"=>1, "changed"=>1, "total"=>30, "restarted"=>0, "out_of_sync"=>2, "failed_to_restart"=>0, "scheduled"=>0, "skipped"=>7}, :changes=>{"total"=>2}, :events=>{"failure"=>1, "success"=>2, "total"=>3}, :version=>{"config"=>1426614598, "puppet"=>"3.4.3"}, :status=>"stopped", :running=>0, :enabled=>1, :idling=>0, :stopped=>1, :lastrun=>1426615268, :runtime=>1, :output=>"Currently stopped; last completed run 1 seconds ago"}}
2015-03-17T18:01:09 debug: [418] Node 4(primary-controller) status: error
2015-03-17T18:01:09 debug: [418] Node 4 has failed to deploy. There is no more retries for puppet run.
2015-03-17T18:01:09 debug: [418] {"nodes"=>[{"status"=>"error", "error_type"=>"deploy", "uid"=>"4", "role"=>"primary-controller"}]}
2015-03-17T18:01:09 err: [418] Task '{"priority"=>1800, "type"=>"puppet", "uids"=>["4"], "parameters"=>{"puppet_modules"=>"/etc/puppet/modules", "puppet_manifest"=>"/etc/puppet/modules/osnailyfacter/modular/rabbitmq/rabbitmq.pp", "timeout"=>3600, "cwd"=>"/"}}' on node 4 valid, but failed
2015-03-17T18:01:09 err: [418] No more tasks will be executed on the node 4
2015-03-17T18:01:09 debug: [418] Data received by DeploymentProxyReporter to report it up: {"nodes"=>[{"uid"=>"4", "status"=>"error", "error_type"=>"deploy", "role"=>"primary-controller", "task"=>{"priority"=>1800, "type"=>"puppet", "uids"=>["4"], "parameters"=>{"puppet_modules"=>"/etc/puppet/modules", "puppet_manifest"=>"/etc/puppet/modules/osnailyfacter/modular/rabbitmq/rabbitmq.pp", "timeout"=>3600, "cwd"=>"/"}}}]}
2015-03-17T18:01:09 debug: [418] Data send by DeploymentProxyReporter to report it up: {"nodes"=>[{"uid"=>"4", "status"=>"error", "error_type"=>"deploy", "role"=>"primary-controller", "task"=>{"priority"=>1800, "type"=>"puppet", "uids"=>["4"], "parameters"=>{"puppet_modules"=>"/etc/puppet/modules", "puppet_manifest"=>"/etc/puppet/modules/osnailyfacter/modular/rabbitmq/rabbitmq.pp", "timeout"=>3600, "cwd"=>"/"}}}]}
2015-03-17T18:01:09 info: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: Finished deployment of nodes => roles: {"4"=>"primary-controller"}
2015-03-17T18:01:09 info: [418] Casting message to Nailgun: {"method"=>"deploy_resp", "args"=>{"task_uuid"=>"22c06470-c5f9-43bc-acb7-46fbeba7d795", "nodes"=>[{"uid"=>"4", "status"=>"error", "error_type"=>"deploy", "role"=>"primary-controller", "task"=>{"priority"=>1800, "type"=>"puppet", "uids"=>["4"], "parameters"=>{"puppet_modules"=>"/etc/puppet/modules", "puppet_manifest"=>"/etc/puppet/modules/osnailyfacter/modular/rabbitmq/rabbitmq.pp", "timeout"=>3600, "cwd"=>"/"}}}]}}
2015-03-17T18:01:09 warning: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: Critical nodes with uids: 4 fail to deploy. Stop deployment
2015-03-17T18:01:09 warning: [418] This nodes: [{"uid"=>"2", "role"=>"controller"}, {"uid"=>"5", "role"=>"controller"}] will not deploy because at least one critical node deployment fail
2015-03-17T18:01:09 warning: [418] This nodes: [{"uid"=>"3", "role"=>"compute"}] will not deploy because at least one critical node deployment fail
2015-03-17T18:01:09 info: [418] Run hook ---
priority: 100
type: upload_file
uids:
- '2'
- '3'
- '4'
- '5'
parameters:
 path: "/etc/hiera/nodes.yaml"
  data: |
    nodes:
    - {fqdn: node-2.test.domain.local, internal_address: 172.16.1.4, internal_netmask: 255.255.255.0,
      name: node-2, public_address: 172.16.0.4, public_netmask: 255.255.255.0, role: controller,
      storage_address: 192.168.1.2, storage_netmask: 255.255.255.0, swift_zone: '2', uid: '2',
      user_node_name: 'Untitled (ea:df)'}
    - {fqdn: node-3.test.domain.local, internal_address: 172.16.1.5, internal_netmask: 255.255.255.0,
      name: node-3, public_address: 172.16.0.5, public_netmask: 255.255.255.0, role: compute,
      storage_address: 192.168.1.3, storage_netmask: 255.255.255.0, swift_zone: '3', uid: '3',
      user_node_name: 'Untitled (15:22)'}
    - {fqdn: node-4.test.domain.local, internal_address: 172.16.1.9, internal_netmask: 255.255.255.0,
      name: node-4, public_address: 172.16.0.9, public_netmask: 255.255.255.0, role: primary-controller,
      storage_address: 192.168.1.5, storage_netmask: 255.255.255.0, swift_zone: '4', uid: '4',
      user_node_name: 'Untitled (b7:9b)'}
    - {fqdn: node-5.test.domain.local, internal_address: 172.16.1.10, internal_netmask: 255.255.255.0,
      name: node-5, public_address: 172.16.0.10, public_netmask: 255.255.255.0, role: controller,
      storage_address: 192.168.1.6, storage_netmask: 255.255.255.0, swift_zone: '5', uid: '5',
      user_node_name: 'Untitled (7e:55)'}

2015-03-17T18:01:09 debug: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: MC agent 'uploadfile', method 'upload', results: {:sender=>"3", :statuscode=>0, :statusmsg=>"OK", :data=>{:msg=>"File was uploaded!"}}
2015-03-17T18:01:09 debug: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: MC agent 'uploadfile', method 'upload', results: {:sender=>"2", :statuscode=>0, :statusmsg=>"OK", :data=>{:msg=>"File was uploaded!"}}
2015-03-17T18:01:09 debug: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: MC agent 'uploadfile', method 'upload', results: {:sender=>"4", :statuscode=>0, :statusmsg=>"OK", :data=>{:msg=>"File was uploaded!"}}
2015-03-17T18:01:09 debug: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: MC agent 'uploadfile', method 'upload', results: {:sender=>"5", :statuscode=>0, :statusmsg=>"OK", :data=>{:msg=>"File was uploaded!"}}
2015-03-17T18:01:09 info: [418] Run hook ---
priority: 200
type: puppet
uids:
- '2'
- '3'
- '4'
- '5'
parameters:
  puppet_modules: "/etc/puppet/modules"
  puppet_manifest: "/etc/puppet/modules/osnailyfacter/modular/hosts/hosts.pp"
  timeout: 3600
  cwd: "/"

2015-03-17T18:01:09 debug: [418] Waiting for puppet to finish deployment on all
                           nodes (timeout = 5400 sec)...

2015-03-17T18:01:10 debug: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: MC agent 'puppetd', method 'runonce', results: {:sender=>"3", :statuscode=>0, :statusmsg=>"OK", :data=>{:output=>"Called /usr/sbin/daemonize -a -l /tmp/puppetd.lock -p /tmp/puppetd.lock -c / /usr/bin/puppet apply /etc/puppet/modules/osnailyfacter/modular/hosts/hosts.pp --modulepath=/etc/puppet/modules --logdest syslog --trace --no-report --debug --evaltrace --logdest /var/log/puppet.log, Currently stopped; last completed run 3181 seconds ago", :status=>"stopped", :running=>0, :enabled=>1, :idling=>0, :stopped=>1, :lastrun=>1426612089, :runtime=>3181}}
2015-03-17T18:01:10 debug: [418] Waiting for puppet to finish deployment on node 4 (timeout = 5400 sec)...

2015-03-17T18:01:10 debug: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: MC agent 'puppetd', method 'runonce', results: {:sender=>"4", :statuscode=>0, :statusmsg=>"OK", :data=>{:output=>"Called /usr/sbin/daemonize -a -l /tmp/puppetd.lock -p /tmp/puppetd.lock -c / /usr/bin/puppet apply /etc/puppet/modules/osnailyfacter/modular/hosts/hosts.pp --modulepath=/etc/puppet/modules --logdest syslog --trace --no-report --debug --evaltrace --logdest /var/log/puppet.log, Currently stopped; last completed run 2 seconds ago", :status=>"stopped", :running=>0, :enabled=>1, :idling=>0, :stopped=>1, :lastrun=>1426615268, :runtime=>2}}

2015-03-17T18:01:50 debug: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: MC agent 'execute_shell_command', method 'execute', results: {:sender=>"4", :statuscode=>0, :statusmsg=>"OK", :data=>{:stdout=>". /root/openrc && /usr/bin/glance image-create --name 'TestVM-VMDK' --is-public 'true' --container-format='bare' --disk-format='vmdk' --min-ram='64' --property vmware_disktype=sparse --property vmware_adaptertype=lsilogic --property hypervisor_type=vmware --file '/usr/share/cirros-testvm/cirros-i386-disk.vmdk'\nImage 'TestVM-VMDK' was uploaded from '/usr/share/cirros-testvm/cirros-i386-disk.vmdk'\n+-------------------------------+--------------------------------------+\n| Property | Value |\n+-------------------------------+--------------------------------------+\n| Property 'hypervisor_type' | vmware |\n| Property 'vmware_adaptertype' | lsilogic |\n| Property 'vmware_disktype' | sparse |\n| checksum | 55cb970e727c382e06cdd0976e9c7a73 |\n| container_format | bare |\n| created_at | 2015-03-17T18:01:46 |\n| deleted | False |\n| deleted_at | None |\n| disk_format | vmdk |\n| id | fb66bd60-7d1f-4f2e-84e5-419bc1f6c1c9 |\n| is_public | True |\n| min_disk | 0 |\n| min_ram | 64 |\n| name | TestVM-VMDK |\n| owner | 94ed1dc8513140d583e2dbec37302a0c |\n| protected | False |\n| size | 19791872 |\n| status | active |\n| updated_at | 2015-03-17T18:01:48 |\n| virtual_size | None |\n+-------------------------------+--------------------------------------+\n. /root/openrc && /usr/bin/glance image-create --name 'TestVM' --is-public 'true' --container-format='bare' --disk-format='qcow2' --min-ram='64' --property murano_image_info='{\"title\": \"Murano Demo\", \"type\": \"cirros.demo\"}' --file '/usr/share/cirros-testvm/cirros-x86_64-disk.img'\nImage 'TestVM' upload from '/usr/share/cirros-testvm/cirros-x86_64-disk.img' have FAILED!\n\n", :stderr=>"Error finding address for http://172.16.0.7:9292/v1/images: [Errno 104] Connection reset by peer\n", :exit_code=>0}}
2015-03-17T18:01:50 debug: [418] 22c06470-c5f9-43bc-acb7-46fbeba7d795: cmd: cd / && ruby /etc/puppet/modules/osnailyfacter/modular/astute/upload_cirros.rb
cwd: /
stdout: . /root/openrc && /usr/bin/glance image-create --name 'TestVM-VMDK' --is-public 'true' --container-format='bare' --disk-format='vmdk' --min-ram='64' --property vmware_disktype=sparse --property vmware_adaptertype=lsilogic --property hypervisor_type=vmware --file '/usr/share/cirros-testvm/cirros-i386-disk.vmdk'
Image 'TestVM-VMDK' was uploaded from '/usr/share/cirros-testvm/cirros-i386-disk.vmdk'

-------------------------------fuel-version-------------------
api: '1.0'
astute_sha: 93e427ac49109fa3fd8b0e1d0bb3d14092be2e8c
auth_required: true
build_id: 2015-03-16_22-54-44
build_number: '202'
feature_groups:
- mirantis
fuellib_sha: 924d73ae4766646e1c3a44d7b59c4120985e45f0
fuelmain_sha: 608b72a6f79a719cf01c35a19d0091fe20c8288a
nailgun_sha: 874df0d06e32f14db77746cfeb2dd74d4a6e528c
ostf_sha: e86c961ceacfa5a8398b6cbda7b70a5f06afb476
production: docker
python-fuelclient_sha: 2509c9b72cdcdbe46c141685a99b03cd934803be
release: '6.1'
release_versions:
  2014.2-6.1:
    VERSION:
      api: '1.0'
      astute_sha: 93e427ac49109fa3fd8b0e1d0bb3d14092be2e8c
      build_id: 2015-03-16_22-54-44
      build_number: '202'
      feature_groups:
      - mirantis
      fuellib_sha: 924d73ae4766646e1c3a44d7b59c4120985e45f0
      fuelmain_sha: 608b72a6f79a719cf01c35a19d0091fe20c8288a
      nailgun_sha: 874df0d06e32f14db77746cfeb2dd74d4a6e528c
      ostf_sha: e86c961ceacfa5a8398b6cbda7b70a5f06afb476
      production: docker
      python-fuelclient_sha: 2509c9b72cdcdbe46c141685a99b03cd934803be
      release: '6.1'

Tags: vcenter
Tatyana Dubyk (tdubyk)
tags: added: vcenter
Tatyana Dubyk (tdubyk)
description: updated
Tatyana Dubyk (tdubyk)
Changed in fuel:
assignee: nobody → Fuel Partner Integration Team (fuel-partner)
milestone: none → 6.1
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.