Node Discovery Not Working

Bug #1466993 reported by Jason
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Fuel for OpenStack
Invalid
High
Fuel Python (Deprecated)
6.1.x
Invalid
High
Fuel Python (Deprecated)
7.0.x
Invalid
High
Fuel Python (Deprecated)

Bug Description

Seen on:
Cisco B200 blade on Cisco 5108 UCS chassis. Allocated 3 new blades for Mirantis Openstack test deployment in lab. Blades have
32 x Intel(R) Xeon(R) CPU E5-2658 0 @ 2.10GHz
392 GB RAM
2 x 300GB HDD with no RAID
4 x NIC allocated via UCS Manager (one on OAM network, 3 on bearer network)

Reproduce:
1) Download and install MirantisOpenStack-6.0.iso
2) Install on blade #1
       - Note: The DHCP Pool I allocated during the install was 10 addresses wide on a /25 network
3) PXE Boot the two other blades
4) Choose Ubuntu install option, answer questions, success, reboot

Result:
The Fuel dashboard never detects total nodes or unallocated nodes. The total is always 0. Blades 2 and 3 respond to ping and I installed openssh-server on one of them via the console and it appears to work fine. The fact that they obtained IPs and had a successful OS install from the Fuel server seems to indicate that there are no network connectivity issues.

I looked through the installation guide but did not see a way to provoke blades 2/3 to contact the fuel server manually, so thought I'd send through a report to see if you'd seen this. I'll leave it up a few more days in case you want some logs or other info.

Changed in fuel:
milestone: none → 7.0
Revision history for this message
Vladimir Kozhukalov (kozhukalov) wrote :

Jason, we definitely need to have diagnostic snapshot from your env. It looks like our discovery agent can not detect this hardware and raises some exception. We need logs to find out where is the problem.

Revision history for this message
Vladimir Kozhukalov (kozhukalov) wrote :

You are also can look at logs and see if there are any errors in /var/log/docker-logs/remote/node-X.domain.tld/bootstrap/agent.log

Revision history for this message
Jason (js7558) wrote :

I can provide that (diagnostic snapshot) if you tell me how to do it.

Also, I had a problem with the log at /var/log/docker-logs/remote/node-X.domain.tld/bootstrap/agent.log. There is no log at this location. Looks like the bootstrap directory does not exist:

[root@fuel 3f9f55993340]# hostname && pwd && ls -l
fuel.mog.tld
/var/log/docker-logs/remote/3f9f55993340
total 12
-rw-r----- 1 root adm 3300 Jun 19 20:28 kernel.log
-rw-r----- 1 root adm 604 Jun 19 20:28 rsyslogd-2182.log
-rw-r----- 1 root adm 272 Jun 19 20:28 rsyslogd.log

Let me know what else I can grab for you off of here.

Thanks,
Jason

Revision history for this message
Alexander Kislitsky (akislitsky) wrote :

Jason, for creating diagnostic snapshot go to the support tab in the Web UI and push the button 'Generate Diagnostic Snapshot'. After snapshot will be generated download it and attach to the bug, please.

Revision history for this message
Jason (js7558) wrote :

Here you go..

Revision history for this message
Jason (js7558) wrote :

Anybody care?

Revision history for this message
Vladimir Kozhukalov (kozhukalov) wrote :

Jason,

Very sorry for delay. Diagnostic snapshot attached to this bug was not helpful because it does not contain any logs from slave nodes. Perhaps there were some early exceptions preventing slave nodes from sending log messages to the master node. To figure out what is going on we need to have access to this lab, otherwise I am afraid we'll not be able to solve this.

Revision history for this message
Jason (js7558) wrote :

This is not a public lab that I can provide you access to and I've already torn this installation down. I guess there's nothing to do here, so probably best to close this out.

Revision history for this message
Aleksandr Didenko (adidenko) wrote :

Marking as invalid.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.