DBConnectionError during running OSTF tests

Bug #1611422 reported by Ann Taraday
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Fuel for OpenStack
Incomplete
High
Fuel Sustaining
Mitaka
New
High
Fuel Sustaining

Bug Description

Job failed https://ci.fuel-infra.org/job/10.0-community.main.ubuntu.bvt_2/455/console with AssertionError: Failed 1 OSTF tests; should fail 0 tests. Names of failed tests:
  - Check network connectivity from instance via floating IP (failure) Floating IP can not be assigned. Please refer to OpenStack logs for more details.

In the OpenStack logs there are a lot of errors from nova and neutron about DBConnectionError at the same time as the failure of test happened.

node-6/var/log/neutron/neutron-openvswitch-agent.log:20347:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent [-] Failed rep
orting state!
node-6/var/log/neutron/neutron-openvswitch-agent.log:20348:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent Traceback (mos
t recent call last):
node-6/var/log/neutron/neutron-openvswitch-agent.log:20349:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python2.7/dist-packages/neutron/plugins/ml2/drivers/openvswitch/agent/ovs_neutron_agent.py", line 322, in _report_state
node-6/var/log/neutron/neutron-openvswitch-agent.log:20350:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent True)
node-6/var/log/neutron/neutron-openvswitch-agent.log:20351:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python2.7/dist-packages/neutron/agent/rpc.py", line 87, in report_state
node-6/var/log/neutron/neutron-openvswitch-agent.log:20352:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return method(context, 'report_state', **kwargs)
node-6/var/log/neutron/neutron-openvswitch-agent.log:20353:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python2.7/dist-packages/neutron/common/rpc.py", line 138, in call
node-6/var/log/neutron/neutron-openvswitch-agent.log:20354:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent return self._original_context.call(ctxt, method, **kwargs)
node-6/var/log/neutron/neutron-openvswitch-agent.log:20355:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/rpc/client.py", line 169, in call
node-6/var/log/neutron/neutron-openvswitch-agent.log:20356:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent retry=self.retry)
node-6/var/log/neutron/neutron-openvswitch-agent.log:20357:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/transport.py", line 96, in _send
node-6/var/log/neutron/neutron-openvswitch-agent.log:20358:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent timeout=timeout, retry=retry)
node-6/var/log/neutron/neutron-openvswitch-agent.log:20359:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 464, in send
node-6/var/log/neutron/neutron-openvswitch-agent.log:20360:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent retry=retry)
node-6/var/log/neutron/neutron-openvswitch-agent.log:20361:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent File "/usr/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py", line 455, in _send
node-6/var/log/neutron/neutron-openvswitch-agent.log:20362:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent raise result
node-6/var/log/neutron/neutron-openvswitch-agent.log:20363:2016-08-07 20:20:54.150 9875 ERROR neutron.plugins.ml2.drivers.openvswitch.agent.ovs_neutron_agent RemoteError: Remote error: DBConnectionError (_mysql_exceptions.OperationalError) (2013, "Lost connection to MySQL server at 'reading initial communication packet', system error: 0") [SQL: u'SELECT 1']

Revision history for this message
Ann Taraday (akamyshnikova) wrote :
tags: added: bvt-failure
Changed in mos:
assignee: nobody → MOS Neutron (mos-neutron)
milestone: none → 10.0
importance: Undecided → High
status: New → Confirmed
tags: added: area-neutron
Changed in mos:
assignee: MOS Neutron (mos-neutron) → Ann Taraday (akamyshnikova)
Revision history for this message
Ann Taraday (akamyshnikova) wrote :

In the logs of nova at the moment of failed test http://paste.openstack.org/show/559007/ - error from Neutron server "InternalServerError: Request Failed: internal server error while processing your request". In Neutron server logs http://paste.openstack.org/show/559010/ - error "DBConnectionError: (_mysql_exceptions.OperationalError) (2013, "Lost connection to MySQL server at 'reading initial communication packet', system error: 0") "

In the mysql/error.log there is no error, but there are error related to mysql in pacemaker.logs - http://paste.openstack.org/show/559016/ - " p_mysqld (ocf::fuel:mysql-wss): FAILED node-2.test.domain.local"

affects: mos → fuel
Changed in fuel:
milestone: 10.0 → none
Changed in fuel:
assignee: Ann Taraday (akamyshnikova) → Fuel Sustaining (fuel-sustaining-team)
milestone: none → 10.0
tags: added: area-library
removed: area-neutron
Revision history for this message
Maksim Malchuk (mmalchuk) wrote :

The issue is not reproduced for a long time on CI.
Looks like it already fixed.
Moved to the Incomplete status, feel free to reopen if it appears again.

Changed in fuel:
status: Confirmed → Incomplete
Revision history for this message
Dmitriy Kruglov (dkruglov) wrote :
Revision history for this message
Dmitriy Kruglov (dkruglov) wrote :
Revision history for this message
Maksim Malchuk (mmalchuk) wrote :

$ grep -irn split-brain node-*
node-1.test.domain.local/ocf-mysql-wss.log:2251:2016-09-14T04:14:51.853688+00:00 err: ERROR: p_mysqld: check_if_galera_pc(): But I'm running a new cluster, PID:21774, this is a split-brain!

$ grep mysql-server-wsrep-5.6 node-1.test.domain.local/dpkg.log
2016-09-14T00:57:34.502992+00:00 info: 2016-09-14 00:57:29 install mysql-server-wsrep-5.6:amd64 <none> 5.6.30-0~u14.04+mos1

we're waiting for other mysql package
marked as duplicate

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.