Intermittent "Error starting thread.: ModuleNotFoundError: No module named 'etcd3gw'" in grenade-py3 jobs since March 14

Bug #1820892 reported by Matt Riedemann
20
This bug affects 2 people
Affects Status Importance Assigned to Milestone
devstack
Fix Released
High
Dr. Jens Harbott

Bug Description

http://logs.openstack.org/96/644596/1/check/grenade-py3/c5059ab/logs/screen-c-bak.txt.gz

Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service [-] Error starting thread.: ModuleNotFoundError: No module named 'etcd3gw'
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service Traceback (most recent call last):
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/usr/local/lib/python3.6/dist-packages/oslo_service/service.py", line 796, in run_service
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service service.start()
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/opt/stack/old/cinder/cinder/service.py", line 219, in start
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service coordination.COORDINATOR.start()
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/opt/stack/old/cinder/cinder/coordination.py", line 66, in start
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service cfg.CONF.coordination.backend_url, member_id)
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/usr/local/lib/python3.6/dist-packages/tooz/coordination.py", line 800, in get_coordinator
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service invoke_args=(member_id, parsed_url, options)).driver
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/usr/local/lib/python3.6/dist-packages/stevedore/driver.py", line 61, in __init__
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service warn_on_missing_entrypoint=warn_on_missing_entrypoint
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/usr/local/lib/python3.6/dist-packages/stevedore/named.py", line 81, in __init__
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service verify_requirements)
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/usr/local/lib/python3.6/dist-packages/stevedore/extension.py", line 203, in _load_plugins
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service self._on_load_failure_callback(self, ep, err)
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/usr/local/lib/python3.6/dist-packages/stevedore/extension.py", line 195, in _load_plugins
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service verify_requirements,
Mar 19 16:14:03.940298 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/usr/local/lib/python3.6/dist-packages/stevedore/named.py", line 158, in _load_one_plugin
Mar 19 16:14:03.943132 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service verify_requirements,
Mar 19 16:14:03.943132 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/usr/local/lib/python3.6/dist-packages/stevedore/extension.py", line 223, in _load_one_plugin
Mar 19 16:14:03.943132 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service plugin = ep.resolve()
Mar 19 16:14:03.943132 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/usr/local/lib/python3.6/dist-packages/pkg_resources/__init__.py", line 2417, in resolve
Mar 19 16:14:03.943132 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service module = __import__(self.module_name, fromlist=['__name__'], level=0)
Mar 19 16:14:03.943132 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service File "/usr/local/lib/python3.6/dist-packages/tooz/drivers/etcd3gw.py", line 20, in <module>
Mar 19 16:14:03.943132 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service import etcd3gw
Mar 19 16:14:03.943132 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service ModuleNotFoundError: No module named 'etcd3gw'
Mar 19 16:14:03.943132 ubuntu-bionic-rax-dfw-0003973610 cinder-backup[28520]: ERROR oslo_service.service

http://logstash.openstack.org/#/dashboard/file/logstash.json?query=message:%5C%22Error%20starting%20thread.:%20ModuleNotFoundError:%20No%20module%20named%20'etcd3gw'%5C%22%20AND%20tags:%5C%22screen%5C%22&from=10d

This only shows up in the grenade-py3 job, so I wonder if https://github.com/openstack-dev/grenade/commit/d5dc033a2e6df988b0576fc0417be46cc8d157ed is somehow involved.

or maybe this https://github.com/openstack-dev/devstack/commit/676957ffcff78e790134776f71035a3b14974896

Revision history for this message
Matt Riedemann (mriedem) wrote :
Revision history for this message
Matt Riedemann (mriedem) wrote :

From the grenade logs:

2019-03-19 15:58:12.552 | Check python version for : etcd3gw
2019-03-19 15:58:13.601 | + inc/python:pip_install:349 : local install_test_reqs=
2019-03-19 15:58:13.604 | + inc/python:pip_install:350 : local test_req=etcd3gw/test-requirements.txt
2019-03-19 15:58:13.606 | + inc/python:pip_install:351 : [[ -e etcd3gw/test-requirements.txt ]]
2019-03-19 15:58:13.609 | + inc/python:pip_install:359 : sudo -H http_proxy= https_proxy= no_proxy= PIP_FIND_LINKS= SETUPTOOLS_SYS_PATH_TECHNIQUE=rewrite /usr/local/bin/pip2.7 install -c /opt/stack/old/requirements/upper-constraints.txt etcd3gw

Revision history for this message
Matt Riedemann (mriedem) wrote :
no longer affects: grenade
Changed in devstack:
status: New → Triaged
importance: Undecided → High
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to devstack (master)

Fix proposed to branch: master
Review: https://review.openstack.org/644638

Changed in devstack:
assignee: nobody → Matt Riedemann (mriedem)
status: Triaged → In Progress
Revision history for this message
Matt Riedemann (mriedem) wrote :

This only happens on "rax-dfw" nodes so etcd3gw must be pre-installed in the images used on those nodes.

Revision history for this message
Ghanshyam Mann (ghanshyammann) wrote :

yeah, other images like on ovh-gra1[1], vexxhost-sjc1 nodes try with remote package installation with 3.5 classifier.

- http://logs.openstack.org/21/641921/1/check/grenade-py3/e894fe1/logs/pip3-freeze.txt.gz

Revision history for this message
Matt Riedemann (mriedem) wrote :
Download full text (5.1 KiB)

Just hit this with the memcache import for keystone:

http://logs.openstack.org/76/644576/1/gate/grenade-py3/e8d5a3b/logs/screen-keystone.txt.gz?level=TRACE#_Mar_22_01_10_29_138043

Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: CRITICAL keystone [-] Unhandled error: ModuleNotFoundError: No module named 'memcache'
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone Traceback (most recent call last):
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone File "/usr/local/bin/keystone-wsgi-public", line 54, in <module>
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone application = initialize_public_application()
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone File "/opt/stack/old/keystone/keystone/server/wsgi.py", line 24, in initialize_public_application
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone name='public', config_files=flask_core._get_config_files())
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone File "/opt/stack/old/keystone/keystone/server/flask/core.py", line 164, in initialize_application
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone startup_application_fn=loadapp)
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone File "/opt/stack/old/keystone/keystone/server/__init__.py", line 46, in setup_backends
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone drivers = backends.load_backends()
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone File "/opt/stack/old/keystone/keystone/server/backends.py", line 41, in load_backends
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone cache.configure_cache()
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone File "/opt/stack/old/keystone/keystone/common/cache/core.py", line 124, in configure_cache
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone cache.configure_cache_region(CONF, region)
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone File "/usr/local/lib/python3.6/dist-packages/oslo_cache/core.py", line 235, in configure_cache_region
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone '%s.' % conf.cache.config_prefix)
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERROR keystone File "/usr/local/lib/python3.6/dist-packages/dogpile/cache/region.py", line 591, in configure_from_config
Mar 22 01:10:29.138043 ubuntu-bionic-rax-dfw-0004128909 <email address hidden>[1887]: ERR...

Read more...

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to devstack (master)

Reviewed: https://review.openstack.org/644638
Committed: https://git.openstack.org/cgit/openstack-dev/devstack/commit/?id=ddb6179b0479ea9478cf2a146fe9b0d7592acaec
Submitter: Zuul
Branch: master

commit ddb6179b0479ea9478cf2a146fe9b0d7592acaec
Author: Matt Riedemann <email address hidden>
Date: Tue Mar 19 15:04:12 2019 -0400

    Ease python 3 classifier check in check_python3_support_for_package_local

    This makes the grep match in check_python3_support_for_package_local
    the same as check_python3_support_for_package_remote.

    Change I0349de2026c49279ba7f262d5e86d37018d66326 in grenade started
    setting the PYTHON3_VERSION variable, and then we recently started
    using bionic nodes everywhere which means we're running python 3.6.

    The etcd3gw package has a python 3 and 3.5 classifier, but not 3.6:

    https://pypi.org/project/etcd3gw/

    The pip_install function code that is dealing with installing py3
    packages is hitting a problem installing etcd3gw if the package is
    local because of the more restrictive grep in the
    check_python3_support_for_package_local function, and since
    PYTHON3_VERSION=3.6 now, we don't install from py3 and install
    etcd3gw on python 2.7 which makes services like cinder-volume and
    cinder-backup, which use etcd3gw, fail when they are running under
    python 3 (they get module import errors).

    This simply removes the $ restriction on the grep. Looking at the
    change that added those local/remote functions:

      I243ea4b76f0d5ef57a03b5b0798a05468ee6de9b

    There is no explanation for the difference, it just said:

      Also, since not many packages are classified correctly, fallback
      to looking for just "Programming Language :: Python :: 3" and
      log a message for the package to highlight the problem.

    So that's what this change does.

    Note that alternatives would be:

    1. Update the etcd3gw package to add the 3.6 classifier and do
       a release (this should probably happen anyway).

    2. Add etcd3gw to ENABLED_PYTHON3_PACKAGES but that would be a
       short-term hack workaround.

    Change-Id: Icd3768870ba0f1659bb2e6f002043d975047b73e
    Closes-Bug: #1820892

Changed in devstack:
status: In Progress → Fix Released
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to devstack (stable/rocky)

Fix proposed to branch: stable/rocky
Review: https://review.openstack.org/645592

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Change abandoned on devstack (stable/rocky)

Change abandoned by Matt Riedemann (<email address hidden>) on branch: stable/rocky
Review: https://review.openstack.org/645592
Reason: Nevermind this is likely not an issue on stable/rocky because we aren't running grenade-py3 there.

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to devstack (stable/rocky)

Reviewed: https://review.openstack.org/645592
Committed: https://git.openstack.org/cgit/openstack-dev/devstack/commit/?id=9c1b7a5e6ac842d99f11b9be3042ed3499eb156f
Submitter: Zuul
Branch: stable/rocky

commit 9c1b7a5e6ac842d99f11b9be3042ed3499eb156f
Author: Matt Riedemann <email address hidden>
Date: Tue Mar 19 15:04:12 2019 -0400

    Ease python 3 classifier check in check_python3_support_for_package_local

    This makes the grep match in check_python3_support_for_package_local
    the same as check_python3_support_for_package_remote.

    Change I0349de2026c49279ba7f262d5e86d37018d66326 in grenade started
    setting the PYTHON3_VERSION variable, and then we recently started
    using bionic nodes everywhere which means we're running python 3.6.

    The etcd3gw package has a python 3 and 3.5 classifier, but not 3.6:

    https://pypi.org/project/etcd3gw/

    The pip_install function code that is dealing with installing py3
    packages is hitting a problem installing etcd3gw if the package is
    local because of the more restrictive grep in the
    check_python3_support_for_package_local function, and since
    PYTHON3_VERSION=3.6 now, we don't install from py3 and install
    etcd3gw on python 2.7 which makes services like cinder-volume and
    cinder-backup, which use etcd3gw, fail when they are running under
    python 3 (they get module import errors).

    This simply removes the $ restriction on the grep. Looking at the
    change that added those local/remote functions:

      I243ea4b76f0d5ef57a03b5b0798a05468ee6de9b

    There is no explanation for the difference, it just said:

      Also, since not many packages are classified correctly, fallback
      to looking for just "Programming Language :: Python :: 3" and
      log a message for the package to highlight the problem.

    So that's what this change does.

    Note that alternatives would be:

    1. Update the etcd3gw package to add the 3.6 classifier and do
       a release (this should probably happen anyway).

    2. Add etcd3gw to ENABLED_PYTHON3_PACKAGES but that would be a
       short-term hack workaround.

    Change-Id: Icd3768870ba0f1659bb2e6f002043d975047b73e
    Closes-Bug: #1820892
    (cherry picked from commit ddb6179b0479ea9478cf2a146fe9b0d7592acaec)

tags: added: in-stable-rocky
Revision history for this message
Matt Riedemann (mriedem) wrote :

This is also failing in the neutron-grenade-(dvr-)multinode jobs, and seems to be an issue for libvirt-python.

Revision history for this message
Matt Riedemann (mriedem) wrote :

Looks like the python 3 parsing from pypi is failing:

http://logs.openstack.org/79/636079/9/gate/grenade-py3/153ff2c/logs/grenade.sh.txt.gz#_2019-03-29_23_30_47_850

2019-03-29 23:30:47.850 | + inc/python:pip_install_gr:73 : pip_install 'libvirt-python!=4.1.0,!=4.2.0'
2019-03-29 23:30:47.872 | Check python version for : libvirt-python!=4.1.0,!=4.2.0
2019-03-29 23:30:49.354 | WARNING: Did not find python 3 classifier for remote package libvirt-python!=4.1.0,!=4.2.0
2019-03-29 23:30:49.815 | + inc/python:pip_install:353 : local install_test_reqs=
2019-03-29 23:30:49.818 | + inc/python:pip_install:354 : local 'test_req=libvirt-python!=4.1.0,!=4.2.0/test-requirements.txt'
2019-03-29 23:30:49.820 | + inc/python:pip_install:355 : [[ -e libvirt-python!=4.1.0,!=4.2.0/test-requirements.txt ]]
2019-03-29 23:30:49.822 | + inc/python:pip_install:363 : sudo -H http_proxy= https_proxy= no_proxy= PIP_FIND_LINKS= SETUPTOOLS_SYS_PATH_TECHNIQUE=rewrite /usr/local/bin/pip2.7 install -c /opt/stack/old/requirements/upper-constraints.txt 'libvirt-python!=4.1.0,!=4.2.0'

We should be looking up the package name correctly by splitting off the blacklisted versions:

package=$(echo $package_dir | grep -o '^[.a-zA-Z0-9_-]*')
python3_classifier=$(check_python3_support_for_package_remote $package)

$ echo libvirt-python!=4.1.0,!=4.2.0 | grep -o '^[.a-zA-Z0-9_-]*'
libvirt-python

Revision history for this message
Matt Riedemann (mriedem) wrote :

I wrote a little test script for that devstack function and it does seem to work for libvirt-python:

osboxes@osboxes:~/git/devstack$ cat test.sh
#!/bin/bash
source ./inc/python
package=libvirt-python
python3_classifier=$(check_python3_support_for_package_remote $package)
if [[ ! -z "$python3_classifier" ]]; then
    echo "Found package in pypi supports py3"
else
    echo "Package in pypi does not support py3"
fi
osboxes@osboxes:~/git/devstack$ ./test.sh
Found package in pypi supports py3

So I'm not sure why we're hitting a case where it says we're not on the rax-dfw nodes - do they have a pypi mirror that's intercepting https://pypi.python.org?

Revision history for this message
Matt Riedemann (mriedem) wrote :

(8:46:52 AM) fungi: mriedem: running that script locally on the mirror.dfw.rax.o.o server (out of convenience) it consistently reports "Found package in pypi supports py3"

(8:48:08 AM) fungi: mriedem: so what would be the behavior of the code in devstack if https://pypi.python.org/pypi/libvirt-python/json occasionally came back truncated?
(8:48:18 AM) fungi: (or empty)
(8:48:25 AM) mriedem: we'd fallback to py2
(8:49:16 AM) fungi: mriedem: so just to confirm, if we got incomplete data from that pypi json metadata api query on occasion, that could explain the behavior exhibited in that bug?
(8:49:26 AM) mriedem: fungi: i think so yeah
(8:49:30 AM) mriedem: b/c we're just doing a grep
(8:49:52 AM) fungi: mriedem: rather than actually trying to parse the json
(8:50:09 AM) mriedem: fungi: correct
(8:50:18 AM) mriedem: https://github.com/openstack-dev/devstack/blob/master/inc/python#L103
(8:52:03 AM) mriedem: fungi: so you're thinking maybe curl is failing on some network error?
(8:52:13 AM) mriedem: it is using -s so any error would be surpressed
(8:52:40 AM) fungi: mriedem: or one of the nearby fastly cdn endpoints is responding with an incomplete payload

Revision history for this message
Jeremy Stanley (fungi) wrote :

I don't think Rackspace is necessarily directly operating any sort of transparent proxy for https://pypi.python.org/ but that site does rely on a vast network of proxies (a CDN) so it's possible the PyPI CDN endpoints nearest the nodes in rax-dfw (those CDN endpoints might even be hosted inside the provider's network, that's not an uncommon choice for larger CDN operators). Since the code in DevStack accessing the PyPI JSON metadata API isn't parsing the JSON itself to make sure it's complete and is also silencing errors which curl might otherwise emit, and truncated or failed queries could explain the observed behavior, I would start by ruling that out.

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Related fix proposed to devstack (master)

Related fix proposed to branch: master
Review: https://review.openstack.org/649091

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to devstack (master)

Fix proposed to branch: master
Review: https://review.openstack.org/649096

Matt Riedemann (mriedem)
Changed in devstack:
status: Fix Released → In Progress
Changed in devstack:
assignee: Matt Riedemann (mriedem) → Dr. Jens Harbott (j-harbott)
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to devstack (master)

Reviewed: https://review.openstack.org/649096
Committed: https://git.openstack.org/cgit/openstack-dev/devstack/commit/?id=e03bcb2c8b8f1ee1cbef579454a30776e43175b3
Submitter: Zuul
Branch: master

commit e03bcb2c8b8f1ee1cbef579454a30776e43175b3
Author: Matt Riedemann <email address hidden>
Date: Mon Apr 1 12:19:45 2019 -0400

    Remove crusty old python 3 package version logic

    If we are running with python3, just assume that any
    package that is not blacklisted is available for py3
    and just attempt to install it and let pip sort it out
    whether it gets installed from a local or remote package.

    Change-Id: Ic05d183e489320f6dfc721575d47e7e4d661f87c
    Closes-Bug: #1820892

Changed in devstack:
status: In Progress → Fix Released
Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Change abandoned on devstack (master)

Change abandoned by Matt Riedemann (<email address hidden>) on branch: master
Review: https://review.openstack.org/649091

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix proposed to devstack (stable/stein)

Fix proposed to branch: stable/stein
Review: https://review.openstack.org/650522

Revision history for this message
OpenStack Infra (hudson-openstack) wrote : Fix merged to devstack (stable/stein)

Reviewed: https://review.opendev.org/650522
Committed: https://git.openstack.org/cgit/openstack/devstack/commit/?id=779072886c36bf14722986772d468fbaf95ff7a9
Submitter: Zuul
Branch: stable/stein

commit 779072886c36bf14722986772d468fbaf95ff7a9
Author: Matt Riedemann <email address hidden>
Date: Mon Apr 1 12:19:45 2019 -0400

    Remove crusty old python 3 package version logic

    If we are running with python3, just assume that any
    package that is not blacklisted is available for py3
    and just attempt to install it and let pip sort it out
    whether it gets installed from a local or remote package.

    Change-Id: Ic05d183e489320f6dfc721575d47e7e4d661f87c
    Closes-Bug: #1820892
    (cherry picked from commit e03bcb2c8b8f1ee1cbef579454a30776e43175b3)

tags: added: in-stable-stein
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Duplicates of this bug

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.