openstack object save - MemoryError

Bug #1663771 reported by Dmitry Sutyagin
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Mirantis OpenStack
In Progress
Medium
Nikita Konovalov
10.0.x
Won't Fix
Medium
Nikita Konovalov
9.x
Won't Fix
Medium
Nikita Konovalov

Bug Description

Reproduced on MOS 9.2

Issue summary: openstack object client silently fails when downloading an object.

Steps to reproduce:
- Deploy controllers with 2GB RAM (virtual lab), then:
root@node-6:/mnt/backup/swift-glance# free -m
             total used free shared buffers cached
Mem: 2000 1353 647 7 10 93
-/+ buffers/cache: 1249 751
root@node-6:/mnt/backup/swift-glance# openstack object show glance 18c0fae2-4e31-464d-9776-b3033945224e
+----------------+---------------------------------------+
| Field | Value |
+----------------+---------------------------------------+
| account | AUTH_231d9fe5c973475581cac208e6de4534 |
| container | glance |
| content-length | 341311488 |
| content-type | application/octet-stream |
| etag | 3209808b650928a389f4c94a67f5f3a9 |
| last-modified | Wed, 08 Feb 2017 00:38:16 GMT |
| object | 18c0fae2-4e31-464d-9776-b3033945224e |
+----------------+---------------------------------------+
root@node-6:/mnt/backup/swift-glance# openstack --debug object save glance 18c0fae2-4e31-464d-9776-b3033945224e
...
Traceback (most recent call last):
  File "/usr/lib/python2.7/dist-packages/cliff/app.py", line 374, in run_subcommand
    result = cmd.run(parsed_args)
  File "/usr/lib/python2.7/dist-packages/openstackclient/common/command.py", line 38, in run
    return super(Command, self).run(parsed_args)
  File "/usr/lib/python2.7/dist-packages/cliff/command.py", line 54, in run
    self.take_action(parsed_args)
  File "/usr/lib/python2.7/dist-packages/openstackclient/object/v1/object.py", line 204, in take_action
    file=parsed_args.file,
  File "/usr/lib/python2.7/dist-packages/openstackclient/api/object_store_v1.py", line 369, in object_save
    stream=True,
  File "/usr/lib/python2.7/dist-packages/openstackclient/api/api.py", line 83, in _request
    return session.request(url, method, **kwargs)
  File "/usr/lib/python2.7/dist-packages/openstackclient/common/session.py", line 40, in request
    resp = super(TimingSession, self).request(url, method, **kwargs)
  File "/usr/lib/python2.7/dist-packages/positional/__init__.py", line 94, in inner
    return func(*args, **kwargs)
  File "/usr/lib/python2.7/dist-packages/keystoneauth1/session.py", line 452, in request
    resp = send(**kwargs)
  File "/usr/lib/python2.7/dist-packages/keystoneauth1/session.py", line 517, in _send_request
    self._http_log_response(response=resp, logger=logger)
  File "/usr/lib/python2.7/dist-packages/positional/__init__.py", line 94, in inner
    return func(*args, **kwargs)
  File "/usr/lib/python2.7/dist-packages/keystoneauth1/session.py", line 263, in _http_log_response
    text = self._remove_service_catalog(response.text)
  File "/usr/lib/python2.7/dist-packages/requests/models.py", line 769, in text
    encoding = self.apparent_encoding
  File "/usr/lib/python2.7/dist-packages/requests/models.py", line 643, in apparent_encoding
    return chardet.detect(self.content)['encoding']
  File "/usr/lib/python2.7/dist-packages/chardet/__init__.py", line 24, in detect
    u.feed(aBuf)
  File "/usr/lib/python2.7/dist-packages/chardet/universaldetector.py", line 115, in feed
    if prober.feed(aBuf) == constants.eFoundIt:
  File "/usr/lib/python2.7/dist-packages/chardet/charsetgroupprober.py", line 59, in feed
    st = prober.feed(aBuf)
  File "/usr/lib/python2.7/dist-packages/chardet/sjisprober.py", line 53, in feed
    for i in range(0, aLen):
MemoryError

Expected results: openstack client successfully downloads a 300MB file from Swift.

Actual result: silent failure.

Reproducibility: 100% on this env.

Impact: cannot use openstack object CLI.

Changed in mos:
milestone: none → 9.x-updates
Revision history for this message
Dmitry Sutyagin (dsutyagin) wrote :
tags: added: area-mos
Revision history for this message
Nikita Konovalov (nkonovalov) wrote :

Does the issue reproduce on the environment with a more powerful controller? I would suggest to setup a similar environment with 16+ GB of RAM and check the same operations.

I've also noticed that you don't have any swap in your "free -m" output. Using swap may resolve the issue. However the operations will be very slow.

Revision history for this message
Fuel Devops McRobotson (fuel-devops-robot) wrote : Related fix proposed to openstack/python-openstackclient (9.0/mitaka)

Related fix proposed to branch: 9.0/mitaka
Change author: Jordan Pittier <email address hidden>
Review: https://review.fuel-infra.org/30996

Revision history for this message
Vitalii Gridnev (vgridnev) wrote :

There are two issues around this bug:

 1. Keystoneauth is trying to get response.text for achieved response when debug log is enabled. This action consumes all memory (14G for 400MB file in my case). This issue already fixed in stable/mitaka branch in upstream. Change https://review.fuel-infra.org/#/c/30981/ brings all changes to keystoneauth library from stable/mitaka
 2. 'openstack object save' command is extremely slow (see https://bugs.launchpad.net/python-openstackclient/+bug/1654645). Fix proposed at https://review.fuel-infra.org/30996

Revision history for this message
Vitalii Gridnev (vgridnev) wrote :

Additional note about 1. In my case, --debug is not required to reproduce issue, actually.

Revision history for this message
Fuel Devops McRobotson (fuel-devops-robot) wrote : Related fix proposed to openstack/python-openstackclient (10.0/newton)

Related fix proposed to branch: 10.0/newton
Change author: Jordan Pittier <email address hidden>
Review: https://review.fuel-infra.org/30999

Revision history for this message
Vitalii Gridnev (vgridnev) wrote :
Revision history for this message
Vitalii Gridnev (vgridnev) wrote :

Moving to won't fix due to the Medium priority.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.