Config_drive file throws Request is too large

Bug #1853635 reported by bhreddy
18
This bug affects 3 people
Affects Status Importance Assigned to Milestone
OpenStack Compute (nova)
Invalid
Undecided
Unassigned

Bug Description

Openstack Version: QUEENS
I am trying to instantiate VM with config-drive option with two files config1.xml(size 200KB) and config2.xml(100KB).
openstack server create ----config-drive=true --file config1.xml=config1.xml -flavor m1.tiny --image cirros

instance creation was failed with error "OverLimit: Request is too large"

My Quota is already with 500000

nova quota-show command output:

injected_file_content_bytes | 500000 |

Even I have updated the following in nova.conf
injected_file_content_bytes = 500000

Any suggestion is highly appreciated.

Revision history for this message
Matt Riedemann (mriedem) wrote :

I believe there is a hard-coded restriction on the per-file size to 64KiB:

https://docs.openstack.org/nova/latest/user/metadata.html#metadata-userdata

I'm trying to find where that is enforced though. But I don't think the quota matters in that case (though the config option should set a max value so operators don't think they can set the max to over that limit).

Revision history for this message
Matt Riedemann (mriedem) wrote :

Nevermind user_data != file injection. I don't see any restriction in the nova code about the personality file size restriction. I wonder if there is something in the client code you're using? I don't see anything obvious from the python-novaclient code though.

Can you recreate the failure using the --debug option on the command line and paste that output so we can see where the request fails?

Revision history for this message
Dr. Jens Harbott (j-harbott) wrote :
Download full text (4.1 KiB)

Reproduced against rocky. The same error occurs with two files of 64k size, so the limit seems to be the total request, not the file.

https://dev:8774 "POST /v2.1/2708f8eb31ef4ac89b47f17ccd05d643/servers HTTP/1.1" 413 131
RESP: [413] Content-Length: 131 Content-Type: application/json Date: Mon, 25 Nov 2019 15:11:30 GMT Server: Apache Via: 1.1 controller-node12.dev:8774 X-Content-Type-Options: nosniff x-compute-requ
est-id: req-b245cf71-225a-47b4-b94a-3c9b5dc86588 x-openstack-request-id: req-b245cf71-225a-47b4-b94a-3c9b5dc86588
RESP BODY: {"message": "Request is too large.<br /><br />\n\n\n", "code": "413 Request Entity Too Large", "title": "Request Entity Too Large"}
POST call to compute for https://dev:8774/v2.1/2708f8eb31ef4ac89b47f17ccd05d643/servers used request id req-b245cf71-225a-47b4-b94a-3c9b5dc86588
Request is too large.<br /><br />

 (HTTP 413) (Request-ID: req-b245cf71-225a-47b4-b94a-3c9b5dc86588)
Traceback (most recent call last):
  File "/usr/lib64/python3.6/site-packages/cliff/app.py", line 399, in run_subcommand
    result = cmd.run(parsed_args)
  File "/usr/lib64/python3.6/site-packages/osc_lib/command/command.py", line 41, in run
    return super(Command, self).run(parsed_args)
  File "/usr/lib64/python3.6/site-packages/cliff/display.py", line 116, in run
    column_names, data = self.take_action(parsed_args)
  File "/usr/lib64/python3.6/site-packages/openstackclient/compute/v2/server.py", line 917, in take_action
    server = compute_client.servers.create(*boot_args, **boot_kwargs)
  File "/usr/lib64/python3.6/site-packages/novaclient/v2/servers.py", line 1373, in create
    return self._boot(response_key, *boot_args, **boot_kwargs)
  File "/usr/lib64/python3.6/site-packages/novaclient/v2/servers.py", line 810, in _boot
    return_raw=return_raw, **kwargs)
  File "/usr/lib64/python3.6/site-packages/novaclient/base.py", line 364, in _create
    resp, body = self.api.client.p...

Read more...

Revision history for this message
bhreddy (hanumanthareddy-b) wrote :

yes j-harbott, limit is not per file and it's total request. I am tryig to understand how to increase it?

Revision history for this message
Matt Riedemann (mriedem) wrote :

I think it must be something in the web server then (or maybe middleware?) because I don't see anything in the nova code that checks for this.

Revision history for this message
Allison Walters (allisonw) wrote :

Having this issue too. Confirmed that the same command without the config-drive works fine. I checked through the httpd config and don't see an applicable size limit set, and there shouldn't be any middleware - I'm running this directly at the command line on my only node (packstack install). Anywhere else to check?

Revision history for this message
Artom Lifshitz (notartom) wrote :

I believe this is the oslo sizelimit middleware [1]. I'm by no means an expert on it, but based on [2] if you sent [oslo_middleware]/max_request_body_size to a large number, if *should* fix this. Can you try that and report? Thanks!

[1] https://github.com/openstack/oslo.middleware/blob/master/oslo_middleware/sizelimit.py
[2] https://docs.openstack.org/oslo.middleware/latest/configuration/index.html#configuration-from-the-application

Revision history for this message
Allison Walters (allisonw) wrote :

Ooh, that seems promising. I ended up using Cinder and volumes for this, so I don't have a setup to reproduce with my original files and such this second, but I might be able to soon. I did have another script that was using the config-drive successfully, since the file was much smaller. I changed the max size in the oslo sizelimit.py file to something smaller than this file, and then got the same error message as the original issue. Moved it back and it's fine. And the default size in there was 112k, so that seems consistent with what I was seeing. I'll try to report back if/when I can repro with the original files, but this definitely looks very likely to be the solution. Thanks!

Revision history for this message
Balazs Gibizer (balazs-gibizer) wrote :

Based on the last comment from Allison the solution is to configure oslo_middleware properly to allow bigger HTTP request bodies:

[oslo_middleware]/max_request_body_size

So I'm closing this bug as Invalid.

Changed in nova:
status: New → Invalid
To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.