s3: cinder backup consistently fails because of 400 error

Bug #2047052 reported by Takashi Kajinami
8
This bug affects 1 person
Affects Status Importance Assigned to Milestone
Cinder
New
Undecided
Unassigned

Bug Description

Problem description:
Tempest tests which test backup functionality consistently fails when s3 backend is used.
Example logs can be found in https://zuul.opendev.org/t/openstack/build/df411785b7e54a52b1336662b7f6cbd4 .

Looking into cinder-backup.log, I found that 400 from s3 server seems to be the root cause.

```
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server [None req-a68585df-d1e3-4623-9365-62e91b8054eb tempest-VolumesBackupsTest-1726122002 None] Exception during message handling: cinder.backup.drivers.s3.S3ClientError: S3 client error: An error occurred (400) when calling the HeadBucket operation: Bad Request
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/backup/drivers/s3.py", line 158, in func_wrapper
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server return func(*args, **kwargs)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/backup/drivers/s3.py", line 219, in put_container
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server self.conn.head_bucket(Bucket=bucket)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/botocore/client.py", line 535, in _api_call
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server return self._make_api_call(operation_name, kwargs)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/botocore/client.py", line 980, in _make_api_call
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server raise error_class(parsed_response, operation_name)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server botocore.exceptions.ClientError: An error occurred (400) when calling the HeadBucket operation: Bad Request
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server During handling of the above exception, another exception occurred:
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server Traceback (most recent call last):
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/server.py", line 165, in _process_incoming
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server res = self.dispatcher.dispatch(message)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 309, in dispatch
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server return self._do_dispatch(endpoint, method, ctxt, args)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_messaging/rpc/dispatcher.py", line 229, in _do_dispatch
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server result = func(ctxt, **new_args)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/backup/manager.py", line 522, in continue_backup
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception():
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server self.force_reraise()
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server raise self.value
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/backup/manager.py", line 500, in continue_backup
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server with excutils.save_and_reraise_exception():
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 227, in __exit__
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server self.force_reraise()
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/data/venv/lib/python3.10/site-packages/oslo_utils/excutils.py", line 200, in force_reraise
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server raise self.value
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/backup/manager.py", line 493, in continue_backup
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server updates = backup_service.backup(
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/backup/chunkeddriver.py", line 570, in backup
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server volume_size_bytes) = self._prepare_backup(backup)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/backup/chunkeddriver.py", line 355, in _prepare_backup
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server container = self._create_container(backup)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/backup/chunkeddriver.py", line 258, in _create_container
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server self.put_container(backup.container)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server File "/opt/stack/cinder/cinder/backup/drivers/s3.py", line 160, in func_wrapper
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server raise S3ClientError(reason=err)
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server cinder.backup.drivers.s3.S3ClientError: S3 client error: An error occurred (400) when calling the HeadBucket operation: Bad Request
Dec 20 13:19:17.930678 np0036212256 cinder-backup[82002]: ERROR oslo_messaging.rpc.server
Dec 20 13:19:18.008651 np0036212256 cinder-backup[82002]: INFO cinder.backup.manager [None req-92cbd4aa-0c00-4a8f-a899-9251f94d53c9 tempest-VolumesBackupsTest-1726122002 None] Delete backup started, backup: 639d13a9-4a90-40f9-9781-ca20c4fcb463.
```

Revision history for this message
Christian Rohmann (christian-rohmann) wrote :

Is there any way to get the complete request and code 400 response from S3?

"An error occurred (400) when calling the HeadBucket operation: Bad Request" is really generic and could just be about wrong access key / credentials / region. Where do those credentials come from in this case? Are they set and valid?

Mind you boto3 has all the AWS magic, so ENV variables such as AWS_ACCESS_KEY_ID have the highest preference over e.g. an AWS_PROFILE - see https://docs.aws.amazon.com/cli/latest/topic/config-vars.html.

Revision history for this message
Takashi Kajinami (kajinamit) wrote :

It seems the issue was "fixed" by https://review.opendev.org/c/openstack/tempest/+/904707 , but I'm wondering if we have to consider a better way to avoid the problem.

Revision history for this message
Christian Rohmann (christian-rohmann) wrote :

Well S3 expects only lower case letters, see [1]

This is due to the fact that S3 bucket can be accessed via a subdomain with the bucket name. This is by AWS and also 3rd party implementations such as Ceph [2].

I am wondering if Swift would actually allow for "CoNtAiNeR" to exist alongside "container" and "CONTAINER" [3]? Otherwise I would expect the Swift library to actually normalize / lowercase the name or even through an exception.

[1] https://docs.aws.amazon.com/prescriptive-guidance/latest/defining-bucket-names-data-lakes/faq.html#q5
[2] https://docs.ceph.com/en/reef/radosgw/s3/commons/#bucket-and-host-name
[3] https://docs.openstack.org/api-ref/object-store/index.html#containers

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.