intermittent probe test failure: test_reconciler_move_object_twice

Bug #2028175 reported by Alistair Coles
6
This bug affects 1 person
Affects Status Importance Assigned to Milestone
OpenStack Object Storage (swift)
New
Medium
Unassigned

Bug Description

Intermittent failures are seen with test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice:

e.g. https://review.opendev.org/c/openstack/swift/+/888575#:~:text=2%20%7C%201-,day,-19%20hr%20ago

```
> self.assertEqual(obj['name'], expected)
E AssertionError: '1:/AUTH_test/\x00container\x002e0e537e-c1[72 chars]b6de' != '0:/AUTH_test/\x00container\x002e0e537e-c1[72 chars]b6de'
E - 1:/AUTH_test/container2e0e537e-c12a-4506-8fc4-45471d702848/objectdf4d3593-776b-4e51-a592-a5948c2eb6de
E ? ^
E + 0:/AUTH_test/container2e0e537e-c12a-4506-8fc4-45471d702848/objectdf4d3593-776b-4e51-a592-a5948c2eb6de
E ? ^

test/probe/test_container_merge_policy_index.py:536: AssertionError
```

and also

```
======================================================================
FAIL: test_reconciler_move_object_twice (test.probe.test_container_merge_policy_index.TestReservedNamespaceMergePolicyIndex)
----------------------------------------------------------------------
Traceback (most recent call last):
  File "/vagrant/swift/test/probe/test_container_merge_policy_index.py", line 554, in test_reconciler_move_object_twice
    self.fail('Found unexpected object %r in the queue' % obj)
AssertionError: Found unexpected object {'bytes': 0, 'hash': '1689720266.64622_3', 'name': '0:/AUTH_test/\x00container\x00b23803d0-e29f-4e63-8542-ad816237f052/\x00object\x00d8d592d8-8087-46fc-86cf-0bc47767bb97', 'content_type': 'application/x-delete', 'last_modified': '2023-07-18T22:44:26.646220'} in the queue
```

Tags: flakey-test
summary: - internittent probe test failure: test_reconciler_move_object_twice
+ intermittent probe test failure: test_reconciler_move_object_twice
Revision history for this message
Alistair Coles (alistair-coles) wrote :
Revision history for this message
clayg (clay-gerrard) wrote :

I ran this a bunch in a loop on my laptop and couldn't reproduce (even while my machine was busy/slow doing other things). I was hoping to get examine the failed state as context while reasoning about the assert - like maybe just running the recociler again fixed it or something. But I guess I'll just have to state at the code; maybe it's picking the wrong device/config sometimes.

Revision history for this message
clayg (clay-gerrard) wrote :

LOL, so i was going to explore the random space and see if there was a specific seed that would reliably trigger the failure:

```
diff --git a/test/probe/test_container_merge_policy_index.py b/test/probe/test_container_merge_policy_index.py
index 441380590..26477ef18 100644
--- a/test/probe/test_container_merge_policy_index.py
+++ b/test/probe/test_container_merge_policy_index.py
@@ -16,6 +16,7 @@ import time
 import uuid
 import random
 import unittest
+import os

 from swift.common.manager import Manager
 from swift.common.internal_client import InternalClient
@@ -33,6 +34,8 @@ from swiftclient import ClientException

 TIMEOUT = 60

+if 'SWIFT_TEST_RANDOM_SEED' in os.environ:
+ random.seed(os.environ['SWIFT_TEST_RANDOM_SEED'])

 class TestContainerMergePolicyIndex(ReplProbeTest):

```

And then the FIRST one failed!
```
FAILED swift/test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice - AssertionError: Found unexpected object {'bytes': 0, 'hash': '1698853396.75432_3', 'name': '0:/AUTH_test/\x00container\x002d56235d-f039-483f-a741-271fbe878c53/\x00object\x009bdd4...
```

I assumed it would fail reliably with SWIFT_TEST_RANDOM_SEED=1 but it wasn't repeatable.

Revision history for this message
clayg (clay-gerrard) wrote :
Download full text (5.1 KiB)

OK, it's NOT random.seed ordering:

vagrant@saio:~$ for i in {1..100}; do echo "testing $i"; SWIFT_TEST_RANDOM_SEED=$i pytest swift/test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice; if [ $? -ne 0 ]; then break; fi; done
testing 1
================================================================================= test session starts =================================================================================
platform linux -- Python 3.8.10, pytest-7.4.2, pluggy-1.3.0 -- /usr/bin/python
cachedir: .pytest_cache
rootdir: /home/vagrant/swift
configfile: tox.ini
plugins: cov-4.1.0
collected 1 item

swift/test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice PASSED [100%]

....

====================================================================== 1 passed, 6 warnings in 162.46s (0:02:42) ======================================================================
testing 31
================================================================================= test session starts =================================================================================
...

        # make sure the queue is settled
        self.get_to_final_state()
        for container in int_client.iter_containers(MISPLACED_OBJECTS_ACCOUNT):
            for obj in int_client.iter_objects(MISPLACED_OBJECTS_ACCOUNT,
                                               container['name']):
> self.fail('Found unexpected object %r in the queue' % obj)
E AssertionError: Found unexpected object {'bytes': 0, 'hash': '1698858066.47763_3', 'name': '1:/AUTH_test/\x00container\x006edb077d-9e35-420b-bdef-2dc266314325/\x00object\x004e0ca00f-6716-47d7-8bca-8283c85a6b3c', 'content_type': 'application/x-delete', 'last_modified': '2023-11-01T17:01:06.477630'} in the queue

swift/test/probe/test_container_merge_policy_index.py:557: AssertionError
...
=============================================================================== short test summary info ===============================================================================
FAILED swift/test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice - AssertionError: Found unexpected object {'bytes': 0, 'hash': '1698858066.47763_3', 'name': '1:/AUTH_test/\x00container\x006edb077d-9e35-420b-bdef-2dc266314325/\x00object\x004e0ca...
====================================================================== 1 failed, 6 warnings in 170.33s (0:02:50) ======================================================================
vagrant@saio:~$
vagrant@saio:~$ SWIFT_TEST_RANDOM_SEED=31 pytest swift/test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice
================================================================================= test session starts =======================...

Read more...

Revision history for this message
clayg (clay-gerrard) wrote :
Download full text (6.9 KiB)

gah, not PYTHONHASHSEED either?

vagrant@saio:~$ for i in {1..100}; do echo "testing $i"; PYTHONHASHSEED=$i pytest swift/test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice; if [ $? -ne 0 ]; then break; fi; done
testing 1
================================================================================= test session starts =================================================================================
platform linux -- Python 3.8.10, pytest-7.4.2, pluggy-1.3.0 -- /usr/bin/python
cachedir: .pytest_cache
rootdir: /home/vagrant/swift
configfile: tox.ini
plugins: cov-4.1.0
collected 1 item

swift/test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice PASSED [100%]
...
====================================================================== 1 passed, 6 warnings in 104.79s (0:01:44) ======================================================================
testing 9
================================================================================= test session starts =================================================================================
platform linux -- Python 3.8.10, pytest-7.4.2, pluggy-1.3.0 -- /usr/bin/python
cachedir: .pytest_cache
rootdir: /home/vagrant/swift
configfile: tox.ini
plugins: cov-4.1.0
collected 1 item

swift/test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice FAILED [100%]
...
        # make sure the queue is settled
        self.get_to_final_state()
        for container in int_client.iter_containers(MISPLACED_OBJECTS_ACCOUNT):
            for obj in int_client.iter_objects(MISPLACED_OBJECTS_ACCOUNT,
                                               container['name']):
> self.fail('Found unexpected object %r in the queue' % obj)
E AssertionError: Found unexpected object {'bytes': 0, 'hash': '1698864057.03644_3', 'name': '1:/AUTH_test/\x00container\x0076faa3ba-0cf0-427a-89d7-7e25d321c78a/\x00object\x006bcf6250-4922-43e2-8f2b-a26ae795c7ef', 'content_type': 'application/x-delete', 'last_modified': '2023-11-01T18:40:57.036440'} in the queue

swift/test/probe/test_container_merge_policy_index.py:554: AssertionError
...
=============================================================================== short test summary info ===============================================================================
FAILED swift/test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice - AssertionError: Found unexpected object {'bytes': 0, 'hash': '1698864057.03644_3', 'name': '1:/AUTH_test/\x00container\x0076faa3ba-0cf0-427a-89d7-7e25d321c78a/\x00...

Read more...

Revision history for this message
clayg (clay-gerrard) wrote :
Download full text (6.0 KiB)

maybe "get_to_final_state" isn't running the object-updater - i have some asyncs... and maybe no rows in the .misplaced_objects container?

FAILED swift/test/probe/test_container_merge_policy_index.py::TestReservedNamespaceMergePolicyIndex::test_reconciler_move_object_twice - AssertionError: Found unexpected object {'bytes': 0, 'hash': '1698865367.58697_3', 'name': '1:/AUTH_test/\x00container\x00ade7d7b2-8cc0-40e0-b345-0beca938d4cc/\x00object\x000aa82...
====================================================================== 1 failed, 6 warnings in 125.37s (0:02:05) ======================================================================
vagrant@saio:~$ curl http://localhost:8090/v1/.misplaced_objects
1698865200
vagrant@saio:~$ for i in {1..10}; do curl "http://localhost:8090/v1/.misplaced_objects/1698865200?format=json"; done
[][][][][][][][][][]vagrant@saio:~$

vagrant@saio:~$ for f in $(find /srv/node*/sdb*/container* -name \*.pending); do echo $f; cat $f; done
/srv/node1/sdb1/containers/56/930/e1bf38d2ae3a333f5396d47ed4bf2930/e1bf38d2ae3a333f5396d47ed4bf2930.db.pending
/srv/node3/sdb3/containers/56/930/e1bf38d2ae3a333f5396d47ed4bf2930/e1bf38d2ae3a333f5396d47ed4bf2930.db.pending
/srv/node3/sdb3/containers/56/cba/e33d8a3403f400d5ec3521d4cda9acba/e33d8a3403f400d5ec3521d4cda9acba.db.pending
/srv/node4/sdb4/containers/56/930/e1bf38d2ae3a333f5396d47ed4bf2930/e1bf38d2ae3a333f5396d47ed4bf2930.db.pending
/srv/node4/sdb4/containers/56/cba/e33d8a3403f400d5ec3521d4cda9acba/e33d8a3403f400d5ec3521d4cda9acba.db.pending

vagrant@saio:~$ for db in $(find /srv/node*/sdb*/container* -name \*.db); do echo $db; sqlite3 $db "select * from object"; done
/srv/node1/sdb1/containers/56/930/e1bf38d2ae3a333f5396d47ed4bf2930/e1bf38d2ae3a333f5396d47ed4bf2930.db
2||1698865367.58697|6|application/octet-stream|0b4c12d7e0a73840c1c4f148fda3b037|0|0
/srv/node2/sdb2/containers/56/930/e1bf38d2ae3a333f5396d47ed4bf2930/e1bf38d2ae3a333f5396d47ed4bf2930.db
/srv/node3/sdb3/containers/56/930/e1bf38d2ae3a333f5396d47ed4bf2930/e1bf38d2ae3a333f5396d47ed4bf2930.db
4||1698865367.58697_4|6|application/octet-stream|0b4c12d7e0a73840c1c4f148fda3b037|0|0
5||1698865367.58697_3|0|application/deleted|noetag|1|1
/srv/node3/sdb3/containers/56/cba/e33d8a3403f400d5ec3521d4cda9acba/e33d8a3403f400d5ec3521d4cda9acba.db
2|0:/AUTH_test/|1698865367.58697_1|0|application/deleted|noetag|1|0
4|1:/AUTH_test/|1698865367.58697_3|0|application/x-delete|1698865367.58697_3|0|0
/srv/node4/sdb4/containers/56/930/e1bf38d2ae3a333f5396d47ed4bf2930/e1bf38d2ae3a333f5396d47ed4bf2930.db
4||1698865367.58697_4|6|application/octet-stream|0b4c12d7e0a73840c1c4f148fda3b037|0|0
5||1698865367.58697_3|0|application/deleted|noetag|1|1
/srv/node4/sdb4/containers/56/cba/e33d8a3403f400d5ec3521d4cda9acba/e33d8a3403f400d5ec3521d4cda9acba.db
2|0:/AUTH_test/|1698865367.58697_1|0|application/deleted|noetag|1|0
4|1:/AUTH_test/|1698865367.58697_3|0|application/deleted|noetag|1|0

vagrant@saio:~$ for async in $(find /srv/node*/sdb*/async* -type f); do echo $async; cat $async | python -c "import sys, json, pickle; print(json.dumps(pickle.load(sys.stdin.buffer), indent=2))"; done
/srv/node2/sdb2/async_pending/b57/eef0ca1efbca663b97c2f4d5eac4fb57-...

Read more...

Revision history for this message
clayg (clay-gerrard) wrote :
Download full text (4.6 KiB)

I was wrong about there not being rows in the reconciler queue! I mis-read the sqlite cli output because it truncates \x00 - the test was right:

vagrant@saio:~$ curl -s "http://127.0.0.3:6031/sdb3/56/.misplaced_objects/1698865200?format=json" | python -m json.tool
[
    {
        "bytes": 0,
        "hash": "1698865367.58697_3",
        "name": "1:/AUTH_test/\u0000container\u0000ade7d7b2-8cc0-40e0-b345-0beca938d4cc/\u0000object\u00000aa82695-2677-43f0-80a7-f8f05e7ba617",
        "content_type": "application/x-delete",
        "last_modified": "2023-11-01T19:02:47.586970"
    }
]
vagrant@saio:~$ swift-init object-updater once -nv
...
object-6030: Update sent for /AUTH_test/containerade7d7b2-8cc0-40e0-b345-0beca938d4cc/object0aa82695-2677-43f0-80a7-f8f05e7ba617 /srv/node3/sdb3/async_pending-1/b57/eef0ca1efbca663b97c2f4d5eac4fb57-1698865367.58697_0000000000000003
...
object-6030: Update sent for /AUTH_test/containerade7d7b2-8cc0-40e0-b345-0beca938d4cc/object0aa82695-2677-43f0-80a7-f8f05e7ba617 /srv/node3/sdb7/async_pending-1/b57/eef0ca1efbca663b97c2f4d5eac4fb57-1698865367.58697_0000000000000002
...
object-6020: Update sent for /AUTH_test/containerade7d7b2-8cc0-40e0-b345-0beca938d4cc/object0aa82695-2677-43f0-80a7-f8f05e7ba617 /srv/node2/sdb2/async_pending/b57/eef0ca1efbca663b97c2f4d5eac4fb57-1698865367.58697_0000000000000004
...
object-6040: Update sent for /AUTH_test/containerade7d7b2-8cc0-40e0-b345-0beca938d4cc/object0aa82695-2677-43f0-80a7-f8f05e7ba617 /srv/node4/sdb4/async_pending/b57/eef0ca1efbca663b97c2f4d5eac4fb57-1698865367.58697_0000000000000001
vagrant@saio:~$ for async in $(find /srv/node*/sdb*/async* -type f); do echo $async; cat $async | python -c "import sys, json, pickle; print(json.dumps(pickle.load(sys.stdin.buffer), indent=2))"; done
vagrant@saio:~$ curl -s "http://127.0.0.3:6031/sdb3/56/.misplaced_objects/1698865200?format=json" | python -m json.tool
[
    {
        "bytes": 0,
        "hash": "1698865367.58697_3",
        "name": "1:/AUTH_test/\u0000container\u0000ade7d7b2-8cc0-40e0-b345-0beca938d4cc/\u0000object\u00000aa82695-2677-43f0-80a7-f8f05e7ba617",
        "content_type": "application/x-delete",
        "last_modified": "2023-11-01T19:02:47.586970"
    }
]
vagrant@saio:~$ swift-init container-replicator once -nv
...
container-6011: Adding 1 objects to the reconciler at /srv/node1/sdb1/containers/56/cba/e33d8a3403f400d5ec3521d4cda9acba/e33d8a3403f400d5ec3521d4cda9acba.db
container-6011: 2 successes, 0 failures
container-6011: Cleaning up 0 reconciler containers
...
container-6021: /srv/node2/sdb2/containers/56/930/e1bf38d2ae3a333f5396d47ed4bf2930/e1bf38d2ae3a333f5396d47ed4bf2930.db in sync with 127.0.0.1:6011/sdb1, nothing to do
container-6021: 3 successes, 0 failures
...
container-6031: Adding 1 objects to the reconciler at /srv/node3/sdb3/containers/56/cba/e33d8a3403f400d5ec3521d4cda9acba/e33d8a3403f400d5ec3521d4cda9acba.db
container-6031: 4 successes, 0 failures
container-6031: Replicating 1 reconciler containers
container-6031: Cleaning up 0 reconciler containers
...
container-6041: Adding 1 objects to the reconciler at /srv/node4/sdb4/containers/56/cba/e33d8a3403f400d5ec3521d4cda9acba/e33d8a3403f400d5ec3521d4cda9...

Read more...

Revision history for this message
clayg (clay-gerrard) wrote :
Download full text (3.3 KiB)

I'm really annoyed that replication doesn't make the reconciler queue consistent!

vagrant@saio:~$ swift-init container-replicator once -nv
container-6011: 4 successes, 0 failures
container-6021: 0 successes, 0 failures
container-6031: 4 successes, 0 failures
container-6041: 4 successes, 0 failures
vagrant@saio:~$ for i in {1..4}; do echo server $i; curl -s "http://127.0.0.${i}:60${i}1/sdb${i}/${MISPLACED_CONTAINER_PART}/.misplaced_objects/${MISPLACED_CONTAINER}?format=json" | python -m json.tool; done # query backend queue
server 1
[
    {
        "bytes": 0,
        "hash": "1698868068.21431_3",
        "name": "1:/AUTH_test/\u0000container\u0000753a0681-cd2f-4eec-869f-be3b9e27150c/\u0000object\u000079b75334-cb71-42ff-94ce-ffd6521e5f5f",
        "content_type": "application/x-delete",
        "last_modified": "2023-11-01T19:47:48.214310"
    }
]
server 2
Expecting value: line 1 column 1 (char 0)
server 3
[
    {
        "bytes": 0,
        "hash": "1698868068.21431_3",
        "name": "1:/AUTH_test/\u0000container\u0000753a0681-cd2f-4eec-869f-be3b9e27150c/\u0000object\u000079b75334-cb71-42ff-94ce-ffd6521e5f5f",
        "content_type": "application/x-delete",
        "last_modified": "2023-11-01T19:47:48.214310"
    }
]
server 4
[]
vagrant@saio:~$ for db in $(find /srv/node*/sdb*/container* -name \*.db); do echo $db; sqlite3 $db -line "select * from object"; done
/srv/node1/sdb1/containers/56/cba/e33d8a3403f400d5ec3521d4cda9acba/e33d8a3403f400d5ec3521d4cda9acba.db
               ROWID = 2
                name = 0:/AUTH_test/
          created_at = 1698868068.21431_1
                size = 0
        content_type = application/deleted
                etag = noetag
             deleted = 1
storage_policy_index = 0

               ROWID = 4
                name = 1:/AUTH_test/
          created_at = 1698868068.21431_3
                size = 0
        content_type = application/x-delete
                etag = 1698868068.21431_3
             deleted = 0
storage_policy_index = 0
/srv/node3/sdb3/containers/56/cba/e33d8a3403f400d5ec3521d4cda9acba/e33d8a3403f400d5ec3521d4cda9acba.db
               ROWID = 2
                name = 0:/AUTH_test/
          created_at = 1698868068.21431_1
                size = 0
        content_type = application/deleted
                etag = noetag
             deleted = 1
storage_policy_index = 0

               ROWID = 4
                name = 1:/AUTH_test/
          created_at = 1698868068.21431_3
                size = 0
        content_type = application/x-delete
                etag = 1698868068.21431_3
             deleted = 0
storage_policy_index = 0
/srv/node4/sdb4/containers/56/cba/e33d8a3403f400d5ec3521d4cda9acba/e33d8a3403f400d5ec3521d4cda9acba.db
               ROWID = 2
                name = 0:/AUTH_test/
          created_at = 1698868068.21431_1
                size = 0
        content_type = application/deleted
                etag = noetag
             deleted = 1
storage_policy_index = 0

               ROWID = 4
                name = 1:/AUTH_test/
          created_at = 1698868068.21431_3
                size = 0
        content_type = application/deleted
                etag = noetag
            ...

Read more...

Revision history for this message
clayg (clay-gerrard) wrote :
Download full text (12.8 KiB)

seems like when the reconciler-ic randomly discovers the node with the live entry it'll overwrite it; but it took me like 3 tries:

vagrant@saio:~$ swift-init container-reconciler once -nv
WARNING: Unable to modify max process limit. Running as non-root?
Removing stale pid file /var/run/swift/container-reconciler.pid.d
Running container-reconciler once...(/etc/swift/container-reconciler.conf.d)
container-reconciler: Starting 152380
container-reconciler-ic: Loaded override config for (default): ProxyOverrideOptions({}, {'sorting_method': 'shuffle', 'read_affinity': '', 'write_affinity': '', 'write_affinity_node_count': '2 * replicas', 'write_affinity_handoff_delete_count': None, 'rebalance_missing_suppression_count': 1, 'concurrent_gets': False, 'concurrency_timeout': 0.5, 'concurrent_ec_extra_requests': 0}, app)
container-reconciler: pulling items from the queue
container-reconciler: checking container 1698868800
container-reconciler: looking for objects in 1698868800
container-reconciler-ic: - - 01/Nov/2023/20/08/54 GET /v1/.misplaced_objects/1698868800%3Fformat%3Djson%26marker%3D%26end_marker%3D%26prefix%3D%26states%3Dlisting HTTP/1.0 404 - Swift%20Container%20Reconciler - - 70 - txde8dd968bf514c1097a4e-006542b056 - 0.0253 - - 1698869334.574397087 1698869334.599655628 0
container-reconciler: looking for containers in .misplaced_objects
container-reconciler-ic: - - 01/Nov/2023/20/08/54 GET /v1/.misplaced_objects%3Fformat%3Djson%26marker%3D%26end_marker%3D%26prefix%3D HTTP/1.0 200 - Swift%20Container%20Reconciler - - 95 - txf47471e7c26f4baab2396-006542b056 - 0.0114 - - 1698869334.604813814 1698869334.616210461 -
container-reconciler-ic: - - 01/Nov/2023/20/08/54 GET /v1/.misplaced_objects%3Fformat%3Djson%26marker%3D1698865200%26end_marker%3D%26prefix%3D HTTP/1.0 200 - Swift%20Container%20Reconciler - - 2 - txf9932f0095a14fa59f95a-006542b056 - 0.0102 - - 1698869334.619512558 1698869334.629701376 -
container-reconciler: checking container 1698865200
container-reconciler: looking for objects in 1698865200
container-reconciler-ic: - - 01/Nov/2023/20/08/54 GET /v1/.misplaced_objects/1698865200%3Fformat%3Djson%26marker%3D%26end_marker%3D%26prefix%3D%26states%3Dlisting HTTP/1.0 200 - Swift%20Container%20Reconciler - - 2 - txb5b854b8072044b090aa3-006542b056 - 0.0113 - - 1698869334.636687517 1698869334.647985458 0
container-reconciler: Reconciler Stats: {}
container-reconciler: Exited 152380
vagrant@saio:~$ for i in {1..4}; do echo server $i; curl -s "http://127.0.0.${i}:60${i}1/sdb${i}/${MISPLACED_CONTAINER_PART}/.misplaced_objects/${MISPLACED_CONTAINER}?format=json" | python -m json.tool; done # query backend queue
server 1
[
    {
        "bytes": 0,
        "hash": "1698868068.21431_3",
        "name": "1:/AUTH_test/\u0000container\u0000753a0681-cd2f-4eec-869f-be3b9e27150c/\u0000object\u000079b75334-cb71-42ff-94ce-ffd6521e5f5f",
        "content_type": "application/x-delete",
        "last_modified": "2023-11-01T19:47:48.214310"
    }
]
server 2
Expecting value: line 1 column 1 (char 0)
server 3
[
    {
        "bytes": 0,
        "hash": "1698868068.21431_3",
        "name": "1:/AUTH_test/\u0000container\u0000753a0681-cd2f-4eec-869f-be3b9e27150c...

Revision history for this message
clayg (clay-gerrard) wrote :

re-reading my comments, it seems like the main idea of the problem is:

it looks like we might have one database with a delete=1/deleted row at 1698868068.21431_3 and the other two have deleted=0/x-delete for the same timestamp.

maybe one suggested work around was to just run the reconciler more (or somehow target the container/queue with the "live" entry)

seems like when the reconciler-ic randomly discovers the node with the live entry it'll overwrite it; but it took me like 3 tries

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.