many retries can overflow the results column in atomdetails
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
taskflow |
Fix Released
|
High
|
Greg Hill |
Bug Description
If you have a task that retries a lot of times (up to 80 in this particular case, which I know is excessive, but makes sense in context), it can eventually write a JSON string to the database that is longer than the field can handle. In the default mysql backend, mysql will silently truncate that JSON string, making it no longer valid. Any further retries then fail, because they can't load the history due to it being invalid JSON.
For example (elided for sanity):
Original exception being dropped: ['Traceback (most recent call last):\n', ..File "/usr/lib/
description: | updated |
Changed in taskflow: | |
importance: | Undecided → High |
status: | New → Confirmed |
Changed in taskflow: | |
assignee: | nobody → Greg Hill (greg-hill) |
Reviewed: https:/ /review. openstack. org/153443 /git.openstack. org/cgit/ openstack/ taskflow/ commit/ ?id=1fc1a7e4713 c27f537a3f6192b ea259a0d63ca64
Committed: https:/
Submitter: Jenkins
Branch: master
commit 1fc1a7e4713c27f 537a3f6192bea25 9a0d63ca64
Author: Joshua Harlow <email address hidden>
Date: Thu Feb 5 17:46:58 2015 -0800
Add warning to sqlalchemy backend size limit docs
Partial-Bug: #1416088
Change-Id: I21bde00445892a 4c734592cbcc143 f23085e5660