Comment 5 for bug 541637

Revision history for this message
Leonard Richardson (leonardr) wrote :

Yes, Martin has the right idea. When you iterate over a lazr.restfulclient collection it makes a series of HTTP requests grabbing different chunks of the data. So you're effectively making the same SELECT statement with LIMIT=50, LIMIT=51,100, etc. (I don't remember the exact SQL syntax, but you get the idea.) If the database changes between requests you will silently miss or repeat entries. On the server side I could see a database connection keeping open a view of the database as it was when the transaction started, and iterating over that view. But here every SELECT statement runs in a totally different context.

I think we can *detect* this problem fairly easily by adding ETags to collections and asking for different pages using If-Match=[collection ETag]. But the 412 you get in that situation will be the same as when you call lp_save() and discovered someone else just saved the same object. You won't have any option other than to restart your operation from the beginning. And when you're talking about hundreds of bugs, that sucks.

I can think of ways to avoid the iterator (eg. create a server-side 'set of bugs' object representing the result of your query, and then modify that set with a single request, allowing the iterator to run on the server side), but they're significantly more complex.

As a stupid workaround you can set a huge page size and get all the Triaged bugs at once. Since you're stopping the bugs from being Triaged, you could also repeatedly get the first page of Triaged bugs and set that page.

Sorry that I don't have more to offer.