gwibber should follow some of twitter's rate limiting best practices

Bug #366117 reported by Dominic Evans
2
Affects Status Importance Assigned to Milestone
Gwibber
Confirmed
Undecided
Unassigned

Bug Description

twitter now offer some guidance on rate limiting best practices on their API wiki here http://apiwiki.twitter.com/Rate-limiting

e.g.,

"Your application should recognize it is being rate-limited by the REST API if it receives begins to receive HTTP 400 response codes. It is best practice for applications to monitor their current rate limit status and dynamically throttle requests if necessary."

We should monitor the returned HTTP headers and throttle back the number of requests appropriately and/or put up a warning to tell the user they need to scale things back. We could do this within twitter.py and make it internally cache results and throttle things, so we don't have to make users scale back their polling of other services like identi.ca / facebook etc.

"Search back-offs: If your application monitors a high volume of search terms, query less often for searches that have no results than for those that do. By using a back-off you can keep up to date on queries that are hot but not waste cycles requesting queries that very rarely change."

This would also be a good one to implement, even though the search api has a much higher rate limit, automatic tuning of the polling interval for search terms based on the average frequency of their update would allow for plenty of hashtags/search terms to be monitored at once without risk. This could wait until we have implemented parallelised loading (as mentioned on the roadmap http://live.gnome.org/Gwibber/Roadmap ) but it would be achievable today by a simple if check

Revision history for this message
Alan Bell (alanbell) wrote :

I can confirm this, I came across it when modifying Gwibber for the use case of a full screen projected "twitter wall" for conferences. The application should back off if it gets the 420 response from the twitter API:
"An application that exceeds the rate limitations of the Search API will receive HTTP 420 response codes to requests. It is a best practice to watch for this error condition and honor the Retry-After header that instructs the application when it is safe to continue. The Retry-After header's value is the number of seconds your application should wait before submitting another query (for example: Retry-After: 67)."

Changed in gwibber:
status: New → Confirmed
Revision history for this message
Adam Hepton (adam-hepton) wrote :

This definitely needs to take place. I am regularly running out of API requests when I have my Twitter feed set to refresh every two minutes, and having to wait 30 minutes before I get a new batch of requests. This shouldn't happen with a 150 requests/hour limit and Gwibber being the only way I am accessing Twitter's feeds.

To post a comment you must log in.
This report contains Public information  
Everyone can see this information.

Other bug subscribers

Remote bug watches

Bug watches keep track of this bug in other bug trackers.