Comment 33 for bug 676370

Revision history for this message
Eric Angell (some-other-guy) wrote :

die.net indeed appears to block both curl and wget user agents, and with good reason. But he's not blocking Coral (http://coralcdn.org), and I can't see why he would ever start. This is exactly the sort of application where Coral is useful, so I recommend you change the source urls in the scripts to look like "http://static.die.net.nyud.net/earth/peters/1600.jpg" (adding the ".nyud.net" to the end of the host name).

However, despite Coral's FAQ claiming that they respect cache-control headers such as max-age, that doesn't appear to be working in this case. die.net is very conveniently sending a "Cache-control: public, max-age=1800" in responses that have not been caught by the curl/wget user agent blackhole, but when the file is delivered through Coral, it's gaining their default cache timeout of 12 hours instead of the half hour specified by the original server. Subsequent Coral requests sometimes return the original file cached, and sometimes return newer versions of it. Even if I specify "Cache-Control: max-age=1800, must-revalidate" in my curl request, Coral's behavior doesn't change.

So I recommend that somebody put further effort into figuring out why Coral's cache control is broken, but that the project script be updated regardless - better to get an image up to 12 hours out of date than to get nothing at all.