Caching for arbitrary images on the web/remote hosts
Affects | Status | Importance | Assigned to | Milestone | |
---|---|---|---|---|---|
thumbnailer (Ubuntu) |
New
|
Wishlist
|
Unassigned |
Bug Description
I frequently hit the use case that I query some host (might be in my local network, might be on the internet) for a list of things. The result set comes with images so I present a scrollable list of all those things to the user.
Problem now is that with the images being fetched, the list scrolling is somewhat stuttery and also presents empty images at first which then are only filled with content a bit later. Scrolling the list back up, same thing again happens as the in-QML caches we have are based on RAM and with that can't hold tons of images. They are discarded and scrolling the list back up causes them to be reloaded.
It would be great if the Thumbnailer could take care of this and fetch and cache them for the developer.
One point to think about would be timeouts. Some use cases will always present the same image again. For example in kodimote I fetch the music artwork given by kodi. The artwork won't ever change, so the cache can be persistent. At the same time, fetching things from an online server *might* change over time. Ideally this would be configurable so that the developer can specify some cache time out for entries.
Doing this would not be hard. The underlying persistent- cache-cpp thingy already handles TTL eviction. The time to live can be set on a per-thumbnail basis. So, you can just ask for a thumbnail and provide an expiry time, and the thumbnail will automatically be re-fetched once it expires. Or you can leave the timeout off, and then the thumbnail will hang around the cache using LRU until it isn't accessed often enough to drop out of the cache completely (at which point it would be automatically re-fetched as needed).
The key you would provide for this would be the URL to the image file.
If you want to thumbnail images that you have extracted yourself (where the images do not have URL that directly points at them), you can do that today. Doing so requires you to write them into the file system and to then ask for a thumbnail with a URL that points at the local file. (That's how photos are being thumbnail, for example.) The cost of this approach is increased disk space because the thumbnailer absolutely will not provide a thumbnail for a local image unless you can prove that you are entitled to actually read that image. In practice, this means that, as soon as the original image from which a thumbnail was generated is deleted, the thumbnailer will no longer hand out a thumbnail for that image.
All this would work for any file type that is currently supported. (Basically all image file formats in the known universe, plus most audio and video formats, provided the necessary gstreamer codecs are installed.)