-
Notifications
You must be signed in to change notification settings - Fork 123
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Limiting cache size? #185
Comments
When using the
Note that this does not check if the files are actually expired and also doesn't limit the actual disk usage, so your mileage may vary. |
My anticipation is that when someone reaches this sort of problem, that it is time to take better ownership of that cached data and consider using something like an external store. It would be nice to support many different types of caches along with the usage, but as that gets really complicated, I've avoided it in CacheControl proper. With that said, I would think it could be a good idea to have a separate caches package that includes different implementations and can share common code such as a worker that can examine the cache implementation for stale entries, allowing folks to focus on support for specific storage systems instead. |
I did some testing, it seems that python-diskcache can be used as a drop in replacement for
Then you could periodically call However, it's not possible to remove expired items, because the cache itself is not aware of the expiry date of the response. |
I got the FileCache working but decided to try the diskcache FanoutCache because I wanted the |
@tedivm You are right, I checked and while the I've created pull requests for both projects to correct this. (grantjenks/python-diskcache#77, #195) In the mean time, if you're still interested to try out the i.e.:
|
It appears that neither
DictCache
norFileCache
provide any way to evict entries that aren't being accessed - neither by age nor a max cache size.It's only possible with Redis by configuring the database expiration externally.
The text was updated successfully, but these errors were encountered: