Hi service stack team and users. we use the [CacheResponse(Duration = 72000, MaxAge = 30)] tag to cache response. I want to have a background job that runs in intervals much smaller than 72000 seconds so that user will always hit the cache because our endpoint is slow if it hits a cache miss. Is there a way to accomplish this? Thanks
Update:
We have been considering custom caching but wondering if we can accomplish the same thing using bulit-in CacheResponse feature
If you want to access the Service at a lower cache duration than what you’ve specified for the Service I would recommend that you maintain different Services, where your Cached API uses the gateway to invoke an internal non-cached services as done in our https://northwind.netcore.io example, e.g:
[CacheResponse(Duration = 72000, MaxAge = 30)]
public class CachedServices : Service
{
public object Get(CachedMyRequest request) =>
Gateway.Send(request.ConvertTo<MyRequest>());
}
Thanks for the response.
But I don’t think we are on the same page.
I don’t want to call a cached API at a smaller interval.
I want to ask all my users to call GET CachedMyRequest instead of some API that does exactly the same thing but with smaller cache expiration value
[CacheResponse(Duration = 72000, MaxAge = 30)]
public class CachedServices : Service
{
public object Get(CachedMyRequest request) =>
Gateway.Send(request.ConvertTo<MyRequest>());
}
but since GET CachedMyRequest is slow, I want the users to always hit the cache.
In order to do that, I want to have a background job that keeps updating the cache every 3600 seconds (e.g. res:/cachedmyrequest.json, res:/cachedmyrequest.json.deflate, res:/cachedmyrequest.html, res:/cachedmyrequest.html.deflate) so that before the cache items expire, the cache values were already updated with newer values and the cache item TTL were already reset to 72000
Doesn’t sound like you’ll be able to handle your use-case with [CacheResponse] which uses the classic Lazy Loading caching pattern and a dynamic cache key based on the end user request.
If you want to prepopulate the cache you’re going to need a constant and deterministic cache key that you control yourself that your background job populates and your Service implementations returns.