OutOfMemoryException RedisTypeClient.DeleteAllAsync()

So I have a Redis Typed collection that has grown quite large (Count=608251). I have code to periodically clean up the collection, but it has gotten so large that now I get an error when I try to remove it.

Is there a better way to achieve a DeleteAll that will handle larger data sets? Or is this something you can fix on your side?


await using var redisClient      = await redisClientsManager.GetClientAsync().ConfigureAwait(false);
var             redisTypedClient = redisClient.As<T>();
await redisTypedClient.DeleteAllAsync().ConfigureAwait(false);


Exception of type 'System.OutOfMemoryException' was thrown.
System.OutOfMemoryException: Exception of type 'System.OutOfMemoryException' was thrown.
at System.String.Ctor(ReadOnlySpan`1 value)
at System.Span`1.ToString()
at System.Text.ValueStringBuilder.ToString()
at System.String.FormatHelper(IFormatProvider provider, String format, ParamsArray args)
at ServiceStack.IdUtils.CreateUrn[T](Object id) in 
C:\BuildAgent\work\3481147c480f4a2f\src\ServiceStack.Common\IdUtils.cs:line 175
at ServiceStack.Redis.RedisClient.UrnKey[T](Object id) in 
C:\BuildAgent\work\b2a0bfe2b1c9a118\src\ServiceStack.Redis\RedisClient.cs:line 870
at ServiceStack.Redis.Generic.RedisTypedClient`1.ServiceStack.Data.IEntityStoreAsync<T>.DeleteAllAsync(CancellationToken token) in C:\BuildAgent\work\b2a0bfe2b1c9a118\src\ServiceStack.Redis\Generic\RedisTypedClient.Async.cs:line 141

The Redis typed client uses a Redis set data type to track keys of individually stored object which has their own key. The current implementation uses SMEMBERS to return all the members of the set and pulls it into memory. At 600k items, this will likely start approaching a 100MB+ (depending on your IDs/key size) also going over the wire.

If your type IDs key are really long, this will also contribute to the size, although that isn’t really addressing the problem.

A better way to handle this would be to use SSCAN and page through the keys during delete. Here is a possible implementation using an extension method from a RedisTypedClient. Usage being myRedisTypedClient.DeleteAll(0,1000); for example.

public static class RedisCustomExtensions
    public static void DeleteAll<T>(this RedisTypedClient<T> typedClient, ulong cursor, int pageSize)
        var scanResult = typedClient.NativeClient.SScan(typedClient.TypeIdsSetKey, cursor, pageSize);
        var resultCursor = scanResult.Cursor;
        var ids = scanResult.Results.Select(x => Encoding.UTF8.GetString(x)).ToList();
        var urnKeys = ids.Map(t => typedClient.RedisClient.UrnKey<T>(t));
        if (urnKeys.Count > 0)
        if(resultCursor != 0)

It recursively uses SSCAN to read pageSize number of keys at a time from the set and removes the returned keys from the set until there are no more and then removes the set key by ID itself. This should use less memory at a time but will be more chatty (more commands to Redis), but still pull the full set over the network. Usage would be DeleteAll(0,10000) for example.

As your data gets bigger, it might be worth looking at the usage of the data to group them into sets of your own to reduce the number of keys, but this is usually only worth while if you expire data often and Redis CPU utilization is high as you can reduce the overhead of Redis expiration by grouping into a single managed set/hashset yourself.

Hope that helps.

1 Like

That does help. Thank-you!

Maybe to flush the response out, an Async version?

@Charles this change was made the default in the latest version on MyGet v5.11.1.

The DeleteAll methods for both RedisClient and RedisTypedClient (and async equivalents) use SSCAN with a configurable CommandKeysBatchSize under the RedisConfig type, defaulting to 10000.

This should avoid memory issues related to large sets when using the RedisTypedClient.

Let us know how you go. :+1:

1 Like