AutoQuery slow on 200 column query (150 rows)

I have hit an issue with AutoQuery causing a HTTP timeout, because it takes over 100 seconds to process a query request.

The SQL query takes roughly 50 seconds to return 150 rows of 200 columns. However, it is taking an additional 100 seconds to pass the data to the client, which combined exceeds the 100 second HTTP timeout enforced on the Blazor client. (I have the same loading times using curl, so it is not client related).

Unfortunately, it does not seem possible to increase the HTTP timeout for the JsonApiClient Client
see JsonApiClient Client

Without changing the db query, I can comment out properties in the response DTO class, to have the data passed to the client under the 100 second threshold. This leads me to believe the issue is AutoQuery mapping of the db response to the response DTO.

    public QueryResponse<ResponseDto> Get(QueryRequest request)
    {
        using var db = AutoQuery!.GetDb<ResponseDto>(Request);
        var q = AutoQuery.CreateQuery(request, Request, db);
        var response = AutoQuery.Execute(request, q, Request, db);
        return response;
    }
public class QueryRequest : QueryDb<ResponseDto>, IGet
{
    public bool? P_Closed { get; set; }
    public bool? S_Closed { get; set; }
}
[Schema("Report")]
[DataContract]
public class ResponseDto
{
   // annotations
   [DataMember] public Guid? A_Id { get; set; }
   [DataMember] public string? A_FinalRemark { get; set; }
   [DataMember] public bool? A_Final { get; set; }

  // total of 200 properties defined (types: string, DateOnly, decimal and bool)

Any help or pointers appreciated:

  • Are there any known issues handling large column sets?

  • Any suggestions to resolve the issue?

I am using ServiceStack 6.10, Windows, MSSQL 2019, NET7 and Blazor WASM client.

Without specifically measuring your use case, it is hard to tell exactly what is causing the slow down beyond that it is after the DB query has finished. One thing you could try is to remove AutoQuery from the equation to see if your performance improves with the rest of the stack staying the same, eg just returning a List based on a matching SQL query but using plain OrmLite. It is possible it is a WASM deserialization layer issue, but I’m not 100% sure how the fetch implementation interacts with Blazor itself, so again, something you would have to test.

If it is just Blazor/C# serialization causing the issue, you might need to step outside of Blazor/C# to get the control you need to handle the client timeout in the browser (options { timeout = x }). Obviously this is far from ideal but might be required if you are returning such a large number of columns, then use JS interop to bind the data back to your C# context maybe?

Does that page require all 200 columns? Could you create a new AutoQuery endpoint catering specifically with just the columns you require for the page? Have you checked how the built in UIs like Locode perform with the same AutoQuery endpoint? You could try requesting just the columns you want using the Fields option.

If you want to setup a minimal reproduction, and share it as a public repository on GitHub, I could take a further look.

Thank you for the response. Your suggestion to replace AutoQuery with OrmLite solved my issues.

Using OrmLite, reduced the total response time from 150s to 40s:

    public QueryResponse<ResponseDto> Get(QueryRequest request)
    {
        Db.SetCommandTimeout(300);

        var data = Db.Select<ResponseDto>();

        return new QueryResponse<ResponseDto>
        {
            Offset = 0,
            Total = 0,
            Results = data,
            Meta = null,
            ResponseStatus = null
        };
    }

To clarify:

  • The issue is not related to the Blazor WASM client; the same errors occurred using swagger and curl.
  • All 200 columns are required and the query cannot be broken down (unfortunately).

To me it seems that AutoQuery struggles with 200 columns, but it is great that dropping down to OrmLite can resolve the issues.

Today, I consider the issue solved, so thank you again.

Does your Table include any Complex Type properties with [Reference] attributes by any chance?

Simple date types only: string, DateOnly, decimal, bool and Guid.

No [Reference] attributes.

ok that’s interesting, issue must be elsewhere. If you can profile it or provide a repro we can look to optimize it as 50s to return 150 rows seems abnormally slow.

Only thing I can think of is that DateTime’s are notoriously slow in conversions/serialization, how many DateTime / DateOnly / DateTimeOffset columns does it have?

There are 187 columns comprising:

  • Guid: 1
  • bool: 11
  • decimal: 77
  • DateOnly: 32
  • int: 5
  • string: 61

Previously, I did some testing by commenting out properties of different types in the ResponseDto (e.g. all DateOnly), but I saw no significant performance differences amongst the types.

Simply, reducing the number of properties improved the AutoQuery “mapping” of the database data to the C# data.

2 Likes