Generalized request batching
Hi,
even though ServiceStack has recently implemented request batching, it only works for multiple requests of the same type.
We need to handle the case of batching several different requests together. The idea is to reduce the number of HTTP requests, not so much for performance reasons as much as to help us read the HTTP trace more easily (less requests to ‘follow’).
I’ve implemented this on the server side with a relatively simple extension method, but it boils down to splitting the ‘container’ batch request into constituents, then executing them synchronously in a simple foreach loop. Since most of our service calls are IO bound, I’d definitely want to run them in parallel.
I could run all of the constituent requests in parallel by running Task.Run for each of them then WaitAll, but this is ignoring a lot of threads already created for request dispatching (basically by ASP.NET when in IIS or by ServiceStack when self hosting) and creating additional ones for the Task library.
I would like to simulate the behavior of normal request dispatcher, so I wonder where/how to do so? I’m guessing this should be doable with ServiceRunner but I’m not sure what is the best way to get/use this from inside a request handler? Differently put, it’s almost as if I wanted to take the batching request, split it into actual requests, then re-put them back into execution pipeline so that they get executed, then when done re-assemble back into a batching response. I also don’t want to repeat this code, it should be reusable across all services.
I hope I’m making sense, if not please let me know and I’ll clarify.
Right, Auto Batched requests does require every request to be of the same type.
If you want to create your own custom batched requests you should be able to execute each Request DTO individually with base.ExecuteRequest(dto)
- which is just a wrapper around HostContext.ServiceController.Execute(dto, Request)
.
Drazen Dotlic:
+Demis Bellot great info, thanks. If I want to execute individual requests ‘in parallel’ then wait for completion, should I just call ExecuteAsync for each of them (then WaitAll)? Will this or not introduce new threads when hosted in IIS?
Drazen Dotlic:
When I say ExecuteAsync I mean of course the method of the ServiceController.
+Drazen Dotlic Nope don’t use ExecuteAsync, it’s an unused internal API I’m experimenting with, currently it’s just wrapping a sync service result in a Tast<object> response.
Drazen Dotlic:
Good to know, I noticed ExecuteAsync isn’t in the IServiceController which hinted a bit to its usability.
If I did want to have request execution in parallel without introducing new threads especially in hosting environement which already handle thread pooling on their own, what would be the cleanest way?
Btw, in the 4.0.23 version base.ExecuteRequest isn’t accessible, but I can get ahold of HostContext so I’m fine with that.
Don’t see how you’re going to avoid using new threads, for ASP.NET hosts you wont have access to the ASP.NET worker thread pool so you’ll either need to use the ThreadPool or manage your own threads. The https://github.com/ServiceStack/ServiceStack/wiki/Concurrency-model wiki describes the different concurrency model of the different self hosting AppHosts, which either uses the same I/O Thread (AppHostHttpListenerBase), .NET ThreadPool (AppHostHttpListenerPoolBase) or SmartThreadPool (AppHostHttpListenerBase) which uses SmartTheadPool to manage it’s own threads.
Otherwise If you execute non-blocking async Services than ExecuteRequest() returns a Task which you can use Task.WhenAll() on.
Drazen Dotlic:
I might have been imprecise: I want to avoid creating (accidentally) additional threads which would not have been created had the batched requested actually been split on the client side (which I don’t want to do for reasons listed in the post).
If I make 3 different calls for 3 requests they will be dispatched by the hosting environment to 3 threads (assuming no congestion and reasonably multi-core host machine). If I batch manually (as I am doing now) my batch request will be executed on its own thread (like any other request DTO) but then my foreach loop will synchronously execute actual requests in sequence. Since our requests are (on the server) heavily IO bound, it might be faster to execute separate requests (from the client), which defeats somewhat the purpose of batching
IOW, I’m just looking at dispatching the actual requests in the most efficient way, trying to reuse the existing mechanism (whether it’s Smart Thread Pool or ASP.NET’s thread pool, I don’t really want to know, ServiceStack takes care of this at the moment just fine). FYI, the code at the moment, even if it executes synchronously, is fast enough not to cause us issues, but I have an itching feeling that it can be improved.
Drazen Dotlic:
Just looked at the wiki about concurrency model. I actually don’t see a way to do this better except to simply make our requests async and then do WaitAll - as you’ve suggested. There’s no need to run async code (rather IO bound code which can be made async) on multiple threads, gives me nothing. I focused too much on the old ways of doing things where we needed the thread pool because most of our IO was blocking and doing async wasn’t as easy as it is today with async/await.
Thanks for your help!