I just tried to send a gzipped and protobuf-encoded payload to our ServiceStack server per POST, and it responded with RequestBindingExceptionUnable to bind request at ServiceStack.Host.RestHandler.CreateRequest (IRequest httpReq, IRestPath restPath).
The request headers included Accept-Encoding: gzip Content-Encoding: gzip Content-Type: application/x-protobuf
Uncompressed web requests work flawlessly with protobuf.
Is ServiceStack supporting gzip encoded payload in general?
If yes it might be possible that my compression was not understood by the server, I used DotNetZip which compresses with Gzip default compression (6).
What else could be wrong?
No the decompression of Request doesn’t happen inside ServiceStack, the recommended approach is to decompress the HTTP Request before it reaches ServiceStack using an ASP.NET HttpModule.
Short status update before I leave for lunch. It works, when I leave out the Content-Type header. As soon as I send a Content-Type header together with Content-Encoding: gzip and the GZipRequestDecompressingModule : IHttpModule configured, the server returns a HTTP 400 / Unable to bind request.
I assume this could be an ordering problem of the filters?
IHttpModule is registered at the ASP.NET level and should be transparent and executed before the request reaches ServiceStack, so ServiceStack should just be seeing an uncompressed stream - sounds like that’s not happening.
And, additionally, I had to remove the line request.Headers.Remove("Content-Encoding");, because "HTTP Error 500: System.NotSupportedException - Collection is read-only."