Compression doesn't work

I’m trying to setup gzip compression, but no file gets compressed at all.

I’ve overridden ShouldCompressFile as public override bool ShouldCompressFile(IVirtualFile file) => true; but I see that no file gets compressed.

Can you investigate the issue? I got the same behaviour on both 5.1.0 and 5.1.1.

Please, ask if you need more details.
Thank you.

Please provide as much context possible that someone reasonably needs to be able to at least identify the issue.

At a minimum we need we need the raw HTTP Request/Response Headers, are you referring to a static file?

yes, we are referring at a static file. Here is the request as CURL:

curl --request GET \
  --url http://localhost/html/it/main/main.b08918397b7ea52c6e81.js \
  --header 'accept: */*' \
  --header 'accept-encoding: gzip' \
  --header 'accept-language: it-IT,it;q=0.8,en-US;q=0.5,en;q=0.3' \
  --header 'cache-control: no-cache' \
  --header 'dnt: 1' \
  --header 'pragma: no-cache' \
  --header 'referer: http://localhost/html/it/main/index.html' \
  --header 'user-agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0' \
  --cookie 'ss-pid=N1C6ikjhkJhHl7ZHhYDt; ss-id=bBUHbCpfCC4UPHd5fMIY; ss-opt=temp; X-UAId='

We can see that the response change a bit when we don’t set the Accept-Encoding header.
If we set it, the response header are set as:

**Transfer-Encoding: chunked**
Content-Type: text/javascript
Last-Modified: Thu, 19 Jul 2018 13:51:30 GMT
Vary: Accept
Server: Microsoft-HTTPAPI/2.0
X-Powered-By: ServiceStack/5.11 NET45 Win32NT/.NET
Access-Control-Allow-Methods: GET, POST, PUT, DELETE, OPTIONS
Access-Control-Allow-Headers: Origin, Accept, Content-Type, Authorization, Set-Cookie
Access-Control-Allow-Credentials: true
Date: Fri, 20 Jul 2018 09:52:27 GMT

Otherwise they are set as:

**Content-Length: 293847**
Content-Type: text/javascript
Last-Modified: Thu, 19 Jul 2018 13:51:30 GMT
Vary: Accept
Server: Microsoft-HTTPAPI/2.0
X-Powered-By: ServiceStack/5.11 NET45 Win32NT/.NET
Access-Control-Allow-Methods: GET, POST, PUT, DELETE, OPTIONS
Access-Control-Allow-Headers: Origin, Accept, Content-Type, Authorization, Set-Cookie
Access-Control-Allow-Credentials: true
Date: Fri, 20 Jul 2018 09:52:27 GMT

Compression for static files is working. Have you enabled Static File Compression?

SetConfig(new HostConfig {
    CompressFilesWithExtensions = { "js", "css" },
    // (optional), only compress .js or .css files > 10k
    CompressFilesLargerThanBytes = 10 * 1024 
});

Yes absolutely, in that exact same way. In fact, enabling and disabling compression, changes those 2 headers i’ve pointed out in the previous post.
If it can help you, i’m using the Self Host configuration of ServiceStack.

Compression works the same for self-host AppSelfHostBase:

Try downloading the file in the browser to see if it returns a compressed response (i.e. includes Content-Encoding: gzip/deflate)

Firefox returns this, with file size equals to transfer size:

My hostconfig:

var hostConfig = new HostConfig
{
   DebugMode = false,
   UseCamelCase = true,
   CompressFilesWithExtensions = { "js", "css", "html" },
   CompressFilesLargerThanBytes = 10 * 1024,
   AllowPartialResponses = false,
};

The Responses are being compressed for Firefox as well:

So I can’t tell why it’s not working for you, I’m assuming the file size is larger than 10k? I’d need a repro I can run locally to be able to investigate.

Yes the file is actually a 5MB JS file. What do you need for a repro? I want to send you enough to reproduce the issue without sending you the entire code base.

A Minimal, Complete, and Verifiable example, i.e. the smallest stand-alone example that I can run locally to repro the issue.

Here is the example, done with 5.1.1 MyGet, updated as today:

using ServiceStack;
using ServiceStack.IO;
using System;
using System.Reflection;
using System.Threading;

namespace SSCompression
{
    /// AppHost Implementation
    public sealed class AppHost : AppSelfHostBase
    {
        public AppHost() : base("Compression Bug", Assembly.GetExecutingAssembly()) { }

        /// Overriding Configure to enable compression
        public override void Configure(Funq.Container container)
        {
            var hostConfig = new HostConfig
            {
                DebugMode = false,
                CompressFilesWithExtensions = { "js", "css", "html" },
                CompressFilesLargerThanBytes = 10 * 1024,
                WebHostPhysicalPath = @"..\..\".MapServerPath(),
            };

            base.SetConfig(hostConfig);
        }

        /// Just compress everything. Commenting this override does not change behaviour
        public override bool ShouldCompressFile(IVirtualFile file)
        {
            return true;
        }
    }


    /// Entry Point
    class Program
    {
        /// Entry point dell'applicazione
        static void Main(string[] args)
        {
            try
            {
                new AppHost().Init().Start("http://localhost/");
            }
            catch (Exception ex)
            {
                Console.Write(ex);
                throw;
            }

            // Handle CTRL+C
            var exitEvent = new ManualResetEvent(false);

            Console.CancelKeyPress += (sender, eventArgs) =>
            {
                Console.WriteLine("Stopping the server...");
                eventArgs.Cancel = true;
                exitEvent.Set();
            };

            Console.WriteLine("Use CTRL+C to close this server...");
            exitEvent.WaitOne();
        }
    }
}

Just put any big file (mine is 5.14MB) in the WebHostPhysicalPath, and the file is not compressed as I showed in the previous posts. This file was made stripping mine piece by piece and testing is every time.

EDIT: the same happens with 5.1.0

I am again unable to reproduce the issue with what you’ve provided:

This is now the 4th time I’ve tried to repro this issue in this thread without being able to. If you want me to spend any more time investigating this, upload a Minimal, Complete, and Verifiable example to GitHub which I can just clone, restore and run to reproduce the issue. Make sure you test it on a different computer to make sure it’s not your environment that’s the issue.

Here the Github repository with the bug reproduction: https://github.com/samusaran/SSCompressionBug

Tested on:

  • My Windows 10 machine
  • Clean install Windows 10 VM
  • Clean install Windows 8.1 VM

On every case, the transferred size is equal to the file size, the Content-Encoding header is not present, and instead Transfer-Encoding: chunked is present.

This example does not reproduce it either:

It’s still returning a compressed deflate response.

Transfer-Encoding is not a replacement for Content-Encoding, all the HttpListener examples I’ve shown above does both.

I’ve found something more. Not setting the User-Agent header, it actually returns me a compressed content, while using any of the following it does not.

Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.181 Mobile Safari/537.36
Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/68.0.3440.75 Safari/537.36

I don’t understand why this should happen. With this I can have a compressed content returned to me to a software like Insomnia, but it doesn’t work with any browser, it seems.

For a specific example, this works:

curl --request GET \
  --url http://localhost/main.js \
  --header 'accept: application/json' \
  --header 'accept-encoding: gzip, deflate, br' \
  --cookie 'ss-pid=N1C6ikjhkJhHl7ZHhYDt; ss-id=bBUHbCpfCC4UPHd5fMIY; ss-opt=temp; X-UAId='

Response:
Transfer-Encoding: chunked
Content-Type: text/javascript
Content-Encoding: deflate
Last-Modified: Tue, 31 Jul 2018 06:58:15 GMT
Accept-Ranges: bytes
Vary: Accept
Server: Microsoft-HTTPAPI/2.0
X-Powered-By: ServiceStack/5,10 NET45 Win32NT/.NET
Date: Tue, 31 Jul 2018 07:11:43 GMT

This does not:

curl --request GET \
  --url http://localhost/main.js \
  --header 'accept: application/json' \
  --header 'accept-encoding: gzip, deflate, br' \
  --header 'user-agent: Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.181 Mobile Safari/537.36' \
  --cookie 'ss-pid=N1C6ikjhkJhHl7ZHhYDt; ss-id=bBUHbCpfCC4UPHd5fMIY; ss-opt=temp; X-UAId='

Response:
Transfer-Encoding: chunked
Content-Type: text/javascript
Last-Modified: Tue, 31 Jul 2018 06:58:15 GMT
Accept-Ranges: bytes
Vary: Accept
Server: Microsoft-HTTPAPI/2.0
X-Powered-By: ServiceStack/5,10 NET45 Win32NT/.NET
Date: Tue, 31 Jul 2018 07:13:05 GMT

Solved as https://github.com/restify/node-restify/issues/1515. It was the exactly same issue with the exactly same antivirus.

Makes sense it was something in your Environment as it was unreproducible here.

Going to copy the post comment here so it’s more clearly discoverable for anyone else hitting this issue:

I have just looked at it a bit more and tried it on another computer and it is working fine on that.
The only difference between the two machine is one has and anti-virus installed and one does not.

The antivirus is stripping the header.

If anyone else has this issue my antivirus is ESET Endpoint Security and you will need to disable “application protocol content filtering” to get gzip encoding it to work

Closing due to it being an antivirus issue