Yes absolutely, in that exact same way. In fact, enabling and disabling compression, changes those 2 headers i’ve pointed out in the previous post.
If it can help you, i’m using the Self Host configuration of ServiceStack.
So I can’t tell why it’s not working for you, I’m assuming the file size is larger than 10k? I’d need a repro I can run locally to be able to investigate.
Yes the file is actually a 5MB JS file. What do you need for a repro? I want to send you enough to reproduce the issue without sending you the entire code base.
Here is the example, done with 5.1.1 MyGet, updated as today:
using ServiceStack;
using ServiceStack.IO;
using System;
using System.Reflection;
using System.Threading;
namespace SSCompression
{
/// AppHost Implementation
public sealed class AppHost : AppSelfHostBase
{
public AppHost() : base("Compression Bug", Assembly.GetExecutingAssembly()) { }
/// Overriding Configure to enable compression
public override void Configure(Funq.Container container)
{
var hostConfig = new HostConfig
{
DebugMode = false,
CompressFilesWithExtensions = { "js", "css", "html" },
CompressFilesLargerThanBytes = 10 * 1024,
WebHostPhysicalPath = @"..\..\".MapServerPath(),
};
base.SetConfig(hostConfig);
}
/// Just compress everything. Commenting this override does not change behaviour
public override bool ShouldCompressFile(IVirtualFile file)
{
return true;
}
}
/// Entry Point
class Program
{
/// Entry point dell'applicazione
static void Main(string[] args)
{
try
{
new AppHost().Init().Start("http://localhost/");
}
catch (Exception ex)
{
Console.Write(ex);
throw;
}
// Handle CTRL+C
var exitEvent = new ManualResetEvent(false);
Console.CancelKeyPress += (sender, eventArgs) =>
{
Console.WriteLine("Stopping the server...");
eventArgs.Cancel = true;
exitEvent.Set();
};
Console.WriteLine("Use CTRL+C to close this server...");
exitEvent.WaitOne();
}
}
}
Just put any big file (mine is 5.14MB) in the WebHostPhysicalPath, and the file is not compressed as I showed in the previous posts. This file was made stripping mine piece by piece and testing is every time.
This is now the 4th time I’ve tried to repro this issue in this thread without being able to. If you want me to spend any more time investigating this, upload a Minimal, Complete, and Verifiable example to GitHub which I can just clone, restore and run to reproduce the issue. Make sure you test it on a different computer to make sure it’s not your environment that’s the issue.
On every case, the transferred size is equal to the file size, the Content-Encoding header is not present, and instead Transfer-Encoding: chunked is present.
I’ve found something more. Not setting the User-Agent header, it actually returns me a compressed content, while using any of the following it does not.
Mozilla/5.0 (Linux; Android 6.0; Nexus 5 Build/MRA58N) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/66.0.3359.181 Mobile Safari/537.36
Mozilla/5.0 (Windows NT 10.0; Win64; x64; rv:61.0) Gecko/20100101 Firefox/61.0
Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/68.0.3440.75 Safari/537.36
I don’t understand why this should happen. With this I can have a compressed content returned to me to a software like Insomnia, but it doesn’t work with any browser, it seems.
Makes sense it was something in your Environment as it was unreproducible here.
Going to copy the post comment here so it’s more clearly discoverable for anyone else hitting this issue:
I have just looked at it a bit more and tried it on another computer and it is working fine on that.
The only difference between the two machine is one has and anti-virus installed and one does not.
The antivirus is stripping the header.
If anyone else has this issue my antivirus is ESET Endpoint Security and you will need to disable “application protocol content filtering” to get gzip encoding it to work