How to manage huge file transfer

Hi
I would like to use the Virtual File System in order to upload / download file from my AppHost. In few rows I already add this functionality, but when I try to download a huge file (300mega) I saw the network percentage of the apphost reach 100% and I’m wondering if there is a better approch to handle files, maybe splitting them into small parts.

The method on the service is this

        public object Get(OrderProcessProgramFileGet request)
        {
            string path = $@"Programs\{request.FileName}";

            var file = HostContext.VirtualFileSources.GetFile(path);
            if (file == null)
                throw HttpError.NotFound($"File {path} does not exist");

            return new HttpResult(file, true)
            {
                ContentType = MimeTypes.GetMimeType(file.Extension)
            };
        }

and the method of the client is

            using (var client = new JsonServiceClient(@"http://localhost:50000"))
            {
                var request = new OrderProcessProgramFileGet()
                {
                    FileName = "f2013c46-ed5a-48a3-a92f-002e9cd8bd76.program"
                };
                var response = client.Get<object>(request);

                System.IO.File.WriteAllText($@"C:\Test\{request.FileName}", response.ToString());

            }

Any suggestion will be appreciated
Best regards
Enrico

The service implementation is fine as it will stream back the response but you’ll also want to stream back the response on the client. See the C# client docs for different examples from reading from a stream, e.g:

using var stream = await client.GetAsync<Stream>(new GetFile { 
    Path = "/path/to/file.png" 
});
using var fs = File.Create(Path.Combine(uploadsDir, "file.png"));
await stream.CopyToAsync(fs);

Note: If using you .NET Core you should use JsonHttpClient which avoids the abstraction penalty.