Out of memory exception with file upload

Good day,

I have created a client and self hosted service based on the Hello World sample. On the client I call the service as follows:

try
{
      var client = new JsonServiceClient("http://127.0.0.1:1337/");
      var request = new Hello
      {
           Name = @"vs2012I.iso"
      };

      Console.WriteLine("Connected.");

      var response = client.PostFileWithRequest<HelloResponse>(new FileInfo(@"C:\Temp\vs2012I.iso"), request);

      Console.WriteLine(response.Result);
}
catch (Exception e)
{
      Console.WriteLine(e);
}

The service is implemented as follows:

public class HelloService : ServiceStack.Service
{
    public HelloResponse Post(Hello request)
    {
	Console.WriteLine("Received request.");

	if (Request.Files == null || Request.Files.Length != 1)
		return new HelloResponse
		{
			Result = "No file was uploaded."
		};

	var httpFile = Request.Files[0];
	var documentFile = Path.Combine(@"C:\Temp\SS Large", request.Name);

	//httpFile.SaveTo(documentFile);

	using (var fileStream = new FileStream(documentFile, FileMode.CreateNew))
	{
		httpFile.InputStream.CopyTo(fileStream);
	}

	Console.WriteLine("Request processed.");

	return new HelloResponse { Result = "File uploaded: " + request.Name };
    }
}

When uploading a file of around 1.2GB I get:

Exception of type ‘System.OutOfMemoryException’ was thrown.
at System.IO.MemoryStream.set_Capacity(Int32 value)\r\n at System.IO.MemoryStream.EnsureCapacity(Int32 value)\r\n at System.IO.MemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)\r\n at System.IO.Stream.InternalCopyTo(Stream destination, Int32 bufferSize)\r\n at System.IO.Stream.CopyTo(Stream destination)\r\n at ServiceStack.Host.HttpListener.ListenerRequest.LoadMultiPart()\r\n at ServiceStack.Host.HttpListener.ListenerRequest.get_Form()\r\n at ServiceStack.Host.HttpListener.ListenerRequest.get_FormData()\r\n at ServiceStack.Host.Handlers.ServiceStackHandlerBase.DeserializeHttpRequest(Type operationType, IRequest httpReq, String contentType)\r\n at ServiceStack.Host.Handlers.GenericHandler.CreateRequest(IRequest req, String operationName)\r\n at ServiceStack.Host.Handlers.GenericHandler.ProcessRequestAsync(IRequest httpReq, IResponse httpRes, String operationName)

This happens with much smaller file (around 400mb) in my actual application (also self hosted).

I have tried hard to find the reason and/or solution for this but to no avail.

I’d appreciate some guidance.

Thanks.

The upload exceeded the memory limits available in .NET’s MemoryStream where the request is buffered whilst parsing requests posted Form Data, you’re going to hit resources issues uploading files this large.

One thing you can try is instead of uploading a file using HTTP multi-part/form-data to instead send it to a Request that implements IRequiresRequestStream which allows you to read directly from the Request Stream to avoid buffering as much as possible.

You should also look at streaming the Request on the client to avoid buffering there as well, e.g something like:

using (var fs = new FileInfo("C:\Temp\vs2012I.iso").OpenRead())
{
	var webReq = (HttpWebRequest)WebRequest.Create(url);
	webReq.Accept = MimeTypes.Json;
	using (var stream = webReq.GetRequestStream())
	{
		await fs.CopyToAsync(stream);
	}

	using (var webRes = webReq.GetResponse())
	using (var stream = webRes.GetResponseStream())
	{
		var bytes = stream.ReadFully();
		var json = bytes.FromUtf8Bytes();
		var dto = json.FromJson<ResponseDto>();
	}
}

You may or may not hit other resource issues but your best chance to avoid them is to avoid buffering anywhere. For files this large I’d be looking at an appropriate native solution that’s designed to handle large files, e.g. using ftp/sftp/ssh/scp then calling the Service with the path to complete the request.

A Web Application isn’t a good choice for this, but if you have to go through one I’d look at splitting and uploading the file into smaller more manageable chunks, that you’d later need to piece back together on the server.

Hi Demis,

Thank you for the quick reply. You are of course correct, a web service is not ideal for this sort of scenario and the intended application would not normally require files of this size to be uploaded. However, you never know what users will do and I need to know where the boundaries are.

I will try your suggestions and also look into using ftp instead.

Regards,
Gerhard