Large file uploads (~1 GB) fail with OutOfMemoryException – support for chunked uploads?

Description:
We are using ServiceStack’s PostFileWithRequest() method to upload files.

public virtual TResponse PostFileWithRequest(string relativeOrAbsoluteUrl, Stream fileToUpload, string fileName, object request, string fieldName = “file”)

When attempting to upload large files (greater than 1 GB), the server experiences high memory usage and eventually fails with:

System.OutOfMemoryException
“bad allocation” errors

It appears the file is being fully buffered in memory during upload/deserialization, which leads to memory exhaustion before the file is persisted.

Adjusting maxRequestLength, executionTimeout, and maxAllowedContentLength in web.config allows the request to pass IIS limits, but it does not prevent memory exhaustion and sometimes results in a timeout exception.

Code used for file upload:

Record record = new Record();
record.RecordType = new RecordTypeRef() { Uri = 2 };
record.Title = “test2gb”;
FileInfo file = new FileInfo(“C:\test\test.msi”);
var recordresponse = client.PostFileWithRequest(file, record);

Questions:

Does ServiceStack currently support chunked file uploads?

If not, is there a recommended workaround for handling large files?

Any guidance on best practices for handling 1 GB+ files with ServiceStack-based APIs?

ServiceStack isn’t imposing any additional limits or buffers uploaded files, it typically just passes through the underlying hosts Request Stream, how is your Service implementation handling the file uploads? What .NET Version are you using?

No, but it’s what you’d need to implement to support upload of large files. IMHO 1GB is too big to upload to an API endpoint. I’d look at implementing manual chunking or if possible post a URL and have your API download the file where it’ll have better control over chunking and file stream management.

Otherwise consider alternative solutions like using scp/rsync/ftp to upload large files to a public drop box. Or if it’s easier clients could upload files to a public S3 bucket and post the path to uploaded file to your API which downloads it.

Does ServiceStack have any built-in support for resumable upload protocols such as TUS (for chunked file uploads)? If not, could this be raised as an Enhancement Request for consideration in future releases to better support large file uploads?

It doesn’t, you can submit any feature requests to:

https://servicestack.net/ideas

FYI, looks like someone has already implemented TUS support for ASP.NET Core:

You should be able to use this along side ServiceStack since they’re both just middleware.