It appears the file is being fully buffered in memory during upload/deserialization, which leads to memory exhaustion before the file is persisted.
Adjusting maxRequestLength, executionTimeout, and maxAllowedContentLength in web.config allows the request to pass IIS limits, but it does not prevent memory exhaustion and sometimes results in a timeout exception.
Code used for file upload:
Record record = new Record(); record.RecordType = new RecordTypeRef() { Uri = 2 }; record.Title = “test2gb”; FileInfo file = new FileInfo(“C:\test\test.msi”);
var recordresponse = client.PostFileWithRequest(file, record);
Questions:
Does ServiceStack currently support chunked file uploads?
If not, is there a recommended workaround for handling large files?
Any guidance on best practices for handling 1 GB+ files with ServiceStack-based APIs?
ServiceStack isn’t imposing any additional limits or buffers uploaded files, it typically just passes through the underlying hosts Request Stream, how is your Service implementation handling the file uploads? What .NET Version are you using?
No, but it’s what you’d need to implement to support upload of large files. IMHO 1GB is too big to upload to an API endpoint. I’d look at implementing manual chunking or if possible post a URL and have your API download the file where it’ll have better control over chunking and file stream management.
Otherwise consider alternative solutions like using scp/rsync/ftp to upload large files to a public drop box. Or if it’s easier clients could upload files to a public S3 bucket and post the path to uploaded file to your API which downloads it.
Does ServiceStack have any built-in support for resumable upload protocols such as TUS (for chunked file uploads)? If not, could this be raised as an Enhancement Request for consideration in future releases to better support large file uploads?