Angular upload multiple, large files inside associated POST request

I’ve read a few posts on the best way of doing this but most seem a bit old and I’m wondering what the latest best practise is for what we’d like to achieve…

We have an angular 11 app on which we need to be able to post the creation of a new object (gets created in a database) which may or may not have some files associated with it (also stored in database and related to the object).

public class MyUploadFile
{
    public int FileType { get; set; }
    public UploadFile File { get; set; }
    public string MD5Hash { get; set; }
}

public class MyObject
{
    public string description { get; set; }
    public int ObjectType { get; set; }
    public List<MyUploadFile> files { get; set; }
}

I’d like to be able to POST a MyUploadFile as json via Angular’s HttpClient but suspect that its not possible. Is using protobuf a better option?

Please can someone advise what the latest advice is on sending multiple files as part of a request object from angular?

Using Angular is irrelevant, the way to upload multiple file uploads from a web page is to use multipart/form-data Content-Type HTML standard natively supported by browsers which your ServiceStack API can access from the Request.Files property in your ServiceStack Service, e.g:

public object Post(MyFileUpload request)
{
    if (this.Request.Files.Length > 0)
    {
        var uploadedFile = base.Request.Files[0];
        uploadedFile.SaveTo(MyUploadsDirPath.CombineWith(file.FileName));
    }
    return HttpResult.Redirect("/");
}

The way to send it with Ajax is to use the FormData object, here’s an example of using it from Angular:

Thanks @mythz.

I’ve got a working version of what you’ve suggested but I want to allow multiple files to be uploaded and I want a checksum comparison before and after upload so I want to supply an array of objects with the post.

I’ve come to understand this isn’t possible. So I’m now looking at reading and converting the files into byte arrays so I can achieve what I want.

Is this a suitable solution or is there a reason why this isn’t recommended?

The files should very rarely be more than 100MB in size with 10’s of KBs being the norm. It would be very rare for there to be more than 5 files uploaded at once.

You should be able to send checksums via a different property on your Request DTO, e.g. ?checksums=chk1,chk2,...

multipart/form-data is recommended for HTTP FIle Uploads since it’s the standard for File Uploads for which there are multiple different interoperable client/server implementations. It’s also going to be more efficient since it doesn’t require de/serialization into a data format, i.e. the raw file contents can be included in the HTTP Request as-is.

Ultimately you can choose to use whatever you prefer, but this is the standard to send multiple file uploads supported natively by HTTP Clients & Servers so it’s going to be the recommended option.

I’ll also add that it may be more ergonomic to send files using your APIs standard data format, e.g. GitHub Gists APIs uploads their files when creating/updating Gists using JSON payloads:
https://docs.github.com/en/rest/reference/gists#create-a-gist

Which does mean that binary files would need to be text encoded, e.g. in Base64.

Thanks for taking the time to explain things. I’ve coded up the RequestDTO properties as suggested and its working ok.

In testing I find that when uploading multiple large files (20 files, total size 2.6GB) I’m getting an IOException at RecyclableMemoryStream.Write(): Maximum capacity exceeded

    public object Post(PostActivity request)
    {
        var receivedFiles = new List<PostDocument>();

        for (int i = 0; i < Request.Files.Count(); i++)
        {
            var uploadedFile = Request.Files[i];
            if (uploadedFile.ContentLength > 0)
            {
                var saveToPath = Path.GetTempPath().CombineWith(string.Concat(Guid.NewGuid(), '.', uploadedFile.FileName));
                receivedFiles.Add(new PostDocument
                {
                    Title = request.DocTitleList[i],
                    Description = request.DocDescriptionList == null ? null : request.DocDescriptionList[i],
                    TypeId = request.DocTypeIdList[i],
                    CategoryId = request.DocCategoryIdList == null ? null : request.DocCategoryIdList[i],
                    Md5 = request.DocMd5List[i],
                    SavedFile = saveToPath
                });
                uploadedFile.SaveTo(saveToPath);
            }
        }

        return new {
            text = "hi"
        };
    }

Is there anything I can do about this?

As .NET is essentially performing the read/writes from Request Stream there’s not much that can be done to make this more efficient other than using Async APIs which I’ve just added in this commit.

This change is available from the latest v5.13.3+ that’s now available on MyGet.

Can you also post the full StackTrace, as that may provide some insight on which limit you’ve exceeded.

Thanks, I’ll give that a try

Here’s the stack trace.

at ServiceStack.Text.RecyclableMemoryStream.Write(Byte[] buffer, Int32 offset, Int32 count)
at System.IO.Stream.InternalCopyTo(Stream destination, Int32 bufferSize)
at System.IO.Stream.CopyTo(Stream destination)
at ServiceStack.Host.HttpListener.ListenerRequest.LoadMultiPart()
at ServiceStack.Host.HttpListener.ListenerRequest.get_Form()
at ServiceStack.Host.HttpListener.ListenerRequest.get_FormData()
at ServiceStack.HttpRequestExtensions.GetFlattenedRequestParams(IRequest request)
at ServiceStack.Host.RestHandler.<CreateRequestAsync>d__15.MoveNext()
--- End of stack trace from previous location where exception was thrown ---
at System.Runtime.CompilerServices.TaskAwaiter.ThrowForNonSuccess(Task task)
at System.Runtime.CompilerServices.TaskAwaiter.HandleNonSuccessAndDebuggerNotification(Task task)
at ServiceStack.Host.RestHandler.<ProcessRequestAsync>d__14.MoveNext()"

Well in this case moving to the more efficient .NET6 runtime will also improve resource usage.

Understood, thanks for your help