Uploading files is a very common requirement in most web applications however Blazor does not currently have a native file upload component. Steve Sanderson blogged about this a few weeks ago and even posted a sample project on Github. Oqtane has a need to upload module and theme packages for installation at run-time so we had also implemented our own solution some time ago - which I was happy to discover closely aligned with the approach suggested above.
The main challenge in the Oqtane implementation of the file upload control was that it needed to work equally well in both client and server hosting models. In addition it could not be limited in any way by the size of file which was being uploaded. The approach we took was to leverage the HTML5 File API ( which is now ubiquitous in most modern browsers ) to initiate the upload and then break up a file into chunks which can be streamed to the server and then merged back together.
The technical solution involves multiple aspects including a Razor component, JavaScript, JSInterop, and a C# Controller on the server.
First lets look at the Razor component:
@using Microsoft.AspNetCore.Components.Web
@namespace Oqtane.Modules.Controls
@if (multiple)
{
<input type="file" id="@fileid" name="file" accept="@filter" multiple />
}
else
{
<input type="file" id="@fileid" name="file" accept="@filter" />
}
<span id="@progressinfoid"></span> <progress id="@progressbarid" style="visibility: hidden;"></progress>
@code {
[Parameter]
public string Name { get; set; } // optional - can be used for managing multiple file upload controls on a page
[Parameter]
public string Filter { get; set; } // optional - for restricting types of files that can be selected
[Parameter]
public string Multiple { get; set; } // optional - enable multiple file uploads
string fileid = "";
string progressinfoid = "";
string progressbarid = "";
string filter = "*";
bool multiple = false;
protected override void OnInitialized()
{
fileid = Name + "FileInput";
progressinfoid = Name + "ProgressInfo";
progressbarid = Name + "ProgressBar";
if (!string.IsNullOrEmpty(Filter))
{
filter = Filter;
}
if (!string.IsNullOrEmpty(Multiple))
{
multiple = bool.Parse(Multiple);
}
}
}
The component accepts a number of configurable parameters including the ability to support single file or multiple file uploads and the ability to restrict the files which can be uploaded based on the file extension. The component renders an HTML5 file input element as well as an HTML5 progress element.
If we look at the Add Module component we can see how it is utilized:
<FileUpload Filter=".nupkg"></FileUpload>
<button type="button" class="btn btn-success" @onclick="UploadFile">Upload</button>
...
private async Task UploadFile()
{
await FileService.UploadFilesAsync("Modules");
}
The file upload component allows a person to upload a file and the Upload button initiates the actual upload to the server. These operations were separated from one another because a file upload is generally not a stand-alone operation, it is usually part of a larger process which needs to be performed as an atomic unit of work.
Now lets look at the JavaScript function:
uploadFiles: function (posturl, folder, name) {
var files = document.getElementById(name + 'FileInput ').files;
var progressinfo = document.getElementById(name + 'ProgressInfo ');
var progressbar = document.getElementById(name + 'ProgressBar ');
var filename = ' ';
for (var i = 0; i < files.length; i++) {
var FileChunk = [];
var file = files[i];
var MaxFileSizeMB = 1;
var BufferChunkSize = MaxFileSizeMB * (1024 * 1024);
var FileStreamPos = 0;
var EndPos = BufferChunkSize;
var Size = file.size;
progressbar.setAttribute("style", "visibility: visible;");
if (files.length > 1) {
filename = file.name;
}
while (FileStreamPos < Size) {
FileChunk.push(file.slice(FileStreamPos, EndPos));
FileStreamPos = EndPos;
EndPos = FileStreamPos + BufferChunkSize;
}
var TotalParts = FileChunk.length;
var PartCount = 0;
while (Chunk = FileChunk.shift()) {
PartCount++;
var FileName = file.name + ".part_" + PartCount + "_" + TotalParts;
var data = new FormData();
data.append( 'folder ', folder);
data.append( 'file ', Chunk, FileName);
var request = new XMLHttpRequest();
request.open( 'POST ', posturl, true);
request.upload.onloadstart = function (e) {
progressbar.value = 0;
progressinfo.innerHTML = filename + ' 0% ';
};
request.upload.onprogress = function (e) {
var percent = Math.ceil((e.loaded / e.total) * 100);
progressbar.value = (percent / 100);
progressinfo.innerHTML = filename + '[ ' + PartCount + '] ' + percent + '% ';
};
request.upload.onloadend = function (e) {
progressbar.value = 1;
progressinfo.innerHTML = filename + ' 100% ';
};
request.send(data);
}
}
This function splits an uploaded file into chunks within the client browser. It uses a naming convention for the chunks which allows the server to determine when an entire file has been uploaded. It sends the chunks to a server endpoint and updates a progress indicator so that the user is aware of the upload status.
The last aspect of the file upload is the server controller:
[HttpPost("upload")]
public async Task UploadFile(string folder, IFormFile file)
{
if (file.Length > 0)
{
if (!folder.Contains(":\\"))
{
folder = folder.Replace("/", "\\");
if (folder.StartsWith("\\")) folder = folder.Substring(1);
folder = Path.Combine(environment.WebRootPath, folder);
}
if (!Directory.Exists(folder))
{
Directory.CreateDirectory(folder);
}
using (var stream = new FileStream(Path.Combine(folder, file.FileName), FileMode.Create))
{
await file.CopyToAsync(stream);
}
await MergeFile(folder, file.FileName);
}
}
The public POST method accepts the uploaded file chunks and saves them to the file system. It then calls MergeFile which deals with combining the chunks back together.
private async Task MergeFile(string folder, string filename)
{
// parse the filename which is in the format of filename.ext.part_x_y
string token = ".part_";
string parts = Path.GetExtension(filename).Replace(token, ""); // returns "x_y"
int totalparts = int.Parse(parts.Substring(parts.IndexOf("_") + 1));
filename = filename.Substring(0, filename.IndexOf(token)); // base filename
string[] fileparts = Directory.GetFiles(folder, filename + token + "*"); // list of all file parts
// if all of the file parts exist ( note that file parts can arrive out of order )
if (fileparts.Length == totalparts)
{
// merge file parts
bool success = true;
using (var stream = new FileStream(Path.Combine(folder, filename), FileMode.Create))
{
foreach (string filepart in fileparts)
{
try
{
using (FileStream chunk = new FileStream(filepart, FileMode.Open))
{
await chunk.CopyToAsync(stream);
}
}
catch
{
success = false;
}
}
}
// delete file parts
if (success)
{
foreach (string filepart in fileparts)
{
System.IO.File.Delete(filepart);
}
}
}
// clean up file parts which are more than 2 hours old ( which can happen if a file upload failed )
fileparts = Directory.GetFiles(folder, "*" + token + "*");
foreach (string filepart in fileparts)
{
DateTime createddate = System.IO.File.GetCreationTime(filepart);
if (createddate < DateTime.Now.AddHours(-2))
{
System.IO.File.Delete(filepart);
}
}
}
The above code relies on the file naming convention established in the JavaScript function to determine if all chunks have been received for a file ( note that it is possible for chunks to be delivered out of order due to latency on some threads ). If the method determines that all chunks have been received it merges them into a single file and deletes the individual chunks. It also contains logic to clean up any chunks from previous failed file upload attempts which may have occurred.
This solution provides a very workable solution for uploading large files in both client-side and server-side Blazor.