I am trying to send a file to a server over a REST API. The file could potentially be of any type, though it can be limited in size and type to things that can be sent as email attachments.
I think my approach will be to send the file as a binary stream, and then save that back into a file when it arrives at the server. Is there a built in way to do this in .Net or will I need to manually turn the file contents into a data stream and send that?
For clarity, I have control over both the client and server code, so I am not restricted to any particular approach.
I'd recommend you look into RestSharp
http://restsharp.org/
The RestSharp library has methods for posting files to a REST service. (RestRequst.AddFile()). I believe on the server-side this will translate into an encoded string into the body, with the content-type in the header specifying the file type.
I've also seen it done by converting a stream to a base-64 string, and transferring that as one of the properties of the serialized json/xml object. Especially if you can set size limits and want to include file meta-data in the request as part of the same object, this works really well.
It really depends how large your files are though. If they are very large, you need to consider streaming, of which the nuances of that is covered in this SO post pretty thoroughly: How do streaming resources fit within the RESTful paradigm?
You could send it as a POST request to the server, passing file as a FormParam.
#POST
#Path("/upload")
//#Consumes(MediaType.MULTIPART_FORM_DATA)
#Consumes("application/x-www-form-urlencoded")
public Response uploadFile( #FormParam("uploadFile") String script, #HeaderParam("X-Auth-Token") String STtoken, #Context HttpHeaders hh) {
// local variables
String uploadFilePath = null;
InputStream fileInputStream = new ByteArrayInputStream(script.getBytes(StandardCharsets.UTF_8));
//System.out.println(script); //debugging
try {
uploadFilePath = writeToFileServer(fileInputStream, SCRIPT_FILENAME);
}
catch(IOException ioe){
ioe.printStackTrace();
}
return Response.ok("File successfully uploaded at " + uploadFilePath + "\n").build();
}
private String writeToFileServer(InputStream inputStream, String fileName) throws IOException {
OutputStream outputStream = null;
String qualifiedUploadFilePath = SIMULATION_RESULTS_PATH + fileName;
try {
outputStream = new FileOutputStream(new File(qualifiedUploadFilePath));
int read = 0;
byte[] bytes = new byte[1024];
while ((read = inputStream.read(bytes)) != -1) {
outputStream.write(bytes, 0, read);
}
outputStream.flush();
}
catch (IOException ioe) {
ioe.printStackTrace();
}
finally{
//release resource, if any
outputStream.close();
}
return qualifiedUploadFilePath;
}
Building on to #MutantNinjaCodeMonkey's suggestion of RestSharp. My use case was posting webform data from jquery's $.ajax method into a web api controller. The restful API service required the uploaded file to be added to the request Body. The default restsharp method of AddFile mentioned above caused an error of The request was aborted: The request was canceled. The following initialization worked:
// Stream comes from web api's HttpPostedFile.InputStream
(HttpContext.Current.Request.Files["fileUploadNameFromAjaxData"].InputStream)
using (var ms = new MemoryStream())
{
fileUploadStream.CopyTo(ms);
photoBytes = ms.ToArray();
}
var request = new RestRequest(Method.PUT)
{
AlwaysMultipartFormData = true,
Files = { FileParameter.Create("file", photoBytes, "file") }
};
Detect the file/s being transported with the request.
Decide on a path where the file will be uploaded (and make sure CHMOD 777 exists for this directory)
Accept the client connect
Use ready library for the actual upload
Review the following discussion:
REST file upload with HttpRequestMessage or Stream?
First, you should login to the server and get an access token.
Next, you should convert your file to stream and post the stream:
private void UploadFile(FileStream stream, string fileName)
{
string apiUrl = "http://example.com/api";
var formContent = new MultipartFormDataContent
{
{new StringContent(fileName),"FileName"},
{new StreamContent(stream),"formFile",fileName},
};
using HttpClient httpClient = new HttpClient();
httpClient.DefaultRequestHeaders.Add("Authorization", accessToken);
var response = httpClient.PostAsync(#$"{apiUrl}/FileUpload/save", formContent);
var result = response.Result.Content.ReadAsStringAsync().Result;
}
In this example, we upload the file to http://example.com/api/FileUpload/save and the controller has the following method in its FileUpload controller:
[HttpPost("Save")]
public ActionResult Save([FromForm] FileContent fileContent)
{
// ...
}
public class FileContent
{
public string FileName { get; set; }
public IFormFile formFile { get; set; }
}
Related
The last 2 days, I have been trying to create an image upload system for my website. When I try to save an uploaded image in the "wwwroot" of my Api, everything goes as planned except that I get an empty image in my folder.
At the backend, I receive the filename I send in the frontend but the bytes of the image itself are not there. For some reason the the data of the stream I put in the post call is missing but I do receive the filename in the formfile.
Edit:
To clear things up about my application, I'm working with an Asp.Net Mvc as frontend and Asp.Net Api as backend. I know this isn't how you are supposed to use Asp.Net but this is a school project and I have to do it like this. Normally i would work with Angular or something else but that is not an option for me right now.
So, I'm sending data from the Asp.Net Mvc (frontend) to the Asp.Net Api (backend) and I'm trying to do it by sending it as form data. That means there is no real form that is being submitted.
This is the guide I tried to use:
https://ilclubdellesei.blog/2018/02/14/how-to-upload-images-to-an-asp-net-core-rest-service-with-xamarin-forms/
Backend
ImageController:
[HttpPost("upload")]
public async Task<IActionResult> UploadImage([FromForm(Name = "file")] IFormFile file)
{
if (file.Length == 0)
return BadRequest("Empty file");
string imageName = file.FileName;
using (var fs = new FileStream("wwwroot/" + imageName, FileMode.Create, FileAccess.Write))
{
await file.CopyToAsync(fs);
}
return Ok();
}
Frontend
Method that uploads 1 image as a MemoryStream to the server
private async Task<string> Upload(Stream image, string name, string contentType)
{
_httpClient = _clientFactory.CreateClient("ProjectApi");
HttpContent fileStreamContent = new StreamContent(image);
fileStreamContent.Headers.ContentDisposition = new System.Net.Http.Headers.ContentDispositionHeaderValue("form-data") { Name = "file", FileName = name };
fileStreamContent.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue(contentType);
using (var formData = new MultipartFormDataContent())
{
formData.Add(fileStreamContent);
HttpResponseMessage response = await _httpClient.PostAsync("api/images/upload", formData);
var input = await response.Content.ReadAsStringAsync();
return input;
}
}
The content doesn't seem to be empty:
The filename has been successfully send to the Api but the bytes of the image have not been send:
Structure after uploading some images without checking the size of the formfile (They are empty):
I am not 100% sure but I suppose the reason why you get empty file is that you did not set what type data your api endpoint will consume and maybe the form encryption type & method attributes. My suggestion is that you should update your code to below. ,
[Consumes("multipart/form-data")]
private async Task<string> Upload(Stream image, string name, string contentType)
And in case you forget to add form attributes to your html section, please set attributes as follows <form method="post" enctype="multipart/form-data">. Hope this solves your problem.
I'm trying my hand at .NET Core but I'm stuck trying to convert multipart/form-data to an application/octet-stream to send via a PUT request. Anybody have any expertise I could borrow?
[HttpPost("fooBar"), ActionName("FooBar")]
public async Task<IActionResult> PostFooBar() {
HttpResponseMessage putResponse = await _httpClient.PutAsync(url, HttpContext.Request.Body);
}
Update: I think I might have two issues here:
My input format is multipart/form-data so I need to split out the file from the form data.
My output format must be application-octet stream but PutAsync expects HttpContent.
I had been trying to do something similar and having issues. I needed to PUT large files (>1.5GB) to a bucket on Amazon S3 using a pre-signed URL. The implementation on Amazon for .NET would fail for large files.
Here was my solution:
static HttpClient client = new HttpClient();
client.Timeout = TimeSpan.FromMinutes(60);
static async Task<bool> UploadLargeObjectAsync(string presignedUrl, string file)
{
Console.WriteLine("Uploading " + file + " to bucket...");
try
{
StreamContent strm = new StreamContent(new FileStream(file, FileMode.Open, FileAccess.Read));
strm.Headers.ContentType = new System.Net.Http.Headers.MediaTypeHeaderValue("application/octet-stream");
HttpResponseMessage putRespMsg = await client.PutAsync(presignedUrl, strm);
}
catch (Exception e)
{
Console.WriteLine(e.Message);
return false;
}
return true;
}
Turns out Request has a Form property that contains a Files property that has an OpenReadStream() function on it to convert it into a stream. How exactly I was supposed to know that, I'm not sure.
Either way, here's the solution:
StreamContent stream = new StreamContent(HttpContext.Request.Form.Files[0].OpenReadStream());
HttpResponseMessage putResponse = await _httpClient.PutAsync(url, stream);
I'm looking to use the IRequiresRequestStream interface to enable large file uploads (video files) using ServiceStack (v3) and chunked transfer encoding. The standard file upload can't seem to cope with some of the larger video files our customers are uploading, so we are looking to enable chunked transfer encoding for these files.
I have successfully tested the chunked transfer encoded file upload, but there are a number of parameters that also need to be sent across with the file.
Since IRequiresRequestStream bypasses the ServiceStack request object parser, any other parameters in the request object alongside the Stream are obviously not populated. As a work around I can see the following options:
Query String parameters, accessible via this.Request.QueryString collection
Custom header parameters, accessible via this.Request.Headers collection
Path, accessible via RequestBinder??
I've already managed to implement options 1 and 2, but somehow neither feel quite RESTful enough. I'd prefer to use the Path -> RequestDTO, but I'm struggling with the RequestBinder.
Service:
public object Any(AttachmentStreamRequest request)
{
byte[] fileBytes = null;
using (var stream = new MemoryStream())
{
request.RequestStream.WriteTo(stream);
length = stream.Length;
fileBytes = stream.ToArray();
}
string filePath = #"D:\temp\test.dat";
File.WriteAllBytes(filePath, fileBytes);
var hash = CalculateMd5(filePath);
var requestHash = this.Request.QueryString["Hash"];
var customerId = this.Request.QueryString["CustomerId"];
var fileName = this.Request.QueryString["FileName"];
// nicer would be
// var requestHash = request.Hash;
// var customerId = request.CustomerId;
// save file....
// return response
return requestHash == hash
? new HttpResult("File Valid", HttpStatusCode.OK)
: new HttpResult("Invalid Hash", HttpStatusCode.NotAcceptable);
}
Request:
[Route("/upload/{CustomerId}/{Hash}", "POST", Summary = #"POST Upload attachments for a customer", Notes = "Upload customer attachments")]
public class AttachmentStreamRequest : IRequiresRequestStream
{
// body
public Stream RequestStream { get; set; }
// path
public int CustomerId { get; set; }
// query
public string FileName { get; set; }
// query
public string Comment { get; set; }
// query
public Guid? ExternalId { get; set; }
// path
public string Hash { get; set; }
}
WebClient:
private static async Task<string> SendUsingWebClient(byte[] file, string hash, customerId)
{
var client = (HttpWebRequest)WebRequest.Create(string.Format("http://localhost.fiddler:58224/upload/{0}/{1}", customerId, hash));
client.Method = WebRequestMethods.Http.Post;
client.Headers.Add("Cookie", "ss-pid=XXXXXXXXXXX; ss-id=YYYYYYYYYY");
// the following 4 rows enable streaming
client.AllowWriteStreamBuffering = false;
client.SendChunked = true;
client.ContentType = "application/json";
client.Timeout = int.MaxValue;
using (var fileStream = new MemoryStream(file))
{
fileStream.Copy(client.GetRequestStream());
}
return new StreamReader(client.GetResponse().GetResponseStream()).ReadToEnd();
}
I'm guessing the simple direction to take is something along the following lines, but it seems like a kludge.
RequestBinders.Add(typeof(AttachmentStreamRequest), httpReq => {
var dto = new AttachmentStreamRequest();
var segments = base.Request.PathInfo.Split(new[] { '/' }, StringSplitOptions.RemoveEmptyEntries);
dto.CustomerId = segments[1].As<int32>();
dto.Hash = segments[2].As<string>();
// Stream copy to dto.RequestStream and other params etc....
return dto;
});
I've done a bit of Googling for examples of RequestBinders in this scenario. I'm sure there must be inbuilt ServiceStack methods for parsing the Path, but I'm struggling with it. Does anyone have an example they would like to share?
Recently I also investigated using Chunked transfer with custom headers. Unfortunately, I found out that it's not supported out-of-the-box in HttpWebRequest class nor in .NET Framework in general. The only solution that worked for me was to implement Chunked Transfer HTTP communication over TCP. It's not as complex as it sounds in the begginning. You just need to open TCP client connection, format the headers as needed, split your stream by chunks and send it.
Here is the definition of Chunked Transfer protocol:
https://developer.mozilla.org/en-US/docs/Web/HTTP/Headers/Transfer-Encoding
I have an Azure blob container for storing images. I also have a suite of ASP.NET Web API methods for adding / deleting / listing the blobs in this container. This all works if I upload the images as files. But I now want to upload the images as a stream and am getting an error.
public async Task<HttpResponseMessage> AddImageStream(Stream filestream, string filename)
{
try
{
if (string.IsNullOrEmpty(filename))
{
throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.BadRequest));
}
BlobStorageService service = new BlobStorageService();
await service.UploadFileStream(filestream, filename, "image/png");
var response = Request.CreateResponse(HttpStatusCode.OK);
return response;
}
catch (Exception ex)
{
base.LogException(ex);
throw new HttpResponseException(Request.CreateResponse(HttpStatusCode.BadRequest));
}
The code for adding a new image to the blob container as a stream looks like this.
public async Task UploadFileStream(Stream filestream, string filename, string contentType)
{
CloudBlockBlob blockBlobImage = this._container.GetBlockBlobReference(filename);
blockBlobImage.Properties.ContentType = contentType;
blockBlobImage.Metadata.Add("DateCreated", DateTime.UtcNow.ToLongDateString());
blockBlobImage.Metadata.Add("TimeCreated", DateTime.UtcNow.ToLongTimeString());
await blockBlobImage.UploadFromStreamAsync(filestream);
}
And finally here's my unit test that is failing.
[TestMethod]
public async Task DeployedImageStreamTests()
{
string blobname = Guid.NewGuid().ToString();
//Arrange
MemoryStream stream = new MemoryStream(Encoding.UTF8.GetBytes($"This is a blob called {blobname}."))
{
Position = 0
};
string url = $"http://mywebapi/api/imagesstream?filestream={stream}&filename={blobname}";
Console.WriteLine($"DeployedImagesTests URL {url}");
HttpContent content = new StringContent(blobname, Encoding.UTF8, "application/json");
var response = await ImagesControllerPostDeploymentTests.PostData(url, content);
//Assert
Assert.IsNotNull(response);
Assert.IsTrue(response.IsSuccessStatusCode); //fails here!!
Assert.AreEqual(HttpStatusCode.OK, response.StatusCode);
}
The error I am getting is Value cannot be null.
Parameter name: source
Is this the correct way to upload an image stream to Azure blob storage using Web API? I have it working with image files without a problem, and only getting this problem now that I'm trying to upload using streams.
Is this the correct way to upload an image stream to Azure blob storage using Web API? I have it working with image files without a problem, and only getting this problem now that I'm trying to upload using streams.
According to your description and error message, I found you send your stream data in your url to the web api.
According to this article:
Web API uses the following rules to bind parameters:
If the parameter is a "simple" type, Web API tries to get the value from the URI. Simple types include the .NET primitive types (int, bool, double, and so forth), plus TimeSpan, DateTime, Guid, decimal, and string, plus any type with a type converter that can convert from a string. (More about type converters later.)
For complex types, Web API tries to read the value from the message body, using a media-type formatter.
In my opinion, the stream is a complex types, so I suggest you could post it as body to the web api.
Besides, I suggest you could crate a file class and use Newtonsoft.Json to convert it as json as message's content.
More details, you could refer to below codes.
File class:
public class file
{
//Since JsonConvert.SerializeObject couldn't serialize the stream object I used byte[] instead
public byte[] str { get; set; }
public string filename { get; set; }
public string contentType { get; set; }
}
Web Api:
[Route("api/serious/updtTM")]
[HttpPost]
public void updtTM([FromBody]file imagefile)
{
CloudStorageAccount storageAccount = CloudStorageAccount.Parse("aaaaa");
var client = storageAccount.CreateCloudBlobClient();
var container = client.GetContainerReference("images");
CloudBlockBlob blockBlobImage = container.GetBlockBlobReference(imagefile.filename);
blockBlobImage.Properties.ContentType = imagefile.contentType;
blockBlobImage.Metadata.Add("DateCreated", DateTime.UtcNow.ToLongDateString());
blockBlobImage.Metadata.Add("TimeCreated", DateTime.UtcNow.ToLongTimeString());
MemoryStream stream = new MemoryStream(imagefile.str)
{
Position=0
};
blockBlobImage.UploadFromStreamAsync(stream);
}
Test Console:
using (var client = new HttpClient())
{
string URI = string.Format("http://localhost:14456/api/serious/updtTM");
file f1 = new file();
byte[] aa = File.ReadAllBytes(#"D:\Capture2.PNG");
f1.str = aa;
f1.filename = "Capture2";
f1.contentType = "PNG";
var serializedProduct = JsonConvert.SerializeObject(f1);
var content = new StringContent(serializedProduct, Encoding.UTF8, "application/json");
var result = client.PostAsync(URI, content).Result;
}
Hi and thanks for looking!
Background
I am using the Rotativa pdf tool to read a view (html) into a PDF. It works great, but it does not natively offer a way to save the PDF to a file system. Rather, it only returns the file to the user's browser as a result of the action.
Here is what that code looks like:
public ActionResult PrintQuote(FormCollection fc)
{
int revisionId = Int32.Parse(Request.QueryString["RevisionId"]);
var pdf = new ActionAsPdf(
"Quote",
new { revisionId = revisionId })
{
FileName = "Quote--" + revisionId.ToString() + ".pdf",
PageSize = Rotativa.Options.Size.Letter
};
return pdf;
}
This code is calling up another actionresult ("Quote"), converting it's view to a PDF, and then returning the PDF as a file download to the user.
Question
How do I intercept the file stream and save the PDF to my file system. It is perfect that the PDF is sent to the user, but my client also wants the PDF saved to the file system simultaneously.
Any ideas?
Thanks!
Matt
I have the same problem, here's my solution:
You need to basically make an HTTP request to your own URL and save the output as a binary file. Simple, no overload, helper classes, and bloated code.
You'll need this method:
// Returns the results of fetching the requested HTML page.
public static void SaveHttpResponseAsFile(string RequestUrl, string FilePath)
{
try
{
HttpWebRequest httpRequest = (HttpWebRequest)WebRequest.Create(RequestUrl);
httpRequest.UserAgent = "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; Trident/5.0)";
httpRequest.Headers.Add(HttpRequestHeader.AcceptEncoding, "gzip,deflate");
HttpWebResponse response = null;
try
{
response = (HttpWebResponse)httpRequest.GetResponse();
}
catch (System.Net.WebException ex)
{
if (ex.Status == WebExceptionStatus.ProtocolError)
response = (HttpWebResponse)ex.Response;
}
using (Stream responseStream = response.GetResponseStream())
{
Stream FinalStream = responseStream;
if (response.ContentEncoding.ToLower().Contains("gzip"))
FinalStream = new GZipStream(FinalStream, CompressionMode.Decompress);
else if (response.ContentEncoding.ToLower().Contains("deflate"))
FinalStream = new DeflateStream(FinalStream, CompressionMode.Decompress);
using (var fileStream = System.IO.File.Create(FilePath))
{
FinalStream.CopyTo(fileStream);
}
response.Close();
FinalStream.Close();
}
}
catch
{ }
}
Then inside your controller, you call it like this:
SaveHttpResponseAsFile("http://localhost:52515/Management/ViewPDFInvoice/" + ID.ToString(), "C:\\temp\\test.pdf");
And voilĂ ! The file is there on your file system and you can double click and open the PDF, or email it to your users, or whatever you need.
return new Rotativa.ActionAsPdf("ConvertIntoPdf")
{
FileName = "Test.pdf", PageSize = Rotativa.Options.Size.Letter
};
Take a look at the MVC pipeline diagram here:
http://www.simple-talk.com/content/file.ashx?file=6068
The method OnResultExecuted() is called after the ActionResult is rendered.
You can override this method or use an ActionFilter to apply and OnResultExecuted interceptor using an attribute.
Edit:
At the end of this forum thread you will find a reply which gives an example of an ActionFilter which reads (and changes) the response stream of an action. You can then copy the stream to a file, in addition to returning it to your client.
I successfully used Aaron's 'SaveHttpResponseAsFile' method, but I had to alter it, as the currently logged in user's credentials weren't applied (and so it was forwarding to MVC4's login url).
public static void SaveHttpResponseAsFile(System.Web.HttpRequestBase requestBase, string requestUrl, string saveFilePath)
{
try
{
*snip*
httpRequest.Headers.Add(HttpRequestHeader.AcceptEncoding, "gzip,deflate");
httpRequest.Headers.Add(HttpRequestHeader.Cookie, requestBase.Headers["Cookie"]);
*snip*</pre></code>
Then in your calling Controller method, simply add 'Request' into the SaveHttpResponseAsFile call.
You can also do it using Rotativa, which is actually quite easy.
Using Rotativa;
...
byte[] pdfByteArray = Rotativa.WkhtmltopdfDriver.ConvertHtml( "Rotativa", "-q", stringHtmlResult );
File.WriteAllBytes( outputPath, pdfByteArray );
I'm using this in a winforms app, to generate and save the PDFs from Razor Views we also use in our web apps.
I was able to get Eric Brown - Cal 's solution to work, but I needed a small tweak to prevent an error I was getting about the directory not being found.
(Also, looking at the Rotativa code, it looks like the -q switch is already being passed by default, so that might not be necessary, but I didn't change it.)
var bytes = Rotativa.WkhtmltopdfDriver.ConvertHtml(Server.MapPath(#"/Rotativa"), "-q", html);