I have a controller (ApiController) which is ready to receive an XML as a parameter. One of its node is a byte[] (document) This service is built in .Net but another company is going to use my service from Java platform.
We want to avoid that the service in .Net becomes blocked by receiving many request from Java application(this must be asynchronous) So the Java application can send us multiple request with the xml which contains the byte[] node. I don't know how the Java application must manage this call but in .net we want to continue being always available.
I've been reading the new async and await keywords and I not very sure if it must apply to my service on my controller.
My controller looks like this currently:
public class MyController : ApiController
[HttpPost, ActionName("myMethod")]
public void MyMethod(HttpRequestMessage request)
{
XmlDocument doc = new XmlDocument();
doc.Load(request.Content.ReadAsStreamAsync().Result);
//do something...
//call method to transform byte[] to physical file and save it
}
and I've been testing from another application (client) in .Net like this:
private void button1_Click(object sender, EventArgs e)
{
TestMethod();
}
public async Task TestMethod()
{
string xml = "<root> .......";
var content = new StringContent(xml, Encoding.Unicode);
var client = new HttpClient();
HttpResponseMessage result = await client.PostAsync("http://localhost:port/api/myController/myMethod", content);
}
It works fine. I mean click many times on the same button and it creates my file correctly with a little delay (physically) because the file size but the client application it never gets blocked. But I'm not very sure if the client must run this asynchronously or the service must be prepared to do it by itself and the client from Java must only do the request through http.
Or even, I must do the conversion of the file from byte[] to physical file async?
Each request made to your api will get a new instance of your controller and its own thread, so you don't need to worry about the conversion of your byte[] to a file and the saving of that file blocking incoming requests.
A reference to this can be found here: http://blogs.msdn.com/b/roncain/archive/2012/07/16/dependency-injection-with-asp-net-web-api-and-autofac.aspx
Related
My code looks something like this:
private readonly HttpClient _httpClient;
public SomeConstructor(HttpClient httpClient){
_httpClient = httpClient;
}
public void SomeMethod(string reqUrl, string payload){
var result = GetResponseStringAsync(reqUrl, payload).GetAwaiter().GetResult();
// do something with result
}
private async Task<string> GetResponseStringAsync(string reqUrl, string payload){
using (var req = new HttpRequestMessage("POST", reqUrl)){
using (var content = new StringContent(payload)){
// Attach content headers here (specific for each request)
req.Content = content;
// Attach request headers here (specific for each request)
// using req.Headers.TryAddWithoutValidation()
using (var response = await _httpClient.SendAsync(req))
{
return await response.Content.ReadAsStringAsync();
}
}
}
}
I need to send API requests that have different, signed headers per request, otherwise I will get back 401 (Unauthorized). That said, when I send a single request, I always got 200, indicating that the authorization headers are sent correctly. However, if I send multiple requests at once (say with concurrency level set to 10), only 1 request got 200 back, whereas the other 9 got 401s. If I click on these 9 links individually, however, I got 200s for every single one of them, as expected.
It seems to me that somehow, there's a concurrency issue that results in the proper headers not being attached to their corresponding requests, even when I create a new HttpRequestMessage for each request. HttpClient and HttpRequestMessage both are supposedly thread-safe but could someone provide an explanation as to why I'm still getting weird results when sending multiple requests at once?
Add:
I have something like this in my AppHost: Container.Register<ISomeConstructor>(x => new SomeConstructor(new HttpClient())); so I am sure I'm not accidentally modifying the
client anywhere else
Placing a lock around the HttpClient (just before the SendAsync call) makes it work and returns 200s 100% of the time, further convincing me that it's a concurrency issue
I'm deploying and running on Mono 6.8.0.105 -- could this be a Mono issue? I couldn't find any issues/bug reports on this though
I am trying to export some data to excel and store that excel in AWS S3. Our current architecture is like,
we get data from database and manipulate it as per our needs. This is done is one API call.
We need to pass that data as stream to another API ( specifically designed to interact with AWS S3)
Store that stream as Excel file in AWS S3.
So far what i have achieved is :
I am able to get data from database and convert it to memory stream. I have written another API to receive this stream. But couldn't manage to get to pass memory stream from one API to another API.
1st API :
public async Task<ICollection<UserDTO>> ExportUsers(Guid groupId, HttpRequestMessage request)
{
var ms = // get's memory stream out of data received from database.
var client = new HttpClient
{
BaseAddress = new Uri("http://localhost:58025/")
};
client.DefaultRequestHeaders.Accept.Clear();
client.DefaultRequestHeaders.Accept.Add(
new MediaTypeWithQualityHeaderValue("application/bson"));
MediaTypeFormatter bsonFormatter = new BsonMediaTypeFormatter();
//Not sure about BsonMediaTypeFormmater. Juz gave it a try
var response = await client.PostAsync("http://localhost:58025/Resource/Memory", ms, bsonFormatter);
}
2nd API :
[HttpPost]
[Route("Resource/Memory", Name = "UploadMemory")]
public async Task<IHttpActionResult> UploadMemoryFile(MemoryStream memory)
{
// Not reaching until here
}
Any help is highly appreciated!!
First time posting! I've been breaking my head on this particular case. I've got a Web application that needs to upload a file towards a web-api and receive an SVG file (in a string) back.
The web-app uploads the file as follows:
using (var client = new WebClient())
{
var response = client.UploadFile(apiUrl, FileIGotEarlierInMyCode);
ViewBag.MessageTest = response.ToString();
}
Above works, but then we get to the API Part:
How do I access the uploaded file? Pseudocode:
public string Post([FromBody]File f)
{
File uploadedFile = f;
String svgString = ConvertDataToSVG(uploadedFile);
return s;
}
In other words: How do I upload/send an XML-file to my Web-api, use/manipulate it there and send other data back?
Thanks in advance!
Nick
PS: I tried this answer:
Accessing the exact data sent using WebClient.UploadData on the server
But my code did not compile on Request.InputStream.
The reason Request.InputStream didn't work for you is that the Request property can refer to different types of Request objects, depending on what kind of ASP.NET solution you are developing. There is:
HttpRequest, as available in Web Forms,
HttpRequestBase, as available in MVC Controllers
HttpRequestMessage, as available in Web API Controllers.
You are using Web API, so HttpRequestMessage it is. Here is how you read the raw request bytes using this class:
var data = Request.Content.ReadAsByteArrayAsync().Result;
I'm trying to extend my REST service (built using WCF/webHttpBinding) so that clients can upload gzipped data. I'm not sure what the best way to accomplish this but I thought it would be fairly easy by adding a HTTP module which will decompress the data if Content-Encoding for the incoming request is set to gzip.
So I created an class deriving from IHttpModule with the following implementation:
private void OnBeginRequest(object sender, EventArgs e)
{
var app = (HttpApplication) sender;
var context = app.Context;
var contentEncoding = context.Request.Headers["Content-Encoding"];
if (contentEncoding == "gzip")
{
// some debug code:
var decompressedStream = new GZipStream(context.Request.InputStream, CompressionMode.Decompress);
var memoryStream = new MemoryStream();
decompressedStream.CopyTo(memoryStream);
memoryStream.Seek(0, SeekOrigin.Begin);
var streamReader = new StreamReader(memoryStream);
string msg = streamReader.ReadToEnd();
context.Request.InputStream.Seek(0, SeekOrigin.Begin);
app.Request.Filter = //new TestFilterStream(app.Request.Filter);
new System.IO.Compression.GZipStream(
app.Request.Filter, CompressionMode.Decompress);
}
}
The issue I'm seeing is that the GZipStream decompression is never actually performed. I've confirmed that the incoming data is in fact gzip'd (the msg-variable contains the proper data). I've also tried creating my own stream class (TestFilterStream) above and assign that to app.Request.Filter and I've confirmed that no members on the stream class is actually called by ASP.NET. So it seems like while it's possible to specify a filter, that filter isn't actually used.
Isn't HttpApplication.Request.Filter actually used?
I tried setting the Request Filter in two ways:
Using a HttpModule
Setting it in the start of Application_BeginRequest() (Global.asax)
Both with the same results (VS2012 web project + IISExpress):
If there is no input data (GET request or similar), the Filter.Read is not invoked
In case of a POST with actual data, the filter is executed and the web service gets the filtered data
Even if I read from the Request.InputStream before the Filter is set, I still get the filter triggered from my service code.
I have no easy way of testing with Gzippet input, so I have not tried if the actual filter works. However, I know it is getting triggered, since I get an error from GZipStream while it attempts to look for the input.
Perhaps you are having other HttpModules or Filters that disrupt your input or control flow?
This post proposes a method similar to yours, but also states the following, which may be causing some side effects (my tests were not using WCF):
"It appears that this approach trigger a problem in WCF, as WCF relies on the original Content-Length and not the value obtained after decompressing."
I've just done a couple of tests, and my Request.Filter stream is called into, as long as there is a request body and the request body gets read by the http handler. I'm guessing you use either a PUT or a POST, and certainly read the request body, so that shouldn't be a problem.
I suspect Knaģis' comment is correct. Have you tried without the debug code? If I dig into the HttpRequest source, I see a variable _rawContent being written to exactly once; at that same time the request filters are applied. After that the _rawContent value just gets cached, and is never updated (nor reset when a filter is added).
So by calling Request.InputStream in your debug code you are definitely preventing your filter from being applied later on. Reading the Request.Headers collection is no problem.
Are you sure, that application itself should bother?
Usually that's handled per configuration of host (IIS). So, basically, you only need to implement custom GZip support, when you host the service yourself.
You can take a look here
I'm trying to have a console application to send a XML file to a web application developed in ASP.NET MVC 3, and receive another XML as a response.
The error returned in the console application is:
The remote server returned an error: (500) Internal Server Error.
When I get Fiddler2 running, I see this error:
Object reference not set to an instance of an object.
The code in the console application is:
static void Main(string[] args)
{
var wc = new WebClient();
byte[] response = wc.UploadFile("http://mysite.com/Tests/Test", "POST", "teste.xml");
string s = System.Text.Encoding.ASCII.GetString(response);
Console.WriteLine(s);
Console.ReadKey();
}
The code in the MVC Controller is:
[HttpPost]
public ActionResult Test(HttpPostedFileBase file)
{
XElement xml = XElement.Load(new System.IO.StreamReader(file.InputStream));
var test = new MyTest();
return File(test.RunTest(xml), "text/xml", "testresult.xml");
}
RunTest() works well, since this method works when I upload the file via form (in a method with the same name, using method GET). RunTest() returns the XML with the response.
When I debug the MVC application, I see the problem: the variable file is null!
How do I fix that? What do I have to change in my console application for it to actually send a file? Or is it the case to change the MVC method?
And, before trying to use WebClient, I tried this code here: http://msdn.microsoft.com/en-us/library/debx8sh9.aspx, and had the same results.
Your problem is that WebClient.UploadFile isn't posting a form with the enctype set to multipart/form-data using an input named "file" for MVC to map to. Try changing your server side method to this:
[HttpPost]
public ActionResult Test()
{
var file = Request.Files[0] as HttpPostedFile;
XElement xml = XElement.Load(new System.IO.StreamReader(file.InputStream));
var test = new MyTest();
return File(test.RunTest(xml), "text/xml", "testresult.xml");
}
You need to pass name for parameter with uploaded file. This is not possible with WebClient.
Check
How to specify form parameter when using webclient to upload file
Send multipart/form-data content type request
In case someone has the slightly different but related problem:
I also had to do it using the function UploadData instead of UploadFile. In this case, instead of writing in the controller:
var file = Request.Files[0] as HttpPostedFile;
XElement xml = XElement.Load(new System.IO.StreamReader(file.InputStream));
One can simply write:
XElement xml = XElement.Load(new System.IO.StreamReader(Request.InputStream));
Easier! ;)