Download file by Javascript postback in C# - c#

EDIT:
I´ll be more specific. I want to do a script to download a group of files every day.
To do this programmatically, i need to click in a javascript button.
It´s simple when is just put the URL in WebRequest class, but in javascript button i don´t have the URL. How can i mount this URL?
Request (by Fiddler):
POST /SomeSite?Something.aspx HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Referer: http://www.Site.com/Stackoverflow/SomeSite?Something.aspx
Accept-Language: pt-BR
User-Agent: Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)
Content-Type: application/x-www-form-urlencoded
Accept-Encoding: gzip, deflate
Host: www.Site.com
Content-Length: 10616
Connection: Keep-Alive
Pragma: no-cache
Cookie: idioma=pt-br; WT_FPC=id=187.16.81.13-3324702672.30186643:lv=1320587789589:ss=1320587578749
__EVENTTARGET=ctl00%24contentPlaceHolderConteudo%24lnkDownloadArquivo&__EVENTARGUMENT=&__VIEWSTATE=%BlaBlaBla

Here you can see the _EVENTTARGET that is using postback with a link Button which name is "lnkDownloadArquivo". So far I understand you want to simulate same download request without button click. if so then you can check here a solution .
http://ciintelligence.blogspot.com/2011/01/fetching-aspnet-authenticated-page-with.html.
here you can get idea how asp.net button post back request works.

The built-in class you need is the HTTPWebRequest (or WebRequest) class. To create one, call System.Net.WebRequest.Create() and pass your URL, add the appropriate headers using the Headers collection, write to the Stream retrieved from WebRequest.GetRequestStream(), then retrieve the response using WebRequest.GetResponse(). From the retrieved response object, you can get the response Stream using WebResponse.GetResponseStream(). The Stream can then be read from like any other Stream object.

Related

gzip being added to the Content-Encoding header multiple times

~(I found out when it happens, see bottom of question)
I am working with a traditional ASP.NET web application. There is an .aspx page that hosts an angular 11 application which loads fine 9/10 times but occasionally a bad response is returned with a 200 OK status. When this happens, in Firefox a page loads with a "content encoding error" and in Chrome and Edge, just a blank screen with the same verbiage in console.
Using Wireshark, I was able to determine that when the "content-encoding-error" occurs the response header has three comma separated "gzip" values appended to the Content-Encoding header, see below:
HTTP/1.1 200 OK
Cache-Control: no-cache, no-store, must-revalidate
Content-Type: text/html; charset=utf-8
Content-Encoding: gzip, gzip, gzip
...
Whereas, a normal response from the .aspx page look like this.
HTTP/1.1 200 OK
Cache-Control: no-cache, no-store, must-revalidate
Content-Type: text/html; charset=utf-8
Content-Encoding: gzip
..
I can duplicate the issue using one of aspx's [web method] calls:
var ctx = HttpContext.Current
var unused = ctx.Response.Filter // Because apparently you must access it before you can set it
ctx.Response.Filter = new GZipStream(ctx.Response.OutputStream, CompressionLevel.Optimal)
ctx.Response.AppendHeader("Content-Encoding", "gzip")
ctx.Response.AppendHeader("Content-Encoding", "gzip") // <--Gzip added twice here
The troubling part is that the multiple "gzip" values are on the response from the aspx page itself. I have search the entire code base and all web.config(s) in an attempt to find where this compression is being applied but to no avail. So, I am thinking it could be a third party doing this.
We use DevExtreme and I have been looking at these settings in our config:
<add key="DXEnableCallbackCompression" value="true" />
<add key="DXEnableResourceCompression" value="true" />
<add key="DXEnableResourceMerging" value="true" />
<add key="DXEnableHtmlCompression" value="true" />
I am still having trouble scanning the code for issues. Does anyone know of a trick using fiddler or Wireshark or any other tool that could reveal where these headers are sporadically showing triples at?
Edit: Here is the GET request header which returns a response which proper encoding ~90% of the time.
GET http://xxx/xxx.aspx?xxx=4 HTTP/1.1
Host: xxx.com
Connection: keep-alive
Pragma: no-cache
Cache-Control: no-cache
Upgrade-Insecure-Requests: 1
User-Agent: Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/92.0.4515.131 Safari/537.36 Edg/92.0.902.73
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,image/apng,*/*;q=0.8,application/signed-exchange;v=b3;q=0.9
Referer: http://xxx/Home.aspx
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.9
Cookie: ASP.NET_SessionId=x; .ASPXAUTH=x;
Found out when it happens:
I was able to duplicate this issue on a regular basis. If I close all browser sessions and recycle the app pool, the issue occurs on the first request. On subsequent requests, the issue does not happen.
Also, the culprit is a google script embedded on the HTML page. When this script is removed the page loads fine on first request after a recycle or not.
Culprit Code:
<script type="text/javascript" src="//maps.googleapis.com/maps/api/js?key=
<%=GoogleMapAPIKey%>&channel=<%=GoogleMappingChannel%>"></script>
I am sure it is not the js file itself. The keys are embedded into the tag via server side processors. Those two processor's call an API to get the keys and those calls are gzipped. I still don't know why the aspx's response header is getting three "gzips" when the js include statement is present in the page markup.
I may remove this wall of text and add a new question due to the new findings.
The problem seems to occur when you Gzip encoding was added to outgoing that were being triggered from markup on the aspx page. All web methods that are called after page load and in an async fasion from the angular client have no encoding issue.
There were two called via a page property that was triggered by page markup to access its value. These web methods had gzip applied and I guess since these were processed earlier in the page-lifecycle something was getting mixed up.
My problem was solved by removing the compression on those two calls.
There were two calls to a function that added Gzip encoding prior to page load and at that time the response was the aspx page itself.

POST JSON to webservice using WebClient C#

I am working on a desktop application which calls some web services via WebClient POST requests. These same web services are used in web application also.
I am facing a strange problem where in my desktop application request was successful and I got response but some of my requests parameters were not saved. But same request is updateing all the parameters we I call them from web application using jquery.
In web application I am calling web service like this
$.post("/MyService/Account/Register",accountModel, function (data) {
});
and I stingify my json object that is accountModel, my request looks like this when console.log
{"Name":"Lorem","Email":"abc#abc.com","interest":"[\"1\"]","sectors":"[\"1\",\"2\"]","subscribe":false}
Now when I used same request string to post data from my desktop application all the properties like name, email and subscribe were saved but interest and sectors were not saved.
I want to figure it out that why same request object is working via jquery and not in C# webclient.
Here is my code that I used to post data using WebClient
WebClient client = new WebClient();
string json = string.Format("{{\"Name\":\"{0}\",\"Email\":\"{1}\",\"interest\":\"[\"{2}\"]","sectors":"[\"{3}\",\"{4}\"]","subscribe":{5} }}","Lorem","abc#abc.com","1","1","2","false");
client.Headers[HttpRequestHeader.ContentType] = "application/json";
string result = client.UploadString("http://Server.com/MyService/Account/Register", json);
Please someone help me to resolve this issue that I am not getting any error but some of my parameters were not updates.
I want to clear that I do not have any code or documentation access to web service.
UPDATE
As per Jasen's comment here are requests captured with fiddler
JQuery request
POST http://Server.com/MyService/Account/Register HTTP/1.1
Host: server.com
Connection: keep-alive
Content-Length: 463
Accept: */*
Origin: http://server.com
X-Requested-With: XMLHttpRequest
User-Agent: Mozilla/5.0 (Windows NT 10.0; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/46.0.2490.80 Safari/537.36
Content-Type: application/x-www-form-urlencoded; charset=UTF-8
Referer: http://server.com/MyService/Account/Register/
Accept-Encoding: gzip, deflate
Accept-Language: en-US,en;q=0.8
Cookie: .ASPXAUTH=7A14FC68B72078BAE43A623B94A901180C72093CCE222BBD98EE2AE7E2612078D1E3B7D8860905A4F7B2D75FD67E9274A0A5C40760A5AF703F970504380EBAF8B3D09A15F0B70090ACF4882DC58885F7CF12473BF55647840F3080ADD2C19249
Name=Lorem&Email=abc#abc.com&interest=%5B%221%22%5D&sectors=%5B%221%22%2C%222%22%5D&subscribe=false
WebClient Request
POST http://server.com/MyService/Account/Register HTTP/1.1
Content-Type: application/json
Host:server.com
Cookie: .ASPXAUTH=F586C63F64186E13EB6EC19AAB25A531A0EDA5B7B601013550ADD629C1481EC3F080DDB5F06D691CB8F81EE8631EF8859F82CF7DD3F2ED2A597AA971A53E80141EDD6EA549784AD7EAE8E144F0CD3196A44316F29C08E0C5383A7231A1B6C5EF
Content-Length: 536
Expect: 100-continue
{"Name":"Lorem","Email":"abc#abc.com","sectors":["1","2"],"interest":["1"],"subscribe":false}
Shall I send my webclient request as URL encoded string like we can see in jquery request?
Finally I got the solution with help of fiddler. Thanks to Jasen for suggesting fiddler to see requests
Here is my working code
WebClient client = new WebClient();
string result = client.UploadValues("http://Server.com/MyService/Account/Register", new NameValueCollection()
{
{"Name","Lorem"},
{"Email","abc#abc.com"},
{"interest","[\"1\"]"},
{"sectors","[\"1\",\"2\"]"},
{"interest","false"}
});
Yes I used UploadValues method instead of UploadData or UploadString. Also note that I have removed the content type json declaration from my code.

Get html result from web page

I am planning create a movil application (for fun) that should use the result from this web page (http://consultawebvehiculos.carabineros.cl/index.php). is there any ways to create a instance of a browser in my Net code and read this result and publish it using a web service..
something like:
var IE= new broswer("http://consultawebvehiculos.carabineros.cl/index.php");
var result=IE.FindElementByID("txtIdentityCar").WriteText(YourIdentityCar);
publicToWebSerivce(result);
Update:
Using Fiddler i can see that http post is somthing like that:
POST http://consultawebvehiculos.carabineros.cl/index.php HTTP/1.1
Host: consultawebvehiculos.carabineros.cl
Connection: keep-alive
Content-Length: 61
Cache-Control: max-age=0
Accept: text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8
Origin: http://consultawebvehiculos.carabineros.cl
User-Agent: Mozilla/5.0 (Windows NT 6.2; WOW64) AppleWebKit/537.17 (KHTML, like Gecko) Chrome/24.0.1312.57 Safari/537.17
Content-Type: application/x-www-form-urlencoded
Referer: http://consultawebvehiculos.carabineros.cl/index.php
Accept-Encoding: gzip,deflate,sdch
Accept-Language: es-ES,es;q=0.8
Accept-Charset: ISO-8859-1,utf-8;q=0.7,*;q=0.3
accion=buscar&txtLetras=CL&txtNumeros1=sk&txtNumeros2=12&vin=
May be i need some .Net class like webclient in order connect with the php page...no sure.
UPDATE: I finally i found the solution using Fiddler to know the total parameters and I've used the code from http://www.hanselman.com/blog/HTTPPOSTsAndHTTPGETsWithWebClientAndCAndFakingAPostBack.aspx
If your are just interested in scraping the page, I suggest using Html Agility Pack.
If you also want to display the page, then you could use the WebBrowser control.
We've been using http://htmlunit.sourceforge.net/ for similair tasks. It allows you to send requests, receive response/status code/etc.
(it's a Java lib, so you could either google for a .Net port or use a converter to convert Java assembly into .Net assembly - see http://blog.stevensanderson.com/2010/03/30/using-htmlunit-on-net-for-headless-browser-automation/ for guidance. We've used the convertion approach).

Extract Url using Regex

I've been searching for at least 2hrs but I can't find any pattern to extract following Urls using regex. I went with too many patterns which described in many articles. But I couldn't find something useful.
For Example : Urls like following patterns.
http://google.com
http://www.google.com
http://www.image.google.com
http://google.com:8080
http://google.com:8080/default.aspx?param=1
http://google.com/default.aspx?param=1&param1=2
Update : Dear friends, It looks like I have to explain my issue in more details, I'm working on a simple proxy server using TCP components, My server listen to specific port when an incoming connection received. I'm extracting and reading all client request data.
data contains headers and content types and etc like following :
GET http://www.bing.com/ HTTP/1.1
Accept: text/html, application/xhtml+xml, */*
Accept-Language: en-US,en;q=0.7,fa;q=0.3
User-Agent: Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; WOW64; Trident/6.0)
Accept-Encoding: gzip, deflate
Host: www.bing.com
DNT: 1
Proxy-Connection: Keep-Alive
These are plain-text so I need to find and extract Urls for doing forwarding operations.
And any Url pattern you guess.
Please, Any advice will be helpful.
https?://[\w\.]+\.\w+(:\d{1,5})?(/[\w?&.=]+)?
Salam. Try this one:
https?://[^\s]+

Removing a line feed from the header in of a c# web request

I need to remove the last line feed from a http webrequest in order to communicate with an json-rpc service.
The request which .net generates looks like this.
POST http://localhost.:8332/ HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; MS Web Services Client Protocol 4.0.30319.1)
Authorization: Basic dGlwa2c6dGlwa2c=
Host: localhost.:8332
Content-Length: 42
Expect: 100-continue
Connection: Keep-Alive
{"id":1,"method":"getinfo","params":[]}
What I would need would be this (notice the missing line feed after last header value and the begin of the json content):
POST http://localhost.:8332/ HTTP/1.1
User-Agent: Mozilla/4.0 (compatible; MSIE 6.0; MS Web Services Client Protocol 4.0.30319.1)
Authorization: Basic dGlwa2c6dGlwa2c=
Host: localhost.:8332
Content-Length: 42
Expect: 100-continue
Connection: Keep-Alive
{"id":1,"method":"getinfo","params":[]}
I can't find anything where I could manipulate the header which is actually sent to service.
See http://www.bitcoin.org/smf/index.php?topic=2170.0 for more background on the problem...
finally resolved my (core) issue. the problem with my communication with the rpc service, was that I had not set content-type. The service was requiring a content-type of "application/json-rpc" to work properly.

Categories

Resources