How to access cookies sent from an url in windows phone application? - c#

I'm calling a url in my app,and that url send me a data via cookies, how should i get that data from cookies?? Like NSHttpCookie and NSHttpSharedCookie in IOS.

1. If you're using the Windows Phone WebBrowser control, you can try to use the WebBrowserExtensions.GetCookies method :
You could use the GetCookies method to retrieve cookies associated
with a website if you use the WebBrowser control in your application.
Once you have retrieved a CookieCollection, you could use the cookies
to make subsequent HTTP requests to the website.
It should return a CookieCollection which contains Cookie instances from which you'll get all the information you need.
2. If you're using HttpWebRequest, you'll find a good tutorial from msdn in here.
Basically you have to create and to populate a CookieContainer instance from your HttpWebRequest to send cookies and you just have to get received cookies from the Cookies property of HttpWebResponse in the other way. (It also returns a CookieCollection)

Related

How can I catch UTM Cookies values?

I'm trying to Log into a website using WebRequests in C# But the website is using Google-Analytics libraries which is making some UTM-Cookies
The question is how can I catch those cookies values and pass them to the website?
The HttpWebRequest class has a CookieContainer collection property that you can add instances of the Cookie class.
The HttpWebResponse class has a Cookies collection property that contains instances of the Cookie class.
Make your request, read the cookies from the response, and add them to any subsequent requests as needed.

How to get HTML code from webpage?

I'm trying to get HTML code from a specific webpage, but when I do it using
HttpWebRequest request;
HttpWebResponse response;
StreamReader streamReader;
request = (HttpWebRequest)WebRequest.Create(pageURL);
response = (HttpWebResponse)request.GetResponse();
streamReader = new StreamReader(response.GetResponseStream(), Encoding.GetEncoding("windows-1251"));
htmlCode = streamReader.ReadToEnd();
streamReader.Close();
or using WebClient, I get redirected to a login page and I get its code.
Is there any other way to get HTML code?
I read some information here: How to get HTML from a current request, in a postback , but didn't understand what should I do, or how and where to specify URL.
P.S.:
I'm logged-in in a browser. Notepad++ perfectly gets what I need via "right click - view source code".
Thanks.
If you get redirected to a login page, then presumably you must be logged in before you can get the content.
So you need to make a request, with suitable credentials, to the login page. Get whatever tokens are sent (usually in the form of cookies) to maintain the login. Then request the page you want (sending the cookies with the request).
Alternatively (and this is the preferred approach), most major sites that expect automated systems to interact with them provide an API (often using OAuth for authentication). Consult their documentation to see how their API works.
If the page you want to get to is behind a login screen - you're going to need to do the login mechanism through code. And add an associated CookieCollection to hold the login cookie that the website will try to drop on your Request.
Alternatively, if you have a user who can help the program along, you could try listing the cookies for the site once they've logged in through their browser. Copy that cookie across and add it to the CookieCollection.
Cheers
Simon
If you want to scrap an html page that requires autentication, I suggest you to use Watin
to fill the proper fields and navigate to the pages you want to download.
Maybe iot seems a little overkilling at a first glance, but it will save a lot of troubles later.

WebRequest class to post data to login form

I want to use the WebRequest class to post data to a website. This works fine, however the website I'm posting to requires cookies/sessions (it's a login form). After logging in I need to retrieve some account information (this is information on a specific page).
How can I make sure the login information is being stored? In AutoIT I did this using a hidden webbrowser, however I want to use a console application for it.
My current code (to login) is too long to post here, so it can be found here.
Take a look at my aspx sessions scraper on bitbucket. It does exactly what you are asking for, including some aspx webforms specific extensions, like sending postbacks etc.
You need to store the cookie that you get after logging in and then send that cookie when you request pages containing personal information.
Here is an example of using cookies with WebRequest
It is possible that you can't connect because the session has ended so in this case you need to relogin.

Can I read the contents of an Ajax request using the WebBrowser control?

I'm trying to read the contents of an AJAX response in the WebBrowser control in C#/WinForms. The Navigating/Navigated/etc. events seem to fire, but they don't give any access to the data being returned.
Is there any way to intercept the requests and read the data?
Note: If I send the request directly (using webBrowser.Navigate(ajaxUrl)) the WebBrowser controls pops up asking the user to Open/Save the page (as it has a content-disposition header), so that isn't an option. I tried doing it manually with a WebClient/WebRequest, but I can't get the cookies to work correctly (the cookies I read from document.cookie do not seemto match the cookies actually sent with the AJAX request!).
No, you cannot capture XMLHTTPRequests from the web-browser control using the methods of the Web Browser control. You might want to have a look at http://www.fiddler2.com/core/

Download a file over HTTPS C# - Cookie and Header Prob?

I am trying to download a file over HTTPS and I just keep running into a brick wall with correctly setting Cookies and Headers.
Does anyone have/know of any code that I can review for doing this correctly ? i.e. download a file over https and set cookies/headers ?
Thanks!
I did this the other day, in summary you need to create a HttpWebRequest and HttpWepResponse to submit/receive data. Since you need to maintain cookies across multiple requests, you need to create a cookie container to hold your cookies. You can set header properties on request/response if needed as well....
Basic Concept:
Using System.Net;
// Create Cookie Container (Place to store cookies during multiple requests)
CookieContainer cookies = new CookieContainer();
// Request Page
HttpWebRequest req = (HttpWebRequest)WebRequest.Create("https://www.amazon.com");
req.CookieContainer = cookies;
// Response Output (Could be page, PDF, csv, etc...)
HttpWebResponse resp= (HttpWebResponse)req.GetResponse();
// Add Response Cookies to Cookie Container
// I only had to do this for the first "login" request
cookies.Add(resp.Cookies);
The key to figuring this out is capturing the traffic for real request. I did this using Fiddler and over the course of a few captures (almost 10), I figured out what I need to do to reproduce the login to a site where I needed to run some reports based on different selection critera (date range, parts, etc..) and download the results into CSV files. It's working perfect, but Fiddler was the key to figuring it out.
http://www.fiddler2.com/fiddler2/
Good Luck.
Zach
This fellow wrote an application to download files using HTTP:
http://www.codeproject.com/KB/IP/DownloadDemo.aspx
Not quite sure what you mean by setting cookies and headers. Is that required by the site you are downloading from? If it is, what cookies and headers need to be set?
I've had good luck with the WebClient class. It's a wrapper for HttpWebRequest that can save a few lines of code: http://msdn.microsoft.com/en-us/library/system.net.webclient.aspx

Categories

Resources