How to check if webclient got correct response C#? - c#

got typical api request for sending sms through sms gateway.
Any chance that I can check connection of the device?
string readUrl = "http://192.168.253.160/index.php/http_api/read_sms";
string result;
WebClient Client = new WebClient();
Client.QueryString.Add("login", "admin");
Client.QueryString.Add("pass", "password");
Client.QueryString.Add("folder", "sentitems");
Client.QueryString.Add("idfrom", "100");
Client.QueryString.Add("idto", "101");
Client.QueryString.Add("limit", "1");
Client.Headers.Add("user-agent", "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:16.0) Gecko/20100101 Firefox/16.0");
Stream receivedStream = Client.OpenRead(readUrl);
StreamReader reader = new StreamReader(receivedStream);
result = reader.ReadToEnd();
receivedStream.Close();
reader.Close();
MessageBox.Show(result);
I would like to make condition:
if(result == true)
{
button green
}
else
{
button Red
}
I know that if i check folder sentitems with correct ID and I get correct result I will get response what can I use in this condition. But this response is so long. So I am just thinking about another/different option.
Thanks

Related

Windows.Web.Http.HttpClient.ReadAsStringAsync() results in COMException HRESULT E_FAIL

I have following C# code:
Uri url = new Uri("http://lu32kap.typo3.lrz.de/mensaapp/exportDB.php?mensa_id=all");
HttpClient client = new HttpClient();
client.DefaultRequestHeaders.UserAgent.TryParseAdd("Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; WOW64; Trident/6.0)");
HttpResponseMessage response = await client.GetAsync(url);
response.EnsureSuccessStatusCode();
var content = response.Content;
if(content != null)
{
string result = await content.ReadAsStringAsync();
if (result != null)
{
tblock.Text = result;
}
}
Every time I run it, I get a COMException "HRESULT E_FAIL".
I was able to track it down partially. It's caused by the website I'm trying to get my data from because if I'm changing it to "https://www.google.de/" it works.
It's crashing at:
string result = await content.ReadAsStringAsync();
Nevertheless I need to get it to work with this website because it returns a, with PHP generated, json object.
Is there a way to fix this?
The image behind this link shows the crash in VS2015
I ran this code locally and I end up getting this Exception
The character set provided in ContentType is invalid. Cannot read
content as string using an invalid character set.
And it looks like it is returning UTF8
'utf8' is not a supported encoding name. For information on defining a
custom encoding, see the documentation for the
Encoding.RegisterProvider method.
Can you ensure that the output on the server is in the correct format? Perhaps try this answer:
Parsing UTF8 JSON response from server
Solution:
It was a problem with the UTF8 encoding. I was able to build a small workaround.
Thanks to Glitch100!
Uri url = new Uri("http://lu32kap.typo3.lrz.de/mensaapp/exportDB.php?mensa_id=all");
HttpClient client = new HttpClient();
client.DefaultRequestHeaders.UserAgent.TryParseAdd("Mozilla/5.0 (compatible; MSIE 10.0; Windows NT 6.2; WOW64; Trident/6.0)");
HttpResponseMessage response = await client.GetAsync(url);
response.EnsureSuccessStatusCode();
IHttpContent content = response.Content;
if(content != null)
{
IBuffer buffer = await content.ReadAsBufferAsync();
using (DataReader dataReader = DataReader.FromBuffer(buffer))
{
string result = dataReader.ReadString(buffer.Length);
if (result != null)
{
tblock.Text = result;
}
}
}

Not able to access a page from US as a country

I want to use the US as a country to access this
url = http://www.tillys.com/product/Say-What/Short-Dresses/SAY-WHAT--Ribbed-Tank-Midi-Dress/Heather-Grey/285111595,
I've tried with cookies and all but the url still it redirects to the site's home page.
I want to see if there is any way i can access this page. Below is the code with which i am trying.
Below is the function with which i am trying to do this:
public static string getUrlContent (string url)
{
var myHttpWebRequest = (HttpWebRequest)WebRequest.Create(url);
myHttpWebRequest.Method = "GET";
myHttpWebRequest.AllowAutoRedirect = true;
myHttpWebRequest.ContentLength = 0;
myHttpWebRequest.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,image/webp,*/*;q=0.8";
myHttpWebRequest.Headers.Add("Cookie", "=en%5FUS;");
myHttpWebRequest.UserAgent = "Mozilla/5.0 (X11; Linux x86_64) AppleWebKit/537.36 (KHTML, like Gecko) Ubuntu Chromium/49.0.2623.108 Chrome/49.0.2623.108 Safari/537.36";
//myHttpWebRequest.Headers.Add("Accept-Encoding", "gzip, deflate, sdch");
myHttpWebRequest.Headers.Add("Accept-Language", "en-US,en;q=0.8");
myHttpWebRequest.Headers.Add("Cookie", "wlcme=true");
//myHttpWebRequest.CookieContainer = new CookieContainer();
//myHttpWebRequest.Headers.Add("X-Macys-ClientId", "NavApp");
var response = (HttpWebResponse)myHttpWebRequest.GetResponse();
var rmyResponseHeaders = response.Headers;
Console.WriteLine ("Content length is {0}", response.ContentLength);
Console.WriteLine ("Content type is {0}", response.ContentType);
// Get the stream associated with the response.
Stream receiveStream = response.GetResponseStream ();
// Pipes the stream to a higher level stream reader with the required encoding format.
StreamReader readStream = new StreamReader (receiveStream, Encoding.UTF8);
//Console.WriteLine ("Response stream received.");
Console.WriteLine (readStream.ReadToEnd ());
var josnStr = readStream.ReadToEnd ();
Console.WriteLine (josnStr);
return josnStr;
//Encoding enc1 = Encoding.GetEncoding(1252);
}
If the site www.tillys.com is using Geo-fencing it will show you different content based on a lookup of the requesting IP address. In this case there's nothing C# or other languages can do.
You'll need to either proxy your request through a VPN (see How to send WebRequest via proxy?) or deploy your code to a data center in the US. For example, if you use Azure you can deploy to several different data centers through out the world including several data centers in the US. Once your code is running in the US it should be able to access the US version of the page.

Cant get content (Html) of "Visual Studio Team Services" via WebRequest

Somehow I Iam not able to download the html content of https://{YourName}.visualstudio.com/Defaultcollection/ via HttpWebRequest/WebRequest or WebClient.
It always results a HTML-Page with following error Message:
Microsoft Internet Explorer's Enhanced Security Configuration is currently enabled on your environment. This enhanced level of security prevents our web integration experiences from displaying or performing correctly. To continue with your operation please disable this configuration or contact your administrator.
I have tried alot of ways to get to my needed result. I tried using OAuth2 and also setup Alternate authentication credentials. I even disabled Microsoft Internet Explorer's Enhanced Security.
Here are 2 of my x methods which doesnt seem to work. Both give the same result (see error msg above):
private static void Test()
{
WebClient client = new WebClient();
client.UseDefaultCredentials = true;
client.Credentials = new NetworkCredential(UserName,Password);
//Pretend to be a browser
client.Headers.Add("user-agent", "Mozilla/5.0 (Windows; U; Windows NT 6.1; en-US; rv:1.9.1.3) Gecko/20090824 Firefox/3.5.3 (.NET CLR 4.0.20506)");
var HTML = client.DownloadString("https://<YourName>.visualstudio.com/Defaultcollection/");
Console.WriteLine(HTML);
}
private static void Test2()
{
CookieContainer cookies = new CookieContainer();
HttpWebRequest authRequest = (HttpWebRequest)HttpWebRequest.Create("https://<YourName>.visualstudio.com/Defaultcollection/");
//Set Header
authRequest.UserAgent = "Mozilla/5.0 (Windows NT 5.1; rv:2.0b8) Gecko/20100101 Firefox/4.0b8";
authRequest.Accept = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
authRequest.Headers.Add("Accept-Encoding", "gzip, deflate");
authRequest.Headers.Add("Accept-Language", "de,en;q=0.5");
authRequest.Headers.Add("Accept-Charset", "ISO-8859-1,utf-8;q=0.7,*;q=0.7");
//authRequest.Headers.Add("Keep-Alive", "30000");
authRequest.Headers.Add(HttpRequestHeader.Authorization, SetAuthHeaderValue());
//Something
authRequest.ContentLength = 0;
authRequest.ContentType = "application/soap+xml; charset=utf-8";
authRequest.Host = "<YourName>.visualstudio.com";
//Set Cookies
authRequest.CookieContainer = cookies;
HttpWebResponse response = (HttpWebResponse)authRequest.GetResponse();
StreamReader readStream = new StreamReader(response.GetResponseStream());
string HTML = readStream.ReadToEnd();
Console.WriteLine(HTML);
readStream.Close();
}
private static string SetAuthHeaderValue()
{
//string _auth = string.Format("{0}:{1}",UserName,Password);
//string _enc = Convert.ToBase64String(Encoding.ASCII.GetBytes(_auth));
String encoded = System.Convert.ToBase64String(System.Text.Encoding.GetEncoding("ISO-8859-1").GetBytes(UserName + ":" + Password));
string _cred = string.Format("{1}", "Basic", encoded);
return _cred;
}
I picked the Header-Values you see here, by tracing the connection with fiddler.
Is somebody able to authenticated,connect and download the html-content from https://{YourName}.visualstudio.com/Defaultcollection/?
Would be awesome, thanks :)!

Checking url availability

In my program,I check that a site is available or not, I use this code
HttpWebRequest request;
HttpWebResponse response;
Message = string.Empty;
string result="";
request = (HttpWebRequest)WebRequest.Create(url);
request.Timeout = 300000;
request.AllowAutoRedirect = true;
try
{
response = (HttpWebResponse)request.GetResponse();
result = response.StatusCode.ToString();
response.Close();
}
catch (Exception ex)
{
result = ex.Message;
}
I set timeout to 5 min. when the program runs,for some sites(Urls) , result is "unable to connect to remote server" but site is available. how can I solve this problem?
Some sites have throttles to limit web requests from robots or client with invalid user agent or not recognized string.
Therefore I suggest that you adjust the user agent to a known browser.
For example:
WebClient client = new WebClient();
// Mozilla 2.2
client.Headers.Add("user-agent", "Mozilla/5.0 (Windows; U; Windows NT 6.1; rv:2.2) Gecko/20110201");
// Safari 7.0.3
client.Headers.Add("user-agent", "Mozilla/5.0 (Macintosh; Intel Mac OS X 10_9_3) AppleWebKit/537.75.14 (KHTML, like Gecko) Version/7.0.3 Safari/7046A194A");
For a complete list of user agent string, check out: http://www.useragentstring.com/pages/useragentstring.php

I am downloading a link from nseindia site using my program but now I cant do it there is a error 403?

I am downloading a file from nseindia site using my program but now there is a error 403(forbidden)(page not found). Also a value for the same site using this code
WebClient client = new WebClient();
client.Headers.Add("user-agent", "Mozilla/5.0 (compatible; MSIE 9.0; Windows NT 6.1; WOW64; Trident/5.0)");
Stream data;
try
{
data = client.OpenRead("http://www.nseindia.com/");
}
catch (Exception e)
{
MessageBox.Show("Error: " + e.Message + e.Data + e.HelpLink);
return "";
}
StreamReader reader = new StreamReader(data);
string s = null;
int count = 0;
while (reader.Read()>0)
{
s = reader.ReadLine();
if (s.Contains("<td class=\"t1\">"))
{
MessageBox.Show("Line: " + s);
s= s.Remove(0, 18);
s = s.Remove(s.Length - 5);
count++;
if (count == 5)
break;
}
}
data.Close();
reader.Close();
return s;
It seems that this site requires an Accept HTTP request header:
client.Headers[HttpRequestHeader.Accept] = "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8";
One of the problems you will encounter with what you are currently doing is that you are totally dependent on how the site you are trying to scrape works. Not to mention the fragility of your HTML parsing code. What is worse is that this could change anytime and you have no control unless you own the site. Tomorrow the site might start requiring some other HTTP header and your code will stop working once again. Just saying this so that you are prepared.
Maybe you could contact the site owners and see if they are offering an official API to consume their content.
What you are getting technically is :
HTTP - 403
In simple language it says that you are not authorized to access this resource.
Check whether your domain is blocking requests to this site. Just try to open it in your browser and see if it is opening fine.
Looks like NSE made some changes, now you need to use these two headers:
client.Headers[HttpRequestHeader.Accept] = "text/html,application/xhtml+xml,application/xml;q=0.9,/;q=0.8";
client.Headers.Add("user-agent", "Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.31 (KHTML, like Gecko) Chrome/26.0.1410.64 Safari/537.31");

Categories

Resources