HttpWebRequest calling a page twice - c#

Problem: This console app calls a long running webpage hosted on Azure twice. I want it to call it only once.
The console app fails with a caught exception: The underlying connection was closed: An unexpected error occurred on a receive. so question
If I call the page from Chrome, it runs once (as expected)
public class ExtendedWebClient : WebClient
{
public int Timeout { get; set; }
protected override WebRequest GetWebRequest(Uri address)
{
HttpWebRequest request = (HttpWebRequest)base.GetWebRequest(address);
if (request != null)
request.Timeout = Timeout;
request.KeepAlive = false;
request.ProtocolVersion = HttpVersion.Version10;
return request;
}
public ExtendedWebClient()
{
Timeout = 1000000; // in ms.. the standard is 100,000
}
}
class Program
{
static void Main(string[] args)
{
var taskUrl = "http://secret.net/SendWeeklyEmails.aspx";
// create a webclient and issue an HTTP get to our url
try
{
using (ExtendedWebClient httpRequest = new ExtendedWebClient())
{
var output = httpRequest.DownloadString(taskUrl);
}
}
catch (Exception ex)
{
Console.WriteLine("Exception was: " + ex.Message);
}
}
}

Simple answer - I don't believe this client calls the page twice!
If your call is long running and Azure doesn't allow you to do long-polling, then you will need to rearchitect this app so that you have separate calls for starting and then progress monitoring this "SendWeeklyEmails" task. You could even do this using your command line client code, instead of using the web app.

Related

http get response for an Azure Website through a Console application in C#

I have a requirement where all I have to do is check if I am able to get a valid response from an azure website through a console application in C#. I have written the below code :
class ServerDBAvailability
{
static void Main(string[] args)
{
WebSiteIsAvailable("https://somehostname.azurewebsites.net/");
}
public static bool WebSiteIsAvailable(string Url)
{
string Message = string.Empty;
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(Url);
// Set the credentials to the current user account
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
request.Method = "GET";
try
{
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
{
// Do nothing; we're only testing to see if we can get the response
//Console.WriteLine(response);
}
}
catch (WebException ex)
{
Message += ((Message.Length > 0) ? "\n" : "") + ex.Message;
}
return (Message.Length == 0);
}
}
But when I run this code locally, its giving me an "Unauthorised 401 error" on this line --> HttpWebResponse response = (HttpWebResponse)request.GetResponse(). I am able to access the azure website through chrome.
I'm not able to understand why I am getting unauthorised error, even though I have access to this website. Please note that this is happening for site hosted on Azure only, for my other on prem sites this code is working fine.
Can someone help me out here?

Make application to wait after posting request till Website returns response using Event Handlers

I would like to write an Event Handler which triggers a method when website returns some response.
My application fetch response by posting URL's from few websites. Unfortunately at times website return response after some delay(It may take 1 to 5 seconds) and that leads my application to throw an error because next request executes without waiting for previous request to get response.
I can actually put a sleep time after every request that application posts but that doesn't seems to be right way because if I set 5 seconds as sleep time and if suppose website returns response in 1 seconds that makes process to wait for 4 more seconds unnecessarily.
To save some processing time I decide to add Event handlers which should allow application to run next request after we get response for previous request.
So I tried something like this and I can able to call trigger but it is not working the way I want.
My intention is to to create trigger which makes next request to run after getting response to the previous request and at most it can wait 5 second.
Can someone please help me in this, Thanks In advance.
public delegate void ChangedEventHandler(string response);
public class ListWithChangedEvent : EventArgs
{
public event ChangedEventHandler Changed;
protected virtual void OnChanged(string response)
{
if (Changed != null)
{
Changed(response);
}
}
public void Validate(string response)
{
OnChanged(response);
}
}
public class EventListener
{
private ListWithChangedEvent _list;
public EventListener(ListWithChangedEvent list)
{
_list = list;
_list.Changed += new ChangedEventHandler(ListChanged);
}
private void ListChanged(string response)
{
if (!response.IsEmpty())
{
return;
}
}
}
//Validating Response after request being posted
private void _postRequestAndParseResponse()
{
_performPostRequest(_returnTailPart(_urls.CommonUrl), argumentsList);
ListWithChangedEvent list = new ListWithChangedEvent();
EventListener listener = new EventListener(list);
list.Validate(_docNode.InnerHtml);
}
HTTP timeouts are a built-in function of most http clients and are the simplest way of requesting web resource with timeout specified.
If you are using WebRequest you can use its timeout property. Here's an example:
public void Test()
{
const int timeoutMs = 5000;
sw.Start();
RequestWithTimeout("https://google.com", timeoutMs);
RequestWithTimeout("http://deelay.me/7000/google.com", timeoutMs);
RequestWithTimeout("http://thisurelydoesnnotexist.com", timeoutMs);
RequestWithTimeout("http://google.com", timeoutMs);
}
private void RequestWithTimeout(string url, int timeoutMs)
{
try
{
Log("Webrequest at " + url + " starting");
WebRequest req = WebRequest.Create(url);
req.Timeout = timeoutMs;
var response = req.GetResponse();
Log("Webrequest at " + url + " finished");
}
catch (WebException webException)
{
Log("WebRequest failed: " + webException.Status);
}
catch (Exception ex)
{
Log(ex.ToString());
}
}
Output:
0ms | Webrequest at https://google.com starting
169ms | Webrequest at https://google.com finished
170ms | Webrequest at http://deelay.me/7000/google.com starting
5186ms | WebRequest failed: Timeout
5186ms | Webrequest at http://thisurelydoesnnotexist.com starting
5247ms | WebRequest failed: NameResolutionFailure
5247ms | Webrequest at http://google.com starting
5311ms | Webrequest at http://google.com finished
If you are using WebClient you can also easily configure timeouts. Check this answer out: https://stackoverflow.com/a/6994391/5056245
If you really need to implement timeout at method-calling level, check this out:
Implementing a timeout on a function returning a value
If none of those answers work for you, please tell us how are you requesting your web resources.
You have not specified how you are calling urls. If you use HttpClint (http://www.asp.net/web-api/overview/advanced/calling-a-web-api-from-a-net-client) it can be done as follows:
using(var client = new HttpClient())
{
var task = client.PostAsync(url1, ..., ...);
if(!task.wait(5000))
{
//task not completed in specified interval (5 sec), take action accordingly.
}
else
{
// task completed within 5 second, take action accordingly, you can access the response using task.Result
}
// Continue with other urls as needed
}
There can be many other ways as well. Please post your code for calling urls if this doesn't answer your question.

How to know if a website/domain is available before loading a webview with that URL

hello I am trying to launch an intent with a webview from a user entered URL, I have been looking everywhere online and I can't find a concrete answer as to how to make sure the website will actually connect before allowing the user to proceed to the next activity. I have found many tools to make sure the URL follows the correct format but none that actually let me make sure it can actually connect.
You can use WebClient and check if any exception is thrown:
using (var client = new HeadOnlyClient())
{
try
{
client.DownloadString("http://google.com");
}
catch (Exception ex)
{
// URL is not accessible.
}
}
You can catch more specific exceptions to make it more elegant.
You can also use custom modification to WebClient to check HEAD only and decrease the amount of data downloaded:
class HeadOnlyClient : WebClient
{
protected override WebRequest GetWebRequest(Uri address)
{
WebRequest req = base.GetWebRequest(address);
req.Method = "HEAD";
return req;
}
}
I would suggest you to use HttpHead for simple request with AndroidHttpClient, but it is deprecated now. You can try to implement HEAD Request by sockets.
You can try to ping the address first.
See this SO question: How to Ping External IP from Java Android
Another option:
Connectivity Plugin for Xamarin and Windows
Task<bool> IsReachable(string host, int msTimeout = 5000);
But, any pre-check that succeeds isn't guaranteed as the very next request might fail so you should still handle that.
Here's what I ended up doing to Check if a Host name is reachable. I was connecting to a site with a self signed certificate so that's why I have the delegate in the ServiceCertificateValidationCallback.
private async Task<bool> CheckHostConnectionAsync (string serverName)
{
string Message = string.Empty;
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(serverName);
ServicePointManager.ServerCertificateValidationCallback += delegate
{
return true;
};
// Set the credentials to the current user account
request.Credentials = System.Net.CredentialCache.DefaultCredentials;
request.Method = "GET";
request.Timeout = 1000 * 40;
try
{
using (HttpWebResponse response = (HttpWebResponse) await request.GetResponseAsync ())
{
// Do nothing; we're only testing to see if we can get the response
}
}
catch (WebException ex)
{
Message += ((Message.Length > 0) ? "\n" : "") + ex.Message;
return false;
}
if (Message.Length == 0)
{
goToMainActivity (serverName);
}
return true;
}

HTTPS web request failing

When I run the program contained below the first HTTPS request succeeds, but the second request fails. Both url's are valid and both can be accessed successfully in a browser. Any suggestions as to what needs to be done to access the second url successfully?
using System;
using System.IO;
using System.Net;
public class Program
{
private static void Main(string[] args)
{
var content = "";
bool status;
var url1 = "https://mail.google.com";
var url2 = "https://my.ooma.com";
status = DoHttpRequest(url1, out content);
OutputStatus(url1, status, content);
status = DoHttpRequest(url2, out content);
OutputStatus(url2, status, content);
Console.ReadLine();
}
private static void OutputStatus(string url, bool status, string content)
{
if (status) Console.WriteLine("Url={0}, Status=Success, content length = {1}", url, content.Length);
else Console.WriteLine("Url={0}, Status=Fail, ErrorMessage={1}", url, content);
}
private static bool DoHttpRequest(string url, out string content)
{
content = "";
var request = (HttpWebRequest) WebRequest.Create(url);
try
{
request.Method = "GET";
request.CookieContainer = null;
request.Timeout = 25000; // 25 seconds
var response = (HttpWebResponse) request.GetResponse();
var streamReader = new StreamReader(response.GetResponseStream());
content = streamReader.ReadToEnd();
return true;
}
catch (WebException ex)
{
content = ex.Message;
return false;
}
}
}
Historically, most problems of this description that I've seen occur when you forget to call .Close() on the object returned from GetResponseStream(). The problem exists because when you forget to close the first request, the second request deadlocks waiting for a free connection.
Typically this hang happens on the 3rd request, not the second.
Update: Looking at your repro, this has nothing to do with the order of the requests. You're hitting a problem because this site is sending a TLS Warning at the beginning of the HTTPS handshake, and .NET will timeout when that occurs. See http://blogs.msdn.com/b/fiddler/archive/2012/03/29/https-request-hangs-.net-application-connection-on-tls-server-name-indicator-warning.aspx. The problem only repros on Windows Vista and later, because the warning is related to a TLS extension that doesn't exist in the HTTPS stack on WinXP.
Increse your request TimeOut.
request.Timeout = 60000; //60 second.
May be your network connection is a bit slow. I run with 25 seconds, okay. (Yeah, the second url is a bit longer to get response, than the first one.)

HttpWebRequest Exception Handling

I am making an asyncronous HttpWebRequest and if that fails, I want to call a backup web service. Like so:
public void CallService1()
{
HttpWebRequest request = HttpWebRequest.Create("http://MyFirstWebService")
request.BeginGetResponse(this.CallService1Completed, request);
}
public void CallService1Completed(IAsyncResult result)
{
HttpWebRequest request = (HttpWebRequest)result.AsyncState;
try
{
using (HttpWebResponse response = (HttpWebResponse)request.EndGetResponse(result))
{
using (Stream responseStream = response.GetResponseStream())
{
// Use Data
}
}
}
catch (WebException webException)
{
if (?????)
{
CallBackupService2();
}
}
}
Bearing in mind that this is a mobile applications where you may not always have an internet connection. I do not want to call the backup service if there is no internet connection. I only want to call the backup service if the first service is down for some reason. What would I put in the 'IF' statement above.
It can be implemented like:
if (NetworkInterface.GetIsNetworkAvailable())
{
CallBackupService2();
}

Categories

Resources