Differentiate between client app and browser in ASMX Web Service? - c#

This is a follow-up to Choosing a Connection String based on kind of request for which I got no answer and what I thought worked doesn't.
I have a webservice that needs to choose a specific connection string based on the user calling it from a browser or from a client application.
I tried:
HttpContext.Current != null? ConnectionStrings["Website"].ConnectionString : ConnectionStrings["Client"].ConnectionString
but realized that at some point even if I'm using the client application, there is some HttpContext (if someone can explain why it'd be great) but the Browser field under Request is "Unknown". So, then I tried:
if ( HttpContext.Current != null )
{
if ( HttpContext.Current.Request.Browser != "Unknown" )
{
//browser connection string here
}
else
//client app connection string here
}
else
//client app connection string here
This worked wonders when debugging, but on testing environment it still points to Browser connection string even when calling from the client app, as if at some point the Browser isn't "Unknown" ...
Is there a MUCH easier/simpler way to do this? The way I'm doing it seems really ugly.
I'm quite desperate at the moment as I have no idea why this is happening..

Rather than detecting and switching on the browser type, consider these two suggestions:
Add Custom Request Headers
In your various callers, define a new custom header in your Http request.
HttpWebRequest webRequest = (HttpWebRequest)WebRequest.Create(url);
webRequest.Headers.Add("CallerType", "ClientApp"); // "Browser", etc.
Then you know exactly and reliably what type of client is calling. This would be hard to get wrong, and couldn't be spoofed/mistaken.
Include The Caller Type in the QueryString
myService.asmx?BrowserType=1
Add a simple new querystring parameter to your .asmx webmethod. This will work just the same in a controlled environment, but if other users/developers get it wrong, or malform the expected values, you'd have to take other measures to correct/handle.
Both allow you to easily determine the connString on the incoming value. Perhaps the absense of a modifier/header, you could assume a default. Your sample question has 2 basic outcomes, and either suggested solution will be easy to extend (browser, client app, iPhone, whathaveyou).

Related

PuppeteerSharp and Page Level Proxies

I know this is possible using Puppeteer in js, but I'm wondering if anyone has figured out how to proxy on a page level in PuppeteerSharp (different proxies for different tabs)?.
it seems I can catch the request, but I'm not sure how to adjust the proxy.
page.SetRequestInterceptionAsync(true).Wait();
page.Request += (s, ev) =>
{
// what to do?
}
Edit
I am aware that I can set the proxy at the browser level like so;
var browser = await Puppeteer.LaunchAsync(new LaunchOptions
{
Headless = false,
Args = new[] { "--proxy-server=host:port" }
});
var page = await browser.NewPageAsync();
await page.AuthenticateAsync(new Credentials() { Username = "username", Password = "password" });
But this is not what I'm trying to do. I'm trying to set the proxy for each page within a single browser instance. I want to test lots of proxies so spawning a new instance of the browser just to set the proxy is too much overhead.
You can use different browser instances for each logical instances. I mean instead of trying to set different proxy for each page/tab with different proxy just create new browser instance and set proxy via launch args.
If this solution doesn't fit your needs, check this question. There is library for NodeJS which give ability to use different proxy per each page/tab. You can check that library source code and implement same things inside your C# application.
That library is using very simple method. Instead of sending requests via puppeter's browser/page library send request via nodejs http tools. It can be done by using method page.setRequestInterception. So library intercept each request from page, after that gather data and send request via http tools. I used C# a long time ago. So maybe I am wrong, but you can try to use HttpWebRequest or something similar. After you get result you should use method request.respond and pass response results there. In this way you can put any kind of proxy inside your application. Check here code of library.

Session-aware webclient not working properly

I'm trying to web-crawl a site that uses php sessions via cookies. It is a good-ol' Squirrelmail webmail server.
I saw a couple of posts like this one, but it's not working for me.
When reaching the part when the cookies are sent by the host, I tried to retrieve the cookies using:
HttpWebResponse rs = (HttpWebResponse)rq.GetResponse();
CookieCollection cc = new CookieCollection();
cc.Add(rs.Cookies);
But rs.Cookies comes empty. However, there are set-cookie headers on the response, which I try to use as a guide to build actual cookies, like this:
for (int i = 0; i < rs.Headers.Count; i++)
{
if (rs.Headers.Keys[i].ToLower().Contains("cookie"))
{
string val = rs.Headers[i];
string[] vv = val.Split(";=,".ToCharArray());
Cookie co = new Cookie(vv[0], vv[1]);
// I know this is not the cleanest way to do it
// I've tried to manually set different values for
// co.Domain, co.Path and co.HttpOnly, just to get a working
// example. I tried different alternatives, but it doesn't
// seem to change anything
cc.Add(co);
}
}
Next, I send the cookies to request the next page, which is nothing but a frameset. The fact that I reach the frameset means I've been successfully authenticated and the session cookie is working. However, when I request one of the frames, I get an authentication-error web page. I've done my research, and the cookies do not change in the meantime. What may be going wrong?
Some may wonder why I'm trying to access webmail when there is pop/smtp to do a cleaner job. The answer is this is just a first example to learn the basics, I don't really care what the site is as long as I can successfully manage sessions.
I don't think posting all the code is a good idea yet, since it is a bit messy, and long: I planned to clean it once it worked (I'll post it, if you think it's worth the confusion). Moreover, I think I may have a conceptual error related to the frames, that may be the key to solve the problem.

Test Proxy to see if it requires Authentication

I have a list of proxies that I want to loop through and test to make sure they are working and also test to make sure that they do not require a username and password.
However, the test does not seem to be working correctly. For example I have one proxy that I know requires a username and password to use, but it is somehow getting through the test.
Here is the sample code that I have:
HttpWebRequest webReq = (HttpWebRequest)System.Net.HttpWebRequest.Create("http://www.google.com");
webReq.Proxy = new WebProxy(proxy);
HttpWebResponse webRes = (HttpWebResponse)webReq.GetResponse();
if (webRes.StatusCode != HttpStatusCode.ProxyAuthenticationRequired)
{
Stream myStream = webRes.GetResponseStream();
if (myStream != null)
{
success = true;
}
}
For example the following proxy requires authentication: "66.60.148.11:3128". However when I run the code, the webRes.StatusCode comes back as OK and passes the webRes.StatusCode != HttpStatusCode.ProxyAuthenticationRequired test.
Any ideas or sugestions are appreciated.
Thanks!
I would imagine the request to the proxy that is allowing it through (so to speak) is not following HTTP standards by not returning a 407 status code. You can view this by doing a packet sniff with something like WireShark http://www.wireshark.org/ or using browser debug tools like Chromes Developer Tools (Ctrl+Shift+I). You could also check to see what status code is being returned by looking at the value of webRes.StatusCode for this case.

Adobe AIR & Web Service Call. Errors with Error #1085: The element type "br" must be terminated by the matching end-tag "</br>"

Here is my simple code, which works fine if called from php or any other client then adobe air. Same code also works from calling from SWF, there is fluorineFX code for other part of project as well, but then it doesn't do anything to break this.
I do find one thing that all POST calls were somehow changing to GET, which really amazes me. I would be so glad to get the answer for this. Thanks in Advance everyone. Below is the almost same code from my web service. with AIR code just under it.
[WebMethod(EnableSession = true)]
public bool Authenticate(string UserName,string Password)
{
try
{
if (Membership.ValidateUser(UserName, Password)){
FormsAuthentication.SetAuthCookie(UserName, true);
return true;
}
return false;
}
catch (Exception ex)
{
return false;
}
}
and my call from adobe AIR code as below
var ws:WebService = new WebService();
ws.wsdl="http://mysite.com/myservice.asmx?WSDL";
ws.useProxy=false;
ws.addEventListener(LoadEvent.LOAD,onWSDLLoad);
ws.loadWSDL();
ws.Authenticate.addEventListener(ResultEvent.RESULT,resultHandler);
ws.Authenticate.addEventListener(FaultEvent.FAULT,onLoginFaultHandler);
ws.Authenticate("usrname","password");
protected function onLoginFaultHandler(event:FaultEvent):void
{
Alert.show('Login Failed with messsage\r\n[ '+event.fault.faultString+' ]');
/* Error #1085: The element type "br" must be terminated
by the matching end-tag "</br>". */
/* checking the content value of fault event shows
same out put as http://mysite.com/myservice.asmx */
}
protected function onLoginResultHandler(event:ResultEvent):void
{
/* on success code */
}
This guy tells us following in page http://verveguy.blogspot.com/2008/07/truth-about-flex-httpservice.html
All HTTP GET requests are stripped of headers. It's not in the Flex stack so it's probably the underlying Flash player runtime.
All HTTP GET requests that have content type other than "application/x-www-form-url-encoded" are turned into POST requests
All HTTP POST requests that have no actual posted data are turned into GET requests. See 1/ and 2/
All HTTP PUT and HTTP DELETE requests are turned into POST requests. This appears to be a browser limitation that the Flash player is stuck with.
I do see my request above turns into GET, but then I DO have post values in it. OR if those are somehow are not sent or recorded by Web Service Object ?
This is pretty simple... The Flex XML parser uses strict xml checking, so all tags must be closed. If you can change the web service, then change all <br> tags to <br />.
I finally found the answer myself. turns out I was having cookies set to AutoDetect. Which meant that the AIR would call a URL and it would need to redirect to keep the cookie/session value in side the URI itself.
Now I switched that to UseCookies and Everything is back to normal. I could test this from a sample web services and realized it was the server-side that was doing something wrong. And from AIR to Browser that's the only difference of cookies.
Somehow nusoap for PHP is smart to know that there is AutoDetect or New URI of the Web Services available. But AIR couldn't locate that. Anyways Thanks everyone for helping me solve this.

Request.UrlReferrer not working in IE7

I have this following code:
if (Request.UrlReferrer != null)
{
if (Request.UrlReferrer.PathAndQuery.ToLowerInvariant() == "/test/content.htm")
{
postbacklink = Request.UrlReferrer.AbsoluteUri.Replace("/TEST/Content.htm", "/Testing.aspx?") + Request.QueryString;
}
else
{
postbacklink = Request.UrlReferrer.AbsoluteUri;
}
}
ExtendedLoanView.PostbackLink = postbacklink;
Now this page can be accessed by two different locations. Which means this code:
postbacklink = Request.UrlReferrer.AbsoluteUri.Replace("/TEST/Content.htm", "/Test.aspx?") + Request.QueryString;
can only work with one page (Test.aspx) and is hard coded. So in IE7 Request.UrlReferrer shows me this:
Request.UrlReferrer = {http://Testing:12345/PPP/Content.htm}
Whereas in IE8+ I am getting this value:
Request.UrlReferrer = {http://Testing:12345/PPP/TestingPage.aspx?Name=Xyz&Address=123 YYY
How should I solve this issue? Its been bugging me for past month.
I would definitely advice not to base your logic on request information (not anymore than user entered values). The thing is that it will be different across browsers, and it is really hackable.
If you still need to pass information from client to server, make sure to have those validated. If you need those to stay in sync and have valid information, do not rely on what the browsers give you, but set it yourself and then take it from a place in the request you did set (for example, a hidden input, a control, a variable on the viewstate, or whatever allows the technology you're using).
Most sites handle the situation you're trying to solve by passing the destination URL in the URL itself, in a query parameter. For example:
http://www.example.com/Login.aspx?returnUrl=/TEST/content.htm
EDIT: I do realize that everything you send to the client is very hackable anyway, but if you set it yourself, it's easier for you to validate that it hasn't been tampered with. An example is the ViewState validation methods.

Categories

Resources