Can you supply Credentials through Webclient in C#? - c#

I need to access a php page that's behind a central authentication server. Using any other means, when I try to access this site, it redirects to the CAS page, awaiting authentication. Which essentially, has me supplying GET variables to the authentication page. (do not want.) Unfortunately, we cannot remove the CAS for this page, because it is a massive security risk.
So my question is, is there anyway to supply the credentials I have to the CAS, store any cookies from that, then using that, access the php page?
my attempt below:
Since I'm using the GET method, I don't need to worry about waiting until I'm at the php page to supply the values, I just need to actually access the page, which we'll call https://site.com/page.php?var1=value1&var2=value2
As WebClient accessing page with credentials suggested, I created a class CookieAwareClient
public class CookieAwareWebClient : WebClient
{
public CookieAwareWebClient()
{
CookieContainer = new CookieContainer();
}
public CookieContainer CookieContainer { get; private set; }
protected override WebRequest GetWebRequest(Uri address)
{
var request = (HttpWebRequest)base.GetWebRequest(address);
request.CookieContainer = CookieContainer;
return request;
}
}
and then the following to submit:
using (var client = new CookieAwareWebClient())
using (var client = new CookieAwareWebClient())
{
var values = new NameValueCollection
{
{"username", "usr" },
{"password", "(passw0rd!)"}, //don't worry. not actual information
};
//This should bring me to the login page, which will upload the values I have given
client.UploadValues("https://site.com/page.php?var1=value1&var2=value2", values);
string result = client.DownloadString("https://site.com/page.php?var1=value1&var2=value2");
unfortunately, this still does not log me into the CAS, and result is returned as the HTML for the CAS.
Does anyone have ANY idea how I fix this? thank you for your help.

I don't know if CAS supports standard HTTP auth out of the box. It is generally forms based.
After being redirected to CAS, you'll want to parse/scrape the page for a few of the hidden form variables. POST those parameters, the username, and the password (along with the cookie that was returned with GET request).
CAS should return an HTTP redirect. (There will be an additional cookie added to the collection: CASTGC cookie.) The webclient should redirect you back to the originally requested resource. Some apps might need two hops to get back.
If you will be polling the resource within the lifetime of the web application's session, make the subsequent calls using the same CookieCollection and you shouldn't have authenticate against CAS again.

Related

JWT auth with asp.net core to create token and store in http only cookies and angular to call method with header

I am new to JWT with basic idea of how it works. I have set the jwt token inside cookie from my web api.
Response.Cookies.Append("X-Access-Token", foundUser.Token
, new CookieOptions { HttpOnly = true }) ;
Now i am trying to call a web api get request which is marked as authorised from my agular application.
But inside angular i dont have a way to send the cookie. As per few documents i came to know that http only cookies are sent directly with our interference but i am not able to fetch the data with unauthorised error, which means that the token is not being used. I have not included the same in the get method as well. see below.
[Authorize(AuthenticationSchemes = JwtBearerDefaults.AuthenticationScheme)]
[HttpGet]
public async Task<ActionResult<IEnumerable<Invoice>>> GetInvoices([FromQuery]QueryParameters queryParameters)
{
return await _uOW.InvoiceRepository.GetInvoices(queryParameters);
}
Do i have to add some logic to read the cookie in backend? I didnt find any answer for this.
Please let me know if there is something missing.
Also inside angular i have not written any special code for calling this get. Is that required?
var headerForToken = new HttpHeaders({'Authorization':`Bearer ${sessionStorage.getItem("token")}`});
return this.http.get<any>(this.APIUrl+'Invoices/GetInvoices',{headers:headerForToken });
This is my current approach in which i am using local storage, but i really need to use the http only cookie.
If there is any link for solution or anything that would be really helpfull
Update 1: So we need to add a domain for this. I tried adding domain still it is not visible when i try to see the cookies.
Response.Cookies.Append("X-Access-Token", foundUser.Token
, new CookieOptions { HttpOnly = true, Domain = ".localhost:4200" }) ;

System.Net URL call not returning the expected HTML response

I am trying to query a website to scrape some information for my organization, this information is sat behind a login page which for now I am bypassing by logging into the browser using my organization credentials and this website stores the details in the cookies so in any subsequent visits am still logged in (I know this is a hit and miss solution but for my purposes it's fine. In the event am logged out I will just manually log back in through a browser session).
Within this site there are two sections I need to access:
/Memberships
In order to retrieve a list of URL's
/Organisation?orgid=XXXXXX
And individual organizational pages which are retrieved from the /Memberships page
Problem
Now for some strange reason during the call to /Memberships the HTML data retrieved is perfectly fine and I am able to get a list of all the child URL's.
string url = "https://www.ACME.com/Memberships";
var response = CallUrl(url).Result;
private static async Task<string> CallUrl(string fullUrl)
{
HttpClient client = new HttpClient();
ServicePointManager.SecurityProtocol = SecurityProtocolType.Tls13;
client.DefaultRequestHeaders.Accept.Clear();
var response = client.GetStringAsync(fullUrl);
return await response;
}
When I proceed to attempt to query any of the child URL's I don't get the HTML response I am expecting which would be the organization details. Instead am presented with the website login page (well the HTML from the login page).
The code used is pretty much the same as above but if we swap out the url variable for:
string url = "https://www.ACME.com/Organisation?orgid=XXXX";
Keep in mind in order to access both the /Memberships page and the individual /Organisation?orgid=XXXXXX pages one must be logged in.
So what's stumping me is why can I access /Memberships but not the other pages!?

How to handle recaptcha on third-party site in my client application

I was curious about how people build third-party apps for sites with NO public APIs, but I could not really find any tutorials on this topic. So I decided to just give it a try. I created a simple desktop application, which uses HttpClient to send GET requests to the site I frequently use, and then parses the response and displays the data in my WPF window. This approach worked pretty well (probably because the site is fairly simple).
However, today I tried to run my application from a different place, and I kept getting 403 errors in response to my application's requests. It turned out, that the network I was using went through a VPN server, while the site I was trying to access used CloudFlare as protection layer, which apparently forces VPN users to enter reCaptcha in order to access the target site.
var baseAddress = new Uri("http://www.cloudflare.com");
using (var client = new HttpClient() { BaseAddress = baseAddress })
{
var message = new HttpRequestMessage(HttpMethod.Get, "/");
//this line returns CloudFlare home page if I use regualr network and reCaptcha page, when I use VPN
var result = await client.SendAsync(message);
//this line throws if I use VPN (403 Forbidden)
result.EnsureSuccessStatusCode();
}
Now the question is: what is the proper way to deal with CloudFlare protection in client application? Do I have to display the reCaptcha in my application just like the web browser does? Do I have to set any particular headers in order to get a proper response instead of 403? Any tips are welcome, as this is a completely new area to me.
P.S. I write in C# because this is the laguage I'm most comfortable with, but I don't mind aswers using any other language as long as they answer the question.
I guess, one way to go about it is to handle captcha in web browser, outside the client application.
Parse the response to see if it is a captcha page.
If it is - open this page in browser.
Let user solve the captcha there.
Fetch the CloudFlare cookies form browser's cookie storage. You gonna need __cfduid (user ID) and cf_clearance (proof of solving the captcha).
Attach those cookies to requests sent by client application.
Use application as normal for the next 24 hours (until CloudFlare cookies expire).
Now the hard part here is (4). It's easy to manually copy-paste the cookies to make the code snippet in my question work with VPN:
var baseAddress = new Uri("http://www.cloudflare.com");
var cookieContainer = new CookieContainer();
using (var client = new HttpClient(new HttpClientHandler() { CookieContainer = cookieContainer } , true) { BaseAddress = baseAddress })
{
var message = new HttpRequestMessage(HttpMethod.Get, "/");
//I've also copy-pasted all the headers from browser
//some of those might be optional
message.Headers.Add("User-Agent", "Mozilla/5.0 (Windows NT 6.1; WOW64; rv:44.0) Gecko/20100101 Firefox/44.0");
message.Headers.Add("Accept", "text/html,application/xhtml+xml,application/xml;q=0.9,*/*;q=0.8");
message.Headers.Add("Accept-Encoding", "gzip, deflate" });
message.Headers.Add("Accept-Language", "en-US;q=0.5,en;q=0.3");
//adding CloudFlare cookies
cookieContainer.Add(new Cookie("__cfduid", "copy-pasted-cookie-value", "/", "cloudflare.com"));
cookieContainer.Add(new Cookie("cf_clearance", "copy-pasted-cookie-value", "/", "cloudflare.com"));
var result = await client.SendAsync(message);
result.EnsureSuccessStatusCode();
}
But I think its going to be a tricky task to automate the process of fetching the cookies, due to different browsers storing cookies in different places and/or formats. Not to metion the fact that you need to use external browser for this approach to work, which is really annoying. Still, its something to consider.
Answer to "build third-party apps for sites with NO public APIs" is that even though some Software Vendors don't have a public api's they have partner programs.
Good example is Netflix, they used to have a public api. Some of the Apps developed when the Public Api was enabled allowed to continue api usage.
In your scenario, your client app acts as a web crawler (downloading html content and trying to parse information). What you are trying to do is to Crawl the Cloudfare data which is not meant to be crawled by a third party app (bot). From the cloudfare side, they have done the correct thing to have a Captcha which prevents automated requests.
Further, if you try to send requests at a high frequency (requests/sec), and if the Cloudfare has Threat detection mechanisms, your ip address will be blocked. I assume that they already identified the VPN server IP address you are trying to use and blacklisted that, that's why you are getting a 403.
Basically you solely depend on security holes in Cloudfare pages you try to access via the client app. This is sort of hacking Cloudfare (doing something cloudfare has restricted) which I would not recommend.
If you have a cool idea, better to contact their developer team and discuss about that.
In case you still need it, I had the very same problem and came up with the following solution 2 years ago.
It opens up the Cloudflare protected web page with the C# WebBrowser class, waits about 6 seconds so that CloudFlare saves the clearance cookie and then the program saves the cookie to disk.
You need a javascript capable browser like the C# WebBrowser class, as the Cloudflare captcha page needs javascript to function and count down in order to save the cookie, any other attempt will fail.
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Threading.Tasks;
using System.Windows.Forms;
using System.Runtime.InteropServices;
using System.Net;
using System.Threading;
namespace kek
{
public partial class Form1 : Form
{
[DllImport("wininet.dll", SetLastError = true)]
public static extern bool InternetGetCookieEx(string url, string cookieName, StringBuilder cookieData, ref int size, Int32 dwFlags, IntPtr lpReserved);
private Uri Uri = new Uri("http://www.my-cloudflare-protected-website.com");
private const Int32 InternetCookieHttponly = 0x2000;
private const Int32 ERROR_INSUFFICIENT_BUFFER = 0x7A;
public Form1()
{
InitializeComponent();
webBrowser1.DocumentCompleted += new System.Windows.Forms.WebBrowserDocumentCompletedEventHandler(this.webBrowser1_DocumentCompleted);
webBrowser1.Navigate(Uri, null, null, "User-Agent: kappaxdkappa\r\n"); //user-agent needs to be set another way if that doesnt work
}
private void webBrowser1_DocumentCompleted(object sender, WebBrowserDocumentCompletedEventArgs e)
{
int waitTime = 0;
if(webBrowser1.DocumentTitle.Contains("We are under attack")) //check what string identifies the unique cloudflare captcha page and put it here
{
waitTime = 6000;
}
Task.Run(async () =>
{
await Task.Delay(waitTime); //cookie can be saved right away, but the waiting period might not have passed yet
String cloudflareCookie = GetCookie(Uri, "cf_clearance");
if (!String.IsNullOrEmpty(cloudflareCookie))
{
System.IO.StreamWriter file = new System.IO.StreamWriter("c:\\CFcookie.blob"); //save to %appdata%\MyProgram\Cookies\clearence.blob
file.Write(cloudflareCookie);
file.Close();
}
});
}
String GetCookie(Uri uri, String cookieName)
{
int datasize = 0;
StringBuilder cookieData = new StringBuilder(datasize);
InternetGetCookieEx(uri.ToString(), cookieName, cookieData, ref datasize, InternetCookieHttponly, IntPtr.Zero);
if (Marshal.GetLastWin32Error() == ERROR_INSUFFICIENT_BUFFER && datasize > 0)
{
cookieData = new StringBuilder(datasize);
if (InternetGetCookieEx(uri.ToString(), cookieName, cookieData, ref datasize, InternetCookieHttponly, IntPtr.Zero))
{
if (cookieData.Length > 0)
{
CookieContainer container = new CookieContainer();
container.SetCookies(uri, cookieData.ToString());
return container.GetCookieHeader(uri);
}
}
}
return String.Empty;
}
}
}
Some notes:
Use a better user agent
The cookie is saved to disk as well because I needed it for something
else. Not sure if the in-built browser saved the cookies for next
time, but in case it does not, this way you can simply load it again.
Change the "We are under attack" phrase to the one that identifies
the CF captcha page you are trying to bypass.
__cfduid cookie is not required afaik
EDIT: Sorry, I was so focused on Cloudflare itself after reading other answers in here that I didn't notice that you need to bypass Recaptcha that is sometimes found on the Cloudflare page. My code can help you a bit for the browser and cookie part, but you will have a hard time solving Recaptcha, at least now. A few weeks ago they made it even harder. I can recommend compiling your own version of Firefox and then automatically solve the captcha by hitting the checkbox. If you don't get that simple captcha then you need to display it for the user. Mind that you also need to randomize the behaviour of your browser and how you click on the checkbox, otherwise it will detect you as a bot.

C# - HttpClient is not sending any cookies

I am developing two websites names www.web1.com and www.web2.com.
In web1 I am saving a http cookie as below
HttpCookie AuthCookie = new HttpCookie(AppConstants.Cookie.AUTH_COOKIE);
AuthCookie.Path = "/";
AuthCookie.Value = "value1";
Response.Cookies.Add(AuthCookie);
Now what I want is to read this cookie in the second website i.e. web2. I am trying to read it using HttpClient as below
HttpClientHandler handler = new HttpClientHandler();
handler.CookieContainer = new CookieContainer();
HttpClient client = new HttpClient(handler);
response = client.GetAsync("http://www.web1.com").Result;
var cookies = cookies.GetCookies(new Uri("http://www.web1.com"));
This doesn't returns any cookies, checked via Fiddler as well. But if I directly open the www.web1.com and check fiddler then it sends the cookie.
Please see what I am missing so that the cookie is not returned from httpclient.
Thanks,
SB
Not sure if this would work properly in your case but AuthCookie.Domain = "IP/Domain"; should do the job for you.
Having said that there are other alternatives like query string and page post on other domain that might interest you.
You can't get or set cookies for another domain. That would be a huge security issue. (would you want me reading your site's cookies on my site?)
Some related posts:
Create a asp.net authenicated cookie on 1 site for another
I need to get all the cookies from the browser
Create cookie with cross domain
Cross domain cookies
UPDATE: A bit of clarification: As a server, you can't get or set cookies on a client for another domain, which is what you want to do. As a client, you can modify / delete cookies that a server sets for you.
In your example, your server-side code is making the request to web1.com. You are not going to get a cookie for a random client. The client isn't involved at all in your code above.
If I visit web1.com and you set a cookie called "username" with a value of "bob", I can, as a client, modify this cookie to have a value of "admin" and then potentially have admin rights to your site, depending on how you are handling your cookies.

Can I put an ASP.Net session ID in a hidden form field?

I'm using the Yahoo Uploader, part of the Yahoo UI Library, on my ASP.Net website to allow users to upload files. For those unfamiliar, the uploader works by using a Flash applet to give me more control over the FileOpen dialog. I can specify a filter for file types, allow multiple files to be selected, etc. It's great, but it has the following documented limitation:
Because of a known Flash bug, the Uploader running in Firefox in Windows does not send the correct cookies with the upload; instead of sending Firefox cookies, it sends Internet Explorer’s cookies for the respective domain. As a workaround, we suggest either using a cookieless upload method or appending document.cookie to the upload request.
So, if a user is using Firefox, I can't rely on cookies to persist their session when they upload a file. I need their session because I need to know who they are! As a workaround, I'm using the Application object thusly:
Guid UploadID = Guid.NewGuid();
Application.Add(Guid.ToString(), User);
So, I'm creating a unique ID and using it as a key to store the Page.User object in the Application scope. I include that ID as a variable in the POST when the file is uploaded. Then, in the handler that accepts the file upload, I grab the User object thusly:
IPrincipal User = (IPrincipal)Application[Request.Form["uploadid"]];
This actually works, but it has two glaring drawbacks:
If IIS, the app pool, or even just the application is restarted between the time the user visits the upload page, and actually uploads a file, their "uploadid" is deleted from application scope and the upload fails because I can't authenticate them.
If I ever scale to a web farm (possibly even a web garden) scenario, this will completely break. I might not be worried, except I do plan on scaling this app in the future.
Does anyone have a better way? Is there a way for me to pass the actual ASP.Net session ID in a POST variable, then use that ID at the other end to retrieve the session?
I know I can get the session ID through Session.SessionID, and I know how to use YUI to post it to the next page. What I don't know is how to use that SessionID to grab the session from the state server.
Yes, I'm using a state server to store the sessions, so they persist application/IIS restarts, and will work in a web farm scenario.
Here is a post from the maintainer of SWFUpload which explains how to load the session from an ID stored in Request.Form. I imagine the same thing would work for the Yahoo component.
Note the security disclaimers at the bottom of the post.
By including a Global.asax file and the following code you can override the missing Session ID cookie:
using System;
using System.Web;
public class Global_asax : System.Web.HttpApplication
{
private void Application_BeginRequest(object sender, EventArgs e)
{
/*
Fix for the Flash Player Cookie bug in Non-IE browsers.
Since Flash Player always sends the IE cookies even in FireFox
we have to bypass the cookies by sending the values as part of the POST or GET
and overwrite the cookies with the passed in values.
The theory is that at this point (BeginRequest) the cookies have not been ready by
the Session and Authentication logic and if we update the cookies here we'll get our
Session and Authentication restored correctly
*/
HttpRequest request = HttpContext.Current.Request;
try
{
string sessionParamName = "ASPSESSID";
string sessionCookieName = "ASP.NET_SESSIONID";
string sessionValue = request.Form[sessionParamName] ?? request.QueryString[sessionParamName];
if (sessionValue != null)
{
UpdateCookie(sessionCookieName, sessionValue);
}
}
catch (Exception ex)
{
// TODO: Add logging here.
}
try
{
string authParamName = "AUTHID";
string authCookieName = FormsAuthentication.FormsCookieName;
string authValue = request.Form[authParamName] ?? request.QueryString[authParamName];
if (authValue != null)
{
UpdateCookie(authCookieName, authValue);
}
}
catch (Exception ex)
{
// TODO: Add logging here.
}
}
private void UpdateCookie(string cookieName, string cookieValue)
{
HttpCookie cookie = HttpContext.Current.Request.Cookies.Get(cookieName);
if (cookie == null)
{
HttpCookie newCookie = new HttpCookie(cookieName, cookieValue);
Response.Cookies.Add(newCookie);
}
else
{
cookie.Value = cookieValue;
HttpContext.Current.Request.Cookies.Set(cookie);
}
}
}
Security Warning: Don't just copy and paste this code in to your ASP.Net application without knowing what you are doing. It introduces security issues and possibilities of Cross-site Scripting.
Relying on this blog post, here's a function that should get you the session for any user based on the session ID, though it's not pretty:
public SessionStateStoreData GetSessionById(string sessionId)
{
HttpApplication httpApplication = HttpContext.ApplicationInstance;
// Black magic #1: getting to SessionStateModule
HttpModuleCollection httpModuleCollection = httpApplication.Modules;
SessionStateModule sessionHttpModule = httpModuleCollection["Session"] as SessionStateModule;
if (sessionHttpModule == null)
{
// Couldn't find Session module
return null;
}
// Black magic #2: getting to SessionStateStoreProviderBase through reflection
FieldInfo fieldInfo = typeof(SessionStateModule).GetField("_store", BindingFlags.NonPublic | BindingFlags.Instance);
SessionStateStoreProviderBase sessionStateStoreProviderBase = fieldInfo.GetValue(sessionHttpModule) as SessionStateStoreProviderBase;
if (sessionStateStoreProviderBase == null)
{
// Couldn't find sessionStateStoreProviderBase
return null;
}
// Black magic #3: generating dummy HttpContext out of the thin air. sessionStateStoreProviderBase.GetItem in #4 needs it.
SimpleWorkerRequest request = new SimpleWorkerRequest("dummy.html", null, new StringWriter());
HttpContext context = new HttpContext(request);
// Black magic #4: using sessionStateStoreProviderBase.GetItem to fetch the data from session with given Id.
bool locked;
TimeSpan lockAge;
object lockId;
SessionStateActions actions;
SessionStateStoreData sessionStateStoreData = sessionStateStoreProviderBase.GetItem(
context, sessionId, out locked, out lockAge, out lockId, out actions);
return sessionStateStoreData;
}
You can get your current SessionID from the following code:
string sessionId = HttpContext.Current.Session.SessionID;
Then you can feed that into a hidden field maybe and then access that value through YUI.
It's just a get, so you hopefully won't have any scaling problems. Security-problems though, that I don't know.
The ASP.Net Session ID is stored in Session.SessionID so you could set that in a hidden field and then post it to the next page.
I think, however, that if the application restarts, the sessionID will expire if you do not store your sessions in sql server.

Categories

Resources