I want to:
Login to a website
Save Cookies
Give user a choice to do A, B or C
A,B and C all require being logged in.
Each will open a FirefoxDriver and do their own thing
What i want to do, is login ONCE, save the cookies from that, and add them to any other FirefoxDriver i want to open.
Right now I'm trying to save the cookies in
public ReadOnlyCollection<Cookie> Cookies { get; set; }
which is the result of
WebDriver.Manage().Cookies.AllCookies;
Assuming login worked and cookies were saving in the above, I have this:
WebDriver = new FirefoxDriver();
WebDriver.Navigate().GoToUrl("http://www.example.com");
if (cookies != null)
{
var s = WebDriver.Manage().Cookies; //Logged out cookies
WebDriver.Manage().Cookies.DeleteAllCookies(); //Delete all of them
var sd = WebDriver.Manage().Cookies; //Make sure theyre deleted
foreach (var cookie in cookies)
{
WebDriver.Manage().Cookies.AddCookie(cookie);
}
var ss = WebDriver.Manage().Cookies;
WebDriver.Navigate().GoToUrl("http://example.com/requiresloginpage");
}
The problem is, howevering over "ss" in this case, gives this exception error
AllCookies = 'ss.AllCookies' threw an exception of type
'OpenQA.Selenium.WebDriverException'
base {System.Exception} = {"Unexpected problem getting cookies"}
InnerException = {"Cookie name cannot be null or empty string\r\nParameter name: name"}
I'm passing 8 cookies (total number when youre logged in) - and all of them seem set and ok. Not sure what I'm doing wrong
In order to save cookies, you should tell selenium to use a specified profile. For some reason I can't get it to use my normal Chrome profile, but this solution will allow you to log in one time, and afterward, selenium will remember cookies.
ChromeOptions options = new ChromeOptions();
options.AddArguments(#"user-data-dir=C:\Users\YOU\AppData\Local\Google\Chrome\User Data\NAMEYOUCHOOSE");
//specify location for profile creation/ access
ChromeDriver driver = new ChromeDriver(options);
Simply put, this code creates a save location for a profile, which does include cookies.
using this code, it is not necessary to write code that saves or loads cookies, Chrome will handle that.
Please note that the location where chrome saves your profiles may be different than mine, and I have only successfully used a directory that leads to the same location as my regular Chrome profile. This profile exists in the form of a folder, not a file.
Generally Selenium do not support cross-session cookies.
Most easy way is to use Serialization.
You need to create wrapper class around selenium's cookie and make it serializable. And create class CookiesManager where will be 2 methods: SaveSession() -- to save and RestoreSession() - to restore from serialized file.
Another way is to save some cookies information into some temp cookies file. Like.... Csv or XML.
Sample of this way you can see here: Keep user logged in - save cookies using web driver
but only for c#.
Related
I have a really frustrating issue, where all I want to do is get user images from O365 and simply display them on my web page, which is hosted on Azure azpp service.
As you can see from this SO and this SharePoint.StackExchange question, The images fail to load when simply trying to display the link taken from SharePoint in an <img> tag.
However, after navigating to the image in a a new tab, and refreshing my page, the iamges load fine. can anyone explain this behaviour? it makes no sense to me at all
Anyways since that just dont work for whatever reason (logged in user clearly has the right permissions, as the images do disaply after navigating to them),
I thought I would try downloading the images using graph API.
SO I downloaded the quick start project and trying to download the iamges with
public async Task<Stream> TestAsync(GraphServiceClient graphClient)
{
var users = graphClient.Users;
var jk = users["user.name#domain.com"];
return await jk.Photo.Content.Request().GetAsync();
}
But I just get
Exception of type 'Microsoft.Graph.ServiceException' was thrown.
Yet when I try to view the same image in the API graph explorer, I can download the image. Please can someone just help me to display SharePoint user images in my web page without the user having to first navigate to the image directly.. Why must it be so difficult?
Once you have a valid token, make sure your permission scopes include User.Read.All, for example:
The query:
var user = graphClient.Users["<userPrincipalName>"];
corresponds to the following endpoint
Url: /users/{userPrincipalName}
Method: GET
which requires User.Read.All scope, see permission section for a more details.
In addition, in case of access without a user token requires Administrative Consent before it can be used.
Example
var users = graphClient.Users;
var user = users[accountName];
var photo = await user.Photo.Content.Request().GetAsync() as MemoryStream;
using (var file = new FileStream("./user.jpg", FileMode.Create, FileAccess.Write))
{
if (photo != null) photo.WriteTo(file);
}
I have a web site (IIS, C#.Net, MVC4) where users are (forms-)authenticated and they upload media files (mostly .mp4) and authorize set of users to play back on demand. I store these files on local storage.
I play these files using jwplayer back to the authorized users on demand.
jwplayer expects I pass the url directly for it to play, but I didn't want to expose a direct url.
I really have to restrict unauthorized access to these files as they are private files.
I tried implementing a controller method to handle https://mysite/Video/Watch?VideoId=xyz, and return FileStream of the actual file. It works on a browser directly. (Though not sure how efficient it is for large files.)
But the problem is, jwplayer looks for urls of pattern http(s)://domain/path/file.mp4[?parameter1=value1¶meter2=value2 and so on.]
When I give a url like https://mysite/Video/Watch?VideoId=xyz, it says 'No playable sources found' without even sending a HEAD request.
If I expose the urls directly, the files are available for anybody to download, which will break the privacy.
Worst case, I would at least want to avoid hot links which will live for ever.
I have also looked at www.jwplayer.com/blog/securing-your-content/ but did not find the solutions suitable.
My questions are,
Is there a way I can retain the pattern of the url http(s)://domain/path/file.mp4 and still control the access to the file?
If (1.) is not possible, how do I leverage the parameters that could be passed on the url. With the parameters, I can think of signed urls. What should I do on the server if I have to provide and handle/validate signed urls.
Just not to hinder the performance, after any validation, can I somehow get the iis to handle the filestream rather my code?
I implemented an HTTPModule to allow/block access to the file. This addresses my questions 1 & 3.
Code snippet below.
void context_PreRequestHandlerExecute(object sender, EventArgs e)
{
HttpApplication app = sender as HttpApplication;
//Get the file extension
string fileExt= Path.GetExtension(app.Request.Url.AbsolutePath);
//Check if the extension is mp4
bool requestForMP4 = fileExt.Equals(".mp4", StringComparison.InvariantCultureIgnoreCase);
//If the request is not for an mp4 file, we have nothing to do here
if (!requestForMP4)
return;
//Initially assume no access to media
bool allowAccessToMedia = false;
//....
// Logic to determine access
// If allowed set allowAccessToMedia = true
// otherwise, just return
//....
if(!allowAccessToMedia)
{
//Terminate the request with HTTP StatusCode 403.2 Forbidden: Read Access Forbidden
app.Response.StatusCode = (int)HttpStatusCode.Forbidden;
app.Response.SubStatusCode = 2;
app.CompleteRequest();
}
}
I'm writing an asp.net application and I am saving the cookies correctly (stored in internet files). When I open the file it contains: access_token mylongalphanumberictoken /domainname (no spaces between them).
The problem is that when I check the client for a cookie, I receive null. Can anyone tell me why this is happening and how do i fix it?
public void createCookie(string tokenVal)
{
authCookie = new HttpCookie("access_token",tokenVal);
authCookie.Expires = DateTime.Now.AddDays(60.00); //Token expires in 60 days
authCookie.Domain = ServerDomain.Authority;
}
check if the client has cookies like this:
if (Request.Cookies["access_token"] != null)
{
currentCookieStore.authCookie = Request.Cookies["access_token"];
}
EDIT: im using: currPage.Response.Cookies.Add(newTokenCookie.OauthCookie) ;
to add the cookies. ServerDomain is the location of my webserver so its machinename.domain
The answer is to add a P3P header to prevent IE from blocking your cookies.
Solution here:
Explanation: Cookie blocked/not saved in IFRAME in Internet Explorer
How to: http://social.msdn.microsoft.com/Forums/windowsazure/en-US/4f74156a-54a0-468b-8496-85913094fc34/issue-while-adding-http-response-header-to-a-site-hosted-in-azure-web-role-running-with-more-than?forum=windowsazuremanagement
I'm having trouble removing/adding cookies in Selenium. I'm using Windows 7 and FireFox 25.0.1. My code looks like this:
Instance = new FirefoxDriver();
Instance.Manage().Window.Maximize();
var _cookies = Instance.Manage().Cookies.AllCookies;
Instance.Manage().Cookies.DeleteAllCookies();
foreach(Cookie cookie in _cookies)
{
Instance.Manage().Cookies.AddCookie(cookie);
}
var _newCookies = Instance.Manage().Cookies.AllCookies; //boom
On that last line I get the exception "Unexpected problem getting cookies." I've tried several variants of the above code and the same problem occurs the second time I call AllCookies-- even after closing and reopening the browser and calling GoToUrl(mysite) and re-adding the cookies (The browser was on mysite when I saved the cookies).
I've checked the cookies collection before accessing it, and they all have name/value pairs.
Has anyone managed to use the cookie API successfully in Selenium for C# or can say what I'm doing wrong?
You can only add cookies if your browser is displaying a page of the domain you want to drop the cookie on.
you don't appear to have navigated to a URL before dropping cookies.
I'm writhing a web application (ASP.Net MVC, C#) that require the user to provide urls to RSS or Atom Feed that I then read with the following code :
var xmlRdr = XmlReader.Create(urlProvidedByUserAsString);
var syndicFeed = SyndicationFeed.Load(xmlRdr);
While debugging my application I accidentally passed /something/like/this as an url and I got an exception telling me that C:\something\like\this can't be opened.
It looks like a user could provide a local path and my application would try to read it.
How can I make this code safe? It probably is not sufficient to check for https:// or http:// at the begining of the url, since the user could still enter something like http://localhost/blah. Is there any other way, maybe with the uri class to check if an url is pointing to the web?
Edit: I think I also need to prevent the user from entering adresses that would point to other machines on my network like this example: http://192.168.0.6/ or http://AnotherMachineName/
Try:
new Uri(#"http://stackoverflow.com").IsLoopback
new Uri(#"http://localhost/").IsLoopback
new Uri(#"c:\windows\").IsLoopback