Can't log in to site under Selenium WebDriver - c#

I am having a strange issue where I am unable to log into a site under test with the Selenium WebDriver, but am not having any issues logging in when running the project under test in Visual Studio, or in our QA environment.
I have broken the test down to the most simplistic example, where it allows me to manually enter the username, password, and click the login button while it waits for verification that it has moved on to the next screen (waits for an element on that page).
All that happens when running under Selenium is a page refresh.
The test:
driver.Navigate().GoToUrl(this._baseURL + "Account/Index");
var wait = new WebDriverWait(driver, TimeSpan.FromSeconds(30));
wait.Until(drv => drv.FindElement(By.Id("element-on-next-page")));
The login button calls the jQuery $.ajax method to POST data to a service, and is correctly going into the success() method. The service returns a redirect URL.
This, in turn, attempts to redirect, which works when working with the site manually, but simply re-loads the login page when under a Selenium test:
window.location.replace(location.origin + result.RedirectTo);
I have verified that the result.RedirectTo is valid when the test is running (it is the page it should be redirecting to for successful login).
Tested with Chrome 71.0.3578.98, Firefox 64.0.2 and IE 11.472.17134.0. It works fine manually in all three.
I am unsure why this behavior is acting differently under automation.
UPDATE: The page it is attempting to redirect to has an [Authorize()] attribute on the controller. Removing this attribute allows the test to pass. The attribute only causes Selenium tests to fail, not manual testing.

have to try to perform login steps manually on the chrome browser launched by Selenium?
I am not sure but sometimes google redirects to authorization page or verify your identity page just to ensure you are not using automation scripts for creating multiple emails signup or scraping any website data.
Just try to run the same scenario manually on the browser launched by selenium.

Related

C# Selenium Chrome clicking link from default chrome home page

I'm trying to get Selenium to click on one of the most visited web links from the default Chrome web page.
The problem is that Selenium cannot find the element on the page and I think it has to do with the fact that a web page technically didn't get loaded. When you open up Chrome it has HTML elements there but the address bar is completely empty. I think possibly this is why Selenium can't find the link? The code is simple and finding the XPATH wasn't an issue. I just don't know if this is a function that Selenium will be able to do or not. I'm trying to do the click because the navigate() function will not work when I put in the proxy information due to the fact that Selenium doesn't have a built-in way to handle a proxy with username and password.
At the end of the day I'm trying to get the username/password box to pop up by clicking on the link. When I open the browser with Selenium programmatically and then manually click on the link the username/password box pops up. But I can't get Selenium to find the element to click on programmatically.
var did = driver.FindElement(By.XPath("//*[#id='mv-tiles']/a[1]"));
did.Click();
UPDATE 1:
I was able to find the element when taking into consideration the iframe but clicking still is an issue.
var frm = driver.SwitchTo().Frame("mv-single");
var did = frm.FindElement(By.XPath("//*[#id='mv-tiles']/a[1]"));
//did.Click(); <-- I can see it go to the element but nothing occurs
IJavaScriptExecutor js2 = (IJavaScriptExecutor) driver;
js2.ExecuteScript("arguments[0].click();", did);
The JavaScriptExecuter is able to click the element but Chrome blocks the redirect with the following message:
[21040:24704:1204/150143.743:ERROR:CONSOLE(1)] "Unsafe JavaScript attempt to initiate navigation for frame with URL 'chrome-search://local-ntp/local-ntp.html' from frame with URL 'chrome-search://most-visited/single.html?title=Most%20visited&removeTooltip=Don%27t%20show%20on%20this%20page&enableCustomLinks=1&addLink=Add%20shortcut&addLinkTooltip=Add%20shortcut&editLinkTooltip=Edit%20shortcut'. The frame attempting navigation is targeting its top-level window, but is neither same-origin with its target nor has it received a user gesture. See https://www.chromestatus.com/features/5851021045661696.
", source: (1)
FINAL UPDATE:
I gave up and decided to do the browser extension solution for proxies with passwords: https://stackoverflow.com/a/35293222/5415162
That list of "Most Recent Pages" is actually in an iframe, which is probably why Selenium can't find it. Try updating the selector to account for the iframe, or maybe add a wait clause to allow the iframe to finish loading.
Regardless of that solution, I don't think it will act any differently than just navigating to the target URL. So to fix your root problem have you tried setting the proxy details when creating the ChromeOptions?

Selenium - Windows Auth Box Issue

This is an odd one I need some help with. We have an automation project with a windows auth box. We were passing in the user/pass in the url string but we started to notice some issues. I wanted to setup AutoIT and see if this fixed the issue we were seeing but the url we go to is an internal ip:port. When I goto the url ex. 123.34.56.78:1111 the browser (chrome) opens but then fails with:
OpenQA.Selenium.WebDriverException: 'The HTTP request to the remote WebDriver server for URL http://localhost:7233/session/be85ee0483da9772b136488bed19c43b/url timed out after 180 seconds.'
It appears that webdriver is waiting for something to complete and I can't get to the next step.
I have tried the below but each one loads the page and then throws the error.
_webDriver.Navigate().GoToUrl(url);
_webDriver.Url = url;
Any ideas?

Selenium web driver in a web application get user inputs

I have a web application hosted on a web server that uses selenium web driver to do some action on web sites and save that response for the user to see at the end.
Right now this works fine, the user uploads the data to search for and the application runs on the server, opening chrome windows to navigate on the sites.
The issue is that , there is a site that need user input to continue. Is there a way to instead of opening the chrome window on the server, I open it on the clients machine? In a way that I could control the flow of this new page, wait for the user to take action, and then continue to perform the automated action.
Any option would be helpful.
Thanks
you can use driver.Navigate().GoToUrl("http://url.here")
Then you can use a something like
WebDriverWait wait = new WebDriverWait(driver, new TimeSpan(0, 0, 30));
if (driver.FindElement(By.CssSelector("selector")).Displayed)
{
wait.Until(ExpectedConditions.VisibilityOfAllElementsLocatedBy(By.CssSelector("selector")));
driver.FindElement(By.CssSelector("selector")).SendKeys("text you want entered");
driver.FindElement(By.CssSelector("selector")).Click(); //There are other properties and methods you can access
}

c# - Session Expired in selenium IE webdriver

I have a secured site from which I need to scrape data from some particular pages. The page should be opened strictly on IE. I opened the login page from selenium and pass the handle to the webdriver. Then the user surfs various pages and pop ups of that website. A timer runs and it is checked whether a particular page is opened or not. That is being checked with following code.
var windowIterator = driver.WindowHandles;
foreach (var windowHandle in windowIterator)
{
popup = driver.SwitchTo().Window(windowHandle);
if (popup.Title == PageTitle) //PageTitle is string value and is saved in App Config
{
doWork = true; //Scraping would be started on this page
break;
}
}
It is working perfectly for other sites in testing environment. In live environment the pop page is displaying with the session expired message and asking for user credentials. Once that is given then it is working fine. The architecture of the website that is being scraped is unknown to me.
Could any body tell me why this is happening and what is the way out.
Possibly its takes too much of the time to scrap the data before page will be updated/changed.
I believe site give to your browser one-session-cookes. Check all of cookies that site gaves you. Possibly this can be resolved by cookies edit via selenium. If not -- you can refresh the page in smaller than life time of cookies and show to the server that "user is here" =)

Selenium Webdriver not respecting cookies or cached images

I am using Selenium (2.24) to generate unit tests (for the Visual Studio unit test framework). While using the C# WebDriver for FireFox, it appears that the browser which is fired up by the driver is not finding my website cookies via javascript (I have a javascript file included in the site that looks for cookies and lets me know if they are found). Also, it is not using the browsers image cache, and is always requesting new images from the server. This behavior does not happen when I run my site from the "normal" (not launched by Selenium) FireFox.
The strange thing is that calling the below code in my unit test DOES return my cookie (it just can't be found by my JavaScript)
driver.Manage().Cookies.GetCookieNamed("MyCookie");
How can I configure the driver to respect my cookies and use the browsers image cache? This functionality is key to testing my website.
By default the FirefoxDriver will create a new anonymous Profile each time it starts Firefox. If you want it to use an exiting profile you need to tell it to.
In Java you do it like so:
ProfilesIni allProfiles = new ProfilesIni();
FirefoxProfile profile = allProfiles.getProfile("MyProfile");
WebDriver driver = new FirefoxDriver(profile);
I'm assuming there's something similar in C#
For cookies: if cookie is marked as "HTTP Only" JavaScript on a page will not be able to see it. As result any code that uses execution of JavaScript on the page will not see this particular cookie.
You can confirm it by using some HTTP debugger (i.e. Fiddler) to see if cookie is set with HttpOnly property. You also can check if running script on a page via dev tools or typing javascript:alert(...) in address bar can see the cookie (document.cookie)

Categories

Resources