I'm trying to update our Selenium tests to work with the latest Firefox. This code snipet shows how I initialize the Driver. Instance is a class member: NgWebDriver Instance
FirefoxOptions ffOptions = new FirefoxOptions();
ffOptions.SetPreference("marionette", true);
IWebDriver NonProtractorInstance = new FirefoxDriver(ffOptions);
Instance = new NgWebDriver(NonProtractorInstance);
Instance.Manage().Timeouts().SetScriptTimeout(TimeSpan.FromSeconds(1000));
Instance.IgnoreSynchronization = false;
However, the following code fails:
Instance.Navigate().GoToUrl(/* URL to angular page */);
With this following error:
Document was unloaded during execution (UnexpectedJavaScriptError)
Note this particular URL does redirect to another page, but both original and redirect page are angular pages.
I've tried every variation of initializing the drivers I could find and they all failed with similar errors.
Any one have any other things I can try to get past this?
Protractor actually does not support FF > v47
Related
I'm getting the following error when trying to run my selenium webdriver tests from a user control I have created:
An exception of type 'System.TypeLoadException' occurred in UserCreationFrontEnd.exe but was not handled in user code
Additional information: Could not load type 'OpenQA.Selenium.Support.UI.WebDriverWait' from assembly 'WebDriver, Version=2.48.2.0, Culture=neutral, PublicKeyToken=null'.
Background story: I have created an app with some automated test suites in that allows people within my team without visual studio to run them. This app is a winforms app and is written in c# and uses user controls. I have added a user control into my selenium solution to run the selenium tests. When I run the user control from here the tests run as expected.
The issue appears when I add the user control into my main application and try and run the selenium tests from there.
The user control is displayed fine, but when I try to start the tests I get the error in my test start up (OneTimeSetUp).
Code:
public void TestStartUp()
{
driver = new InternetExplorerDriver(#"O:\Testing\SDET\SeleniumWebDriver");
wait = new WebDriverWait(driver, TimeSpan.FromSeconds(10));
config = new MapsPageObjectModel.EnvironmentConfig(driver);
navBar = new MapsPageObjectModel.NavigationBar(driver);
homePage = new MapsPageObjectModel.Homepage(driver);
createContact = new MapsPageObjectModel.CreateContact(driver);
securityManager = new MapsPageObjectModel.SecurityManager(driver);
companyPage = new MapsPageObjectModel.Company(driver);
createEmployee = new MapsPageObjectModel.CreateEmployee(driver);
roles = new MapsPageObjectModel.Roles(driver);
UserCreationResults.Clear();
}
The error happens on this line
wait = new WebDriverWait(driver, TimeSpan.FromSeconds(10));
Does anyone have any idea's?
Go to Manage NuGet window and make sure you have installed same version of Selenium.Support and Selenium.WebDriver across all your projects in your solution.
You need to have both usings in your .cs file:
using OpenQA.Selenium;
using OpenQA.Selenium.Support.UI;
Also u need to have refences to WebDriver and WebDriver.Support.
I am testing data flow for a client's website. It has advertisements that take substantially longer to load than the data elements of each page that I would like to test with Selenium commands.
I don't have control over the ads and can't silence them.
I would like to navigate each page with clicks prior to the complete page loading. I know that this is possible because I can do it manually using the mouse. However, despite my attempts the stubborn chromeDriver will not begin automation until the entire page is loaded.
I am using C# .Net 4.6.1, chrome32_55.0.2883.75, and Selenium version 3.0.1.0.
Further, I am using the recommended Selenium page object model. I implemented WaitForLoad() like this:
public override void WaitForLoad()
{
_isLoaded = Wait.Until(d =>
{
lock (d)
{
SwitchToSelf();
return PageRegex.IsMatch(Session.Driver.PageSource);
}
});
}
The PageRegex above will work but only after the full page is loaded. Which is frustrating because I can visually see that the text string that the PageRegex is designed to parse is on the screen. This leads me to believe that there is a setting elsewhere, perhaps while I am configuring ChromeDriver, that would enable me to parse the Session.Driver.PageSource prior to it being completely loaded.
This is how I am instancing the ChromeDriver:
var options = new ChromeOptions();
options.AddArguments("test-type");
options.AddArgument("incognito"); // works
options.AddArgument("--disable-bundled-ppapi-flash"); // works! this turns off shockwave
options.AddArgument("--disable-extensions"); // works
options.AddArguments("--start-fullscreen");
string workFolder = Environment.GetFolderPath(Environment.SpecialFolder.Desktop) +
"\\SHARED";
options.BinaryLocation = workFolder + #"\Chrome32\chrome32_55.0.2883.75\chrome.exe";
var driver = new ChromeDriver(options);
driver.Manage().Cookies.DeleteAllCookies();
return driver;
To interact with the page before it has finished loading, you can either lower the timeout and catch the exception:
IWebDriver driver = new ChromeDriver();
WebDriverWait wait = new WebDriverWait(driver, TimeSpan.FromSeconds(5));
driver.Manage().Timeouts().SetPageLoadTimeout(TimeSpan.FromMilliseconds(500));
try {
driver.Navigate().GoToUrl("http://www.deelay.me/5000/http://www.deelay.me/");
} catch (OpenQA.Selenium.WebDriverTimeoutException) { }
// waits for an element
var body = wait.Until(ExpectedConditions.ElementExists(By.CssSelector("body")));
Or you can disable the waiting by setting the page load stategy to none :
var options = new ChromeOptions();
options.AddAdditionalCapability("pageLoadStrategy", "none", true);
IWebDriver driver = new ChromeDriver(options);
WebDriverWait wait = new WebDriverWait(driver, TimeSpan.FromSeconds(5));
driver.Navigate().GoToUrl("http://www.deelay.me/5000/http://www.deelay.me/");
// waits for an element
var body = wait.Until(ExpectedConditions.ElementExists(By.CssSelector("body")));
Have you tried with ElementToBeClickable or VisibilityOfElementLocated methods of ExpectedConditions class. I think it should work in the scenario you mentioned.
WebDriverWait wdw = new WebDriverWait(driver, TimeSpan.FromSeconds(120));
wdw.Until(ExpectedConditions.ElementToBeClickable(By.Id("ElementId")));
I would like to navigate each page with clicks prior to the complete
page loading
Do I understand you right that you want to click on some links while the page is still loading?
If you know which elements you are going to click, maybe you can do the same as in this question How to click on an element before the page is fully loaded i'm stuck, it takes too much time until loading completed
currently I have the following code using selenium and phantomjs in c#:
public class Driver
{
static void Main()
{
using (var driver = new PhantomJSDriver())
{
driver.Navigate().GoToUrl("https://www.website.com/");
driver.Navigate().GoToUrl("https://www.website.com/productpage/");
driver.ExecuteScript("document.getElementById('pdp_selectedSize').value = '10.0'"); //FindElementById("pdp_selectedSize").SendKeys("10.0");
driver.ExecuteScript("document.getElementById('product_form').submit()");
driver.Navigate().GoToUrl("http://www.website/cart/");
Screenshot sh = driver.GetScreenshot();
sh.SaveAsFile(#"C:\temp\test.jpg", ImageFormat.Png);
}
}
}
My objective is to be able to add a product to my cart and then checkout automatically. The screenshot is just included to test whether the code was successfully working. My first issue is that I often get an error that it cannot find the element with product id "pdp_selectedSize". I'm assuming this is because the the webdriver hasn't loaded the page yet, so I'm looking for a way to keep checking until it finds it without having to set a specific timeout.
I'm also looking for faster alternatives to use instead of a headless browser. I used a headless browser instead of http requests because I need certain cookies to be able to checkout on the page, and these cookies are set through javascript within the page. If anyone has a reccommendation for a faster method, it would be greatly appreciated, thanks!
For your first question, it would behoove you to look into using ExpectedConditions' which is part of theWebDriverWaitclass inSelenium`. The following code sample was taken from here and only serves as a reference point.
using (IWebDriver driver = new FirefoxDriver())
{
driver.Url = "http://somedomain/url_that_delays_loading";
WebDriverWait wait = new WebDriverWait(driver, TimeSpan.FromSeconds(10));
IWebElement myDynamicElement = wait.Until<IWebElement>(d =>
d.FindElement(By.Id("someDynamicElement")));
}
More on WebDriverWaits here.
As to your second question, that is a really subjective thing, in my opinion. Headless Browsers aren't necessarily any faster or slower than a real browser. See this article.
Hi I am new to Selenium Webdriver. I can successfully open a webpage and find elements on it.
In one case I have noted that there is a link on a page that becomes clickable after a while. In Firebug on the Script tab, I can see the code for the javascript that does the timer function.
But using Selenium Webdriver if I issue:
driver.PageSource
I cannot see the source code for the Javascript. Delaying for 30 seconds before requesting the source makes no difference. I have tried finding it with various By options using:
driver.FindElement
and so on, but it isnt there.
How does firebug manage to find and show the Javascript source code? Is there a way that I can coerce Selenium Webdriver to return all code referenced by the page?
Or is there a better approach?
Thanks for any advice!
EDIT---------------------
I tried the following in Firefox:
Dim Driver2 As IWebDriver = New Chrome.ChromeDriver
Driver2.Url = "http://mypage"
Dim js As IJavaScriptExecutor = TryCast(Driver2, IJavaScriptExecutor)
Dim title As String = DirectCast(js.ExecuteScript("return JSON.stringify(window)"), String)
and I got
Permission denied to access property 'toJSON'
I read that this wont work in firefox so I tried in Chrome, and got
Blocked a frame with origin "http://mypage" from accessing a
cross-origin frame
and from there no solutions because according to this its a security restriction, apparently you can't access an with Javascript
I'm starting to think Im a bit out of my depth here.
PageSource probably doesn't return an exact snapshot of the DOM & etc.
You can instead inspect javascript using driver.executeScript() but the burden of analyzing the return object may be discouraging.
Regardless - Here's a contrived example:
Object result = driver.executeScript("return JSON.stringify(window)");
System.out.println(result.toString());
I try to load ChromeDriver with adblock, and somehow it reloads downloading the extension everytime it runs and shows this message:
If you see this message every time you start AdBlock, please make sure you are not using a file cleaner that also cleans 'localStorage' files.
var options = new ChromeOptions();
options.AddArgument("--no-experiments");
options.AddArgument("--disable-translate");
options.AddArgument("--disable-plugins");
options.AddArgument("--no-default-browser-check");
options.AddArgument("--clear-token-service");
options.AddArgument("--disable-default-apps");
options.AddArgument("--no-displaying-insecure-content");
options.AddArgument("--disable-bundled-ppapi-flash");
options.AddExtension(#"D:\AdBlock-v2.6.5\adblock.crx");
using (IWebDriver driver = new ChromeDriver(options))
{
driver.Navigate().GoToUrl(url);
}
Try to use the same chrome profile on every run. This must resolve the issue.
Code to do this located here: Load Chrome Profile using Selenium WebDriver