Selenium Error when using JavaScript or getting elements - c#

Using Seleneium 2.25, I've had a lot of issues arise.
I'm trying to use Selenium Remote Driver on a remote machine (Server) from my computer (local / client). However, when I try to use DesiresCapabilities.Htmlunit() It will locate the elements, but it says they are not visible. I'm completely stumped by this. I'm not sure why it can be found but then not visible.
So then I tried to use some JavaScript in order force it. It comes back and throws an error saying that the webpage can not execute javascript before the page is loaded. How is this possible when I did an implicate wait, and it found the element it was waiting for?
DesiredCapabilities iecapa = DesiredCapabilities.HtmlUnit();
iecapa.IsJavaScriptEnabled = true;
driver = new RemoteWebDriver(new Uri("http://<IP of server>:4444/wd/hub"), iecapa);
IJavaScriptExecutor jQuery = ((IJavaScriptExecutor)(driver));
addressElement = (IWebElement)jQuery.ExecuteScript("return document.GetElementByName('searchAddress')");
So if anyone would like to help me, it would be greatly appreciated! Thank you!
http://imageshack.us/photo/my-images/163/seleniumhtmluniterror.jpg/
that is the error. StackOverflow wont let me post it here. =(

Try this:
WebDriverWait wait = new WebDriverWait(driver, new TimeSpan(0, 0, 0, 30));
wait.Until(p => driver.FindElement(By.Name("searchAddress")));
IJavaScriptExecutor jQuery = ((IJavaScriptExecutor)(driver));
addressElement = (IWebElement)jQuery.ExecuteScript("return document.GetElementByName('searchAddress')");
Also you can add check in JavaScript:
if (document.readyState.toLowerCase()=="complete")
return document.GetElementByName('searchAddress');
return 'Error';

Related

Selenium ChromeDriver C# how to navigate page before it loads?

I am testing data flow for a client's website. It has advertisements that take substantially longer to load than the data elements of each page that I would like to test with Selenium commands.
I don't have control over the ads and can't silence them.
I would like to navigate each page with clicks prior to the complete page loading. I know that this is possible because I can do it manually using the mouse. However, despite my attempts the stubborn chromeDriver will not begin automation until the entire page is loaded.
I am using C# .Net 4.6.1, chrome32_55.0.2883.75, and Selenium version 3.0.1.0.
Further, I am using the recommended Selenium page object model. I implemented WaitForLoad() like this:
public override void WaitForLoad()
{
_isLoaded = Wait.Until(d =>
{
lock (d)
{
SwitchToSelf();
return PageRegex.IsMatch(Session.Driver.PageSource);
}
});
}
The PageRegex above will work but only after the full page is loaded. Which is frustrating because I can visually see that the text string that the PageRegex is designed to parse is on the screen. This leads me to believe that there is a setting elsewhere, perhaps while I am configuring ChromeDriver, that would enable me to parse the Session.Driver.PageSource prior to it being completely loaded.
This is how I am instancing the ChromeDriver:
var options = new ChromeOptions();
options.AddArguments("test-type");
options.AddArgument("incognito"); // works
options.AddArgument("--disable-bundled-ppapi-flash"); // works! this turns off shockwave
options.AddArgument("--disable-extensions"); // works
options.AddArguments("--start-fullscreen");
string workFolder = Environment.GetFolderPath(Environment.SpecialFolder.Desktop) +
"\\SHARED";
options.BinaryLocation = workFolder + #"\Chrome32\chrome32_55.0.2883.75\chrome.exe";
var driver = new ChromeDriver(options);
driver.Manage().Cookies.DeleteAllCookies();
return driver;
To interact with the page before it has finished loading, you can either lower the timeout and catch the exception:
IWebDriver driver = new ChromeDriver();
WebDriverWait wait = new WebDriverWait(driver, TimeSpan.FromSeconds(5));
driver.Manage().Timeouts().SetPageLoadTimeout(TimeSpan.FromMilliseconds(500));
try {
driver.Navigate().GoToUrl("http://www.deelay.me/5000/http://www.deelay.me/");
} catch (OpenQA.Selenium.WebDriverTimeoutException) { }
// waits for an element
var body = wait.Until(ExpectedConditions.ElementExists(By.CssSelector("body")));
Or you can disable the waiting by setting the page load stategy to none :
var options = new ChromeOptions();
options.AddAdditionalCapability("pageLoadStrategy", "none", true);
IWebDriver driver = new ChromeDriver(options);
WebDriverWait wait = new WebDriverWait(driver, TimeSpan.FromSeconds(5));
driver.Navigate().GoToUrl("http://www.deelay.me/5000/http://www.deelay.me/");
// waits for an element
var body = wait.Until(ExpectedConditions.ElementExists(By.CssSelector("body")));
Have you tried with ElementToBeClickable or VisibilityOfElementLocated methods of ExpectedConditions class. I think it should work in the scenario you mentioned.
WebDriverWait wdw = new WebDriverWait(driver, TimeSpan.FromSeconds(120));
wdw.Until(ExpectedConditions.ElementToBeClickable(By.Id("ElementId")));
I would like to navigate each page with clicks prior to the complete
page loading
Do I understand you right that you want to click on some links while the page is still loading?
If you know which elements you are going to click, maybe you can do the same as in this question How to click on an element before the page is fully loaded i'm stuck, it takes too much time until loading completed

Headless Browser C# and Alternatives

currently I have the following code using selenium and phantomjs in c#:
public class Driver
{
static void Main()
{
using (var driver = new PhantomJSDriver())
{
driver.Navigate().GoToUrl("https://www.website.com/");
driver.Navigate().GoToUrl("https://www.website.com/productpage/");
driver.ExecuteScript("document.getElementById('pdp_selectedSize').value = '10.0'"); //FindElementById("pdp_selectedSize").SendKeys("10.0");
driver.ExecuteScript("document.getElementById('product_form').submit()");
driver.Navigate().GoToUrl("http://www.website/cart/");
Screenshot sh = driver.GetScreenshot();
sh.SaveAsFile(#"C:\temp\test.jpg", ImageFormat.Png);
}
}
}
My objective is to be able to add a product to my cart and then checkout automatically. The screenshot is just included to test whether the code was successfully working. My first issue is that I often get an error that it cannot find the element with product id "pdp_selectedSize". I'm assuming this is because the the webdriver hasn't loaded the page yet, so I'm looking for a way to keep checking until it finds it without having to set a specific timeout.
I'm also looking for faster alternatives to use instead of a headless browser. I used a headless browser instead of http requests because I need certain cookies to be able to checkout on the page, and these cookies are set through javascript within the page. If anyone has a reccommendation for a faster method, it would be greatly appreciated, thanks!
For your first question, it would behoove you to look into using ExpectedConditions' which is part of theWebDriverWaitclass inSelenium`. The following code sample was taken from here and only serves as a reference point.
using (IWebDriver driver = new FirefoxDriver())
{
driver.Url = "http://somedomain/url_that_delays_loading";
WebDriverWait wait = new WebDriverWait(driver, TimeSpan.FromSeconds(10));
IWebElement myDynamicElement = wait.Until<IWebElement>(d =>
d.FindElement(By.Id("someDynamicElement")));
}
More on WebDriverWaits here.
As to your second question, that is a really subjective thing, in my opinion. Headless Browsers aren't necessarily any faster or slower than a real browser. See this article.

Selenium Webdriver not returning Javascript code

Hi I am new to Selenium Webdriver. I can successfully open a webpage and find elements on it.
In one case I have noted that there is a link on a page that becomes clickable after a while. In Firebug on the Script tab, I can see the code for the javascript that does the timer function.
But using Selenium Webdriver if I issue:
driver.PageSource
I cannot see the source code for the Javascript. Delaying for 30 seconds before requesting the source makes no difference. I have tried finding it with various By options using:
driver.FindElement
and so on, but it isnt there.
How does firebug manage to find and show the Javascript source code? Is there a way that I can coerce Selenium Webdriver to return all code referenced by the page?
Or is there a better approach?
Thanks for any advice!
EDIT---------------------
I tried the following in Firefox:
Dim Driver2 As IWebDriver = New Chrome.ChromeDriver
Driver2.Url = "http://mypage"
Dim js As IJavaScriptExecutor = TryCast(Driver2, IJavaScriptExecutor)
Dim title As String = DirectCast(js.ExecuteScript("return JSON.stringify(window)"), String)
and I got
Permission denied to access property 'toJSON'
I read that this wont work in firefox so I tried in Chrome, and got
Blocked a frame with origin "http://mypage" from accessing a
cross-origin frame
and from there no solutions because according to this its a security restriction, apparently you can't access an with Javascript
I'm starting to think Im a bit out of my depth here.
PageSource probably doesn't return an exact snapshot of the DOM & etc.
You can instead inspect javascript using driver.executeScript() but the burden of analyzing the return object may be discouraging.
Regardless - Here's a contrived example:
Object result = driver.executeScript("return JSON.stringify(window)");
System.out.println(result.toString());

Selenium - C# - Webdriver - Unable to find element

Using selenium in C# I am trying to open a browser, navigate to Google and find the text search field.
I try the below
IWebDriver driver = new InternetExplorerDriver(#"C:\");
driver.Navigate().GoToUrl("www.google.com");
driver.Manage().Timeouts().ImplicitlyWait(TimeSpan.FromSeconds(5));
IWebElement password = driver.FindElement(By.Id("gbqfq"));
but get the following error -
Unable to find element with id == gbqfq
This looks like a copy of this question that has already been answered.
I can show you what I've done, which seems to work well for me:
public static IWebElement WaitForElementToAppear(IWebDriver driver, int waitTime, By waitingElement)
{
IWebElement wait = new WebDriverWait(driver, TimeSpan.FromSeconds(waitTime)).Until(ExpectedConditions.ElementExists(waitingElement));
return wait;
}
This should wait waitTime amount of time until either the element is found or not. I've run into a lot of issues with dynamic pages not loading the elements I need right away and the WebDriver trying to find the elements faster than the page can load them, and this is my solution to it. Hope it helps!
You can try using a spin wait
int timeout =0;
while (driver.FindElements(By.id("gbqfq")).Count == 0 && timeout <500){
Thread.sleep(1);
timeout++;
}
IWebElement password = driver.FindElement(By.Id("gbqfq"));
this should help make sure that the element has actually had time to appear.
also note, the "gbqfq" id is kinda a smell. I might try something more meaningful to match on than that id.

Selenium C# RemoteWebDriver not finding XPath Elements

I'm using Selenium 2.25.1 API, and I'm trying to be able to find the elements using RemoteWebDriver(). Except when I try, it just fails to find the element. I've tried several different combinations with no luck and have been looking this up for a few days now.
WebDriverWait wait = new WebDriverWait(driver, TimeSpan.FromSeconds(10));
IWebElement WaitForPage = wait.Until<IWebElement>((d) =>
{
return driver.FindElement(By.XPath((String)data));
});
Is my code where it fails. Basically the data variable is an object grabbed from my database. I converted it, and going though the code it comes out perfectly fine. How the difference is, when I used just the browser (i.e. firefox, IE) it works just fine with no errors. But when I use it with RemoteWebDriver(), it throws InvalidOperationException and throws a popup saying it was unable to find the element. (Server did not provide any stacktrace information).
This is usually what I use
IWebDriver driver = new RemoteWebDriver(new Uri("http://localhost:4444/wd/hub"), IEcapa);
When that is used, it just fails everytime.
Any ideas? I am completely puzzled. Anything is welcome and thanks in advance!
I would suggest using an implicit wait instead of an WebDriverWait statement.
WebDriver driver = new FirefoxDriver();
driver.Manage().Timeouts().ImplicitlyWait(TimeSpan.FromSeconds(10));
IWebElement WaitForPage = driver.FindElement(By.XPath((String)data));
And make sure that the xpath you are getting from the data variable is valid. If possible post an some xpath you get from the database.

Categories

Resources