Selenium _webDriver not finding element through By.LinkText - c#

I am running some Acceptance tests (C#) with Selenium webdriver and have the following line of code:
var link = _webDriver.FindElement(By.LinkText("Manage Service"));
link.Click();
On the navigated page, this anchor is what I am trying to target:
<a onclick="doEdit(this, 73332)" href="javascript:void(0);">
<span>Manage Service</span>
</a>
But the test is failing due to Selenium not being able to find that anchor tag. I was under the impression that Selenium is able to handle the above scenario.
I also tried By.PartialLinkText() but again not found!
Why isn't is finding the link?

The reason it wasn't finding it was because the previous step was not working properly.
Here is what I have in my test:
When I navigate to the 'services' page for the 'organisation'
And I log into the site as a OD editor and return to the page I was on
And I click on the 'Manage Service' link
The loging into the site step wasn't given enough time. So I told it to wait 2 seconds and then move onto the next step.
Thens.WaitForSecond(2);
Now it finds it without any problems.
Thanks

Related

C# Selenium Chrome clicking link from default chrome home page

I'm trying to get Selenium to click on one of the most visited web links from the default Chrome web page.
The problem is that Selenium cannot find the element on the page and I think it has to do with the fact that a web page technically didn't get loaded. When you open up Chrome it has HTML elements there but the address bar is completely empty. I think possibly this is why Selenium can't find the link? The code is simple and finding the XPATH wasn't an issue. I just don't know if this is a function that Selenium will be able to do or not. I'm trying to do the click because the navigate() function will not work when I put in the proxy information due to the fact that Selenium doesn't have a built-in way to handle a proxy with username and password.
At the end of the day I'm trying to get the username/password box to pop up by clicking on the link. When I open the browser with Selenium programmatically and then manually click on the link the username/password box pops up. But I can't get Selenium to find the element to click on programmatically.
var did = driver.FindElement(By.XPath("//*[#id='mv-tiles']/a[1]"));
did.Click();
UPDATE 1:
I was able to find the element when taking into consideration the iframe but clicking still is an issue.
var frm = driver.SwitchTo().Frame("mv-single");
var did = frm.FindElement(By.XPath("//*[#id='mv-tiles']/a[1]"));
//did.Click(); <-- I can see it go to the element but nothing occurs
IJavaScriptExecutor js2 = (IJavaScriptExecutor) driver;
js2.ExecuteScript("arguments[0].click();", did);
The JavaScriptExecuter is able to click the element but Chrome blocks the redirect with the following message:
[21040:24704:1204/150143.743:ERROR:CONSOLE(1)] "Unsafe JavaScript attempt to initiate navigation for frame with URL 'chrome-search://local-ntp/local-ntp.html' from frame with URL 'chrome-search://most-visited/single.html?title=Most%20visited&removeTooltip=Don%27t%20show%20on%20this%20page&enableCustomLinks=1&addLink=Add%20shortcut&addLinkTooltip=Add%20shortcut&editLinkTooltip=Edit%20shortcut'. The frame attempting navigation is targeting its top-level window, but is neither same-origin with its target nor has it received a user gesture. See https://www.chromestatus.com/features/5851021045661696.
", source: (1)
FINAL UPDATE:
I gave up and decided to do the browser extension solution for proxies with passwords: https://stackoverflow.com/a/35293222/5415162
That list of "Most Recent Pages" is actually in an iframe, which is probably why Selenium can't find it. Try updating the selector to account for the iframe, or maybe add a wait clause to allow the iframe to finish loading.
Regardless of that solution, I don't think it will act any differently than just navigating to the target URL. So to fix your root problem have you tried setting the proxy details when creating the ChromeOptions?

Can't log in to site under Selenium WebDriver

I am having a strange issue where I am unable to log into a site under test with the Selenium WebDriver, but am not having any issues logging in when running the project under test in Visual Studio, or in our QA environment.
I have broken the test down to the most simplistic example, where it allows me to manually enter the username, password, and click the login button while it waits for verification that it has moved on to the next screen (waits for an element on that page).
All that happens when running under Selenium is a page refresh.
The test:
driver.Navigate().GoToUrl(this._baseURL + "Account/Index");
var wait = new WebDriverWait(driver, TimeSpan.FromSeconds(30));
wait.Until(drv => drv.FindElement(By.Id("element-on-next-page")));
The login button calls the jQuery $.ajax method to POST data to a service, and is correctly going into the success() method. The service returns a redirect URL.
This, in turn, attempts to redirect, which works when working with the site manually, but simply re-loads the login page when under a Selenium test:
window.location.replace(location.origin + result.RedirectTo);
I have verified that the result.RedirectTo is valid when the test is running (it is the page it should be redirecting to for successful login).
Tested with Chrome 71.0.3578.98, Firefox 64.0.2 and IE 11.472.17134.0. It works fine manually in all three.
I am unsure why this behavior is acting differently under automation.
UPDATE: The page it is attempting to redirect to has an [Authorize()] attribute on the controller. Removing this attribute allows the test to pass. The attribute only causes Selenium tests to fail, not manual testing.
have to try to perform login steps manually on the chrome browser launched by Selenium?
I am not sure but sometimes google redirects to authorization page or verify your identity page just to ensure you are not using automation scripts for creating multiple emails signup or scraping any website data.
Just try to run the same scenario manually on the browser launched by selenium.

Click on Captcha via Selenium always raised picture verification

So i try to send SMS using this site via Selenium.
After choose my country and my phone number i have this Captcha:
So with Selenium i succeed to click on this Checkbox but allways got this image verification:
i try to put several sleeps before this Checkbox click but this image verification is still exist.
This behavior is not hapenning via manual scenarion so my question is why this is happening ?
This is how i am find my elements and click on Captcha Checkbox:
IWebElement frame = driver.FindElements(By.XPath("//iframe[contains(#src, 'recaptcha')]"))[0];
driver.SwitchTo().Frame(frame);
IWebElement checkbox = Driver.Browser.FindElement(By.CssSelector("div.recaptcha-checkbox-checkmark"));
checkbox.click();
This is the whole point of captcha's that bot do-not pass a certain point/crawl a webpage; This appears when a website is suspicious of bot activity so just to make sure that a human is on another side these images are shown; so that only a human can be allowed further activity.
You don't get these images when you try to do that manually is because you verify this on local installation of a browser, which have constantly saved cookies about your activities; but when you launch the same screen via Selenium WebDriver a new fresh instance is launched and to make sure that instance/session is legit; websites can ask for captcha details,
But in order to circumvent this situation, you can try to use same sessions where you have already answered the questions, you can do this using DesiredCapabilities in selenium, please google for more.
Take out side beyond selenium , use Sikuli http://www.sikuli.org/ to click on the button and do your stuff Use Java Api of Sikuli not the IDE and use relative region.
All though as mentioned above if you have the scope of the Dev also ,
speak with the Dev and turn off the Captcha from the flow
try logging into an old gmail account before attempting to click recaptcha automatically
import the sikuli jar into your project and use below code
package com.test;
import org.sikuli.script.FindFailed;
import org.sikuli.script.Screen;
public class CaptchaCick{
public static void main(String[] args) throws FindFailed, InterruptedException {
// TODO Auto-generated method stub
Screen s=new Screen();
s.find("source.png");
s.click(("source.png");
System.out.println("Source image found");
}
}

Execute function from a webpage that is activated onclick

I have to make a console application in C# which retrieve some data from webpages.
I have downloaded the HTML code from the main page of a website.
WebClient client = new WebClient();
String htmlCode= client.DownloadString(linkToWebpage);
I have verified the string and it is good.
After this part, I have searched for a specific line in the html code which contains a button and a link.
<a rel="nofollow" class="link" onclick="loadornot()" href="http://aaaaa.com/D?WUrtlC1" target="_blank">Click to read more</a>
Now i am trying to download html code from the ancored link (the one from the href), but I am redirected to the main page and I am not sure why. Even if I copy the link from href and paste it into a webbrowser, I am redirected to the main page.
I believe that this happens because the button call a function onclick="loadornot()". That's why it doesn't work the way I have tried? And if yes, how could I call that function from my c# application to continue my app?
Thank you.
Edit:
I have found out that I need some cookies, more exactly, sessioncode, to make that link work. How can I do that?
You can't run javascript code from web page without browser. So, if you really need to execute that function in downloaded page, use some kind of headless browser, like those: webkitdotnet or awesomium

JQuery load page on browser and execute Jquery - No Selenium

We have tried using selenium for testing, but it has numerous setbacks, delays and sudden crashes.
Jquery sounds a good alternative, but the challenge is how to jquerify every page load on the browser.
Brandon Martinez here has an example of how to add jquery to the console of chrome to jquerify a page:
var element1 = document.createElement("script");
element1.src = "http://ajax.googleapis.com/ajax/libs/jquery/1.4.4/jquery.min.js";
element1.type="text/javascript";
document.getElementsByTagName("head")[0].appendChild(element1);
we want that code to automatically be available in every browser page without the need to manually click a bookmark link on every page.
If we get around that then we can use C# code to:
Process.Start("chrome", #"target site");
and since jquery is already available for every page it will do the population and submit we want.
How can I automatically include jquery for every page that gets loaded on the browser? Is it possible to do that via a chrome plugin; jquery or C# code!? Is it at all possible?
I've decided to use Fiddler to modify response body before being displayed on the browser. Now I can jquerify all pages comes to the browser. Look at this link for a detailed example.

Categories

Resources