I work for big company on a quite big project in ASP.NET.
Our supervisors decided that our application need automated tests and they chose
HP Quick Test Professional tool. I have bad feelings about it. Has anyone ever used that tool to test ASP.NET pages? IS it good choice? Do we need any additional tool/addin? Can we use that tool without implementing MVP (or MVC) pattern? I know there are NUnit,XUnit etc. but most of them are prohibited in our company (don't ask why).
A larger question would be what you are hoping to obtain from the testing. HP Quicktest Pro is usually used to automate regression or UI input testing, while NUnit, XUnit, and the like are used to create more focused unit tests, usually revolving around functionality.
Since you are referring to testing ASP.NET pages, I presume you are thinking of automating regression and/or UI testing. In this case, you don't need to implement MVP/MVC, but it helps. HP Quicktest is well designed for this setup and you can use the UI to set up tests quickly. On the other hand, you do not explicitly need it like that. You may use the Expert View and use VBScript to generally do whatever you please from a testing standpoint.
As for whether or not it's a good choice, that very much depends on your company's situation. At the end of the day, nothing can replace a properly executed regression script done by a warm-blooded human, and if management or the executives at your company don't fully understand that you may get into a situation where they will attempt to replace or subvert human QA, which will be a disaster for everyone involved (voice of experience). If the higher ups understand the limitations of the software, it can be pretty helpful. I would not expect it to change your life for the better, but it can be useful to prevent bugs from ever making it out of the dev environment, which can happen a lot if your shop is heavy with junior programmers.
Related
I know the question comes up every now and then. But because it hasn't been considered for a long time, I would like to ask it again. What would you recommend to create and run automated tests of a WPF application? Recording would be good too, of course, but it is not mandatory. Can you recommend something?
Besides the possible automation bots - or better: UI testing by code - I actually haven't been working on that lately. So, therefore, I would like to approach this from a slightly different angle: How to test a WPF application?
In general; your testing capabilities - well, actually the ease of it - depend on how the program is structured.
Some examples:
Dependency Injection: works very well in combination with unit tests.
The MVVM pattern (suited for WPF): works very well with Dependency Injection and Unit Tests - And allows to isolate UI tests.
Separation of layers - quite obvious, but often forgotten
Of course there are also test concepts to take into account:
Unit tests
Integration test
Consumer Contract tests
UI tests
End to end test.
My guess is, you're referring to a tool to do the UI testing, as I said, it has been a while, so I can't give you an answer there - but what I can give you: good test-ability depends on a well defined architecture.
Here's something on UWP; I am pretty sure there was something similar for WPF: https://learn.microsoft.com/en-us/visualstudio/test/test-uwp-app-with-coded-ui-test?view=vs-2019
I hope this helps.
Take a look at Appium and WinAppDriver. It's a platform to test Web apps, mobile apps and desktop apps using the same API.
I have used Specflow together with Appium to do UI testing in a recent project.
If you want to go for UI testing using recording (no coding required) there a multiple options. Haven't used any of them, but there is: Microfocus Silk, SmartBear TestComplete and many more.
Some of these tools also provide you with Test Case Management as well.
I was browsing for a while the internet and this site and instead of finding some ways to unit test my existing code the only finding was to separate logic and interaction with the user (MVC approach). Although this is great for new projects it is time-consuming and as a result too expensive to invest for existing ones. Is there a way to create specific unit tests, ideally automated, for existing GUI projects that unfortunately connect directly to databases or other systems to get data and the data are manipulated before it is shown? Currently we have two projects the one being MFC, the other C# .net 2.0 Thanks a lot.
Unit testing won't cut in here considering you can't change your existing code (not to mention you don't really unit test UI). You should look for some kind of GUI testing automation/scripting tools. Like Sikuli. Quoting literally the first paragraph from their website:
Sikuli is a visual technology to automate and test graphical user interfaces (GUI) using images (screenshots).
It doesn't get any simplier than that. You "tell" the tool which parts of your UI it should observe/interact, it records and replays it. Skimming through this presentation will give you idea of what exactly you can do (might also check their video). Probably won't solve all your problems, but might be alternative worth considering.
Unit testing an already existing project is always a challenge. however i point you to some open source tools that will help you to automate unit testing
C++
Boost unit test framework
Google Mock
C#
NUnit
NMock
It is possible to produce some level of automated testing, but not unit tests.
Unit tests, by definition, test small units of logic decoupled from the system as a whole. I'd recommend new code be written in the way you described (mvc etc) to be unit testable.
With your existing code, unit testing will obviously require refactoring, which I appreciate is not in your timeframe. You will need to work with what you've got an look at a way to perform more whole-system automated testing, probably driven through the UI. The fact these are not Unit tests is by-the-by, there are helpful tests to have even if you have unit tests. Its helpful to know the distinction though when you are searching for resources.
You are probably best searching for automated ui testing. With the .net apps, you may find something like White useful
If you are lucky enough to have a Premium (at least) version of Visual Studio 2010, then you could consider writing Coded UI Tests.
A UI Test is basically an automated sequence of actions (mouse, keyboard...) on a GUI. These are very high level tests (or functional tests), not unit tests, but it can help testing a GUI application that is already existing.
For instance you can easily automate CRUD actions (which imply a database) and check (assert) that actions have produce the expected result in the UI (new created item in a list...).
Write UI testing can be very time consuming because there are various aspect you had to test. Thanks God there are a lot of frameworks to achieve that result but you always have to write some code.
I assume you already did unit testing (Visual Studio itself comes with a not so bad unit test framework) so what you want to check is not algorithms but UI automation/results. What does it mean? Everything that is code must be tested by code (database operations and algorithms, for example). Even some UI controls can be somehow tested by code (example: if I simulate a user click I'll get that event fired when this condition is true). Trust me, UI testing is Black Art and often you'll get failed tests even if everything is OK.
Simple stress scenario
For a simple scenario, for example to stress your application to reproduce a bug repeating the same operation many times, you can use a macro recorder (such as WinMacro). You register user inputs and then you run that macro in a loop. If there's a subtle bug you have many chances to reproduce (and/or to find) it when that actions are repeated 5000 times in a night. That done you'll get data from your logs.
Simple scenario
If your application can be somehow automated (it may be easy for .NET application using VSA) you can prepare some "good" macro to automate an operation, put results in a file and compare them with a known good results data file.
Simple tip: for MFC application you can write your own "macro" with a text file where each line is a Windows message with its parameter; read it, parse it and SendMessage() them to your application to simulate user inputs, menu clicks and so on. Grab - for example - text box value and compare with something known. WinSpy++ is your friend.
Complex scenario
For anything else (is my custom control drawing everything in the right way? when user click that button then UI colors changes?) you have to use a more complex tool. There are several tools to automate UI testing, built-in in Visual Studio 2010 (not in every edition) what you need to create coded UI tests. What does it mean? You write code to automate your application and then you write more code to check its results (sometimes even comparing bitmaps with known results. It may be tedious and a lot but virtually you can test everything, even if the application hasn't been designed for UI testing.
Start reading this from MSDN.
There is a plethora of commercial tools too (even if I never used it in any project) so I not write any link, I guess you'll have a lot of results in Google.
Mocking is usually the best approach for simulating integration points, but unfortunately most Mocking frameworks fall short if the code is too interconnected and bury dependencies in private methods etc.
It might be worth looking into a framework called Moles (made by Microsoft) that have very few limitations for what you can Mock. It even handles private methods!
Perhaps you could use it to mock your db calls to test your data manipulation?
There are several tutorials online.
Here's one that might get you started:
http://www.unit-testing.net/CurrentArticle/How-To-Mock-Code-Dependencies-Using-Moles.html
My question so general, but I think the answer will be specific.
All I want to know is:
Is there a way or steps or mechanism to test the application (web application) in a professional way?
Many times when I finish developing and try my application, testing it with dummy data several times, and when I think every thing is okay and I think I have covered all possible scenarios, I find I forgot important issues, or others tell me they found problems in my application.
How do I overcome this problem, and save my time?
Good links:
http://www.masukomi.org/talks/unit_testing_talk_2/index.xul?data=slide_data.txt#page215
http://butunclebob.com/ArticleS.UncleBob.TheBowlingGameKata
https://stackoverflow.com/questions/185021/rhino-mocks-good-tutorials
http://daptivate.com/archive/2008/02/12/top-10-best-practices-for-production-asp-net-applications.aspx
http://dotnetslackers.com/IIS/re-60499_What_is_It_is_an_error_to_use_a_section_registered_as_allowDefinition_MachineToApplication_beyond_application_level.aspx
As a proffesional tester my suggestion is that you should have a healthy mix of automated and manual testing.
AUTOMATED TESTING
Unit Testing
Use NUnit to test your classes, functions and interaction between them.
http://www.nunit.org/index.php
Automated Functional Testing
If it's possible you should automate a lot of the functional testing. Some frame works have functional testing built into them. Otherwise you have to use a tool for it. If you are developing web sites/applications you might want to look at Selenium.
http://www.peterkrantz.com/2005/selenium-for-aspnet/
Continuous Integration
Use CI to make sure all your automated tests run every time someone in your team makes a commit to the project.
http://martinfowler.com/articles/continuousIntegration.html
MANUAL TESTING
As much as I love automated testing it is, IMHO, not a substitute for manual testing. The main reason being that an automated can only do what it is told and only verify what it has been informed to view as pass/fail. A human can use it's intelligence to find faults and raise questions that appear while testing something else.
Exploratory Testing
ET is a very low cost and effective way to find defects in a project. It take advantage of the intelligence of a human being and a teaches the testers/developers more about the project than any other testing technique i know of. Doing an ET session aimed at every feature deployed in the test environment is not only an effective way to find problems fast, but also a good way to learn and fun!
http://www.satisfice.com/articles/et-article.pdf
Visual studio have a great test software
http://msdn.microsoft.com/en-us/library/ms182409.aspx
http://channel9.msdn.com/Blogs/briankel/Visual-Studio-Test-Professional-2010-The-Tool-for-the-Modern-Tester
Selenium is a great suite of tools to help test web applications. I'd recommend having a look at that.
This is a very big subject, there are hundreds of books written about software testing. The Wikipedia article should get you started on some concepts, but you really need to learn a lot more.
This SO question should be useful in choosing a book to start with.
I use http://xunit.codeplex.com in combination with http://www.jetbrains.com/resharper/.
Use either ms test framework or NUnit.
I recommend reading about unit tests and focused integration tests.
For full system tests use WatiN.
A lot more than a few nice tools goes into "professionally" testing any application.
But sticking with tools for the moment, a good tools for testing .Net sites is WatiN. And a good example of using WatiN in a real world situation is the DotNetNuke Automation Tests project. It is the continually growing set of automated tests that DotNetNuke Corp. is using to test DotNetNuke on a daily basis, and best of all it's open source.
The product I have been working on has been in development for the past six years. It started as a generic data entry portal into an insanely complex part WPF/part legacy application. The system has been developed for all these years without a single Unit test in its fold. Now, the point has been raised for a comprehensive unit testing framework. I have been recruited recently to work on this product and have been tasked to get the 'Testing' in order. Since the team that worked on the product for the last six years adopted 'Agile', the project lacks any documentation of the business rules or any design documents.
I have been trying to write unit tests for some of the modules. But I am not sure what to Mock, how to setup my Test fixture and eventually what to Test for, since a casual glance of the methods does not reveal its intentions. Also, it has come to my attention that the code was not developed with a particular methodology in mind.
Given the situation, I was wondering if the good people of Stackoverflow could provide me with some advise on how to salvage this situation. I have heard about the book 'Working with Legacy Code' that has something to say about this general situation but I was thinking about getting some pointers from individuals who have encountered similar situations within the technology stack(C#,VB,C++,.NET 3.5,WCF,SQL Server 2005).
In my opinion the best way is to start by "stabilizing" the current code functionallity using integration tests. Try to create tests that has start point which is not likely to change later. Using the integration tests you can gain confidence that refactoring that'll come later for the unit tests are not breaking anything.
The next step is to unit test the code. If you're free to refactor the code you can start separating logic to classes (e.g. extra logic in view layer) and add unit tests to them. Using this process you also get to know better the code of the product.
It is very recommended read Working with Legacy Code, many of the problems you're going to encounter already have solutions :)
Unit testing legacy code can be a challenge sometimes, depending on the existing code and on how much you can change the code. You can use some tools, for example for writing integration tests you can use White framework to automate the GUI. Another tool you can use for writing the unit tests without forcing major changes in the code is Typemock Isolator (disclaimer - I work at Typemock), it allows faking most of dependencies without changing the production code. There are many other tools which can ease the process, try to find and make best use of them :)
#sc_ray: I know this may sound pretty obvious, but I believe before you start writing tests against the existing code base, you should focus on making sure you are using an MVVM approach when interacting with your UI. Being a legacy app does not mean you have code updating the UI directly with if statements, but the older the project the easier it is for people to bypass more modern software development styles. All I'm saying is that I would make sure I am optimizing the use of binding, commanding, and all the wonderful infrastructure facilities WPF provides. Otherwise important pieces of your business logic would not be able to be tested and you could be potentially writing tests against less relevant code...
Here are some typical answers(ranked in ascending order of corniness) I get from managers/bosses whenever I bring up the importance of having unit tests and code coverage as an integral part of the development cycle
"That is the job of QA, just focus on features and development"
"The application is not mission critical, if there are some bugs it's not the end of the world"
"We can't afford to spend time on unit testing"
"Try not to get too fancy"
In spite of having the best intentions of doing a good job, at the end of the day when time comes for the blame game, the burden finally falls on the developer.
It's all too often that I've seen that things break in production, some of which which could have been avoided by catching these bugs statically by running unit tests.
I just wanted to get a conversation going to see what peoples experiences have been and what is the best way to tackle this.
UPDATE: Thanks everyone for a lot of insightful advice. There are several answers that I wish I could select as the right answer.
Introducing unit tests into development process is like investment: you have to put some money up front to get profit later. Management should be more attentive to this analogy if you follow through with it: describe what investments are required and then lay down plan for profits.
For example:
Investments:
time spend to implement test
infrastructure (no serious product
unit tests can be possible without
test-specific infrastructure code
that streamlines product specific
test patterns, test data
creation/removal, etc.);
time spend on writing actual tests;
time spend on reviewing and
supporting tests;
etc.
Profits:
no bug ever re-appears without a sign;
no major features are released without unit tests passing;
cycle development-qa-fix bugs is cut in half for majority of bugs: development-unit test-fix bugs;
etc.
Most managers won't see the advantages of unit testing until they see it in action where it makes sense, so my advice, based on experience, is to take the ff steps:
Apply unit tests to recurring bugs - This is the best use case to prove the value of unit tests. When you have bugs that just appear and reappear every other build, the unit test will allow developers to see which changes caused the bugs, aside from alerting them in advance that a fix is in order. It's quite easy to demonstrate to management as well.
Apply unit tests to regular bugs - With the usefulness of unit tests now clearly demonstrated, several instances of recurring bugs disappearing in the long term should be enough to encourage everyone to use unit tests to evaluate all bugs, to prevent them from becoming recurring bugs.
Apply unit tests to new functionality - With unit tests making sure that old bugs don't reccur, and confirm that they are fixed in the first place, the next step would be to apply it to new functionality to ensure that bugs will be minimized. Make it clear that it is impossible to totally eliminate bugs.
Apply full blown TDD - The final step will be to apply unit testing even before coding, as a design tool that both helps in designing code and minimizing bugs.
Of course I'm not saying that this is easy -- what I had stated above is an oversimplification which even I struggle with everyday -- it's difficult to convince everyone.
If later you decide to move on to a different company, you may want to explicitly look for a company that practices TDD.
People listen to their wallets. Show how much time you can save by catching bugs early on. Translate that to dollar-savings.
With regard to #3, spending time on unit testing will most likely decrease overall time to market. Great article - http://blog.scottbellware.com/2008/12/does-test-driven-development-speed-up.html
For me, the best advantage to adopt the unit test is that I can change my coding behavior to make it more testable, in another word, in more loose coupled way.
if you cannot practice unit test in your real project due to the management issues, I would choose to practice on some small toy project, just to force yourself to get a way to write testable code even there is NO unit test.
My own 2 cents.
If unit testing, I'm assuming you're talking about TDD here, is that important to you you should use your own time to write them (assuming you have time). If you do, keep a record of how much time you actually spend writing them and after they've been in place for a release cycle or two go to your managers with some data.
If the answers you've posted are really what your managers are saying then you work for idiots and perhaps some hard data can sway them. Given the market, quitting isn't likely an option and playing office politics won't get you anywhere (or improve the quality of your code).
Until your managers understand that TDD IS NOT solely about preventing bugs or "testing" they will NEVER get it. TDD is about design and overall code quality.
You have to show them. If they can't be persuaded then I would start looking.
Quietly ;)
My short and incomplete advice would be to:
Just change jobs. A company whose managers give that sort of answers is going to fail anyway, and soon. Get out before it's too late.
Reverse the blame game. Make an official statement every time something gets released without unit testing that it has been done so, and that you are not guaranteeing it's bug-free.
Write down the time you spend on tasks, separating bug fixing after failed deployments, and total it against a (potential) time allotment for writing unit tests.
Why don't you just write unit tests for your code? Do you know if there are some other developers having the same problem? Probably they will follow your example and write unit tests, too.
I don't think that the problem is the technique or the costs for an integration server. The problem is the management's attitude to unit testing. So convince them with all developers.
There are lots of hints in this thread (Jon Limjap's answer), try it!
Don't sell management on a particular approach; that's just going to be difficult and isn't really going to buy you much. Whether or not your management chain appreciates unit tested code doesn't matter.
Sure, unit testing your code has a lot of benefits associated with it, but don't rely on management buy-off to write your tests. When people start seeing results, they'll flock towards The Right Thing.
One other thought to add to the other excellent comments on this thread (many of which I've upvoted): make sure that your management knows that Unit testing is very highly automated at this point. I find it very impressive to pop NUnit on the screen, hit the "Run All" button and see dozens of green-lighted tests being passed in seconds. Do that once, saying "this verifies that all of my older work is still correct despite all of my newer changes" and you just may win a few converts. In any event, they'll come to trust you - with your visible proof of quality - more than they trust others. That can only be good for your career.
Well the classical response is that the earlier you catch a bug the less expensive it is to fix, I think most managers can relate to that.
As Mark said showing something concrete is the best way to convince PHBs that something is good as they are so used to hearing talk and probably don't know the difference between unit testing and other testing.
Now there is a resource to help. A modern list of use cases, tangible evidence for TDD.
Do you need to convince your boss or teammates that TDD is being used? That it's not some theory? That it's not just heir say?
Now you can, check out WeDoTDD.com, a list of companies that TDD and the stories behind those teams.
That's exactly why I created this site, to put to rest the arguments around "TDD proof" and "Does TDD work" and "Who is doing TDD".
You can also learn a lot about the topic itself there by reading the stories behind these companies and teams practicing it.