Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I'm involved in a project where we are using a Continuous Integration server and NUnit for unit testing and integration testing.
A customer asked us the other day if we were writting the tests before the code... Well, were are not doing it this way always. Specially when there are complex technology issues that we would like to test to understand the problem and the possible solution first.
I would like to know if we could still considered our development process as following Agile Development, say it to customers and don't lie.
I think you are mixing up things here.
Test-Driven-Development (TDD) does not necessarily mean you are using an agile approach. Surely, it is a best practice that many of us who do agile use, but TDD can also be used in a Waterfall process, replacing/supplementing the specification.
Continous-Integration on its own means having the code your team produces integrated on an at least daily basis. This does not only force every member of the team having to merge/checkin continously, but also assures you actually can do a release of every build.The unified build process forces you to overcome the "works on my machine syndrom". Because you could do a release everyday this supports an agile process, even though it is not absolutely necessary in the strict sense.
Using tests and integrating them into the build process is a way to enrich your buildprocess with automated Quality Assurance and deepen the level on which integration (integrity) is actually tested.
As long as you are developing in small iterations, focus on getting a working product rather than on getting extensive documentation, and the customer is continuosly involved in the project, it is agile development. Unit testing, TDD and integration testing are of course good and very advisable practices, but they do not decide whether your project is agile or not.
In the absence of Automated tests, CI only verifies that the code under source control is maintained in a compilable state between revisions and that the single-step build works properly. While this is useful, it isn't as useful as the automatic verification that the correctness of the code has been maintained between revisions.
With that said, I'd rather have some verification of code between check-ins than none. I'd rather have partial code coverage or an incomplete set of functional tests than nothing. Or worse.
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
My question is, where is an easy place in the codebase to begin writing unit testing for so I can get my feet wet and develop some basic skills?
The details:
I am a recently hired junior dev and I have been tasked with adding unit testing to my company's projects. I've installed NUnit, watched tutorials, done the basic samples, and am reading through the official docs. I can see the value of automated unit testing, plus think it's a good way to get familiar with my company's source code.
We have 2 online platforms, a "brochure" site to attract clients and our "database" site. both written in Written in Javascript, jQuery, C#, .NET, connecting to MSSQL via Entity framework.
The "brochure" site doesn't have to much user functionality built in. They click links to get to pages of information. And there is a contact request form where they submit basic info so we can contact them. Very simple website.
The "database" is more complex. Registered users can grab any number of combination of data from different databases, there are access rights involved with various levels of registration. Lots of filtering based on users selections.
There aren't currently any tests written for either but they are both in production. Where would be a good place to begin?
I think I should begin on the simple "brochure" site to learn on, but I do see the testing as being more important in the long-term for the "database" site.
Should I try and test controllers, ajax get/post functions, views?
I think what I'm hoping for is someone to say "these 1,2,3 are simple tests to write, start there" If I'm wrong on this thinking please let me know. Thank you.
If the applications currently have no unit tests then it's likely that they aren't immediately "testable." In other words, they may not be built out of smaller classes and methods that can be unit tested. The two go hand-in-hand. People who write unit tests try to write testable code.
Chances are that you may need to refactor a little bit in order to be able to write unit tests. One approach is to look for "low hanging fruit" - areas where that refactoring is easier.
For example, are there large blocks of duplicate code? You could write a class that provides some of that functionality and replace those methods with references to that class. Now you can unit test that class.
Or, does a class have a ton of private methods that all do things seemingly unrelated to the class's primary purpose? There's no way to unit test those private methods, but you could break them out into their own classes.
For the longest time I thought that refactoring meant redesigning the code to use dependency injection or rewriting classes to depend on abstractions, and that made it seem overwhelming. I recently read something that changed my outlook on that. Here it is. The article is titled "Invading Legacy Code in the Name of Testability."
The gist is this (or at least what I remember): Suppose we have a class that calls a bunch of private methods, and that class is untestable. We could put that method in its own class, refactor it to make it testable, and then write unit test for it.
Then, in our original class, we change
DoSomethingInAPrivateMethod();
to
var doesSomething = new DoesSomething();
doesSomething.DoSomething();
Now we can write unit tests for our DoesSomething class.
Normally we might try to avoid creating new classes like that. We're still not applying the principles of OOP. But in this case, what we've done is taken a small piece of code that couldn't be tested and made it testable. It's not ideal. It's not the way we'd like to write our code. But it's better than it was. At least now something has unit tests.
You might get to move 20 lines of code out of one class and put it in another. At least now that original class is a tiny bit more manageable. Maybe after doing that the original class will get to the point where you can unit test it, too.
First of all: There are different type of tests to think about.
Unit tests in general are used to test small units (functions for example).
So writing tests for every complexer function (no need to test getters and setters or something similar) in a Class/Module could be a good point to start. Calls to a database or other external services (e.g. API) should be mocked.
You can create a folder/sub project "tests" in the root folder of your project. The structure of this test folder should be like your code project sources to make it easy to find the test class/files for a specific class in the code project.
Later you can (and should) think about Integration tests. This is were you can connect to your database or other external service.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 9 years ago.
Improve this question
I'm writing some test code for testing an ASP.NET MVC web application with Castle Windsor DI, Domain Driven Design (Application/Domain Services, Repository, Domain Model), NHibernate, and (most likely) MOQ for mocking. The possibilities of what can be tested are endless as basically everything can be tested.
Some possibilities are for example:
Ensure Castle Windsor configuration will work (Test some conventions)
Business logic (Inside an Entity, or Domain Service)
Other things can be tested such as the controller actions etc.
There are quite a few things (so many layers - Controllers, Services, Repositories) which hardly seem worth any effort to test as they are quite simple in general.
With a smaller application it isn't very clear what could have the most benefit yet, but it will grow and the same patterns will be used on more complex applications.
For those with similar applications, what are you unit testing?
Domain models and Application services are first citizens for unit tests if you don't have enough time or new to writing tests. These tests cover the most important part(application services for flow control and domain models for business rules). When I start to learn writing tests(don't know TDD at that time), they're the only part I test.
Then everything could be tested after adopting tdd. You'll need integration tests covering persistence, messaging and other integration points(mostly for testing configurations).
The best answer for "what to unit test" question is ... everything - according to TDD :)
However, ensuring that properly configured components will work well together is rather a part of Integration Testing or Smoke Testing.
The suggested scenario in TDD is to write tests in parallel with the code, so both codes grow simultaneously. Your problem probably lies within the fact that you already have the code, and don't have the unit tests. In such situation there are two possibilities:
1) Your components are well separated. Then you can write unit tests for public interface of each component, aiming to achieve high coverage (measured with some coverage tool).
2) Your components are tangled together. Then I'd suggest writing as much integration tests as you can, also aiming for high coverage. This will be harder than in case 1), so best to test most typical and crucial scenarios, and then refactor the code to loose some coupling, and then proceed with step 1).
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 6 years ago.
Improve this question
We are looking for methods to improve our internal processes when developing software.
We have already adopted TDD contiguous integration and Agile. Any obscure features of visual studio or TFS?
All suggestions welcome.
I don't think there can be a magic bullet. Beauty (or in this case quality) is in the eye of the beholder.
With that said, I can give you some suggestions as to some of the ways we ensure code quality.
One suggestion can be to include Code Coverage into your assessment of Software Quality. It is one thing to have unittests written for your code, but code coverage helps you identify what code is actually 'covered' by a test, which can sometimes identify use-cases/scenarios that you may not have considered. I recommend you investigate nCover.
You may also wish to dig deeper and look into using nDepend...
NDepend is a tool that simplifies
managing a complex .NET code base.
Architects and developers can analyze
code structure, specify design rules,
plan massive refactoring, do effective
code reviews and master evolution by
comparing different versions of the
code.
I appreciate that these are not TFS features but you can easily 'integrate' them into your visual studio environment using TestDriven.NET
This is of course not an exhaustive list of things - you need to find what suits you and gives you the confidence about your code's quality.
Hope this helps,
Here's a helpful list: http://www.joelonsoftware.com/articles/fog0000000043.html ("The Joel Test: 12 Steps to Better Code" by Joel Spolsky)
If you really don't have anything to do, and you have the resources you can try code review. This procedure includes double checking the code before every (real) commit. This helps to catch bugs early in the development process. Google is using this technique widely.
Sadly the tools supporting these kind of procedures are pretty basic and hard to use currently. If you do a googling on you'll find one or two simple code review tool for TFS.
Be careful though. These techniques doesn't help you make good software alone. You still need a good architecture, quality code, etc. (Okay, TDD helps code quality, but architecture is still a gray area.) I'm not aware of any techniques that help that currently and doesn't hurt the development process too much. You have to wait till Visual Studio 2010 comes out with all the bling-bling of model validation, automatic uml diagram generation, etc.
Personally I believe in code reviews. Some of advantages are:
Constantly keeping an eye on code quality and coding standards.
It's easy to notice any unusual/buggy/hard to understand coding structures (i.e. long if conditions, strange type conversions, etc.).
While reading someone else code it's easier to notice all false assumptions (could be negative false) (i.e. this object is never null), which could introduce bugs.
It makes familiar with all changes to source code, so it's easy to remember that this piece of code was lastly modified (which could introduce bugs).
It makes easy to learn (and teach) good habits.
And I do not believe in any tools, which are thread as special kind of silver bullet -- which is not true.
read: Code Complete
It is good that you have implemented TDD, CI and Agile. Just having ANY process is far better than many places I have seen. Code reviews are probably the single best way to disseminate knowledge and flesh out defects early.
For my money, though, stick to the basics. If you are not doing requirements management, you should consider it. You should know what your customer requirements are when you start your sprint or development cycle. Conduct a review of those and discover your derived requirements, if any. Lastly, you should come up with a way to verify that 1) you built everything you intended 2) you tested everything the customer asked for. There are processes to do this but if you can find a way that's good for you, do it!
RM takes less time than code reviews and catches the "really big" mistakes.
You are probably the most happiest guys if you have already adopted TDD, CI and Agile and have nothing to do =) I think you have a very big field of process improvement with TDD and Agile practices.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
Just attended a presentation where a testing company stated that V-model testing can be used in an Agile development team developing with java and c#.
I would prefer testdriven development and automation of acceptance testing before V-model testing.
Not sure if V-model testing and Agile testing can be considered the same.
Looking for your testing experience or opinions about using v-model testing in Agile teams.
If you are using V-model testing in agile, how do you do it (or is it not making sense)?
Update: Thoughtworks presentation (Agile vs v model)
V-Model is widely used at my company. I must add that, IMHO, there are better development models out there, but V-Model can still be effectively used when developing large-scale systems where you are NOT using iterative development.
Still, it's my idea that test-driven development can still be applied to V-Model, as part of the unit testing phase and even on integration testing phase, as long as you can automate that as part of the development cycle.
V-Model, however, sees system testing as a test that occurs after the development of the product is considered complete, so test-driven development doesn't apply. Sure you can automatize it through the use of tools, scripts or programs, but you are no longer developing your code. On System Tests you no longer care about the code, only about specifications. This happens so because your tests units could be incomplete.
Finally, user acceptance, in V-Model, shouldn't be completely automatized, because it's when the final user looks at the system and decides whether it adheres to the requirements or not. Of course the user will have a script on his hands in order to know what he/she should be testing, and in case of, let's say, batch systems, there will be an supply of data, but in no way should a script determine the success of this phase.
But let's get back to the question. What I just said is that TDD and automation can be used as the implementation of testing phases in V-Model. So, if you can use V-Model testing with Agile development, as presentation you saw affirmed, then I can also use TDD and automation techniques.
However, I'm not sure you would want to. I don't know how one could apply V-Model to Agile or if it would be coherent, since V-Model is not agile.
Test driven development is about specification, not test. This is not antagonist with a V approach.
On the other side, V-model implies a single long cycle of development. This is antagonist with an agile approach.
V model testing doesn't really fit in with the ethos of agile development. So in short, while it is feasible that it could be done, it would compromise the nature of the agile process.
One of the important features of agile is the ability to adapt to change. The V model doesn't really support it well.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 5 years ago.
Improve this question
I have been learning about TDD and BDD and using this as much as I can at work but in an environment that really isn't agile.
I want to try it out properly on a pet project at home. So, going with the agile approach, and the principle of having working software, what should I write first? Should I create some forms first (this will be WPF / WinForms), linked to some dummy model? Or should I work ground-up. Any advice would be great!
EDIT
I know that I will be writing tests first. The question is at a higher level. Should I aim to put a working model and business layer together first or start with a form and a dummy model.
You are looking at it from a wrong perspective.
You are separating the application into components horizontally - UI component, backend component and so on.
Instead, you should look at it vertically - what are the different features - it could be login/logout, display list of users, display some other data and so on. Sort those slices into a priority order - which one needs to be done first, which needs to be done second and so on.
Then you concentrate on the most important slice until it works, whatever it takes - if it needs UI you add UI, if it needs backend logic you add backend logic and so on. Only after it is finished and fully working you go back to your list of slices, reevalute them, select the most important one again and concentrate on it.
Repeat until you are done.
This is basically what always working software means.
It allows you to stop at any point and say - this is good enough, and ship it.
If you work horizontally you will not have anything working until you finish all the work.
I usually start with the feature and area that is most important. Take a look at this SO answer.
If you want to do TDD/BDD you do start with tests.
In my humble opinion it is always good to start with model tests unless they are totally based on framework's functionality.
As for checking Controller/View part I prefer to have 'blackbox' tests which check if i get response I expect from an http request (for web applications). It allows to remove brittleness from the tests.
Even if I do full TDD test I often throw tests away if they are about nitty gritty parts of implementation, because otherwise very when refactor implementation at the end user experience is the same, so my application works fine, but I am spending hours fixing the tests. I don't to train myself to avoid refactoring just because I know the pain it would give by redoing a large body of testing code.
I would advice to test only things that really matter to you and ignore the rest till it bites.
When I do get a bug from something I was ignoring, I do write a test to document the bug even it it is about low level implementation.