Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I am going to be creating a web application for internal company use. I created one "General.dll" class library that contains abstract classes such as Person, EmailAddress, etc. And then I created an "EmployeeManagement.dll" which includes classes such as Employee : Person, EmployeeEmailAddress : EmailAddress, etc.
My EmployeeManagement.dll references and relies on General.dll.
Then my web application will reference EmployeeManagement.dll.
How can I effectively keep track of cascading changes? For example, if I make a change to General.dll, I will need to recompile that class library into a new General.dll, and then remember to reference the new General.dll in every other class library that uses it. Then those libraries will need to be recompiled and I have to remember to update the references in the web application to those as well...Seems like there must be a tool or more efficient way to handle this that I just don't know of. Any tips?
For a start, if you add all of your projects to the same solution in Visual Studio then they will automatically be rebuilt as appropriate based on dependencies when you make a change.
Also, during development you probably don't want to add a reference to a particular version of an assembly (this is the default when choosing 'Add reference'). In this way, any changes to your General.dll will automatically cascade to any other project that references it on the next build.
Edit after update from OP
You are quite free to reuse projects in different solutions. So you can have exactly one codebase for General.dll and include that project in any solution that needs it. In that case you of course need to be careful when making changes to General.dll to avoid potentially breaking any project that includes it (a continuous integration utility can help here).
Related
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 1 year ago.
Improve this question
I'm working on a C# app that makes use of interop to interact with several of our DLLs that export C-linkage functions. Because our product consists of more than one executable, I moved all the interop code and many shared classes into a C# class library, and reference it in the C# executable projects.
I'm considering breaking this monolithic C# class library up into a few, more granular class libraries for a few reasons, including:
When I initially set this up, it was less clear which executable projects would need which classes. The app has matured somewhat since then, and it's now clear that a significant percentage of the classes in the library are only used by a single executable...
Our C-linkage DLLs have dependencies that very occasionally fail on a user machine[1]. That prevents all our executables from running, even the ones that don't make use of that particular DLL.
Startup time for our primary app is high (> 2 seconds). As soon as we use anything in our .NET class library, (I think) it loads everything (DLLs included).
My question is about the factors to consider before deciding whether to stay with a single class library, or break things up into smaller, more independent libraries. For example, is the per-library overhead in (runtime) resource footprint or loading time significant enough to affect the decision?
[1]: If they have a broken driver install, for example.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 3 years ago.
Improve this question
I am creating the service where I am using DI container as an entry point.
My solution is getting heavy what forced me to think how to do unity registration in good way.
I have Service with couple of other dlls with logic. As for now, Unity is installed only in Service where all registration is done.
My service needs to refer all dlls which are needed for DI registration and all nugets installed and used by refereed dlls in DI.
It ends up with pretty heavy packages.config in Service which are needed only for DI registration and it is hard to recognize what is used by service itself and what is not.
I am wondering if there is other way, more cleaner way to do registration without huge list of references in main project and do not brake good practice roles at the same time.
I came up with two solutions:
1 - Create separate dll only for UnityConfiguration, I could have there one public static class like: UnityConfiguration.RegisterComponents() which would be called by Service. In that solution I still will have huge list of reference to all the stuff, but it will be at least separated from main service...
2 - Solutions which probably will not be supported by anyone (by me as well): I would have Unity installed in each dll with logic and each dll could have its own static class UnityConfiguration.RegisterComponents() and main service would call all that registration without knowing the types...
Can you please share your way you would go for ? I am really interested in your opinion...
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 6 years ago.
Improve this question
I have a C# project and I want to add a reference to it. I know how to do it but which one should I use - lib folder in the project or package installation via NuGet? It would be great to have the pros and cons for both with some explanation when the one is better than the other. My current opinion is that I should use NuGet whenever possible because I can see if there is an update for the library directly in VS. But I need more information on the topic...
If the reference is made by you, but managed by, say, a different team, then I'd create a share on the network, or make my own NuGet Server so that you can update your own application, and be independent of how the other team operates.
You can set up a "NuGet Server" very easilly, just create an empty MVC application, and then add "Nuget server" directly from nuget hehe.
A file share on your local network can also serve as a nuget source.
I'm trying to come up with a good, valid reason for NOT wanting to use nuget or any other package managemer source, but I simply cannot. Those old-school "hard links" to references just do not resonate with me, maybe someone else can provide guidance on that.
Closed. This question needs to be more focused. It is not currently accepting answers.
Want to improve this question? Update the question so it focuses on one problem only by editing this post.
Closed 7 years ago.
Improve this question
I am trying to create a module that will generate class library project from given database columns that will include info class,dataprovider class and contorller class each having their own .cs file. What i have done now is that create info,provider and controller class.
I have researched to create class library project programatically but still not much success..What i want to do is create a class library project programatically and move those files to the project folder.After that compile the library project and generate dll file and move that to bin folder of website.
Can anyone please point me to the right direction or give some useful resources so that i can use that to solve my problem.
I would use a T4 template and then you can programatically emit a C# class based on your database structure.
https://msdn.microsoft.com/en-us/library/bb126445.aspx
I have used the T4 template and it is very useful for same type of code generation. We can have XML for configuration (class and properties) and then automatically we can create classes for them. They are useful since the same configuration can be used for database layer, mapping properties to database fields etc. so your back-end layers can be automated.
Note: All classes should have common design. Also make class as partial so that you can always add more functionality to it.
Closed. This question is opinion-based. It is not currently accepting answers.
Want to improve this question? Update the question so it can be answered with facts and citations by editing this post.
Closed 4 years ago.
Improve this question
I have three projects(C# libraries) namely A,B,C.
All the 3 have 3-4 xml files(in general can be resources) associated with them.
Each of these projects have classes that access these files for settings and information.
(loading xmls when ever they need)
The problems is sometimes there is a need that a class in project C may need to access
resources(xml files,images etc) of project B and vice versa.
Also these files may or may not be a part of the project solution.These resource paths
can come from app.config etc.
Its really becoming tedious to work out how to centralise access to these resources so that
all three projects can access them uniformly.
Currently all the projects load the files using app.config.
Also i'm trying to minimise the number of times a xml is loaded.(ideally once).
But given the projects are different i have to load it again.
I thought of using a Singleton class as it would make more sense for making uniform access but haven't quiet figured out a way.
Anyone has come across similar situations?
Are there any design patterns or best practices for sharing resources across projects?
Create one library containing the class(es) that access your centralized XML settings, and reference that library from the other libraries.
You don't necessarily need a Singleton for this, but putting it in one place will allow you to focus your efforts on things to improve it later, possibly caching, etc.