When the .net 2.0 framework first came out, the provider model was all the rage. 2.0 even shipped with a bunch of default providers (Membership, sitemap, role). Since the release of 2.0, the hype has died down, and whilst I still use providers day to day, it seems to get far less press.
I was wondering if this is because people are using something other than providers and they've been superseded, or is it simply because the take up wasn't as big as other IoC methods?
It actually hasn't died down. DI is still big. There are many DI frameworks out there to choose from. Yes, it's not hard-baked into every part framework like it should absolutely be, but its still a very good practice to follow. For instance, I was using the P&P's custom application blocks to do DI. Until they ditched it for Unity. Now I'm using unity.
A lightweight DI framework is a good idea for any large extensible application.
I think that as these tools become more standard within .NET the hype around them becomes less, but their use does not. Certainly the Membership and role providers are very important to our new application that we are developing and will save us significant amounts of code.
Microsoft Patterns and Practices is the birthplace of tools like the Enterprise Library, which is heavily involved with the provider patterns (particularly with membership) in regards to the Security Applicaiton block and the model appears to be used throughout the blocks.
Related
I'm relatively new to programming and I have some questions. I'm currently simultaneously trying to learn ASP.NET MVC from the official Microsoft book because I'm looking to get the certification in a few months and at the same time working on a small project for a colleague. Instead of doing things with .net framework i decided to just do everything in .net core and there are some discrepancies between what I learned in the book and what seems to be available in EF Core.
I'm using Sqlite for my database and I've already plugged that in for the most part but I'm running into some issues where various people on the internet are providing me with conflicting information.
So in the .net framework version of ASP.NET, the way Microsoft said you should do things is, You have a Model PersonModel and a context PersonContext that goes in the Model file and inherits DbContext. This PersonContext contains various DbSet<T> objects depending on which tables you want to make etc. This has remained the same for me in .net core. However after this Microsoft tells me to make a PersonInitializer which is supposed to inherit DropCreateDatabaseAlways<PersonContext> and then override the Seed method (conveniently thats also where a getFileBytes method is placed for images etc).
Now, in core, there is no DropCreateDatabaseAlways class to inherit, leading me to believe this structure is actually not intended at all by .net core. I've heard various things on here about using various CLI commands but in the end I'm basically just very confused about what the "proper" thing to do is here?
It think your largest problem at this point is just sourcing information properly. Admittedly, it's a bit confusing, especially for someone completely new on the scene.
A little background to start: there's ASP.NET, ASP.NET MVC, and ASP.NET Core. ASP.NET is really nothing, although it became short hand for ASP.NET Web Forms, as that was the only thing you had to work with at the time. Then, Microsoft introduced ASP.NET MVC, which took the foundation of ASP.NET and layered on an attempt at a modern web application framework modeled after the MVC (model-view-controller) pattern. It was a vast improvement over Web Forms, but did not fundamentally break away from the older system enough to make it a truly great application development framework. Because of it's reliance on System.Web (a monolithic family of DLLs) and the full .NET Framework, it was slow, hard to deploy and completely incompatible with things like containerization. Enter ASP.NET Core, which was a complete rewrite from the ground up.
ASP.NET MVC had a sister framework in ASP.NET Web Api, and technically you could still employ ASP.NET Web Forms in the same project. In fact, a single project could utilize all three frameworks at once, which is as convoluted as it sounds. ASP.NET Core, at least initially, did away with this confusion by just having a single system for both traditional web applications and APIs. As such, if you're looking for information about Core, it's just "Core". Looking for ASP.NET Core MVC is just going to lead you to outdated information.
That said, frustratingly, Microsoft decided in their infinite wisdom that Web Forms, though reviled by most every developer who had ever used a real web application framework, were actually so awesome that they should be resurrected, at least in spirit. Thus was born Razor Pages, and ever since it's been necessary to make a distinction between ASP.NET Core Razor Pages and ASP.NET Core Proper, which has now been relabeled as ASP.NET Core MVC, making it that much more difficult for people to filter out information on one framework versus another. At least for the time being, Razor Pages are still not as prominent as the MVC-style, and thankfully, they at least don't change much in the core functionality of Core to warrant differing discussions on most things. Long and short, you should pretty much just prefix every search with "asp.net core" (in quotes, so it's done as a phrase search). That will generally always give you relevant information - for the most part.
The next set of issues is that ASP.NET Core was developed with high visibility, fully out in the open and with many, many previews, alphas, betas, and even full releases. On the one hand, that's great and is a large part of why it's as good as it is. Developers were able to provide input and steer the development, making it the first web application framework developed by Microsoft actually designed by and for the people actually in the trenches, building web applications.
The downside, though, is that things changed - a lot - and still do. Though, it now seems to be leveling off pretty well after 2.1, which I consider to be pretty much the first truly feature complete release. However, before we got here, we had ASP.NET vNext, ASP.NET MVC 6, DNX, and then ASP.NET Core 1.0, 1.1, 2.0, and finally 2.1 - all of which fundamentally changed at least how some things work. In other words, even if you confine your searches to ASP.NET Core, there's still a lot of outdated and incorrect information out there, simply because it was written about a previous version and things have changed since. To better your odds, you should consider confining your search to things published within the last year (2018+, 2017 if you can't find anything newer). Any older than that, and you're going to have to take the article with a big, huge grain of salt.
And then that brings us to Entity Framework. Oh my. EF has had a storied history, most of it steeped in failure. The original EF wasn't even viable until version 3. The previous versions were so bad they were literally unusable for any serious production work. Even then, it wasn't until EF 4 that it could really even be consider truly ready for prime time, and finally began having some significant uptake among developers as a result. It was also the release where Code First was finally introduced (technically in 4.1).
Before that time there was what the EF team referred to as Database First or Model First. The only meaningful difference was that one would generate your models from an existing database, while the other would let you design your models and then generate a database from that. In either case, you ended up with a monstrosity called EDMX, an XML beast that attempted to keep up with your database state and all the translation of VB/C# classes to that entails. It was an absolutely nightmare to maintain, almost never worked correctly, and a constant source of frustration.
Code First provided an alternative - a shining light in the darkened pit of EDMX hell. You could use POCOs (plain old class objects) and EF would be able to generate migrations from changes you made to those to update your database accordingly. It could even be used to work with existing databases, though it's name prevented it from being used as much as it should have been in that area (many people wrongly believe that if you had an existing database, you had to continue using the old horrible Database First methodology). In truth, Code First was a complete alternative approach to managing databases, both new and existing. EF 5 and EF 6 continued to refine this new approach, and when it came time to work on EF 7, Microsoft decided it was high time EDMX went to its well-deserved grave.
However, work on EF 7 coincided with the development of ASP.NET Core, which itself spawned off .NET Core and eventually .NET Standard. EF was firmly rooted on the full .NET Framework, so it became apparent that a rewrite was in order. As such, development of EF 7 was cancelled, and EF Core was born. EF Core, though, had a rough ride. ASP.NET Core was moving along like a freight-train, but had no native way to interact with databases. You could use old EF, but then you were forced to target the full .NET Framework, instead of .NET Core. As such, EF Core was rushed to release way too soon, and 1.0 was a train-wreck of epic proportions. There was so much basic functionality missing and the workarounds were so obtuse that virtually no one in their right mind actually would take anything into production with it. Then, 1.1 was release and things improved somewhat but still not enough. Then came 2.0, and finally, it was workable ORM, you could actually feel comfortable using in production. The release of 2.1 has brought even more improvements and refinements.
Now that you've had your history lesson, all I can say is that finding good documentation is still unfortunately a bit of a crap-shoot. You need to be very specific with your searches, using the right terms (which is part of why I wanted to give you the history). You also need to pay close attention to dates. Consider suspect anything that existed before 2017 and treat even stuff from that year with a grain of salt. The official docs are your best friend. Microsoft has actually done a pretty fantastic job with their documentation and keeping it up to date and accurate. It covers the largest majority of things is the ultimate source of truth.
Official ASP.NET Core Docs
Official EF Core Docs
Books are not going to be your friend here. The publication process takes far too long and ASP.NET Core has moved far too fast. Any book on Core is outdated the minute it hits the streets. However, if you have a subscription to Safari Books, you might have luck due to their prelease and alpha offerings there. You have to deal with more technical mistakes, grammatical errors, etc., but at least the information will be closer to the actual truth.
Pluralsight has some truly excellent videos that can help you. Unfortunately, video production has a similar problem as book publication, and there's more than few courses that are now so outdated as to be useless.
Hope this at least gives you some better context and helps improve your ability to source good and accurate information. Honestly, I still struggle myself with that a lot of times, and I think most developers working in this space has the same issue. It's fun and exciting, but it's not without its cons. Having to wade through a sea of information to find some pearls is one of them. Good luck.
I was about to use the existing Serilog's SQL Server sink but I realized that the latest pre-release and stable versions do not support ASP.NET Core.
Is there an alternative to this sink? What am I supposed to do? Should I write a new sink?
The Serilog sink for SQL Server depends on some types not yet in .NET Core. Work started to refactor the sink and remove the dependencies, but since then, the types in question have been added to the next .NET Core version:
https://github.com/dotnet/corefx/pull/12426
Due to this, the Serilog SQL Server sink will most likely remain .NET Framework-only until the next .NET Core/.NET Standard release, after which support will be quick to add.
In the interim, writing a quick implementation of ILogEventSink of your own would be a reasonable way to get unblocked.
I can't tell you what you're supposed do, but I can describe a couple options. The Sql Server Serilog provider on GitHub would be a better place to ask the question of what they intend to do.
Serilog is indeed on the .Net Core train, as many, many other mainstream .Net projects are. You are correct that as of today, the SQL server sink is .Net 4.5 only. You can:
Continue developing your ASP.Net Core project, target .Net 4.5 in your project json, build and deploy to windows OS only, but carry on using SQL server sink.
Many companies are migrating to .Net Core but targeting .Net 4.x.x in order to keep 100% back compatibility with existing packages while the kinks are ironed out in the framework. This has been a viable solution for my large scale projects.
Target .Net Core, and write your own logging repository layer to manage custom SQL and database log dumping code.
If you're in core, this is easier than it sounds, but requires experience with Data Repositories and IoC. Any code that needs to dump logs to the database would have to have some sort of "ILoggingRepository". It does however, duplicate calls to logging methods, in addition to deviation from the ILoggerProvider interfaces in Microsoft.Logging.Abstractions - forgoing the flexibility of log levels and such, unless you decided to re-engineer your own. It's a working solution; I never said it was an elegant one.
Write your own Serilog sink.
I don't have experience with this one, but I have seen code samples that describe details of how to accomplish this. The reason I never pursued this option is due to a fear that by the time I finished writing my beastly database sink, the open-source community would rework the Sql Server version into a fully core-compliant and database independent version. This would be the most heavy-handed solution, but also the most robust.
There may be other sinks available for .Net Core, but if you're looking for the SQL server one specifically, then you most likely are working with constraints that prevent using MongoDB sinks and file providers and such.
I guess the question is not really technical, but rather a philosophical one, then I do not propose an answer, but consideration:
In my humble opinion, Microsoft with its Silverlight killing, UWP orientation and "escape to clouds", admitted defeat of its entire concept of proprietary software development, so liberation of dotnet platform is nothing more than the farewell gift for developers, deceived in their hopes.
By itself, dotnet ecosystem is very promising, but its future has little to do with the Microsoft products, as it was before. At least, I hope as a developer who has been working with Microsoft products more then twenty years. Therefore, common infrastructure libraries that were focused on concrete Microsoft products (I mean MS SQL Server in this current case) are dying now.
Therefore, the conclusion is: if you already have long-term project that tightly coupled with SQLServer, maybe it is better to put some efforts to your current logging solution adaptation, otherwise it is better to look for some logging solution, not dependent on MSSQL. Probably it should support different storages via adapters or something like that.
Try to look at this, they declare Core support in next version, at least this is a live project.
Can you please provide me with some tips/guidelines when architecting, designing and implementing a .net framework application, with the requirements given below:
It will be an analytical tool which will retrieve data from files, sql databases and may be cubes. So data layer should be able to handle that. The middleware should be totally independent of the other layers so probably need an IoC container (which one would you recommend)
It will be deployed on the local intranet
The front layer might be WPF application or Silverlight in future (for now, I am concentrating on Silverlight but the point is that it will change)
It should be easy to customise and improve it in the future without changing much of the code as the framework will be deployed for many clients
I need a way to store the configuration information, which will be picked up by the application on application load events to set its feel and look.
I have two months to implement it and looking for as many tips as possible.
SoC for a start
break your application into several assemblies that use IoC (interfaces + implementations):
application model assembly - all other assemblies will reference this one because these classes will be used for inter-communication - they will mostly be just POCOs
presentation assembly - references app model and business services - this one is either WPF or Silverlight in any case use MVVM to make your testing life easier
business services assembly - references app model and data repositories assembly
data repositories - these define repositories that actually get data from the stores
Then I'd create three additional ones:
file data providers
database providers
cube providers
Data repositories would reference all three and use them to provide necessary data.
If configuration becomes very complex with a lot of functionality then you should put it in a separate assembly as well and reference it by business services assembly.
Which MVVM library to use
Since you mentioned time I suppose you'll have hard time catching your deadline. When using MVVM (which I suggested to use) I also suggest you don't use a full blown PRISM (a.k.a. Composite Application Guidance from P&P) but rather go with MVVM Light Toolkit. It will take you less time to get on the bandwagon.
Code generation
In places where appropriate I suggest you use T4 to its full potential. I use it to import stored procedure calls to avoid using magic strings when calling stored procedures (and using their parameters). Check my blog post about it as well.
DAL technology/library
Don't write your own data access code using things like SqlConnection/SqlConnection functionality. There're many data access layer libraries/technologies today that you can use and not reinvent the wheel. If you know nHibernate, then use that. If you know EF, then use that. If you know anything else, use that. Anything that will provide/generate as much code for you as possible that is already tested and debugged.
So it all boils down to:
DRY + YAGNI
a.k.a. Don't repeat yourself and You ain't gonna need it = don't over-engineer you code.
Agile developers are supposed to be lazy
They should develop just as much as it's needed and no more! TDD implicitly provides this process by the red => green => refactor steps.
I would recommend using MVVM and Test Driven Development. The MVVM will give you good separation between the front and middleware, and the TDD will help control the chaos that comes with any nontrivial app development.
Have a look at the Composite Application Guidance from Microsoft's Patterns and Practices group, it may not match what you are doing exactly but will give you some good ideas.
From an architectural standpoint, I highly recommend taking a look at the Microsoft Application Architecture Guide. Since you are already using the Microsoft technology stack, I would consider using Microsoft Unity for IoC. You indicated that your presentation layer might use WPF or Silverlight, so take a look at using Windows Communication Foundation, as you will be somewhat constrained in Silverlight when it comes to communication with your data layer.
We are in a situation whereby we have 4 developers with a bit of free time on our hands (talking about 3-4 weeks).
Across our code base, for different projects, there are a number of framework-y type of code that is re-written for every new project that we start. Since we have some free time on our hands, I'm in the process of creating a "standard" set of libraries that all projects can re-use, such as:
Caching
Logging
Although these 2 above would rely on libraries such as Enterprise Library, each new project would write its own wrappers around it, etc, so we're consolidating all these code.
I'm looking for suggestions on the standard libraries that you built in-house that is shared across many projects.
To give you some context, we build LOB internal apps and public facing websites - i.e. we are not a software house selling shrink-wrap, so we don't need stuff like a licensing module.
Any thoughts would be much appreciated - our developers are yearning to write some code, and I would very much love to give them something to do that would benefit the organization in the long run.
Cheers
Unit Testing Infrastructure - can you easily run all your unit tests? do you have unit tests?
Build Process - can you build/deploy an app from scratch, with only 1 or 2 commands?
Some of the major things we do:
Logging (with some wrappers around TraceSource)
Serialization wrappers (so you can serialize/deserialize in one line of code)
Compression (wrappers for the .NET functionality, to make it so you can do this in one line of code)
Encryption (same thing, wrappers for .NET Framework functionality, so the developer doesn't have to work in byte[]'s all the time)
Context - a class that walks the stack trace to bring back a data structure that has all the information about the current call (assembly, class, member, member type, file name, line number, etc)
etc, etc...
Hope that helps
ok, most importantly, don't reinvent the wheel!
Spend some time researching libraries which you can easily leverage:
For logging I highly recommend Log4Net.
For testing nUnit
For mocking, Rhino.
Also, take a look at Inversion of Control Containers, I recommend Castle Windsor.
For indexing I recommend Solr (on top of Lucene).
Next, write some wrappers:
These should be the entry point of you API (common library, but think of it as an API).
Focus on abstracting all the libraries you use internally in your API, so if you don't want to use Log4Net, or Castle Windsor anymore, you can by writing well structured abstractions and concentrating on loosely coupled design patterns.
Adopt Domain Driven Development:
Think of API(s) as Domains and modular abstractions that internally use other common APIs like you common Data Access library.
Suggestions:
I'd start with a super flexible general DAL library, that makes it super easy to access any type of data and multiple storage mediums.
I'd use Fluent nHibernate for the relational DB stuff, and I'd have all the method calls into the you data access implement LINQ, as it's a c# language feature.
using LINQ to query DBs, Indexes, files, xml etc.
Here is one thing that can keep all developers busy for a month:
Run your apps' unit tests in a profiler with code coverage (nUnit or VS Code Coverage).
Figure out which areas need more tests.
Write unit tests for those sub-systems.
Now, if the system was not written using TDD, chances are it'd be very monolithic and will require significant refactoring to introduce test surfaces. Hopefully, at the end of it you end up with a more modular, less tightly coupled. more testable system.
My attitude is that one should almost never write standard libraries. Instead, one should refactor existing, working code to remove duplication and improve ease of use and ease of testing.
The result will be very much like a "standard library", except that you will know that it works (you reran your unit tests after every change, right?), and you'll know that it will be used, since it was already being used. Otherwise, you run the risk of creating a wonderful standard library that isn't used and doesn't work when it is used.
A previous job encountered a little down time while the business sorted out what the next version should be. There were a few things we did that helped
Migrated from .net reoting to WCF
Searched for pain points in the code that all devs just hate to work with and refactor them
Introduce a good automated build system that would run unit tests and send out emails for failed builds. It would also package and place that version in a shared directory for the QA to pick up
Scripted the DB so that you can easily upgrade the database rather than being forced to take an out of date copy polluted with irrelevant data that other devs have been playing with.
Introduced proper bug tracking and triage process
Researched how we could migrate from winforms to wpf
Looked at CAB (composite application) or plugin frameworks so configuration would get simplier. (At that time setup and configuration was a tremendous amount of time)
Other things I would do now might be
Look at Postsharp to weave cross cutting concerns which would simplify logging, exception handling or anywhere code was repeated over and over again
Look at Automapper so that conversions from one type to another was driven by configuration rather than changing code in many places.
Look at education around TDD (if you dont do it) or BDD style unit tests.
Invest time in streamlining automated integration tests. (As this one is difficult to set up and configure manually it tends to get dropped of within SDLC)
Look at the viability on dev tools such as Resharper
HTH
My company is interested in porting a large business application to .NET. We plan on developing a desktop version and a silverlight version. I mostly researched the CSLA framework (got rocky's book, halfway through already) and found it a bit over-engineered, the data layer side didn't seem so polished either.
Is there any other frameworks that claim to do what CSLA is doing? I'm not talking about ORM tools (e.g L2S, EF, NHibernate.) I'm interested in a framework that supports business rules, easy n-tier architecture, objects are domain driven and not database driven, security on the business objects etc...
I know I can find small frameworks that will do some of the work required (Enterprise Application Block comes to mind) but I'm looking for one that has everything included.
I would be interested in hearing more about why you think CSLA is over-engineered. I have found it to be very feature rich but most of the features just implement standard .NET framework interfaces and so all the plumbing comes free and you definately have to use it.
Your requirements seem to be a great fit for CSLA. Other frameworks (such as ORMs) contain validation/business rules but the major issue is that you are (in most cases) stuck with your data schema. This leads to objects that are not friendly for UI development and force you to know the intricacies of your database.
Here's a good blog post (archived version) courtesy of "Adam on the Net" discussing and comparing the following:
Castle Project
Spring.NET
Enterprise Library
CSLA
If I were you I would either pick Spring.NET or just start building your own framework around ASP.NET MVC and Fluent NHibernate. Then slowly add your own building blocks as and when you need them. Enterprise library blocks are good but heavy according to me and have lot of things that you may not really need.