I'm looking into NRules.Net to implement my pricing engine. After some testings, I don't really understand the benefits of using a .net rules engine vs using a rich domain model...
Rules are still plain CS classes (there is nothing dynamic)
I have to deal with a black-box (the rule engine) which does not allow me to understand what is going on
In the end, why shall I go for a .net rule engine? Is there anything I'm missing?
Thanks
Seb
A rules engine allows you to reason about a Rule as a concept of its own and gives you an elaborate language to play with rules.
You can compose rules dynamically, classify and manipulate lists of rules with different priorities, manage conflicting rules, express them with a convenient DSL that business people might understand, create transformations to make an object comply with a rule, etc.
Without an abstraction of what a Rule is, you can't do all this in a conscious, consistent way.
Not all domains need it but if that general concept often comes up in discussions with your domain experts, a rules engine is probably worth having a look at.
Related
When last involved in .NET, Microsoft advocated an architectural approach where one programmed against a representation of the database in code - datasets, datatables etc. This suited their auto-generated code by tools based approach. They never advocated a rich object domain model as the basis for your architecture.
Has this position changed? Is a rich domain model architectural approach now advocated or supported by Microsoft, especially as a result of introducing Entity Framework, or is a data-centric approach still advocated?
Entity Framework is now the recommended data access solution as opposed to ADO.NET. F# has type providers and allows designing domain specific languages. This embraces information rich programming.
Now .NET Core is moving away from tooling and auto-generated approaches. It is cross-platform, agile and focused on what open source communities doing for years.
To summarize, .NET is now not revolving around database and tools.
Judging by this MSDN "Patterns and Practices" series book from 2012, they have been recommending a rich domain model approach for quite a while.
Not that this is an exclusive recommendation - Microsoft basically moved to a less opinionated, right-tool-for-the-right-job speech about pretty much everything they provide, and the data-centric tools are still there. Regardless, it would be suicidal of them to lag behind everyone else and still recommend a data first, code-generation based only approach these days.
[Edit]
You should note that CQRS and/or Event Sourcing are not exclusive of a rich domain model, quite the opposite. CQRS commands trigger rich domain logic in entities, which then emit (rich) domain events. It's precisely what the book describes. Don't be fooled by the title.
I am creating an interface where users can build their own business rules out of domain specific objects at runtime, have those rules persisted in the database and then used by the application. Some of these are complex predicates and others require combinations of domain objects in what seems fairly complicated relations. So far I have looked into GoF, dynamics with eval, and CodeDom. Does anyone have suggestion on what should be used?
Actually, you can just develop your application with WF rules engine API without using WF. http://blogs.microsoft.co.il/blogs/bursteg/archive/2007/08/09/WF-Rules-Engine-without-Workflow.aspx This will save you from a lot of work.
Kaizen, depending on the scope and kind of your dynamic rules you could eventually use a workflow engine, like MS WF to define the rules as workflow activities for example... in this way you isolate the logic and do not need a full rebuild of the application when you need to change anything in the workflow.
This might not be the best solution but could be an alternative...
Having spent a year building a rules engine and fighting on approaches I can tell you its not easy. Especially when you focus on what your goal is. If its to get users to write the rules for the system, you really need to focus hard on that area. Whats easy for a developer is perhaps much harder for most business users. We built a rules authoring platform in Excel that was compiled into C# and run dynamically ... problem was users found the spreadsheets and flow of logic too complicated and hired ASp.NET contractors to do build the rules.
BizTalk has an engine that I believe can be used for .NEt apps
http://www.microsoft.com/biztalk/en/us/business-rule-framework.aspx
Have fun!
How often do the rules change? Building a system that let's the business build (and version) their own rules is significantly more challenging than building a system that lets a programmer update the rules dynamically.
When a similar requirement came up in a past project, the business admitted that while yes, the rules will change; they won't change so often that it has to be them making the updates.
We ended up using IronPython for the dynamic parts and storing the code in the database and the system would pull up the appropriate rules on load. The rest of the app was written in C#. A win for us and for the business.
I am currently writing a .net application with c# and want to check a number of rules and based on passing or failing the rules, perform an action. So I am looking to implement a generic solution that I can reuse adhering to good oop principles. This has lead me to the conclusion that I need to write a rules engine.
I have good knowledge of c# but this is the first time I have needed to write a rules engine so as part of my research in to the design and development of such, I am looking for any tips regarding the creation of such an engine. What would be great further would be any examples out there that I could look at? Any c#/.net rules engine applications? What layer in a typical 3 tier architecture should such reside at? I had a quick look on the codeplex and google code but none jumped out at me! So some direction would be great.
Actually .NET has a top-notch rules engine meant to be used with workflows (as it is designed to be) but can be used outside of workflows easily: You should see "Windows Workflow Foundation Rules Engine" and inspect the System.Workflow.Activities.Rules namespace.
Learning how to use rules outside of workflows takes only a bit of googling.
Edit: If you want to inspect the architecture, here is two open-source prebuilt engines:
NxBRE
.NET Application Block for Validation and Business Rules
Building and implementing your own rules engine can be a very difficult task with many things to consider. The biggest problem you will face is trying to decide which rules to fire and when. Not giving this the proper care can lead to performance problems within the implementation. I would highly recommend focusing in on the business problem and providing your subject matter experts (SME’s) with the ability to define and maintain their own rules. There are many good commercial products that do this; the one I have successfully implemented multiple times is www.inrule.com. They have a nice set of products that can help solve simple and complex problems. Hopefully this helps.
I am working on a project that is responsible for creating a "job" which is composed of one or more "tasks" which are persisted to a database through a DAL. The job and task columns are composed of values that are set according to business rules.
The class, as it exists now, is getting complicated and unwieldy because the business rules dictate that it needs access to many databases across our system to decide whether a job can be created and/or how it should be set up.
To further complicate things it needs to be necessary to submit a list of jobs and it needs to be callable in a variety of ways (as a referenced assembly, via windows service, or via web service).
Here are some examples of the things it does:
Generate a job cost estimate
Take in an account and/or user to which assign the job
Emit an event for job submission progress tracking
Merge in data from an outside, user-defined list (.csv, .xls, ect.)
Copy files from a local drive to a network accessible drive (if necessary)
My question is: What are the best practices or design patterns to make this as manageable and simple as possible?
Seems like the class needs to be refactored as it would appear to violate the Single Responsibility Principle. I would recommend that each one of the bullet points above have its separate implementation class. In this way you would be implementing the facade pattern , where your main class represents the high level abstraction of what the system is doing.
This type of program can get really messy if not kept clean from the ground up. I myself always try to stick with the basic 3-Tier Application (Presentation, Business, Data). There is a lot of good information out there for building applications in this manner, and it's best to do some demo projects, and read what others have to say about the subject. Here is the MSDN reference.
I myself had to redesign an application that did something very similar. Once I got my Data Layer separated and worked out from everything else my life became a lot easier.
My best advice is take the time to Plan a lot. Use diagrams, flowcharts, etc. etc.. When a program is this complex, I like to have the groundwork for my layers laid out before I ever start writing code.
Given your description of the requirements, there's no real "simple" way to go about this. Its requisite functionality is massive and diverse. My only suggestions are to make the entire thing into a DLL library (or even a set of DLLs), to separate the various frontends so that referencing the assembly need not rely on the Windows service (for instance); and to stick to basic OOP commandments like loose coupling.
Besides recommending to use SOLID and go the extra mile to keep it DRY, I'll suggest to introduce the concept of rules in the system.
By modeling the rules you can switch to a more configurable / flexible approach. You can combine multiple rules to expose different operations that affect the outcome in jobs and the related tasks.
This allows you to have rules that are composed of others. Depending on the scenario you have, that could greatly simplify how you deal with it, since some operations that involve implicit rules that are spread across all those system can be expressed as a combination of simple rules. I'd keep it as simple as possible, but as you extend it you might find the need for different ways to combine the rules, and patterns will emerge on their own.
As for SOLID, I recommend to check the ebook here and try to keep an evolving code approach.
Having never worked with Ruby on Rails, I looked it up on Wikipedia. It says
It is intended to be used with an
Agile development methodology that is
used by web developers for rapid
development.
This got me asking how a given language/framework can be more appropriate for given development methodologies. Are there certain languages that are more friendly for pair programming, for instance? Are there language features that make certain methodologies are more appropriate? Are there features that make certain methodologies impossible?
My initial reaction is to dismiss the connection (the design process is a business process, which is more dependent on business needs that language features). But I'm an only programmer within the firm, and I'm a partner, so I get to decide the business needs. What do you think?
Also, if the SO community finds that certain languages point towards certain methodologies, what methodology is most common for c#, which is what I use most of the time?
I certainly consider different languages to promote certain methodologies as much as they entail different philosophies
Object-oriented languages are suitable for larger groups since code can be neatly decoupled. Dynamic languages instead will allow much more explorative and straightforward approaches due to the possibility of interactively testing code (e.g. interactive shells) without long compile times.
Convention-over-configuration like practiced by RoR is suitable for rapid development whereas advanced functional type systems like F#'s or Haskells promote a types first approach.
And of course, it's even easier to prove opposite - You can't rely on types in an insufficient type system, you can't decouple by interfaces where they're all implicit and you can't easily explore a problem by try and error in languages like C++. So yes, there is a connection.
Certain environments help, particularly for methodologies which stipulate rapid code and unit test cycles. For example, it takes 4s for me to compile and run the basic tests for a 50,000 line project in C, but it there was a post on here a year or so ago where someone was taking 2 minutes to run 'unit' tests on RoR due to database overhead.
Dynamic Languages are natural friends of Agile Methodologies (Agile is not a methodology but a family of methodologies with Scrum, XP,...) because they derive from the same "pragmatism" spirit but as Business Pressure requires it, even compiler's languages now use Agile and in fact are becoming agile by integrating dynamic features like C# or the .NET Framework with the 4.0 version.