I'm looking for some good real-world examples of interaction between Windows Presentation Foundation and Workflow Foundation. Most of the WF tutorials I see demonstrate use within console applications. I'm more curious about applications that use a rich WPF interface and WF. Particularly if they allow user defined workflows (allow users to design and run their own workflows on the fly).
I'm not sure what exactly you're looking for, but here are some links to information about actual real world applications using Workflow in desktop (WPF) applications in one way or another:
Sample Real World WF4 Integration
Infinity Workflow (there's a lot of info in the linked Word file)
Aderant Enterprise Workflow (also presented at PDC Windows Workflow Foundation Futures session)
Let me take the example of trying to make two workflows communicate with each other.
First you need to write a host. This is an extremely loaded proposition, because for two WF hosts to talk to each other, you will then also need to know WCF, and all mushy concepts of threading.
Then your WF will need to communicate with other WFs via the hosts. This makes sense because a WF doesn't keep running in memory for 3 months, when it is waiting for another WF to send an event. The WF sits in the database, and the communication occurs through the hosts.
Okay, even for simpler scenarios, for local in-process communication, you have the CallExternalMethod activity, and HandleExternalEvent activities. Even in this case, you have to talk via the host, because the WF might have been passivated to the database. So in order to do so, you have to remember to do 3 things, decorate your interface with the ExternalDataExchangeAttribute, eventargs needs to derive from ExternalDataEventArgs, and event args is serializable.
If you mess up in any of the items in #3, you get a very non-intuitive "InvalidOperationException". Sure the message says, "Service does not implement an interface with the ExternalDataExchange attribute", but it isn't until you look at the inner exception, that you really know what happened - i.e. you forgot to make it serializable. doh! But I did mark it as serializable. Actually, everything needs to be serializable, even the sender.
Then you have to connect the WF activities, via the proper interface names and method names you are using to communicate.
Finally, for even in-process WF communication, you have to remember to add your service to the ExternalDataExchangeService, and not the WF runtime. Otherwise, it will look like nobody is subscribing to the event. Not to mention, that this is one of those bug, that doesn't really throw an error. i.e. hard to track down!
So, in short, for the simplistic scenario of trying to make two workflows communicate, you need to have a good handle on the following:
*Writing windows apps (for the host),
*Threading,
*WCF,
*OOP Concepts,
*All concepts of serialization,
*Plenty of hooking up and non-intuitive details of WF itself,
*Ninja debugging skills.
Source:http://blah.winsmarts.com/2008-2-I've_been_here_before.aspx
The question is pretty vague but here is a possible awnser in this blog post I wrote. Basically I am rehosting the workflow designer to let end users change workflows as needed and let them run them right there and then. Of course you question could mean pretty much anything, like how to call a workflow service from a WPF form.
This is a sort of self promotion since the link is mine, but have a look.
Here is a sample project I did, which combines WF and WPF to simulate a ATM machine. The code works on some issues like handling the bookmarks, how to keep the workflow alive, and how to manipulate the UI from the workflow.
https://wpfwf.codeplex.com/
Related
We are faced with the problem maintaining lots of windows services.
The idea is to reorganize windows services in to class libraries and connect libraries to one master windows service. Is there a good idea ? Any advices please)
There is a framework for hosting "services" within a single Windows Service called TopShelf. You might want to consider using that. https://github.com/Topshelf/Topshelf
I am interpreting your question to be "We have tons of little Windows applications that run as services - how can we simplify them?".
In general, lots of smaller programs are better. Single monolithic applications are difficult to maintain and test; when someone needs to make a small change it can trigger catastrophic consequences for dozens of other components of the application. It can also make it impossible to change one small application without taking down the whole service, as Chris Knight comments above.
On the other hand, lots of small programs suffer from the breadth problem. You probably want to make sure all your little programs run on a consistent framework - i.e. they all log their results to the same place, they all use a standardized configuration system, and they are all managed in the same place.
I have seen situations where people write services because they need to run a task "when a particular condition happens", so they make it a constantly running service and continuously check for that condition. Is it possible that you could take some of your services and turn them into triggered launches of individual applications?
If this isn't the correct interpretation, please let me know :)
I have three programs (one in C++ + WinAPI, another one in C# .NET and the last one in Java) with different functions. I am about to choose one and implement functions of the other two. Is it possible to somehow merge them? I need to have them in one GUI, under one process (at least visually). IPC isn't a problem.
Thanks for anything
I think the best/easiest thing you could do is make the GUI only in C#, in windows clients you could use Windows Forms or WPF, in web based you can use ASP.NET WebForms or ASP.NET MVC.
in all these cases except MVC (Razor) you have really good tools for designing and customizing the GUI within Visual Studio.
Your C++ code can be wrapped in a class library or as you say accessed via some kind of IPC if it has to run as application, same for Java but if you are 100% free to write and re-write things you could also imagine to port the Java code to C++, this could be easy, difficult or impossible depending on what the java code does.
at last resort if both C++ and Java applications must stay separated and must run in background on same or another machine and you still want to consume their services or methods from your C# GUI, as you mentioned, IPC is probably the way, not sure what you can do in Windows with Java and IPC, surely java can expose or consume XML web services.
The problem you will find is that each has its own process model, and you either need to get the process models to coexist or you'll need several communicating processes (though the user need not see them). With Java, eg, it can be "king", or you can set up a sort of sub-process in another process, or you can have it set up/take down its process model on every call. Which approach is best depends in part on the complexity of the operations you'll be doing in the "guest" language.
The one main thing, though, is that only one language can control the UI -- in general different language UIs don't coexist, and I doubt that you can have eg, a Java UI object in a C# GUI, at least not without treating it as a foreign window.
If you're talking about the best language to rewrite the programs in, that depends entirely on the primary function. If you mainly need it to integrate with Windows nicely, C# would be the obvious choice. If your main idea is to make it cross-platform, it'd make most sense in Java.
If you're talking about running them together and using IPC then yes, that's possible too - you could use anything from a fully blown IPC framework to a custom protocol over sockets on localhost. There shouldn't be too much of an issue there, though remember depending on how big the parts in other languages are, it may cost just as much in terms of time to rewrite them (and there's less of a maintenance burden that way too.) There's also complexities with controlling GUIs from other processes, it can be done ish by passing native canvas IDs around cross process, but it's hard to get working properly, may not be particularly safe and makes it quite difficult to work out what's going on from a maintenance perspective.
I am responsible for a team of developers who will are about to start development of a light weight insurance claims system. The system involves a lot of manual tasks and business workflows and we are looking at using Windows Workflow (.NET 4.0).
An example of the business domain is as follows:
A policy holder calls the contact centre to lodge a claim. This “event” fires two sub tasks which are manually actioned in parallel and may take a lengthy time to complete;
Check customer for fraud – A manual process whereby an operator calls various credit companies to check and assess the potential of a fraudulent customer. From here the sub task can enter a number of sub-statuses (Check in progress, Failed Reference Check, Passed Reference Check, etc)
Send item to repairs centre – A manual process where the item for which the policy holder lodged the claim is sent the repairs centre to be fixed. From here the sub task can enter a number of sub-statuses (Awaiting Repair, In Progress, Repaired, Posted, etc).
The claim can only proceed once the status of each sub task has reached a predefined status (based on the business rules).
On the surface it seems that Workflow is indeed the best technology choice; however I do have a few concerns in using WF 4.0.
Skill set – Looking at the average developer skill set I do not see many developers who understand or know Workflow.
Maintainability – There seems to be little support within the community for WF 4.0 projects and this coupled with the lack of skill set raise concerns around maintainability.
Barrier to entry – I have a feeling that Windows Workflow has a steep learning curve and it’s not always that easy to pick up.
New product – As Workflow has been completely rewritten for .NET 4.0 I see the product as a first generation product and may not have the necessary stability.
Reputation – Previous versions of Workflow were not well received, considered difficult to develop with and resulted in poor business uptake.
So my question is should we use Windows Workflow (WF) 4.0 for this situation or is there an alternative technology (e.g., Simple State Machine, etc) or even a better workflow engine to use?
I have done several WF4 projects so lets see if I can add any useful info to the other answers.
From the description of your business problem it sounds like WF4 is a good match, so no problems there.
Regarding your concerns you are right. Basically WF4 is a new product and is lacking some important features and has some rough edges. There is a learning curve, you do have to do some things differently. The main point is long running and serialization, which is something the average developer is not used to and requires some thought to get right as I hear far too often that people have problems serializing an entities framework data context.
Most of the time using workflow services hosted in IIS/WAS is the best route when doing these long running type of workflows. That makes solving the versioning problem not to hard either, just have the first message return the workflow version and make that a part of each subsequent message. Next put the WCF router in between that routes the message to the correct endpoint based on the version. The basic is never to change an existing workflow, always create a new one.
So what is my advise to you?
Don't take a big gamble on a unknown, and for you unproven, piece of technology. Do a small, non critical, piece of the application using WF4. That way if it works you can expand on it but if it fails you can rip it out and replace it with more traditional .NET code. That way you get real experience with WF4 instead of having to base a decision on second hand information and you learn a new and powerful technology in the process. If possible take a course on WF4 as that will save you a lot of time in getting up to speed (shameless self plug here).
About the Simple State Machine. I have not used it but I was under the impression it was for short running, in memory, state machines. One of the main benefits of WF4 is the long running aspects.
I have come to this dilemma couple of times and I had chosen not to use Work Flow foundation. Some of considerations (similar to yours) were
Involved work flows were lot simpler (a combination of state machine and sequential actions) and doing it in WF seems to overkill for efforts involved.
Learning curve for developers to understand and to use WF effectively was considered high. Status transition table describing valid transitions and actions to be taken are used for additional flexibility and developers were comfortable with it, easily understanding the concept and purpose.
Chances of business process changes were slim and rudimentary changes were easily possible with help of transition table. A change in transition would mean a database script while change in actions would result in new release/patch. However, probability of such occurrence was deemed to be low.
Looking back after 13-14 months, I still think that decision of not using WF was correct. IMO, WF makes sense where there is strong likely hood that work flow can change and/or business rules can change. WF allows to isolate workflow in separate file and so making it configurable by users will be simpler.
We have been using WF 4.0 the last couple of months. I have to say it's challenging to think the Workflow way. However, I can tell you it's worth it. We knew very little when we started. We've bought a beginner and professional book for WF 4.0 that helped. I, myself, watched many videos online and followed PDC 2009 for their breaking news about WF 4.0 and how it's different from the previous somewhat sucky versions.
One major thing that we had to propose a solution for is the way we can deal with In/Our Arguments in a workflow without bounding our custom activities to certain data types and how to pass parameters between activities. I have come up with a good solution for that, and the workflow experience that we have so far is not bad at all. Actually, we have a workflow-intensive application that is getting bigger and bigger and I really cannot imagine myself solving it in a different environment. I love the visual effect that it has: it keeps me away from the details of if/else etc constructs and makes the business rules apparent in a way that doesn't make you forced to dive into lines of code to know what's going on or how to fix some bug.
By the way, the project that we worked on is very similar to what you described and it's a medium-sized project.
You can tell from my words that I like it and I do recommend it although is incorporates some risks as it's a new technology and you have to come up with some innovative ideas.
my 2 cents...
I did three projects in WF 3.5 and I have to say it is not easy. It force you to think in the whole new way especially when persistance is used. Updating the application which contains hundreds of incomplete persisted workflow is challenging. Single breaking change in serialization crashes them all. Introducing multiple versions of the same library to support new and old running workflows is common. It was challenging.
I haven't tryed WF 4.0 yet but based on experience from BizTalk and WF 3.5 I think it will be similar.
Anyway the best approach you can take is to do Proof-of-Concept. Take single WF from your requirments and try to implment it in WF 4.0. You will spend some time with it but you will prove if you are able to do that in WF 4.0 and if there are any visible benefits.
If you decide to use WF 4.0 I insist that you check possibility to run WF as WCF service hosted in Windows AppFabric. AppFabric provides some out of the box functionality for hosting WFs.
I think it does not really make sense today to talk about Workflow in WF4 as a technology choice for this kind of problem. What is really appropriate, as mentioned by Ladislav Mrnka above, is WCF WF Services hosted in AppFabric.
My experience with this is that it pays great dividends and is very enjoyable, but problems arise in the beginning because it is not properly appreciated that for many programmers this is a methodology shift more than a technology shift. On the other hand, generalists and those with a problem-solving mindset saw WCF WF AppFabric as a set of exciting opportunities. So if the mix of people on the project are fairly conservative C# devs attached to their daily set of OO and patterns, it will be hard to introduce. If the team is more innovative, then adoption will be much easier because the potential and new doorways multiply with each discovery.
Two main conceptual problems programmers had in moving to this technology was:
a) Message correlation and messaged exchange patterns
b) Workflows and unit testing
In standard systems in C# for example a workflow is rarely explicit and therefore rarely unit tested. The overall workflow is left for testing by acceptance scenarios or integration. Introduce an explicit WF as a software artifact and suddenly standard devs want to try and unit test it, which is usually not worth doing.
The message correlation aspect of it is a bit of mindset shift for those not familiar with message exchange patterns. Most devs have dealt with in process and remote calls, web service and SOAP, and usually focussed on one or two of those. To abstract above it all and work with a general message based system can be confusing at first.
On the positive side though, the end result is something that saves a lot of time and creates a lot of opportunities. One main thing is that the worfklow, if visually clear, is something that can be worked on by end user, developer and analyst together, eliminating unnecessary steps in the development lifecycle and focusing the parties on one artifact. Further, it discourages islands of functionality in dedicated apps, with dedicated glue layers, by encouraging a suite of business processes in WF per business domain. Further, with AppFabric, the plumbing for persistence, logging, and waking up scheduled activities is all done for you. WF4 performance is outstanding too.
My recommendation would be to find the most innovative or explorative team member do the initial scouting to discover the tricky parts, get the core functions working, and have that initial person be responsible for then compartmentalising the remaining work.
In order to do an insurance claim system of any complexity that involves roles and "sub-tasks" you really need an BPM solution, not just workflow. Workflow Foundation 4.0 is slick but it really doesn't not come close to the functionalities of a BPM product.
BPM solutions, like Metastorm BPM, Global360, and K2.NET, provide human centric workflow, tasks, roles, and system integration that can model and streamline the business processes like your insurance claim system. Use ASP.NET to build the interface that integrates with the BPM workflow engine as their built in designers are usually limited and force you to use their custom built web control which usually are not as full featured as the ASP.NET web controls.
Go with the technology your team knows and feels comfortable with. Workflow Foundation is not a product that you can use straight away - it's rather a set of pieces you can embed in your application in order to build a workflow system. IMHO the workflow logic is the least important piece of technology, first of all you have to concentrate on the GUI because business owners will not see anything but the GUI. But if your system is a success then you have to be prepared for neverending change requests and new requirements so you have to implement your business logic so that it's easy to change and easy to divide into separate processes to suit different user needs (sometimes contradicting). BPM helps in this task because it allows you to have separate, multiple versions of business processes suiting various business needs. You don't need full fledged BPM engine for that but it's useful to code your business logic so that it can be versioned and divided into individual business processes - the worst thing to have is an unmantainable and intertangled blob of code that handles 'everything' and that no one can understand. There are many ideas for that - state machines, DSLs (domain specific languages), scripts etc - you decide what the implementation should be. But you should always think in terms of business processes and organize your logic accordingly so that it reflects these processes.
And be prepared for coexistence of many variants of business logic and data structures - this is the most difficult design task imho.
I'm in a situation where I have to use 4.0 as .NET 4.5 isn't accredited for use in our prod environment yet. I had major pain understanding generally how to get long running workflows going to suit our business need but eventually found an elegant solution. It's not something which just anyone coming later to support can just pick up with ease because there's so much to think about, but I do believe in WF as a tool for managing workflow states.
One big thing I take issue with WF 4.0 though is Maurice's comment:
The basic is never to change an existing workflow, always create a new one
That's great if you just want a new version, but what if you have 50,000 persisted workflows and realise at some point that there's a bug in the workflow? You need to be able to update the xamlx and still be coupled to the existing instances. I've tried ungzipping the various metadata columns in the SQL Server instances table to find something that ties the instance to the workflow definition without any luck.
I did write a synchronisation application for importing data from an old system into our new WF 4.0 driven one. We basically load the data into the system, then run the process which goes about automatically calling into the workflow steps and calling validation methods, essentially mocking user interaction. This only really worked well with us due to the architecture we implemented for access to the workflow service host. It's great as a one off, where after running you can go through and do checks to ensure consistency of the data migration process, but having to use this approach for potentially hundreds of thousands of cases once a system is live isn't really an approach that instills confidence and over burdens the process of integration simple bug fixes.
My recommendation is that you avoid WF 4.0 altogether and just go straight to 4.5 if you're environment supports it. The Dynamic Updates and Side by Side Versioning it provides caters for bug fixing and WF versioning all out of the box. I've still yet to investigate exactly how as 4.5 still isn't accredited for use by our client, but eagerly awaiting this opportunity.
What I'm desperately hoping for is that our client doesn't request changes to policy (and therefore workflow adjustments) and that the current workflows hold up without any bugs. The latter being a vain and empty hope as bugs always pop up.
I really can't understand what was going through the WF dev team's heads to release a system where out of the box you can't fix bugs easily. They should have developed a technique for re-binding an instance to new xamlx.
I created a Windows Form executable in .NET 3.5 that uses a dll to communicate with a machine that scans checks. I'm eventually going to need to move from an executable to a Web Form that can do the same thing. This will be months from now, but I wanted to start doing the research now as I have not done this before. I'm going to need to use ActiveX in order to communicate with the device via a Web Form. I've also not done this before.
I'd like to keep the functionality of my existing executable without having to rewrite most of it, although I do understand that some of it will need to be rewritten. I've done research on ActiveX and how to use it, but I wanted to know if someone has had a similar situation as this. What did you do to convert an exe to a web program? Are there good, specific sources out there that I'm overlooking that can point me in the right direction for this situation? Is there any advice that you can give from your experiences that can help me to reduce mistakes? The company that I work for does not have anyone else here that has done this before, so I've got to teach myself everything needed to do this.
Thanks in advance.
This is where separation of concerns and n-tier design shine through. Hopefully your UI layer is loosely coupled from your domain model. If this is the case, you can code a second IU layer for the web. And not have to change your domain model at all. Then you can compile for each scenario.
*note - In practical use I have always had to extend my business domain to account for some issues with the second UI, but those modifications have usually been minor, and have pointed out places where I had coupled too tightly anyway.
Another option you may consider is creating a web services layer over your business domain code. And then coding a web application that communicates with your domain model via those web services calls. This may have performance implications, and would not be my preferred method of accomplishing this. Though you may find it more manageable if you don't have a well designed application to start with.
"I'd like to keep the functionality of my existing executable without having to rewrite most of it"
In general if you extract as much logic as possible into its own assembly/dll, you can reuse that from whatever UI framework you want. Just make sure you're not doing anything UI specific in there (throwing up dialog boxes, etc).
Normally, converting winforms to webforms is quite possible, although typically a slow development process. Even if you've got the cleanest domain layer in the world, the fact that objects in your web page are thrown away every time means that a web domain layer is normally written very differently to a desktop domain layer.
However, in your case the device - server communication is going to be extra difficult.
Have you looked at xbap? It's basically a way to deploy WPF applications into a web page. It requires your clients to have the right version of .NET installed, but it's going to be the easiest path for you, especially considering that you can host winforms in WPF...
You may take a look at Silverlight 4,
http://silverlight.net/getstarted/silverlight-4-beta/
It contains many features that ASP.NET Web Forms hasn't.
If your team can accept something like ActiveX, why not Silverlight 4? The only disadvantage is that SL4 is still in Beta.
Has .NET (C#) anything like Django's Signals engine?
Our business logic become really complicated over few years of adding new features.
I'm going to re-architecture it. Currently all features are very coupled that makes regression errors while changing something one one place - some other place may be broken.
I really like Django's apps idea where separate applications introduce new functionality and are absolutely separate. Communication between apps is implemented though signals.
I wounder if there is something in .NET that allows to divide project business to many separated "apps" (plug-ins, zones, modules, you name it) and make communication using some kind of "signals".
For example we have simple order flow.
We can add "coupon app" that if exists in the project adds abilities to use discount coupon.
We can add "cross sale" module that if exists adds abilities to offer cross-sale products
Email notification module that if exists adds abilities to send order email notifications.
But in the same time all this modules are "self-contained" means that communication between them is done using emitting signals (ORDER_PROCCESS_START, ORDER_SUCCESS, etcs) and other modules can subscribe to this signals and process them in required way.
This architecture is not related to web, all business logic is processed on the server side like without working with HTTP directly.
I wonder if it's good architecture from code maintaining and testing point of few, is it possible to do this in .NET? Any drawbacks that I don't realize now?
I am not to familiar with Django - but immediately two frameworks come to mind
1) Prism
2) MEF
Now, I know thatt Prism is really a UI Pattern - but the event agregator which they use in it may be useful in segmenting / messaging between loosely coupled projects
My hunch is though something like MEF may be closer to what you are wanting, where you can make plugins to extend the functionality of the application.