C# Thoughts on this design - c#

I have this problem domain where I need to able to run a background process that would:
Run a filter to get an obj collection (time consuming operation)
Pass the obj coll through a set of rules...maybe thru a rule interface
Be able to expose any changes that the rules caused to any interested listeners.
Each filter may have many rules and there can be more than one filter.
Would would be the practical way to approach this? I'm thinking:
Have a WCF app hosted in a Windows Service that would expose callback for rule changes
Let the service do the grunt work of running filter->rules. Will this need to be a separate threaded work ?
Any thoughts or references to existing frameworks, design patterns, etc. are welcome.
thanks
Sunit

If your background process needs to be instantly (24/7/365) accessible from remote machines, the Windows service makes a lot of sense to me. Assuming you are familiar with C#, it is trivial to create a Windows service. You can follow the step-by-step here. Once you've got the Windows service going, it's easy to host the WCF service by creating the System.ServiceModle.ServiceHost instance in the OnStart callback of the Windows service. As far as WCF patterns and good practices, I'll refer you to Juval Lowy's website, IDesign.net. The site has a lot of free WCF-related downloads just by providing your email address.

You have a couple options, the two most obvious are either the client calls a method that starts the job and polls the server for status, or, setup a callback.
Either way the job should be run on a seperate thread so it doesn't block the service.
If you go with the poll for status route, put the actual result in the returning status.
If you go with the callback, use the WSDualHttpBinding and setup a callback. This looks a little scary to setup but it's really not that bad.
I'll let someone else chime in for actual patterns or frameworks, I'm just not sure. Also, checkout MSMQ, this might be another viable solution.

You could use WWF to take care of the rules. You should be able to host WWF as a service.

Related

Process.Start Notepad++ an application which is started by Windows Service [duplicate]

I have done a lot of searching to find a way to start a GUI application from a windows service on Windows 7. Most of what I have found is that with Windows 7 services now run in a separate user session and can not display any graphical interface to the current user. I'm wondering is there is any kind of workaround or different way of accomplishing something like this? Can the service start a process in a different user session?
This change was made for a reason and not simply to annoy developers. The correct approach is to put your UI in a different program and communicate with the session through a pipe, or some other IPC mechanism. The recommendation that services do not present UI is more than 10 years old now.
You should really try to follow these rules, even though it may seem inconvenient to begin with. On the plus side you will enjoy the benefit of keeping your service logic and UI logic separate
If your services runs under the LOCALSYSTEM account then you can check "Allow service to interact with desktop", for the benefit of legacy services that would fail if they could not show UI. But it won't help you anyway because the UI will show in session 0 where it is never seen!
I recommend you take a read of the official Microsoft document describing session 0 isolation.
There is a way to do this.
If you need to show a simple message box you can use the WTSSendMessage Routine.
If you need a complex UI elements you can put it in a separate program and you need to use CreateProcessAsUser Routine.
In this sample provided by microsoft you can see the process.
http://blogs.msdn.com/b/codefx/archive/2010/11/26/all-in-one-windows-service-code-samples.aspx
Windows 7 introduced what is called "Session 0 isolation" that in practice means that every service (except system services) run in a separate non-interactive session. For this reason you cannot directly create a GUI from within the service, except if you run in legacy mode by flagging the Interact With Destop option, which is not good if you plan to run your service for some years in the future.
As David Heffernan said, the best is to use a client-server architecture. WCF makes it easy to communicate with named pipes.
This page is a good starting point to read about Session 0 Isolation and this white paper is also very good.

Is there a best practice for throttling service calls to Windows services?

My team ships a client API that allows applications to communicate with our Windows service. There is a concern that malicious apps could possibly flood our service with requests, so we want to put in some throttling logic on the client API to prevent DOS attacks like this.
Is there a best practice for implementing throttling logic for Windows services? All I can find online is throttling for web (which makes sense). I imagine the same ideas would apply, but I am wondering if there is an established mechanism to do this when it's all on the local system.
You might find an open source project like TokenBucket to be suitable to implement this.
See WikiPedia for a discussion on leaky bucket algorithms in general.
For your application you may need to consider how to prevent multiple instances of your API from being called from the same machine, same process or multiple threads and that might affect how you implement this.

How to run business logic on one computer of the local network and GUI on another?

I have a WPF application running on a computer which is connected to a local network and there is a special device connected to this computer which is controlled by the application. Is there any easy way to migrate the GUI (WPF XAML) to another computer connected to the same network so that the GUI and BL stay coupled?
I have been looking into WCF but there are quite some limitations which would make it time consuming to adapt WCF to my situation. WCF would work perfectly if it could handle both properties (not supported at all) and events in one ServiceContract.
No, there is no simple way to do that.
If your GUI and BL are running on different machines, then you have to implement special code to perform network communications between apps (i.e. GUI and BL are two different apps), and if GUI and BL are tightly coupled, it's almost impossible and pointless to do so (network-related code would contain service contracts tremendous in size and remoting would consume large amount of time, making UI interaction slow and painful) - it should be refactored.
There is, however, a workaround for your problem - you can run your app on machine with both BL and GUI, then show that GUI on different machine: RemoteDesktop or RemoteApp (https://technet.microsoft.com/en-us/library/cc755055.aspx) could help with that.
An easy way? No.
You can, as you noted, use WCF. You could also use SignalR, a message bus, standard TCP sockets, or a number of other technologies.
None of these however will give you the absolute transparency you seem to desire. They all work off the concept that you are going to invoke a method on the server, and it may or may not return data. In the case of TCP, you send data and may get data back. You don't simply access/change a property or listen to an event (SignalR and WCF do support server->client invocation, which isn't exactly the same thing as an event, but it can work like one).
I think you need to look at implementing a standard client/server model, inconvenient as you may believe it to be.

How to have periodical events in a C# MVC web app?

I have a web app that I'm writing right now that is supposed to have "periodical events". For instance, at midnight, the web app should calculate "scores" for all users. I want this done only once during the day.
Is there a way that I can automate this, so it runs automatically at midnight (or whatever hour I choose)?
I don't like the idea of creating a separate script (VBS) to do this, as the calculation would depend on a lot of business logic of the app. I was thinking to put it into a separate Class Library, so it can use the web app logic (which is also in a class library), but is this the best way to go about it?
I also don't like the idea of using the Session_Start() event in the Global.asax to trigger the event by checking the hour manually. There must be some easier way - especially because down the road I expect there will be a lot more of periodical events - some may have to be triggered every fifteen minutes, for example ...
Thanks a lot for any help you can give me.
You should not do this in the web app itself. You are correct to put the business logic in a separate library. Once you have done this, you can use the business logic from anywhere, and therefore, a good solution would be to create a console application that does the nightly jobs, and invoke the console application from Windows Task Scheduler. IIS is not suitable as a host for periodical events.
I guess you are missing the point of separation of concern. Whatever you are asking is a job of a service. You need to develop a separate application as Windows Service that will do all your calculation and to be triggered by any scheduler even Windows Task Scheduler would do. This is what basically done on large scale applications.
Yeah... again awesome "change your architecture and hosting environment so that my answer can be relevant" responses.
Doing what you ask is actually quite easy, take a look at this article: http://www.codeproject.com/Articles/12117/Simulate-a-Windows-Service-using-ASP-NET-to-run-sc
This is a job for a windows service or scheduled task. A web application responds to HTTP requests. Essentially the service's job would be to wake up, run the appropriate calculations and write back to the database. Once in the database, your web application can use the newly calculated values.
Here is some information on windows services: http://msdn.microsoft.com/en-us/library/d56de412.aspx

.NET IPC without having a service mediator

I have two unrelated processes that use .NET assemblies as plugins. However, either process can be started/stopped at any time. I can't rely on a particular process being the server. In fact, there may be multiple copies running of one of the processes, but only one of the other.
I initially implemented a solution based off of this article. However, this requires the one implementing the server to be running before the client.
Whats the best way to implement some kind of notification to the server when the client(s) were running first?
Using shared memory is tougher because you'll have to manage the size of the shared memory buffer (or just pre-allocate enough). You'll also have to manually manage the data structures that you put in there. Once you have it tested and working though, it will be easier to use and test because of its simplicity.
If you go the remoting route, you can use the IpcChannel instead of the TCP or HTTP channels for a single system communication using Named Pipes. http://msdn.microsoft.com/en-us/library/4b3scst2.aspx. The problem with this solution is that you'll need to come up with a registry type solution (either in shared memory or some other persistent store) that processes can register their endpoints with. That way, when you're looking for them, you can find a way to query for all the endpoints that are running on the system and you can find what you're looking for. The benefits of going with Remoting are that the serialization and method calling are all pretty straightforward. Also, if you decide to move to multiple machines on a network, you could just flip the switch to use the networking channels instead. The cons are that Remoting can get frustrating unless you clearly separate what are "Remote" calls from what are "Local" calls.
I don't know much about WCF, but that also might be worth looking into. Spider sense says that it probably has a more elegant solution to this problem... maybe.
Alternatively, you can create a "server" process that is separate from all the other processes and that gets launched (use a system Mutex to make sure more than one isn't launched) to act as a go-between and registration hub for all the other processes.
One more thing to look into the Publish-Subscribe model for events (Pub/Sub). This technique helps when you have a listener that is launched before the event source is available, but you don't want to wait to register for the event. The "server" process will handle the event registry to link up the publishers and subscribers.
Why not host the server and the client on both sides, and whoever comes up first gets to be the server? And if the server drops out, the client that is still active switches roles.
There are many ways to handle IPC (.net or not) and via a TCP/HTTP tunnel is one way...but can be a very bad choice (depending on circumstances and enviornment).
Shared memory and named pipes are two ways (and yes they can be done in .Net) that might be better solutions for you. There is also the IPC class in the .Net Framework...but I personally don't like them due to some AppDomain issues...
I agree with Garo.
Using a pub/sub service would be a great solution. This obviously means that this service would need to be up and running before either of the other two.
If you want to skip the pub/sub you can just implement the service in both applications with different end points. When either of the applications is launched it tries to access the other known object via the IPC proxy. If the proxy fails, the other object isn't up.
-Scott
I've spent 2 days meandering through all the options available for IPC while looking for a reliable, simple, and fast way to do full-duplex IPC. IPCLibrary, which I found on Codeplex.com, is so far working perfectly out of all the options that I tried. All with only 7 lines of code. :D If anyone stumbles across this trying to find a full-duplex IPC, save yourself a ton of time and give this library a try. Grab the source code, compile the data.dll and follow the examples given.
HTH,
Circ

Categories

Resources