I created an application that provides several services. Each service provides a specific processing capabilities, except one service (that is the main service) that returns true or false to the clients which request if the specified processing capabilities is available or not.
Now I would modify the application, leaving the main service unchanged and adding the support for the installation of plugin with new processing capabilities: each plugin should add new processing capabilities without the need of implement a new service, but after installing the new plugin, a new service should be avaible. In this way, a plugin should not handle the communication layer. In other words, I would like to separate the layer of the communication and processing, in order to simplify the creation of new plugins.
Is it possible?
I could create two services: the main service and the service for processing.
The first service may be used by clients to know if a certain feature is present on the server (for example, clients may ask the server if it has installed the plugin that provides the functionality for solving differential equations).
The second service could be used to send a generic task and to receive a general result, for example:
Result executeTask(Task task);
where Result and Task are abstract classes...
For example, if I develop a plugin to solve the differential equations, I first create the classes for transferring data:
public class DifferentialEquationTask : Task
// This class contains the data of the differential equation to be solved.
...
public class DifferentialEquationResult : Result
// This class contains the the result.
...
Therefore, the client should instantiate a new object DifferentialEquationTask and pass it to the method of the second service:
DifferentialEquationTask myTask = new DifferentialEquationTask(...);
...
Result result = executeTask(myTask); // called by basic application
// The second service receives myTask as a Task object.
// This Task object also contains the destination plugin, so myTask is send
// to the correct plugin, which converts it to DifferentialEquationTask
...
myResult = result as DifferentialEquationResult;
// received by the client
Moreover, each plugin should have a version for the application server and a version for the client application.
An alternative would be to include the service in the plugin itself: in this way, a new plugin should implement a new functionality and expose it via an additional service.
In summary, I thought the following two alternatives:
a main service to ask the server if it has a plugin or not, and a second service to deliver tasks at the correct plugin;
a main service to ask if the server has a plugin or not, and various additional services (an additional service for each plugin installed).
In order to choose the best approach, I could use the following requirements:
Which of the two alternatives may provide better performance?
What advantages would be obtained using a new service for each plugin than using a single service that delivers tasks at the correct plugin?
Which of the two alternatives simplifies the development of a new plugin?
Being a novice, I was wondering if there was a better approach...
Thanks a lot!
It seems like the main service could maintain a dictionary of plugins, indexed by name. Then for a client to see if the server provides a particular service, all the main service has to do is look up the name in the dictionary. And to process, the service just has to call a method on the object that's in the value portion of the dictionary entry. An example:
You have three abstract classes: Service, ServiceResult, and ServiceTask. The contents of ServiceTask and ServiceResult aren't really important for this discussion. Service must have a parameterless constructor and a method called Process that takes a ServiceTask as its sole parameter. So your differential equation solver would look like:
public class DiffeqSolver : Service
{
public DiffeqSolver()
{
// do any required initialization here
}
public ServiceResult Process(ServiceTask task)
{
DiffeqTask dtask = task as DiffeqTask;
if (dtask == null)
{
// Error. User didn't pass a DiffeqTask.
// Somehow communicate error back to client.
}
// Here, solve the diff eq and return the result.
}
}
The main service is somehow notified of existing plugins. It maintains a dictionary:
Dictionary<string, Service> Services = new Dictionary<string, Service>();
I assume you have some idea how you're going to load the plugins. What you want, in effect, is for the dictionary to contain:
Key = "DiffeqSolver", Value = new DiffeqSolver();
Key = "ServiceType1", Value = new ServiceType1();
etc., etc.
You can then have two methods for the main service: ServiceIsSupported and Process:
bool ServiceIsSupported(string serviceName)
{
return Services.ContainsKey(serviceName);
}
ServiceResult Process(string serviceName, ServiceTask task)
{
Service srv;
if (Services.TryGetValue(serviceName, out srv))
{
return srv.Process(task);
}
else
{
// The service isn't supported.
// Return a failure result
return FailedServiceResult;
}
}
I've simplified that to some extent. In particular, I'm using a Dictionary, which is not thread safe. You'd want to use a ConcurrentDictionary, or use locks to synchronize access to your dictionary.
The more difficult part, I think, will be loading the plugins. But there are many available examples of creating a plugin architecture. I think you can find what you need.
Related
I'm trying to develop a system to share information across 2 windows applications with different update loops.
I developed a solution that uses a WCF service to store and retrieve data. However this data is different across clients and therefore showing different values for each applications.
The service I tried to implement are similar to this
namespace TEST_Service_ServiceLibrary
{
[ServiceContract]
public interface TEST_ServiceInterface
{
[OperationContract]
string GetData();
[OperationContract]
void StoreData(string data);
}
}
namespace TEST_Service_ServiceLibrary
{
// Core service of the application, stores and provides data:
public class TEST_Service : TEST_ServiceInterface
{
string TEST_string;
// Used to pull stored data
public string GetData()
{
return TEST_string;
}
// Used to store data
public void StoreData(string data)
{
TEST_string = data;
}
}
}
Each of the applications creates a TEST_Service client.
I tested the GetData and StoreData functions and they work fine independently, however when I use StoreData on one application and test the GetData method from the other the data appears to be empty.
I have looked around but haven't found a solution to this problem, is there a work around for this? or should I change my approach? I thought of using a local data base but I'm not sure this is the best way to solve it
Thanks a lot
You have more than one instance of your service class. If you want to have your data in memory, you will need to run it in single instance mode:
[ServiceBehavior(InstanceContextMode = InstanceContextMode.Single)]
Now keeping your data in memory might not be the best option anyway. You should look for a data store of some kind and then make that store a persistent instance with a single interface. Then it does not matter how many of your service instances are used.
If your WCF service was storing information in a database, then information stored on one request would go to the database, and when another request retrieved it, the result would come from that database. So if one client stored something, another could retrieve it.
The reason why this isn't working is because in response to each request your application is creating a new instance of the TEST_Service class. That means TEST_string, where you are storing values between requests, is a new string. It doesn't contain the previous value.
For experimentation you could try changing the string to static:
static string TEST_string;
...and then the value would persist between instances of the service class. But that still wouldn't be effective because your WCF service could be deployed to multiple servers, and then each one would have a separate instance of the class. Updating one wouldn't update the others. Or, if the service application restarted then the value would be lost. (From the context I assume that you're just experimenting with this.)
So ultimately you'd want some way to persist data that wouldn't depend on any of those factors, but would "survive" even when the instance of the service class goes out of scope or the application shuts down.
I'm about to start using hangfire in C# in a asp.net mvc web application, and wonder how to create the right architecture.
As we are going to use HangFire, we are using it as a messagequeue, so we can process(store in the database) the user data directly and then for instance notify other systems and send email later in a separate process.
So our code now looks like this
function Xy(Client newClient)
{
_repository.save(newClient);
_crmConnector.notify(newClient);
mailer.Send(repository.GetMailInfo(), newClient)
}
And now we want to put the last two lines 'on the queue'
So following the example on the hangfire site we could do this
var client = new BackgroundJobClient();
client.Enqueue(() => _crmConnector.notify(newClient));
client.Enqueue(() => mailer.Send(repository.GetMailInfo(), newClient));
but I was wondering whether that is the right solution.
I once read about putting items on a queue and those were called 'commands', and they were classes especially created to wrap a task/command/thing-to-do and put it on a queue.
So for the notify the crm connector this would then be
client.Enqueue(() => new CrmNotifyCommand(newClient).Execute();
The CrmNotifyCommand would then receive the new client and have the knowledge to execute _crmConnector.notify(newClient).
In this case all items that are put on the queue (executed by HangFire) would be wrapped in a 'command'.
Such a command would then be a self containing class which knows how to execute a kind of business functionality. When the command itself uses more than 1 other class it could also be known as a facade I guess.
What do you think about such an architecture?
I once read about putting items on a queue and those were called
'commands', and they were classes especially created to wrap a
task/command/thing-to-do and put it on a queue.
Yes, your intuition is correct.
You should encapsulate all dependencies and explicit functionality in a separate class, and tell Hangfire to simply execute a single method (or command).
Here is my example, that I derived from Blake Connally's Hangfire demo.
namespace HangfireDemo.Core.Demo
{
public interface IDemoService
{
void RunDemoTask(PerformContext context);
}
public class DemoService : IDemoService
{
[DisplayName("Data Gathering Task Confluence Page")]
public void RunDemoTask(PerformContext context)
{
Console.WriteLine("This is a task that ran from the demo service.");
BackgroundJob.ContinueJobWith(context.BackgroundJob.Id, () => NextJob());
}
public void NextJob()
{
Console.WriteLine("This is my next task.");
}
}
}
And then separately, to schedule that command, you'd write something like the following:
BackgroundJob.Enqueue("demo-job", () => this._demoService.RunDemoTask(null));
If you need further clarification, I encourage you to watch Blake Connally's Hangfire demo.
I have a wcf service (hosted in IIS) that is setup to use sessions. It seems to work. When Application_PostAcquireRequestState is called I have a session ID.
I end up using it like this (in my Global.asax):
if (Context.Handler is IRequiresSessionState)
{
log4net.ThreadContext.Properties["sessionId"] = Session.SessionID;
}
That seems to work fine. The value is stored off into my log4net property.
But when my service operation begins (my actual WCF service code) the log4net property is null again.
Since the property is stored per thread (ThreadContext), I can only assume that this means that the session is setup on one thread then executed on another thread. Am I right?
Is there anyway to get my log4net property set on the on the correct thread (without having to remember to make the above call at the start of every single service operation)?
Yes, IIS may use multiple thread to service multiple WCF requests. See http://msdn.microsoft.com/en-us/library/cc512374.aspx for more detail.
You might consider using different instances of a logger for each WCF request.
There are multiple scenarios where WCF might change threads on you:
The Global.asx thread is not guaranteed to be used for a service call (in fact its unlikely).
If there are multiple calls during the same session, the thread may also change between calls to the same service instance.
In theory state information like this should be stored in an Operation Context object. However because log4net uses thread local storage it becomes an awkward solution.
Is there anyway to get my log4net property set on the on the correct
thread (without having to remember to make the above call at the start
of every single service operation)?
Yes. Create a custom IOperationInvoker. The best example I know of is Carlos Figueira's blog. If you apply this as a service behavior your log4net property should always be defined for the service code.
One warning: When adding to thread local storage be sure to clean up. That's why log4net.ThreadContext.Stacks[].Push() returns a IDisposable. In other words your Invoke method should look like (incomplete and untested):
public object Invoke(object instance, object[] inputs, out object[] outputs)
{
using (log4net.ThreadContext.Stacks[key].Push(value))
{
return this.originalInvoker.Invoke(instance, inputs, out outputs);
}
}
See Carlos' blog to understand why you are calling the "originalInvoker". Note that if you want to support async operations that you need to implement additional methods.
Custom properties do not need to be strings. So you could store an instance of the following class in the global context:
public class SessionIdProperty
{
public override string ToString()
{
// error handling omitted
return Session.SessionID;
}
}
This way log4net can access the Session object directly when it logs a message. Log4net calls the ToString() method on non-string properties.
Our application calls external services like
//in client factory
FooServiceClient client = new FooServiceClient(binding, endpointAddress);
//in application code
client.BarMethod(); //or other methods
Is it possible to track all of these calls (e.g by events or something like that) so that the application can collect the statistics like number of call, response time, etc? Note that my application itself needs to access the values, not only to write to a log file.
What I can think is to create a subclass of VisualStudio-generated FooServiceClient and then add codes like this
override void BarMethod()
{
RaiseStart("BarMethod");
base.BarMethod();
RaiseEnd("BarMethod);
}
and the RaiseStart and RaiseEnd method will raise events that will be listened by my code.
But this seems tedious (because there are a lot of methods to override) and there is a lot of repeated codes, my code needs to change everytime the service contract changes, etc. Is there a simpler way to achieve this, for example by using reflection to create the subclass or by tapping into a built-in method in WCF, if any?
The first thing I would look at is to see if the counters available in your server's Performance Monitor can provide you with the kind of feedback you need. There's built in counters for a variety of metrics for ServiceModel Endpoints, Operations and Services. Here is some more info http://msdn.microsoft.com/en-us/library/ms735098.aspx
You could try building an implementation of IClientMessageInspector, which has a method to be called before the request is sent and when the reply is received. You can inspect the message, make logs etc in these methods.
You provide an implementation of IEndpointBehavior which applies your message inspector, and then add the endpoint behavior to your proxy client instance.
client.Endpoint.Behaviors.Add(new MyEndpointBehavior())
Check out the docs for MessageInspectors and EndpointBehaviors, there are many different ways of applying them (attributes, code, endpoint xml config), I can't remember of the top of my head which apply to which, as there also IServiceBehavior and IContractBehavior. I do know for sure that the endpoint behaviors can be added to the client proxy collection though.
I found a simple way to do it by using dynamic proxy, for example Castle's Dynamic Proxy.
Firstly, use a factory method to generate your client object
IFooClient GetClient()
{
FooClient client = new FooClient(); //or new FooClient(binding, endpointAddress); if you want
ProxyGenerator pg = new ProxyGenerator();
return pg.CreateInterfaceProxyWithTarget<IFoo>(client, new WcfCallInterceptor());
}
And define the interceptor
internal class WcfCallInterceptor : IInterceptor
{
public void Intercept(IInvocation invocation)
{
try
{
RaiseStart(invocation.Method.Name);
invocation.Proceed();
}
finally
{
RaiseEnd(invocation.Method.Name);
}
}
//you can define your implementation for RaiseStart and RaiseEnd
}
I can also change the intercept method as I wish, for example I can add a catch block to call a different handler in case the method throw exception, etc.
I am enjoying creating and hosting WCF services.
Up until now I can create services defining contracts for the service and data (interfaces) and defining hosts and configuration options to reach them (endpoint specifications).
Well, consider this piece of code defining a service and using it (no mention for endpoints that are defined in app.config not shown here):
[ServiceContract]
public interface IMyService {
[OperationContract]
string Operation1(int param1);
[OperationContract]
string Operation2(int param2);
}
public class MyService : IMyService {
public string Operation1(int param1) { ... }
public string Operation2(int param2) { ... }
}
public class Program {
public static void Main(stirng[] args) {
using (ServiceHost host = new ServiceHost(typeof(MyService))) {
host.Open();
...
host.Close();
}
}
}
Well, this structure is good when creating something that could be called a Standalone service.
What if I needed my service to use objects of a greater application.
For example I need a service that does something basing on a certain collection defined somewhere in my program (which is hosting the service). The service must look into this collection and search and return a particular element.
The list I am talking about is a list managed by the program and edited and modified by it.
I have the following questions:
1) How can I build a service able to handle this list?
I know that a possible option is using the overloaded ServiceHost constructor accepting an Object instead of a Type service.
So I could pass my list there. Is it good?
[ServiceContract]
public interface IMyService {
[OperationContract]
string Operation1(int param1);
[OperationContract]
string Operation2(int param2);
}
public class MyService : IMyService {
private List<> myinternallist;
public MyService(List<> mylist) {
// Constructing the service passing the list
}
public string Operation1(int param1) { ... }
public string Operation2(int param2) { ... }
}
public class Program {
public static void Main(stirng[] args) {
List<> thelist;
...
MyService S = new MyService(thelist)
using (ServiceHost host = new ServiceHost(S)) {
host.Open();
...
host.Close();
// Here my application creates a functions and other that manages the queue. For this reason my application will edit the list (it can be a thread or callbacks from the user interface)
}
}
}
This example should clarify.
Is it the good way of doing? Am I doing right?
2) How to handle conflicts on this shared resource between my service and my application?
When my application runs, hosting the service, my application can insert items in the list and delete them, the same can do the service too. Do I need a mutex? how to handle this?
Please note that the concurrency issue concerns two actors: the main application and the service. It is true that the service is singleton but the application acts on the list!!!
I assume that the service is called by an external entity, when this happens the application still runs right? Is there concurrency in this case???
Thankyou
Regarding point 2, you can use Concurrent Collections to manage most of the thread safety required.
I'm not sure what you mean by point 1. It sounds like you're describing basic polymorphism, but perhaps you could clarify with an example please?
EDIT: In response to comments you've made to Sixto's answer, consider using WCF's sessions. From what you've described it sounds to me like the WCF service should be sat on a seperate host application. The application you are using currently should have a service reference to the service, and using sessions would be able to call an operation mimicking your requirement for instantiating the service with a list defined by the current client application.
Combine this with my comment on exposing operations that allow interaction with this list, and you'll be able to run multiple client machines, working on session stored Lists?
Hope that's explained well enough.
Adding the constructor to MyService for passing the list certainly will work as you'd expect. Like I said in my comment to the question however, the ServiceHost will only ever contain a single instance of the MyService class so the list will not be shared because only one service instance will consume it.
I would look at a dependency injector (DI) container for WCF to do what you are trying do. Let the DI container provide the singleton list instance to your services. Also #Smudge202 is absolutely correct that using the Concurrent Collection functionality is what you need to implement the list.
UPDATE based on the comments thread:
The DI approach would works by getting all of an object's dependencies from the DI container instead of creating them manually in code. You register all the types that will be provided by the container as part of the application start up. When the application (or WCF) needs a new object instance it requests it from the container instead of "newing" it up. The Castle Windsor WCF integration library for example implements all the wiring needed to provide WCF a service instance from the container. This posts explains the details of how to use the Microsoft Unity DI container with WCF if you want to roll your own WCF integration.
The shared list referenced in this question would be registered in the container as an already instantiated object from your application. When a WCF service instance is spun up from the DI container, all the constructor parameters will be provided including a reference to the shared list. There is a lot of information out there on dependency injection and inversion of control but this Martin Fowler article is a good place to start.