This is my first time dabbling in windows services.
I have a service I would like to manage, I would like to be able to connect to this service via a command line / REPL of sorts to avoid the development time of working on a user interface. I was thinking we could communicate much like attaching to an Asterisk daemon or somewhat like connecting to a MySQL server which to me seems like nothing more than a simple custom shell spawned to handle requests. However, I am always concerned about how efficient my code is and would like to keep to common practices. This will be connecting on the same local machine.
My proposed solution:
I believe I can make simple network stream, to create a simple Read - Eval - Print - Loop.
Another option is to use WCF, however my question would then be, how efficient is this as opposed to packet handling?
My question:
What are some standard practices for communicating with or managing services on the local machine?
I'm trying to learn more about service-oriented design, any resources that could help explain common practice models would be much appreciated.
Of course there are so many ways to do this. The way I would recommend is to make sure you use log4net (or some other logging framework) and log the important info. Create the solution with 3 projects, the first will be the "service logic" or the business service, with the second being the windows service wrapper that starts that service, and the third being a console app that does much the same as the windows service only giving you the ability to interact as you wish. The advantage of the console logging appender is that you still get the console output without actually writing to the console... it give good separation.
I will give another option that I have used in the past, but would give with caution. You can selfhost a WCF service inside a windows service. It gives a nice interface that gets away form the messy self rolled TCP server approach. The caution is that if done wrong it can eat up lots of memory and CPU cycles.
Related
Please forgive the newbie question. I've spent the last three hours researching this, and I can't quite find the right answer, or perhaps I just don't believe it's as simple as it looks.
I need to deploy an application such that an application on the server-side does the heavy lifting, database wise, and the client-side version is fairly lightweight.
I have built a Data Access Layer class library (or at least a dll) that does all the heavy lifting. I have built a Windows Forms application that could serve as the lightweight client. They see each other. They talk to each other. They work happily together.
I'm kind of hoping all I need to do is put the dll on the server, point the reference to it in the client, and all will be well. The dll will run its code on the server, using server resources, and the client will run on the client. It's what the various websites seem to suggest, but it looks too simple.
Do I need to configure something like remoting on the server? Do I need to use System.Runtime.Remoting for something? Or is it really as simple as it looks?
Again, please forgive so basic a question.
what you are trying to do is build Client/Server application,
where you have
Client
client domainDomain
server Domain
Dal
Data
you will need to enstablish http conntection between client domain and server domain .
the common way to do this is using WCF
Explain the different tiers of 2 tier & 3 tier architecture?
http://www.codeproject.com/Tips/642296/Hello-World-Basic-Server-Client-Example-of-WCF
http://www.codeproject.com/Articles/14493/WCF-Basic-Client-Server
oh and welcome to stack overflow!
Well, you could use System.Runtime.Remoting, but that is a deprecated technology, i suggest using WCF for communicating between the client and the server.
I have been developing n-tier applications using .NET for many years. But I still have no idea how to distribute the tiers/layers (dll) to other servers.
Let say, I have an MVC web application with 4 projects, i.e. MVC (UI), Business, Service and Data. Everything works fine if all class library dlls are in one server.
If I want to scale out the application by distributing the Service layer (dll) and Data layer (dll) to other 2 servers, should I convert the class library to WCF Service Library project (with TCP or pipe as communication protocol for better performance) ? Or should I use other technology like .NET remoting or Web API?
Will that be a lot of work?
Is that one of the purpose of creating multi-tier application?
Thanks.
Update:
Do you have any links (from Microsoft) that explain in detail how to scale out an n-tier architecture application to multiple server by distributing the DLL?
If I want to scale out the application by distributing the Service layer (dll) and Data layer (dll) to other 2 servers, should I convert the class library to WCF Service Library project (with TCP or pipe as communication protocol for better performance) ?
Yep, since they are on different machines, you need some kind of communication mechanism that goes beyond simply DLL invocation.
Or should I use other technology like .NET remoting or Web API?
Which approach you choose depends on many factors like complexity, performance...There are many options like
WCF webservices
Simple REST calls with WebApi
a message bus i.e. NServiceBus
...
Obviously remote calls will also be slower having a potential impact on performance etc.
Will that be a lot of work?
It will be more work and in my opinion that "more work" should really be justified. Keep your architecture as simple as possible or better, only as complex as really needed.
An alternative approach could be to have some deployment pipeline that deploys your entire application on different server instances and have some intelligent load balancing strategy. The only thing you need to pay attention to in that case is to properly share the sessions between your instances (stateless would be better ;) ).
My 50 cents...
As far as I know WCF replaced .NET Remoting (MSDN).
Anyway... Someone before me said. If you don't have to scale the application, do not do it. Communication cost alone between services of any kind will slow things down considerably. Probably to extent, where it would be slower than it is now (which I am assuming is the reason for scaling).
Prior to scaling, I would first see where the bottleneck really is. For instance, if the problem is your DB server, then moving services and data layer to another server is useless, as you will still be using the same database. So, you need to first find out what your bottelneck is.
The easiest and least painful way to scale (in my opinion) would be to just add another IIS server and a load balancer that would direct traffic to either one of them. You would need to store sessions in a database or use dedicated server, but that is about all the change you will need. Plus, if one of your server fails, one will still operate.
By default, avoid premature optimalization.
If you have a only web site, I would keep it as simple as possible and only create logical layering. There are a number of options: typical 3 tier, onion architecture etc. The key is that later, if really really needed, you could still refactor your code and make your data layer a separate physical layer. But unless you are creating a new Amazon or something, this will probably not be the case.
If you are in the situation, for example, that you have a web site, but also have to expose a web api; you could choose to have the web site consume the web api. In fact, your web site would then become a very thin layer (maybe not even using ASP.NET MVC) because most of the logic would be in the web api.
PS - .NET remoting is old technology, consider WCF or Web API instead.
We are faced with the problem maintaining lots of windows services.
The idea is to reorganize windows services in to class libraries and connect libraries to one master windows service. Is there a good idea ? Any advices please)
There is a framework for hosting "services" within a single Windows Service called TopShelf. You might want to consider using that. https://github.com/Topshelf/Topshelf
I am interpreting your question to be "We have tons of little Windows applications that run as services - how can we simplify them?".
In general, lots of smaller programs are better. Single monolithic applications are difficult to maintain and test; when someone needs to make a small change it can trigger catastrophic consequences for dozens of other components of the application. It can also make it impossible to change one small application without taking down the whole service, as Chris Knight comments above.
On the other hand, lots of small programs suffer from the breadth problem. You probably want to make sure all your little programs run on a consistent framework - i.e. they all log their results to the same place, they all use a standardized configuration system, and they are all managed in the same place.
I have seen situations where people write services because they need to run a task "when a particular condition happens", so they make it a constantly running service and continuously check for that condition. Is it possible that you could take some of your services and turn them into triggered launches of individual applications?
If this isn't the correct interpretation, please let me know :)
I created a Windows Form executable in .NET 3.5 that uses a dll to communicate with a machine that scans checks. I'm eventually going to need to move from an executable to a Web Form that can do the same thing. This will be months from now, but I wanted to start doing the research now as I have not done this before. I'm going to need to use ActiveX in order to communicate with the device via a Web Form. I've also not done this before.
I'd like to keep the functionality of my existing executable without having to rewrite most of it, although I do understand that some of it will need to be rewritten. I've done research on ActiveX and how to use it, but I wanted to know if someone has had a similar situation as this. What did you do to convert an exe to a web program? Are there good, specific sources out there that I'm overlooking that can point me in the right direction for this situation? Is there any advice that you can give from your experiences that can help me to reduce mistakes? The company that I work for does not have anyone else here that has done this before, so I've got to teach myself everything needed to do this.
Thanks in advance.
This is where separation of concerns and n-tier design shine through. Hopefully your UI layer is loosely coupled from your domain model. If this is the case, you can code a second IU layer for the web. And not have to change your domain model at all. Then you can compile for each scenario.
*note - In practical use I have always had to extend my business domain to account for some issues with the second UI, but those modifications have usually been minor, and have pointed out places where I had coupled too tightly anyway.
Another option you may consider is creating a web services layer over your business domain code. And then coding a web application that communicates with your domain model via those web services calls. This may have performance implications, and would not be my preferred method of accomplishing this. Though you may find it more manageable if you don't have a well designed application to start with.
"I'd like to keep the functionality of my existing executable without having to rewrite most of it"
In general if you extract as much logic as possible into its own assembly/dll, you can reuse that from whatever UI framework you want. Just make sure you're not doing anything UI specific in there (throwing up dialog boxes, etc).
Normally, converting winforms to webforms is quite possible, although typically a slow development process. Even if you've got the cleanest domain layer in the world, the fact that objects in your web page are thrown away every time means that a web domain layer is normally written very differently to a desktop domain layer.
However, in your case the device - server communication is going to be extra difficult.
Have you looked at xbap? It's basically a way to deploy WPF applications into a web page. It requires your clients to have the right version of .NET installed, but it's going to be the easiest path for you, especially considering that you can host winforms in WPF...
You may take a look at Silverlight 4,
http://silverlight.net/getstarted/silverlight-4-beta/
It contains many features that ASP.NET Web Forms hasn't.
If your team can accept something like ActiveX, why not Silverlight 4? The only disadvantage is that SL4 is still in Beta.
I've been asked to research approaches to deal with an app we're supposed to be building. This app, hypothetically a Windows form written in C#, will issue commands directly to the server if it's connected, but if the app is offline, the state must be maintained as if it was connected and then sync up and issue data changes/commands to the server once it is connected.
I'm not sure where to start looking. This is something akin to Google Gears, but I don't think I have that option if we go a Winform route (which looks likely, given that there are other functions the application needs that a web app couldn't perform). Is the Microsoft Sync framework a viable option? Does Silverlight do anything like this? Any other options? I've Googled around a bit but would like the community input on what's best given the scenario.
The Microsoft Sync Framework definitely supports the scenario you describe, although I would say that it's fairly complicated to get it working.
One thing to understand about the Sync Framework is that it's really two quite distinct frameworks shipping in the same package:
Sync Framework
ADO.NET Sync services v. 2
The ADO.NET Sync services are by far the easiest to set up, but they are constrained to synchronizing two relational data stores (although you can set up a web service as a remote facade between the two).
The core Sync Framework has no such limitations, but is far more complex to implement. When I used it about six months ago, I found that the best source to learn from was the SDK, and particularly the File/Folder sync sample code.
As far as I could tell, there was little to no sharing of code and types between the two 'frameworks', so you will have to pick one or the other.
In either case, there are no constraints on how you host the sync code, so Windows Forms is just one option among many.
If I understand correctly, this doesn't sound like an actual data synchronization issue to me where you want to keep two databases in sync. it sounds more like you want a reliable mechanism for a client to call functions on a server in an environment where the connection is unstable, and if the connection is not present at the time, you want the function called as soon as the connection is back up.
If my understanding is right, this is one option. if not, this will probably not be helpful.
This is a very short answer to an in-depth problem, but we had a similar situation and this is how we handled it.
We have a client application that needs to monitor some data on a PC in a store. When certain events happen, this client application needs to update our server in the corporate offices, preferably Real-Time. However, the connection is not 100% reliable, so we needed a similar mechanism.
We solved this by trying to write to the server via a web service. If there is an error calling the web service, the command is serialized as an XML file in a folder named "waiting to upload".
We have a routine running in our client app on a timer set for every n minutes. When the timer elapses, it checks for XML files in this folder. If found, it attempts to call the web service using the information saved in the file, and so on until it is successful. Upon a successful call, the XML file is deleted.
It sounds hack-ish, but it was simple to code and has worked flawlessly for five years now. It's actually been our most trouble-free application all-around and we've implemented the pattern elsewhere successfully