Winforms application Smart client Disconnected Service Agent Application Block evolution - c#

I have a .Net framework 4.0 win forms application that relies on Disconnected Service Agent application block (Smart client) in order to call backend WCF services: https://www.microsoft.com/en-us/download/details.aspx?id=572
This pattern has a huge footprint on the application, resulting in several situations where a call to an agent results in its respective callback for successful result and for exception result.
The thing is that I'm evaluating the possibility to evolve this application to .Net 6, and this pattern is no longer supported. But changing this to a more traditional request/reply pattern would result almost on rewritting the application in order to keep it working.
Is there any alternative I can use on win forms that allows me to achieve the same or similar result?

Related

How can I run my WCF Service Library "in process"

I have a WCF CRUD REST API up and running in a Windows Service. All is well.
I'd like to offer the user the ability to run that inprocess as well; so, instead of running a service (which would require admin) I'd like to have a static library version as well.
With .NET (C#) how would I go about this? Right now I have:
ServiceLib (interesting code)
ConsoleHost
GUI
I'd like the GUI to selectively be able to run the ServiceLib code as a full-fledged Windows Service -OR- just as in-process code. The service way already works, which I assume is harder.
If you run it now in a Windows service it's already self-hosting, I assume? If GUI app will be the only one using the service in the "in process" mode you can split your "WCF CRUD REST API" ServiceLib into two - one implementing CRUD part and second implementing WCF REST API on top of it. In GUI you'll need only the CRUD part, so no need to bother with self-hosting of REST API in the same (and only) application that's going to call it.
Running a GUI app as a Windows service is usually a pretty bad idea anyway from the sysadmin's point of view. I often run console apps that can be either WCF hosting Windows services or just WCF hosts, but their only GUI is ablility to react to Ctrl-C.

Windows c# job infrastructure

I'm looking for a better infrastructure setup for managing and deploying internally developed applications which are executed periodically.
The current setup grew into an unmonitorable heterogeneous collection of applications, that can only be executed directly on the scheduler VM.
Current situation:
Windows environment
Bunch of Jobs written in PowerShell and as C# applications some containing rather complex logic, some perform ETL-Operations
Jobs either configured as service or console applications triggered by default Windows scheduler and running on a dedicated VM
Application specific logging into Log-Files (some applications)
Configuration via app.config file for each C# console application
Windows scheduler doesn't provide nice Web-GUI to watch and monitor job executions.
Ideal situation:
Central monitoring: Overview of all jobs (when run, failures)
Trigger manually via a web frontend
Trigger job execution via API with possibility to check whether execution succeeded.
Central job configuration (connection strings, configuration parameters)
Constraints:
No cloud: Due to internal restrictions the software has to reside inside our own network. Our company owns a sufficiently dimensioned server rack for hosting required servers internally.
Considered Options
Azure WebJobs
From what I have read this would be exactly the solution I'm looking for. Due to our "no cloud" policy we'd need to host our own Azure Pack internally, which might require quite some effort to set up and is possibly a technical overkill for these requirements.
Self-written Web-API Project
Another option would be to write a dedicated Web-API project that contains all job functions, has one central configuration and exposes the job functions as Web-API Methods and using Quartz.net for scheduling.
However if possible I'd prefer to use some standard software, so I won't be responsible for maintaining yet another central piece of our infrastructure.
Which option would you choose? Or are there any better alternatives?
I think what you look for is so-called Master Data Management, so you should check for
http://www.talend.com/
https://www.informatica.com/
http://www.tibco.com/

What service type should I use for a SOA solution to exec a long running task?

I am developing a solution in .Net utilising the VMWare Web Service API to create new isolated virtualised development environments.
The front end will be web based. Requestors input details for the specific environment which will be persisted to SQL. So I need to write an engine of some sort to pull the data from SQL and work on the long running task of creating resource pools, switch port groups and cloning existing VM templates etc. As work progresses, events will be raised to write logs and update info back to SQL. This allows requestors to pull data back into a webpage to see how it's progressing or if it's completed.
The thing I am struggling with is how to engineer the engine which will exec the long running task of creating the environment. I cannot write a windows service (which I would like) as we work in a very secure environment and it's not possible (group policy etc). I could write a web service to execute the tasks, extending the httpRuntime executionTimeout to allow the task to complete. But I'm keen to hear what you guys think may be a better solution to use (based on .Net 3.5). The solution needs to be service oriented as we may be using it on other projects within our org. What about WWF, WCF? I have not used any of the newer technologies available since .Net 2.0 as we've only just been approved to move up from .Net 2.0.
First of all, a Windows Service isn't insecure. One of software development devils is discarding a solution by ignorance or because a lack of investigation, requirement analysis and taking decisions collaborately.
Anyway, if you want to do it in a SOA way, WCF is going to be your best friend.
But maybe you can mix a WCF service hosted by Internet Information Services and Message Queue Server (MSMQ).
A client calls a WCF service operation and enqueues a message to some Message Queue Server queue. Later, a Windows scheduled task executed overtime, checks if your queue has at least an incoming message, and task processes this and others until you dequeue all messages.
Doing this way, IIS WCF host will never need to increase its execution time and it'll serve more requests, while the heavy work of executing this long tasks is performed by a Windows scheduled task, for example a console application or a PowerShell cmdlet (or just a PowerShell script, why not?).
Check this old but useful MSDN article about MSMQ: http://msdn.microsoft.com/en-us/library/ms978430.aspx
I don't understand your comment regarding services being insecure, I think that is false.
However I would look into creating something that uses a messaging bus/daemon.
RabbitMQ: http://www.rabbitmq.com/
MassTransit: http://masstransit-project.com/
NServiceBus: http://nservicebus.com/
Then you can basically use something like WCF, Console App or Service as your daemon.

When to use a webservice over a windows service?

I have a data loading application that has to be executed multiple times per day at irregular intervals. I am planning to write a service to kick off the downloads and import the data to a database server. Are there advantages to using a standard service over a webservice or vice versa?
I think you're missing the point here.
Web Services typically are used for a form of communication or remote execution. You call a remote function on a web-service to either adjust the behavior of the machine it's running on or to retrieve data from it.
Windows Services are background processes that run on a machine without any "logged on user" being required. They can perform tasks and do things while the user is at the login screen, or do elevated operations. You can talk to services to adjust their behavior or retrieve information, but it's general purpose is different than a webservice.
The biggest notable difference here is that web-services must be called, they don't run on their own.
For your application I would suggest using a Windows (Standard) Service, as you can have it execute code once per day. I would only use a web-service if you've got something else to automate the calls to the web-service and you require a response from the server detailing it's execution result (success/fail/warning/etc...)
You could also consider writing a normal (windows or console) application that is triggered by a Windows Scheduled Task. What you've described doesn't necessarily sound like something that would require a service.
Sounds like a good use of a windows service to me. Off the top of my head, I'd use a windows service if:
1. Work is performed on a scheduled basis (regular or irregular intervals) in the background;
2. No interaction is needed - work is just done in the background and kicks off based on polling or some other type of trigger (message dropped in queue, database value trigger, scheduled timespan, etc.);
3. Needs to be monitored (either starts/stops along with logging) and you can take advantage of WMI, perfmon and event log with little effort.
A web service is better for tasks that are interactive (like if you wanted to initiate the download based upon a request received).
Sounds like a windows service is the approach you should take.
Hope this helps. Good luck!
If "irregular interval" does mean, the application is invoked by another application, I would use a web service.
If the action is scheduled, I would use a windows service.
If you are working with SQL Server (scheduled or not), I would also consider SQL Server Integration Services.

How to communicate with a windows service from an application that interacts with the desktop?

With .Net what is the best way to interact with a service (i.e. how do most tray-apps communicate with their servers). It would be preferred if this method would be cross-platform as well (working in Mono, so I guess remoting is out?)
Edit:
Forgot to mention, we still have to support Windows 2000 machines in the field, so WCF and anything above .Net 2.0 won't fly.
Be aware that if you are planning to eventually deploy on Windows Vista or Windows Server 2008, many ways that this can be done today will not work. This is because of the introduction of a new security feature called "Session 0 Isolation".
Most windows services have been moved to run in Session 0 now in order to properly isolate them from the rest of the system. An extension of this is that the first user to login to the system no longer is placed in Session #0, they are placed in Session 1. And hence, the isolation will break code that does certain types of communication between services and desktop applications.
The best way to write code today that will work on Vista and Server 2008 going forward when doing communication between services and applications is to use a proper cross-process API like RPC, Named Pipes, etc. Do not use SendMessage/PostMessage as that will fail under Session 0 Isolation.
http://www.microsoft.com/whdc/system/vista/services.mspx
Now, given your requirements, you are going to be in a bit of a pickle. For the cross-platform concerns, I'm not sure if Remoting would be supported. You may have to drop down and go all the way back to sockets: http://msdn.microsoft.com/en-us/library/system.net.sockets.aspx
If this is a tray app, and not a true service, be wary of how you set up your communications if using pipes or TCP/IP. If multiple users are logged into a machine (Citrix, Remote Desktop), and each user launches a tray app "service", then you can run into a situation where you have multiple processes trying to use the same well known port or pipe. Of course this isn't a problem if you don't plan on supporting multiple pipes or if you have a true service as opposed to a tray app that runs in each user shell.
Have your service listen to 127.0.0.1 on a predefined port with a plain old TCP stream socket. Connect to that port from your desktop application.
It's dead simple and it's completely cross platform.
Did any one of you actually try remoting with Mono? It works just fine. You might bump into some corner cases, but this is highly unlikely. Just test your application for cross-platform (MS.Net <-> Mono) remoting from time to time to catch any possible glitches. And start with a recent Mono, 2.4.2 is current.
Remoting is an option, but it's not cross-platform. Some other ways are to use named pipes, IPC, or kernel events.
Funnily enough I was going to suggest Remoting! The Mono 1.0 Release Notes (from archive.org because the original location is missing) mention System.Runtime.Remoting.dll as a supported library and doesn't say anything about known issues.
If remoting is out then you probably have to implement your own TCP message framing protocol. Windows doesn't have an equivalent of UNIX-domain sockets for communication on the same machine.
Most services that have a GUI component are run as a named user, and are allowed access to the desktop. This lets you access it via COM or .NET but only locally (unless you want to get complicated)
Personally, I open an ordinary old socket on the service - its cross platform, allows multiple clients, allows any app to access it, doesn't rely on Windows security to be opened up for it, and allows your GUI to be written in any language you like (as everything supports sockets).
For a tray app, you'd want a simle protocol to communicate - you might as well use a REST style system to send commands to it, and stream XML (yuk) or a custom data format back.

Categories

Resources