Set service dependencies after install - c#

I have an application that runs as a Windows service. It stores various things settings in a database that are looked up when the service starts. I built the service to support various types of databases (SQL Server, Oracle, MySQL, etc). Often times end users choose to configure the software to use SQL Server (they can simply modify a config file with the connection string and restart the service). The problem is that when their machine boots up, often times SQL Server is started after my service so my service errors out on start up because it can't connect to the database. I know that I can specify dependencies for my service to help guide the Windows service manager to start the appropriate services before mine. However, I don't know what services to depend upon at install time (when my service is registered) since the user can change databases later on.
So my question is: is there a way for the user to manually indicate the service dependencies based on the database that they are using? If not, what is the proper design approach that I should be taking? I've thought about trying to do something like wait 30 seconds after my service starts up before connecting to the database but this seems really flaky for various reasons. I've also considered trying to "lazily" connect to the database; the problem is that I need a connection immediately upon start up since the database contains various pieces of vital info that my service needs when it first starts. Any ideas?

Dennis
what your looking for is SC.exe. This is a command line tool that users can use to configure services.
sc [Servername] Command Servicename [Optionname= Optionvalue...]
more specificly you would want to use
sc [ServerName] config ServiceName depend=servicetoDependOn
Here is a link on the commandlike options for SC.EXE
http://msdn.microsoft.com/en-us/library/ms810435.aspx

A possible (far from ideal) code solution:
In you startup method code it as a loop that terminates when you've got a connection. Then in that loop trap any database connection errors and keep retrying as the following pseudo code illustrates:
bool connected = false;
while (!connected)
{
try
{
connected = openDatabase(...);
}
catch (connection error)
{
// It might be worth waiting for some time here
}
}
This means that your program doesn't continue until it has a connection. However, it could also mean that your program never gets out of this loop, so you'd need some way of terminating it - either manually or after a certain number of tries.
As you need your service to start in a reasonable time, this code can't go in the main initialisation. You have to arrange for your program to "start" successfully, but not do any processing until this method had returned connected = true. You might achieve this by putting this code in a thread and then starting your actual application code on the "thread completed" event.

Not a direct answer put some points you can look into
Windows service can be started Automatically with a delay. You can check this question in SO for some information about it.
How to make Windows Service start as “Automatic (Delayed Start)”
Check this post How to: Code Service Dependencies

Related

OptimisticConcurrencyException: Multiple EF based applications using shared AppFabric cache and same database

I am using a web application and a windows service on the same machine as Appfabric.
Both applications reuse same DAL code (dll) which is EF (Entity Framework) Code-First based and accessing the same cache in Appfabric. The code in the windows service is implemented as a Job as part of Quartz.Net
The web application has to support multiple requests off course, and the windows service multiple threads( scheduler and events).
For both, the shared DAL dll creates a DbContext object per http session and thread ContextID or just Thread ContextID for the later. The DAL uses the EFCachingProviders from here. Also, my EF solution uses Optimistic concurrency with a timestamp columns and IsRowVersion in the mapping.
As stated here, the benefit of having a 2nd level cache is to have access to a representation of the original state across processes! But that does not seem to work for me, I get 'OptimisticConcurrencyException' in my use case as following:
restart cache cluster, restart windows service, restart iis -> clean slate :)
Using web app (firefox), I insert a new object A with reference to existing object B. I can see the new row in the database. All ok.
Using webapp in another browser (chrome) = new session, i can see the new object.
Next, the windows service tries to do some background processing and tries to update object B. This results in an 'OptimisticConcurrencyException'. Apparently the process in the windows service is holding a version of Object B with a dated rowversion.
If i restart the windows service, it tries the same logic again and works with no exception....
So both applications are multithreaded, use same DAL code, connect to same database, and same cache cluster and same cache. I would expect the update and insert to be in the appfabric cache. I would expect the EF context of the windows service to use the newest information. Somehow, it seems, that it's 1st level cache in holding on old information...
or something else is going wrong.
Please advice...
Update
Ok, after digging around, i fixed the Update problem of my windows service. Each Manager object with queries the DAL uses a DbContext bound to its Process ID + Thread ID. So in the Execute function of my Quartz Job, all Managers (of different object types) should share the same DbContext which is created by the first Manager.
The problem was, that after the function finished, the DbContext was not Disposed (which happens automatically in the HTTP Session based DbContext manager). So the next time the Job was executed, the same DbContext was found and used, which by that time was dated already (old first level cache???). The 2nd level cache should not be a problem, because that is shared and SHOULD contain newest objects... if any.
So this part is fixed.
New problem
So the web-app creates a new object A, updates an existing object B, the windows-service now works and is able to update the existing (changed) object B with no problem.
Problem:
When i do a refresh of the webapp, it does not see the changes (by the windows service) of object B....
So if the webapp changed a count to 5, 10 minutes later the windows service change that count to 6 and I open the web-app in same or new window/browser, i still see 5, not 6!
A restart of the webapp (iis) does not help, also an iisreset doesn't.
When i do Restart-CacheCluster.... it works and shows 6....
So it looks like the item is in the cache. The windows service updates it, but does not invalidate the item, which is old and used by the webapp....
Or... although the same object, the webapp has its own entry in the cache and the win-app has its own entry (which does get invalidated)....
Which one?
Solution
I solved this myself. The EF wrapper uses the query string as a key to store items in the cache, it seems. So 2 different queries (does not matter if they originate from 2 different application sharing same distributed cache or same application) referencing the same data in the database will have different keys (different query string) and so different places in the cache. Perhaps its not this black-and-white but something like this...
I don't think internally some way of algorithm is used to check if a query touches existing cached objects.
This causes my problem where my windows service does an update and the webapp still sees the old one from the cache which could only be solved by doing a Restart-CacheCluster command.
So how i fixed this:
My windows Service is a batch job triggered by the Quartz Scheduler. After it is done
I clear the whole cache:
private void InvalidateCache()
{
try
{
DataCache myCache = ...
foreach (String region in myCache.GetSystemRegions())
{
myCache.ClearRegion(region);
}
}
catch (Exception ex)
{
eventLog.WriteEntry("InvalidateCache exception : " + ex.Message);
}
}
I don't have an answer, but I hope the thoughts below might point you into the right direction.
If this is only an issue on updates, I would go for reading a fresh instance of the record on every update from the database, and update that. This would avoid optimistic concurrency errors. Note that the DbContext is not thread safe - I don't know if this would cause the issue, but reading every time new would address it.
If you are having this issue on reads, then you would have to track down where the various caches are and which one is not getting updated and why. I am guessing there are various configuration options for caching at each point of usage. Good luck with that.... :)

Programmatically restarting IIS Hosted WCF service - change Bindings issue

I got into this not very good situation.. When web application starts - I set up different routes for my services so tenants of my multi-user app connect to:
private static void RegisterRoutes()
{
// Setup URL's for each customer
using (var cmc = new CoreModelContext())
{
foreach (var account in cmc.Accounts.Where(aa => aa.IsActive).ToList())
{
RouteTable.Routes.Add(
new ServiceRoute(account.AccountId + "/mobile",
new MyServiceHostFactory(), typeof(MobileService)));
}
}
}
So, when my site/service starts - it grabs all accounts from the database and sets up the routes.
This is a single point of failure right there. Sometimes servers rebooted in wrong order and if SQL Server not started - this service starts in "weird" mode.
Today web service stopped responding. I checked logs - IIS recycled pool as scheduled (default settings) and started different worked process. Something didn't click and boom - server stopped responding. Routes wasn't registered...
So. My question is.. How to fix it best way? I can put routes to config file, but that will mean I have to maintain those id's in 2 places. Probably not that bad but I'd rather do it differently if possible.
Is it possible to to programmatically try and restart pool? What happens when exception thrown in Application_Start ? Right now I'm not trapping it.
Not sure if this is a "fix" but when we've got similar dependency issues, we make sure the other dependencies cannot successfully start in "weird" mode. In this case, I would bring the app down hard if the sql server isn't avaliable, at least in production. Far better to have nothing being processed than have things being processed wrong.

Would like to execute a query once per day in ASP.NET MVC

I would like my ASP.NET MVC app to execute a query once per day. What is the recommended way to do this?
My first thought is to put a timer in Global.asax that goes off every 24 hours, then call my query from the Elapsed handler. Any pitfalls with doing it this way? Is there a better way?
Edit
Let me add a little detail to what I'm trying to do. I'd specifically like the query to execute at midnight every day. If a day is missed (say due to sever maintenance or upgrading the app), that wouldn't be a major issue.
Edit 2
Couple more details:
The query is actually an INSERT, not a SELECT. The purpose is to add a "renewal" record for any member that is due to renew his/her membership at the end of the month.
I'm using SQL Server Compact (it's a very small database).
Does it have to originate in the Web layer? Who'd be there to consume the HTML? Typically, periodic SQL queries are scheduled within the database. In case of MS SQL Server - via the SQL Agent job facility. SQL Server can even send e-mail.
RE: edit2: Should've told so right away. SQL Server Compact is not the same as SQL Server - for one, it does not have SQL Agent IIRC. Still, invoking the Web layer is an overkill. I'd use a Windows Scripting Host file (.js) in conjuction with Windows task scheduler. WSH files can connect to databases via ADO and do whatever they want - inserts, selects, anything.
To detect missed scheduled runs, introduce an extra table with a log of scheduled runs. Then on subsequent runs you can analyse the date of the last run and act accordingly.
Edit2: so no administrative access. You should really tell all those details in the question. In this case, I would go through the Web layer after all, but the scheduling would be on MY end - where I do have control. Have Task Scheduler run on your end and invoke an HTTP URL on the server. To invoke URLs, you can use something like the free CURL utility. Running IE in scheduled manner has the disadvantage of leaving the window open.
IIS is not a scheduling engine.
Edit3 re:comment: sorry, I've misunderstood the nature of your setup. My own experiences have clouded my judgement :) Can you just run a check during every logon operation, and if it's been a while since the last maintenance operation, run it right then and there? How long does the maintenance take? If it's ~1min+, makes sense to run it in a worker thread, so that the logging-on user is not made wait.
Scheduling daily maintenance is a good idea in general, and it is implemented fairly often, but it seems you simply don't have the capability.
I do this very thing in my web apps, but use Asynchronous HTTP Handlers (http://msdn.microsoft.com/en-us/library/ms227433.aspx#Y512); I believe this would be recommended. I just start it off on application start and shut it down on application end (Global.asx).
The thing to remember is that you'll probably have to store the last time the query ran in the database because you'll loose track of that when your application pool recycles.
I'm doing this by putting some fake information in "Cache" and put the time period i want then handel the "_onCacheRemove" event do whatever i wanna do then recreate the "CacheItem" again:
e.g.
I put my tasks in Enum with the time that i wanna to rerun this task in seconds:
public enum ScheduledTasks
{
CleanGameRequests = 120,
CleanUpOnlineUsers = 6
}
then deal with them at "Application_Start" :
protected void Application_Start()
{
AreaRegistration.RegisterAllAreas();
RegisterRoutes(RouteTable.Routes);
// Adding the tasks i want at App_Start
// so if the application restarted my task will refreshed.
AddTask(ScheduledTasks.CleanGameRequests);
AddTask(ScheduledTasks.CleanUpOnlineUsers);
}
// event to handel
private static CacheItemRemovedCallback _onCacheRemove;
private void AddTask(ScheduledTasks task)
{
_onCacheRemove = new CacheItemRemovedCallback(CacheItemRemoved);
HttpRuntime.Cache.Insert(task.ToString(), (int)task, null,
DateTime.Now.AddSeconds((int)task), Cache.NoSlidingExpiration,
CacheItemPriority.NotRemovable, _onCacheRemove);
}
public void CacheItemRemoved(string key, object time, CacheItemRemovedReason r)
{
var task = (ScheduledTasks)Enum.Parse(typeof(ScheduledTasks), key);
switch (task)
{
case ScheduledTasks.CleanGameRequests:
// Do the concept that you wanna to do.
GameRequest.CleanUp();
break;
case ScheduledTasks.CleanUpOnlineUsers:
OnlineUsers.CleanUp();
break;
default:
break;
}
// Don't forget to recreate the "CacheItem" again.
AddTask(task);
}
Note: You may make your time management as you want. In my case i
wanna these tasks to run every period
of time regardless of what time it is.
In your case you should check the time
before then recreate the CacheItem
again.
Hope this helped :)
Unless you have very active site chances are that IIS will bring your application down and there will be no process to execute your task.
Alternatives:
just do that during/immediately after request that is close enough by time
have external task that will trigger the operation on your site via GET/POST.
reconfigure IIS to never recycle/stop your app pool. Than your timer has chance to execute.
use some external service on the server to schedule the task ("at" or even SQL tasks).

C# program parameters from the command line?

I'm trying to start a C# program running, and then give it command from the cmd.exe after it's started running. For instance, suppose I started my .exe from the command line (C://FILEPATH/my_program.exe). I'd then like to have that program continue running, and then have me be able to pass commands to it that it is capable of handling. In my ideal world this would be something like "C://FILEPATH/my_program.exe run_my_command()" which would execute the run_my_command function, or "C://FILEPATH/my_program.exe k", which would do something in response to the char k that I'd pre-programmed in. I know that, as I've typed, would start a new copy of my_program.exe. I'd only like to have one running while I pass something like that in.
Does anyone know how to do this? Sample code would be wonderfully appreciated. Thanks!!
The simplest solution would be for your second instance of "my_program.exe" to look for an existing instance that's already running, "pass" the message over to it and then exit immediately.
The usual way this is implemented is via named pipes (System.IO.Pipes in .NET 3.5+). When your program starts up, listen on a named pipe with a given name. If there's something else already listening on that pipe, send the message to it and exit.
You are describing a typical service and command tool. The service (demon) runs in the background and executes commands. The command tool takes user commands and passes them to the service. See Windows Service Applications. Having a service instead of starting several processes takes care of some issues your approach has, like security isolation between the processes (eg. one user starts the a command, another user starts another command and gets executed in the context of the first user) and process lifetime issues (user launches a command and then closes his session).
The command tool would communicate with the process via classic IPC (local RPC, pipes, shared memory, etc).

Custom API requirement

We are currently working on an API for an existing system.
It basically wraps some web-requests as an easy-to-use library that 3rd party companies should be able to use with our product.
As part of the API, there is an event mechanism where the server can call back to the client via a constantly-running socket connection.
To minimize load on the server, we want to only have one connection per computer. Currently there is a socket open per process, and that could eventually cause load problems if you had multiple applications using the API.
So my question is: if we want to deploy our API as a single standalone assembly, what is the best way to fix our problem?
A couple options we thought of:
Write an out of process COM object (don't know if that works in .Net)
Include a second exe file that would be required for events, it would have to single-instance itself, and open a named pipe or something to communicate through multiple processes
Extract this exe file from an embedded resource and execute it
None of those really seem ideal.
Any better ideas?
Do you mean something like Net.TCP port sharing?
You could fix the client-side port while opening your socket, say 45534. Since one port can be opened by only one process, only one process at a time would be able to open socket connection to the server.
Well, there are many ways to solve this as expressed in all the answers and comments, but may be the simpler way you can use is just have global status store in a place accesible for all the users of the current machine (may be you might have various users logged-in on the machine) where you store WHO has the right to have this open. Something like a "lock" as is used to be called. That store can be a field in a local or intranet database, a simple file, or whatever. That way you don't need to build or distribute extra binaries.
When a client connects to your server you create a new thread to handle him (not a process). You can store his IP address in a static dictionary (shared between all threads).
Something like:
static Dictionary<string, TcpClient> clients = new Dictionary<string, TcpClient>();
//This method is executed in a thread
void ProcessRequest(TcpClient client)
{
string ip = null;
//TODO: get client IP address
lock (clients)
{
...
if (clients.ContainsKey(ip))
{
//TODO: Deny connection
return;
}
else
{
clients.Add(ip, client);
}
}
//TODO: Answer the client
}
//TODO: Delete client from list on disconnection
The best solution we've come up with is to create a windows service that opens up a named pipe to manage multiple client processes through one socket connection to the server.
Then our API will be able to detect if the service is running/installed and fall back to creating it's own connection for the client otherwise.
3rd parties can decide if they want to bundle the service with their product or not, but core applications from our system will have it installed.
I will mark this as the answer in a few days if no one has a better option. I was hoping there was a way to execute our assembly as a new process, but all roads to do this do not seem very reliable.

Categories

Resources