I have a website that constantly scans a folder for files and then performs some actions on them. I dropped several thousand of these files into the directory, it started chugging but when I came back in the morning there were still files there. When I opened up a new session it started working again. Is it possible to keep this function running without a user having a session open?
in IIS, remove the session expiration time on both the website and the application pool. This means that once you fire/lookup the site once, it never shuts down. But as #Yuriy mentioned, what you're doing shouldn't be done via a website... get the information/perform your tasks using a windows service, and if need be, implement a way to display the tasks that were done in a website. But the web site shouldn't be doing the heavy lifting.
Session is user-specific.If you do not want session state enabled for your application, you can set the session mode to Off. please refer this msdn post.
Try using Quartz.net. It is a scheduler that will run tasks for you.
Related
Our Web Application uses an .net-core web api running on a loab balancer and an angular client. We access the DB using EF core.
We have a long running background-task that does a great amount of calculation and takes about 2-3 hours to do so, but will only be initiated by administrators of the application 3-4 times a year.
While the job is running we want to prevent users from adding/editing/deleting data and our client told us its even fine if the application is not avaliable for the duration as they will mostly do it overnight.
The easiest way to do this is to redirect users to an informationpage while the job is running but I have found no way of actually getting to the information if the task is running or not.
I could set a flag whether the job is running or not and just check that flag at every request but I found no way to access an applicationwide state.
I cannot save a flag to the DB because while the transaction is commiting at the end of the job (~1 hour) we cannot read from the DB
What baffles me most is that I have not found a single article or question about a problem like that which doesn't seem to be too outlandish to me, so I guess I'm missing something very obvious.
The simplest way is to store the value for your "Maintenance Mode" in a Singleton class on the server. (No database call needed). The value will remain there for as long as the server is actively running.
If distributed cache (as already mentioned) is not an option, you can run long running task in (uniquely) named transaction and then the check list of active transactions to determine if task is still running.
This is completely dependent on your setup but a simple way to approach this problem might be to make it the long-running job's responsibility to divert traffic from your site while it is running, and then undo that once it is finished.
As an example, if you were running this with an old-school .NET site in IIS the job could drop an app_offline.htm file into the site folder, run, then delete it again. Your setup is different, but if you could do something similar with your load-balancer (configure it to serve some static file instead of routing the requests to your servers) then it could work for you.
I have a project where users upload multiple 4-6MB AccessDb files to migrate into a AWS hosted SQL Server database. I think what is happening is that IIS maybe shutting down the process after some amount of time.
Its basically a file upload and then a c# static class that reads the mdb file and uses EF .core to copy into SQL Server. Using Visual Studio, i don't see any shutdowns and all test files migrate successfully.
Website built using .netcore 2.1 / c#
I have Web.config timeout set to requestTimeout="00:59:00" so that basically stops some 503 errors.
What I'm seeing is that if all browsers are closed(or maybe not) and maybe technically no connections to IIS, that after a few minutes, the migration stops. I'm not seeing new rows added to the database.
I want my c# .netcore processes to continue migrating whatever databases the user uploaded without IIS or whatever other process killing the connection.
Looking for tips I can update. I'm assuming there's other Web.config settings I can update?
It sounds like your code that processes the files is a background job in your application, so I think I know what's going on.
By default, IIS app pools are configured to shut down after 20 minutes of inactivity (no incoming requests). You need to disable that.
The option is called "Idle Time-out (minutes)" in the Advanced Settings of the application pool. Set it to 0 to disable it.
There are also options in there to automatically refresh the app pool, so review those and make sure they are acceptable to you (I forget what the defaults are).
It's not a good design decision to keep an http connection open for that long. First of all asp.net core will drop the thread after a while, and second the user could drop the connection by mistake. I would recommend doing this async to the user:
Receive the file, validate it.
Schedule a background job using either Hangfire or the builtin
IHostedService interface.
When the migration is complete email or notify the user some other
way.
I have an C# MVC app. And one of the calls I know will take like 12 hours, I'm generating some reports.
I want to know if the IIS will keep the process running this long.
If I'll do it async will it run and somehow put it away and let him run for this long?
In general, this is a very poor practice. Your app pool may be recycled for many reasons outside of your control.
You would greatly benefit from processing this in an external process and then providing the user the results via IIS.
However, starting .net 4.5.2 (https://blogs.msdn.microsoft.com/dotnet/2014/05/05/announcing-the-net-framework-4-5-2/) you can use the HostingEnvironment.QueueBackgroundWorkItem API.
The idea behind this is that the IIS will try to finish this work in case of a graceful shutdown of the app pool.
I'm assuming the report must be using something like SSRS. Why don't you create a batch job at the backend to run the reports at a specified time. Update a table with the status of the report and just poll the status at the front end. when Ready just download it. Imagine if your report had been running for 6 hours and it's reliant on the website being up. If someone re-starts the website that's 6 hours of processing gone.
The goal is that I want my overall website response time to be instantaneous.
The problem is that I do no have IIS access, my website is hosted using external service and I have no control to the IIS panel.
My current approach now is having a scheduled code that keeps my website alive. The problem with this only approach is the hosting service has an algorithm to shutdown all their hosted website like every some hours.
This is why I need to implement another approach which to warm up / pre-load the website each time it runs.
How to do this when there is no access to the IIS panel?
The solution requires no 3rd party sites; robots; or apps, you merely write a very simple app yourself that periodically performs a trivial web function, perhaps a REST GET. By performing this function say every few minutes not only do you guarentee that the IIS pool won't timeout and be cold for a client, but it also has the nice effect of ensuring your website is up and running in a warm condition (JIT'd; and running) ready for a real request for your non-heartbeat website requests.
e.g.
In your website expose a REST API, say www.misspiggy.com/api/hiyaaaa that does nothing other than to return HTTP 200 OK.
By implemententing this in your ASP.NET app, any request to the above URL will cause your stopped or cold ASP.NET website to be JIT'd during:
first deployment (and even then only during a request is made to it)
after the IIS AppPool has timed out and needs to restart on demand
The client code that makes the REST request can be anything:
a console app
a Windows service
WinForms/WPF app
The console app can be triggered to fire via Windows Task Scheduler say every 5 minutes thus saving you the hastle of building in a scheduler.
My current approach now is having a scheduled code that keeps my website alive. The problem with this only approach is the hosting service has an algorithm to shutdown all their hosted website like every some hours
I suggest you set your ping period to be a matter of minutes rather than hours.
No admin access to server required
The problem is that I do no have IIS access, my website is hosted using external service and I have no control to the IIS panel
It should be pointed out that this solution does not require you to install anything new on the server nor make any changes on the server.
Azure
It is interesting to note that Azure Application Insights has Availability Tests that though designed for testing web site availability, can be used for this exact same purpose of keeping your website alive and warm ready to go for web clients. In fact this is what I do for my web apps.
Doing so keeps response times and latency as low as possible.
There are a number of things you can do but a real simple solution is to use a website monitoring site something like statuscake or uptime robot there are a large number of them out there. You set them up call a page or pages on your website at set intervals to ensure it is still up this has the added bonus of keeping the site warm.
I would also precompile your mvc app if you arent already doing that.
HTH
I'm creating a ASP.NET .NET 4.0 website and part of this site requires that there is an "always running" application. Normally I would create a Windows Service for this, but the site will be hosted within a shared hosting environment, and unless I get a virtual server, then this isn't a possibility.
My first thought was to have a thread running in the background that would do this and it would be created on Application_Start and destroyed on Application_End. I've looked around and this seems like it could be an option, but I would of course have to hit the site in order to cause the Application_Start to be called and if the associated AppPool is recylced, then this process would have to be repeated (so I believe?!?).
Within a normal ASP.NET website does these seem possible?
In the end I had a seperate thread that sits and waits for a signal to be set. Once set it then does it's work. To make sure the thread is always active I make a HTTP request for a "dummy" page to ensure that, if the AppPool was recycled, then the Application_Start event is triggered and the thread restarted.
It depends on what mean by 'always running application':
If it's a realtime service, it still makes sense to run it as separate process, even if it may have a web front-end. It's so because ASP .NET server was designed by Microsoft for specific tasks(to run web apps, render pages etc.) in many aspects like memory usage or multithreading. And I'd prefer to use at least a VDS in such case.
Another case is when it's is a periodically(say every hour) alarmed application which does some uncomplicated work - perhaps your shared hoster has some mechanisms to trigger a specific page to do some work(as my hoster does). For example, I have an ASP .NET page that monitors the tour date list of my favourite band, and sends email notification when they are going to play a gig in my town - it's triggered by hoster every 4 hours.