I have my web application running. And every time I change some little peace of logic code I need to stop the app and wait for IIS to restart entirely.
Somewhere in the web I saw some guy saying that one of cool features of MVC5 (or maybe MVC6 on ASP.NET Core) that you can make changes "on the fly".
So can I not stop and restart IIS every time, or I just misunderstood something?
It depends on how the ASP.NET Core app is deployed. Essentially, its ability to make changes on the fly is owed to it being able to be deployed as just plain code, rather than as a compiled application. The web server essentially compiles it on the fly. However, for that to happen, you need to be using a web server than actually can compile it on the fly. IIS cannot. However, IIS can act as a reverse proxy for Kestrel, and Kestrel can compile on the fly. If you deploy the app in the traditional "compile and publish directly to IIS application directory" approach, then you will not benefit from this.
Actually, you don't need to restart IIS after each deploy. Whenever a change is detected in DLLs, the app (not IIS) will recycle and re-load the new DLLs. It just impacts that particular application and reloads the app domain.
Also, editing the web.config in a web application will also recycle the app.
You can read more in this article.
Related
I am trying to run a background service which just writes to a file on a specified interval.
There are two methods that I tried
1) Created the project with the Console application template
2) Created the project with Web Application as template
When I run the app from visual Studio, both of them run fine. But when I deploy them to IIS, only the web application version works. It must be noted that there is absolutely no difference between the code of the two projects. I have used the WebHost as a hosting strategy in both the projects as well as well as installed all the dependencies in case of Console application as there are in the Web Application version.
I must also inform that I have used the preloadEnabled="true" option in IIS as IIS needs a web request to start the application.
I am wondering what is the difference between both the project types as the code is the same? I don't want the Web Application template.
Edit 1: I forgot to mention that the service will also need to expose an api endpoint for healthcheck purposes. Will the windows service approach listen for http requests?
I used the following article for implementing my background service.
https://learn.microsoft.com/en-us/dotnet/architecture/microservices/multi-container-microservice-net-applications/background-tasks-with-ihostedservice
After years of building background services, I learned that Windows services are the best tools to implement these applications. While there are different techniques to keep an IIS application up and running in the background and prevent it from getting recycled, in practice, the applications on IIS are not meant to be executed forever.
If you had an intention to build your app in the cloud, I would have suggested using something like Azure WebJobs or Azure Functions Timer-Triggered functions, but for on-premise, even using something like Hangfire in the web is not sustainable. The worst happens when you need backward compatibility on Windows servers that don't have the "Application Initialization" module.
My suggestion is to move your application to a simple Windows Service if you control your environment. Windows services consume less memory, are easier to manage, and can run forever without getting recycled.
WebApplications are plain the wrong tools for this.
Being always on and always reachable, WebServers are primary targets for hacking. To compensate for that, they are usually run under the most restrictive user rights you can imagine: Read rights to their programm and this instances content directory. While I do not know why it worked at all, it propably will stop working in Production.
What you wanted to write was eitehr a Service or something executed by the Windows Task Sheduler. Personally I advise for the Task Sheduler as Services have their own set of restrictions. Unless of coruse there is some detail of the requirements that you did not told us.
This article could be helpful. It's a step by step tutorial on how to convert a console application to a web application.
We are currently developing an asp.net mvc app to replace our old classic asp applications.
Because we just uploaded our .asp files to the production servers in the past, we want to do the same thing with the compiled dll that .net produces.
Now, I was wondering, what happens to open requests to the application when the webserver reloads the assembly?
classic asp is a scripting language and does not need to be compiled, so you can override files in production to your heart's content.
MVC however requires compilation, temp files are created and used. As such
you cannot simply override dlls in a running system, the dlls will be in use.
You will need to stop your application which will kill all connections and not accept any more requests, do the upload, recompile your application and then restart it.
All this is usually done very quickly with a CI/CD pipeline. You will probably want to create a process around this, maybe make the website unavailable for a while, but this depends on what kind of system you are running and how critical it is. If you need it up 24/7 without any downtime then you'll need some creative ideas
I have a C# program (I'll call it ProgramA) running on an azure VM running windows server, and it references my MVC site. Currently when I make changes to my MVC sites database structure the program on the VM runs into errors as the database structure has changed.
In order to resolve these errors I simply need to stop the service, update the dll, and restart the service. Currently I do this manually, which is awful as it causes downtime. I am looking to automate this deployment.
UPDATED - DEPLOYING CONSOLE APPLICATION IN CLOUD WITH DEPLOYMENT FROM VS2013
I was thinking of achieving this by creating a small app to run on the VM to check for updates and perform them, but I am now aware there are easier ways to achieve my goal by deploying my application from VS to run as a service in the cloud.
My app is a console app. I simply want it running in the cloud in a way that makes it easier to deploy. I have found https://azure.microsoft.com/en-us/documentation/articles/websites-dotnet-deploy-webjobs/#convert, but following the steps leads me to deploying a website and I'm not sure that's what I want. This isn't a website it is simply a console application.
What is the best way of achieving this?
So, I've been working on project that's making use of the Katana/Owin pipeline. And I'm trying to use Owin to serve static files securely. Now, when I run the project locally through VS, everything works perfectly fine.
However, when I deploy it to a server (Windows Server 2008 R2, IIS 7), IIS has decided that it'll be damned before it lets any dirty managed handler manage its static files. Even when I've removed the static file handler. Even when I've added my own handler for a specific path. Even when I've told it to run All Managed Modules For All Requests. And since the path that's being passed in is a virtual path, IIS of course craps out and dies and either spits out a 500 or a 404 error. I'm not entirely sure what causes it to spit out a 500 over a 404, but anytime the request is deeper than 1 directory deep it gives out a 500 over 404. (/Client/Content/Folder/static.html(404) vs /Client/Content/Folder/SubFolder/static.html (500))
I'm just not sure where to go from here, at all. Worse still, when I've deployed to my local IIS (not running it out of VS, that is.) it works perfectly. It respects the web.config, the Owin middleware handles the static requests perfectly, the security is in place. Everything is peachy. Granted I'm running IIS 8 express on my box, but I'm not convinced that that is it.
I'm going to post my comment as an answer since the more I think about it, the more sure I am that it could be the issue.
Check that all necessary asp.net and IIS features have been installed correctly via Programs and Features - do a comparison between your dev and server environments. I've experienced essentially the same issues as yourself and this was usually the reason.
If this isn't the reason, check the IIS settings/Application pool settings make sure they are the on dev and production environments.
I have created a C# web service, a web client, and successfully debugged the service with ASP.NET Development Server (that thingy that gets activated when you just press F5). All fine. Now I need a web service that is almost the same as previous, but differs in a few lines of code. For this purpose I created two new configurations, DebugNew and ReleaseNew, and set the output directory to binNew (instead of default "bin"). The problem is that original web service is executed in debugger, instead of new web service. The debugger is unaware of binNew folder. How to set up the environment to start new web service if DebugNew configuration is active?
As far as I know, web applications will only run out of the bin folder. If somebody knows how
to change that, I would be happy to call myself wrong in order to learn that trick myself.
Assuming that I am actually correct for once there, you could write a post compile script that checks which build configuration is active. If it's either DebugNew or ReleaseNew, copy the contents from binNew to bin.
If there's really only a few lines of code different though, I question whether or not putting a configuration setting in and adjusting the code accordingly isn't a better way to go. But, I certainly don't know all the facts. Just a thought I had.
I established a truce with Visual Studio (or unconditionally surrendered, depending on the point of view), and reverted output directory back to \bin. At least the setup project, which is in the same solution, maintains separate folders for each configuration.