We're trying to move a fairly complicated set of web applications from .NET 4.5 to .NET 5.0, hosted in IIS.
One problem we need to figure out is how to manage the per-environment configuration, during deploys.
With .NET 4.5 we are using web deploy packages. The production team already has hundreds of xxx.SetParameters.xml files for the many customers and environments.
In dev and test, we're using appsettings.%ASPNETCORE_ENVIRONMENT%.json, but that's not going to work for our production deploys.
The production team would like to continue using web deploy packages with xxx.Parameters.xml and xxx.SetParameters.xml files, so they don't have to reconstruct them in some new mechanism.
We're fine with telling them they have to change, if they really do have to change.
How are per-environment configuration settings usually managed, in asp.net 5.0?
Some additional clarification.
In our existing process, a build generates a deployment package.
A deploy to a particular environment then takes that package and a file defining the configuration for that particular environment, and creates a site and applies that configuration.
This is currently using msdeploy, but that's not important. What is important is that the per-environment configuration files are maintained in a separate repository, and are not stored with the source code.
And we don't want them to be.
The change history of the per-environment configuration files is completely unrelated to the change history of the source code used to create the deploy packages.
Related
I hope this is no duplicate but I couldn't find any related topic. Every developer knows the problem. I create my app, deploy it and on some other machine it does not run because some dependency is missing. Often those dependecies are part of the workloads installed along with Visual Studio or some SDKs.
My question now is if there is a way to test my app on my dev machine like it had not installed SDKs and VS? Basically I want the app to only consider dependencies I explicitely specified through project references, nuget packages or locally copied files. Or in short: every dependency should be part of the app's deployment target folder.
Of course some really basic OS-dependent stuff needs to be used as well but I don't want the app to use stuff like OpenAL, GLFW, Windows SDK or similiar things just because I have installed it on my machine beforehand.
I hope you can understand what I mean. So I basically need some sandbox. I know there are things like VMs, docker, etc but I would like to use this as I run my app from Visual Studio. So if I hit F5 I want the app to ignore globally installed stuff at all.
I work with VS 2022. Thanks for advice.
You could use a continuous integration system in order to build (from scratch), publish and test on a fixed known build agent configuration. I used Teamcity.
You could use a Virtual Machine or a docker image as agent PC.
Moreover you can configure more agents with different possible configurations.
As a general rule of thumb, you can reference Nuget packages instead of assemblies in the GAC. This way, they will be copied to your application's bin folder.
You can also use .NET Application Publishing to create a deployable folder with your application. If you're targeting .NET Core, and the target machine may not have .NET Core installed or you don't know what version it will have, you can create a self contained release which will include the .NET Core binaries in the release.
See https://learn.microsoft.com/en-us/dotnet/core/deploying/
I made a c# asp.net solution and it has multiple projects in them. It has 3 API projects and 2 web projects and the client wants it such that all these Web and API projects are on different servers so that if one of them goes down due to overload or an error, the others can still work. There are also projects for BLL and DAL similar to a repository structure, but they can be on the same server. Is there any way to do that on Azure or AWS using gateways or multi-tenants? Not sure if that's possible.
The best practice is - If you have different deployment targets (different servers) for different projects (API, WEB etc) then break the solution and create individual solution for your Web, and API projects and commit them into separate repositories. Since you have only 4-5 projects this approach will work better for you.And deploy using a CI/CD pipeline.
Deploying directly from the visual studio to production server is never considered a good approach in the DevOps era.
However, if you do not want to break the repository and still want to use visual studio for deployment (or doesn't have Jenkins/Azure DevOps etc.) then. Right click on the project that you want to deploy. Click on Publish. Then you will have to select your target environment (IIS, Azure, Folder, FTP etc.) as the image below and publish the projects to server one by one.
I am developing a C#, MVC4, EF5 Code First application on .NET in Visual Studio 2012 and have used the VS publish mechanism to deploy it to an Azure Website with an Azure SQL Database.
I now want to use Git and GitHub for version control and involve others in the project.
However, although I am familiar with using Git in a LAMP environment, I have no experience of using Git with Windows, Azure Websites and a compiled environment.
I would like to use the Azure Website as the production server, another Azure Website as a Staging server, developer Windows machines using Visual Studio for development and GitHub as the central repository.
There is a helpful article here: http://www.windowsazure.com/en-us/develop/net/common-tasks/publishing-with-git/ . I can get my head around what would be needed here for, say, a PHP application on Azure. But I am unsure of the best approach with a compiled application and what I can achieve using Azure Websites and Visual Studio.
A nudge or two in the right direction would be greatly appreciated!
don't publish from VS to azure, instead setup your azure website to pull from the github repo. the deployment process compiles your solution.
watch http://www.youtube.com/watch?v=5NGieL0tinw&feature=youtu.be&hd=1 or read http://vishaljoshi.blogspot.com/2012/09/continuous-deployment-from-github-to.html
Also SocttGu announced this on his blog # http://weblogs.asp.net/scottgu/archive/2012/09/17/announcing-great-improvements-to-windows-azure-web-sites.aspx he also talks about a cool feature of publishing branches, this will nail your requirement for a stage server and production server. Have a stage branch and a production branch and merge to them as desired. see the section "Support for multiple branches"
looks like they added support for private repos finally.
appharbor is a competitor to azure that does something similar.
You are basically introducing a new step with the requirement that the source code must be compiled before it can be deployed to the server. Where you implement this step is up to you. You could:
Ensure that your target server has the capabilities to compile the source code (some Continuous Integration tools could help with this, such as CruiseControl.NET). This has the caveat that the target server be able to compile source code (possibly even requiring Visual Studio to be installed), so that may not be an option.
Check the compiled binaries into source control. You could keep these compiled binaries separate from the main source branch, to keep things clean. Deploy the binaries to the target server.
Some hybrid of the previous two options is also possible; you could set up a Continuous Integration server with CruiseControl.NET, which can check out the current source, build it, and check the resulting binary back into a special branch, then deploy that branch to your target Server.
I am working on my first ASP.NET MVC 4 app. The client is deploying directly from the SVN repo, which I am pushing from. Can/should I be checking in release builds, or should they be running builds on their end as part of the deploy process. I am wanting to make it as simple for them as possible. Thanks for any advice!
You shouldn't be checking any builds into a source control repository. Only source code. A build server should be used to precompile the application using the target configuration (Release if you are pushing to production). Also be careful not to leave any production connection strings and urls into the source code you have commited. An innocent developer could checkout the code and do lots of damage without any consciousness.
What is the best strategy for making changes to a specific file within a C# .NET project and a DEV server and then moving that file to a different environment, say server B? I noticed it always wants me to recompile on the destination server and I figured I was doing something wrong because I didn't think I would have to (plus the server isn't in-house so it is really slow and time consuming).
Any suggestions or strategies you or your company uses would be appreciated.
Make sure you are using a Web Application project where it compiles a DLL, not web site which uses loose code files.
You could use a source code versioning system like Subversion.
use a source control program for source files (like SubVersion) and Cruise Control for binaries built out of those files...
For web application development my experience has been:
Developers have a development environment on their local machines that is attached to source control
A DEV web server with shares to the projects created allows developers to COPY files to the web application folders manually.
A TEST web server where MSI installations ONLY are used to distribute the changes for UAT
A PROD web server where MSI installations ONLY are used to distribute the UAT approved MSI
The size of projects I am involved with usually makes build scripts overkill, most times a project is being worked on it is built many times for debugging etc.