In my search for the meaning of life, I stumbled upon a blog post that mentioned that your deployment strategy is not your architecture, it is simply an implementation detail, and as such we need to design for allowing different deployment patterns, whether you want to deploy your system to 1 node or multi-node, or another type of structure.
Do the latest versions of Visual Studio provide some kind of flexibility (besides azure) to be able to deploy services in a variety of strategies?
For example, let's say I have a solution
Acme Solution
--Acme Startup Proj
--Acme Service A.csproj
--Acme Service B.csproj
--Acme Service C.csproj
I want to be able to deploy this entire solution as 1 solution, or I would like to be able to deploy 3 separate binaries, one for each microservice.
AcmeServiceA.exe
AcmeServiceb.exe
AcmeServicec.exe
What does Visual Studio give you in terms of flexibility of deployment configuration?
Deployment techniques will vary with which technologies your app is built. For the sake of an example, I'm going to assume we're dealing with web services or sites.
You've specified two deployment scenarios: deploying a single project (e.g. microservice), and deploying all projects (full rollout). Let's start small...
Deploying an individual project
The main thing to plan for is that each deployable atom (this could be a project or a service + DB backend... something as small as you would prefer not to split it into smaller deployments).
For web projects (either it be Web API projects or other types), Visual Studio's built-in options can be generally summarized as: WebDeploy, Azure, and now with .NET Core, Docker images. I'm not going to go into the details of each, because those are separate questions. But I may refer to some details for you to research if they sound interesting (I'm more familiar conceptually with WebDeploy, so I'll refer to that a lot; but I'm not advocating for or against it).
If you were using WebDeploy for example, you could have each project produce a WebDeploy Package. (Again, look this up for more details on how to do it). This package can be crafted to contain a file payload (the site/service files) as well as a database payload, or other subatoms using the WebDeploy provider model. Visual Studio has pretty decent support for this scenario, and there is documentation on it.
Or you could generate a Docker image. From my understanding (and lack of experience with Docker as yet), if you wanted to deploy your web service and database, they ought to be in separate containers. You'll soon find yourself building these yourself outside of VS. That's not a bad thing, Docker sounds very flexible once you get the hang of it; but you are leaving the IDE for this.
Either way, now you can deploy the atomic package. This was the easy part.
Deploying the solution
So, you've got lots of these atomic deployment packages. How do you roll them all out?
Well, at this point VS doesn't provide a lot for you. And it's hard to justify what VS should do here. Almost every organization is going to come up with slightly different rules. Do you deploy from your CI? Do you create packages and deploy them to different environments in your release pipeline? Or do you do it in the cloud and hotswap environments (like Azure deployment slots)?
A VS native solution has to be either extremely configurable (and hence extremely complicated), or it will be too simple to fit most customers' needs. (As an aside, the initial support for WebDeploy back in VS2010 errored on the first of these. It was extremely configurable, and very difficult for customer or even the product team to wrap their heads around all of the possible scenarios. Source: I was the QA for that feature once upon a time.)
Really at this point you need to determine how and when you rollout your deployments. You need something to orchestrate each of these deployments.
VS generally orchestrates things with MSBuild. Again, I'm not advocating this as your orchestration platform (I actually dislike it for that... it's ok for your project configuration, but IMO not a good fit for task management), but if this is what you want to use, it can work. It's actually pretty simple if you're using it for to the Web Project scenario. You can build your solution and use the parameter /p:PublishOnBuild=true. If you are using WebDeploy to directly publish, you're done! If you're creating WebDeploy Packages, then you still need to push those, but at least you've created them all at once.
If you are using WebDeploy Packages, they will each generate a script to use for publishing. There are ways of passing in different WebDeploy parameters as well, so you can reuse the same package (build output) to publish to different environments. However, you'll have to write your own script to combine all of these into one megalithic deployment.
Ditto for Docker as well. You may get a set of images, but you still need something to orchestrate publishing all of them. Tools like Kubernetes can help you rollout, or in the event of issues, rollback.
There's also more generic orchestration platforms like Octopus Deploy.
How Unsatisfying!
Yeah, it kind of sucks that there isn't an out-of-the-box solution for large scale deployments. But if there was, it wouldn't work for 95% of teams. Most of what VS does provide is enough for an individual or very small development team to get their code to their servers. Any larger of a team, and you'll get better mileage out of building a system that is tailored for how your team operates. There are plenty of tools out there and none of them work perfectly in all cases. Find one that works for you, and you'll be fine. And in the end, it all comes down to pushing files and running scripts. If you don't like one system or tool, you can try another one.
If you are looking for an improved deployment experience in Visual Studio, check out Flexera's InstallShield Limited Edition in-box solution (ISLE, http://blogs.msdn.com/b/visualstudio/archive/2013/8/15/what-s-new-in-visual-studio-2013-and-installshield-limited-edition.aspx). ISLE is a great solution for those customers looking for added capabilities not found in Visual Studio Installer Projects, such as TFS and MSBuild integration, support for creating new web sites and ISO 19770-2 Tagging support, etc.
VS2015: https://marketplace.visualstudio.com/items?itemName=VisualStudioProductTeam.MicrosoftVisualStudio2015InstallerProjects
VS2017: https://marketplace.visualstudio.com/items?itemName=VisualStudioProductTeam.MicrosoftVisualStudio2017InstallerProjects
With the Setup and Deployment project templates you can choose to package all assemblies in the solution or each one individually as MicroService's using Setup, Web, CAB or Merge Module Projects:
Then choose which assemblies are included:
It really kind of depends on the exact use case how to achieve that requested some kind of flexibility and on your definition of an acceptable level of such a flexibility.
Taking your example with these three different executables as separate microservices (Service A, B, C) and as a complete service (Startup) in the context of Web.Api you could do the following:
Each project (Service A, B, C) can be designed as a separate OWIN self hosted executable (as outlined in Use OWIN to Self-Host ASP.NET Web API 2) and provide one or more endpoints to be exposed.
The main project (Startup) could also be an OWIN self host or a regular IIS Web.Api application that references the three projects (Service A, B, C) and load their respective endpoints in its own Startup routine (and optionally additional endpoints of iteself).
You can then use a separate configuration project in Visual Studio (or an external project in a completely different environment) and make use of deployment technologies like Puppet, Chef, or whatever to deploy according to your scenarios.
Your code would then be unaffected by the deployment you are actually wishing to perform and that respective configuration would be managed separately.
If this does not answer your question or if I have misunderstood your question, could you please clarify it and give more details?
When we talk about the meaning of life, here are the two cents about it by a deployment (install) specialist :-)- the answer is (seems :-) long, but it will contain specific information where to look for every point..
First of all, let's state that deployment is NOT a nobrainer, though many developers would like to see it like that (and as a deployment specialist, I observe quite often stakeholders in the software development process actually thinking like this- simple put, deployment is kinda forgotten until the day before shipment :-)
Compare it with coke for a bold and simple example. The "developers" produce the liquid, but it is quite easy to realize here, that the job isn't done yet. :-)
Visual Studio itself has not really support for deployment strategies. Based on several areas of deployment as mentioned in the following list there are of course a lot of technologys, some by Microsoft, helping with that.
What I would do is building setup bundles for different customers or scenarios which install subsets of services like client/server scenarios or others (see no. 3. in the following list.)
Second, as you may have seen from other answers, deployment is not deployment.
Partly, this depends if one sees deployment as just producing binary files by MSBuild or deploying to a test system or deploying to the customer, e.g. by updating the productive web site or producing DVDs or uploading executables to the update web site...
There are several different areas which sure have relations, but every numbered area is large and complicated enough to have own specialists for it:
Deployment seen as part of architecture has to deal with source and binary structures and entities, e.g. project and binary structure (how much .exe, .dll files, how are their dependencies, variation planning.
=> As you mentioned, you are here in the area of (Visual Studio, etc.) solutions, projects, as well namespaces, especially in the WCF area you have contracts, etc., you have (POC#) interfaces, etc. You have Nuget or other tools to resolve and manage dependencies.
.NET has to offer the concept of the assembly to deal with this, the architecture, e.g. if to deploy interfaces and contracts in own assemblies, how to deal with client/server scenarios, how the assemblies depend on each other, is up to you and architecture..
Concerning services, there is an interesting subtask how to host services, They can be hosted on a web server, they can be selfhosted in an .exe, they can be hosted with IIS or OWIN, etc. Links for more information:
Selfhosting in WCF:
https://msdn.microsoft.com/en-us/library/ee939340.aspx
Selfhosting in a Windows service, here with SignalR:
https://code.msdn.microsoft.com/windowsapps/SignalR-self-hosted-in-6ff7e6c3
Hosting with OWIN:
https://en.wikipedia.org/wiki/Open_Web_Interface_for_.NET
Deployment as part of a local Windows or other operating system integration: You have to think about, in which system directories you have to place certain files or data generally. You have to think about shared dlls, shared data, project data, temporary data, user specific data, registry, file system, Windows logo requirements, best practices, service configuration, etc.
Deployment as a process of creating setups, own installations, which, besides other things, accomplishes the needed actions mentioned in 2- with additional tasks like graphical installation front-end (setup GUI), license acknowledgement, what's new section, selection of optional components (just think of Visual Studio setup), uninstall/repair/modify possibilities, and so on.
Deployment as a devops process, e.g. part of continuous integration , continuous delivery and/or continuous deployment. Here are two main points: Technically, to have a defined process which is doing things mentioned in 2. and 3. (or alternatively web deploy steps) automatically as part of the build process ("post-build step").
This can include creating setups or hierarchies of setups- or working without setups at all). The second is to enable testers, developers and managers (or even customers) to see at least every morning or even more often the already installed example of the last nightly or daily build, maybe with several deployment variants (client/server?, basic/prof?) or on different systems.
You are here half in developer world, half in admin world.
Here the main point is often not creating complicated setups like in 3. but primarily to define own "pack" and copy (and sign... etc.) processes, and to automate them as part of the development (and test and delivery) process. Puppet and Chef were already mentioned.
Deployment as web or cloud deployment (can also be the endpoint of a devops process)- others have said something about that, I will omit details here, but an important differentiation is, if you are talking about deployment to the customer or deployment to an intermediary test or staging system.
Maybe one thing making this point worth to be mentioned additionally to devops, is that a deploy to online servers, server farms or a cloud has very own challenges.
Deployment seen primarily as an administrative process of distributing shippable, buyed and/or own programmed software to all the thousands of PCs in a company and it's daughter firms. there are of course special tools for this including update strategy, monitoring, license management and more. You are here in admin world, not in developer world anymore. Microservices will be a new and high challenge to admins which are mostly used to install and distribute "large" packages like MS Office or Oracle or whatsoever.
This topic is not so boring for developers as it seems. Primary because the two "worlds" of developers and admins are merging. And developers have to care about the customer view of "running the software in the real world". Devops is only the beginning. Everybody knows virtual machines, but now we have software defined networking, virtual apps, virtual server farms, the cloud, etc. You can define a deployment architecture by dependendies without any programming just by configuration. So deployment should be part of your application architecture, but mostly it isn't (enough). In fact until now the admin view is nearly nowhere integrated with the view of the software producers/developers. Concerning Microsoft, there is a lot of work done here by the Windows team, esp. in the server product line, and that was never really strategically coordinated with the developer team AFAIK (this is probably valid for EVERY software shop until now :-)
Currently, a lot of people publishing related to devops or the continuous buzzwords are not very experienced with setups. Building setups can be seen as a special technology among the other necessary steps.
Given that you are interested in knowing more about 3. (setups) :
If you don't want only to copy executables, but to have the functionality of full setups, which do more work than just copy, part of setup strategy can be to have bundle setups (sometimes called suite setups or bootstrapper setups) with own selection features. They can call the underlying small setups e.g. for your microservices.
Visual Studio itself has not longer an own support for the more sophisticated setup types like MSI, and especially never had for grouping setups to bundles, what can be one possible solution of deploying a bunch (or variants of bunches) of services- VS has e.g some support for "ClickOnce" deployment, but this has been made more for database ("smart") clients than for services or even microservices.
ClickOnce: https://msdn.microsoft.com/de-de/library/31kztyey.aspx
A replacement for the lack of "real" setup creation in Visual Studio can be the WiX toolset which is an Open Source project formed by Microsoft employees. Or InstallShield Express (which is a free, but a limited variant of the commercial ones).
With both you can create full MSI setups which are maybe the most sophisticated setup type in the windows setup zoo.
a) Of course there are other setups types besides MSI (aka Windows Installer), they are from third party vendors which are more or less proprietary but more simple: , e.g. Nullsoft - NSIS and InnoSetup.
I will not give links for creating single MSI setups because they can be easily found with the given links of creating bundles of MSI setups in the next lines:
b)
The tool for creating setups that select and install other (defined subsets of underlying) in the Wix "world" is called "Burn":
Creating bundles of setups with Burn:
http://wixtoolset.org/documentation/manual/v3/bundle/
Special (paid) support for this you can get for example from the founder of WiX who created a company especially for this:
https://www.firegiant.com/wix/tutorial/net-and-net/bootstrapping/
Rob Mensching, the founder, can be found here on SA as well answering dedicated questions.
c) InstallShield Suite setups:
Another is the already mentioned tool InstallShield, but for this you will need their InstallShield Premium variant which costs bucks:
http://helpnet.installshield.com/installshield21helplib/helplibrary/SteCreatingSuites.htm
d) Setup-Factory :
https://www.indigorose.com/setup-factory/
e) I am sure, many people would advise to take a look into Docker.
Virtual applications are not only setups, but they isolate themselves in the "installed" state from other apps like a sandbox.
See for example https://docs.docker.com/docker-for-windows/
f)
The list would be not complete, if I would not mention APP-V as virtual application installation technology which shares some but not all features with docker. But these technologies are not really made for orchestrating multiple deliveries but to deliver just one app.
And Microsoft has defined a new setup type called AppX.
Especially you have to differ, if you want to create "legacy" (full) desktop applications for Windows where MSI setups are the known technology for or store apps which are the new type since Windows 8 (aka Universal Windows apps aka Windows Store apps aka modern apps aka Metro apps).
AppX:
https://msdn.microsoft.com/en-us/library/windows/desktop/hh446767(v=vs.85).aspx
AppX targets a more simple setup type than MSI.
Universal Windows apps (UWP):
https://learn.microsoft.com/en-us/windows/uwp/get-started/whats-a-uwp
For anything more detailed we have to know more of your requirements.
Me and my couple of friends will start working on a C# database project. We will use Microsoft VS 2015 and SQL Server 2014. Is there any way that our Visual Studio (installed on separate laptop) can connect to the same project?
For example, if one of my friends removes a class from the project, that class should also be removed from our project. Also, if he adds something, that change should also be shown on our VS solution.
If you have db project in Visual studio you should connect it to some version control. After that every change done by your friends will be fetch/pull on your local machine and you will execute the db project. Same is for code changes in your main project. Read about svn and git and choose what is better for you.
Git is a free and open source distributed version control system
designed to handle everything from small to very large projects with
speed and efficiency.
Getting your project on GitHub
Subversion is a free/open source version control system (VCS). That
is, Subversion manages files and directories, and the changes made to
them, over time. This allows you to recover older versions of your
data or examine the history of how your data changed. In this regard,
many people think of a version control system as a sort of “time
machine.”
Subversion can operate across networks, which allows it to be used by
people on different computers. At some level, the ability for various
people to modify and manage the same set of data from their respective
locations fosters collaboration. Progress can occur more quickly
without a single conduit through which all modifications must occur.
And because the work is versioned, you need not fear that quality is
the trade-off for losing that conduit—if some incorrect change is made
to the data, just undo that change.
Some version control systems are also software configuration
management (SCM) systems. These systems are specifically tailored to
manage trees of source code and have many features that are specific
to software development—such as natively understanding programming
languages, or supplying tools for building software. Subversion,
however, is not one of these systems. It is a general system that can
be used to manage any collection of files. For you, those files might
be source code—for others, anything from grocery shopping lists to
digital video mixdowns and beyond.
Importing Data Into A Repository SVN
If you are not aware with git/svn I advise you to use SVN it is easy to understand. Git has his advantages when your team is really big and for open source. Currently on this moment git is the "future" in version controls.
Team Foundation
You can use Team Foundation Version Control (TFVC) to scale from small
to large projects, and by using server workspaces, you can scale up to
very large codebases with millions of files per branch and large
binary files. TFVC is a centralized version control system that lets
you apply granular permissions and restrict access down to a file
level. Because your team checks in all their work into your Team
Foundation server, you can easily audit changes and identify which
user checked in a changeset. By using compare and annotate you can
identify the exact changes that they made.
https://www.visualstudio.com/tfs/
GitLab
GitLab Inc. is a company based on the GitLab open-source project.
GitLab is an application to code, test, and deploy code together. It
provides Git repository management with fine grained access controls,
code reviews, issue tracking, activity feeds, wikis, and continuous
integration.
https://about.gitlab.com/
Bitbucket
Bitbucket is a web-based hosting service for projects that use either
the Mercurial (since launch) or Git (since October 2011) revision
control systems. Bitbucket offers both commercial plans and free
accounts. It offers free accounts with an unlimited number of private
repositories (which can have up to five users in the case of free
accounts) as of September 2010, but by inviting three users to join
Bitbucket, three more users can be added, for eight users in total.
Bitbucket is written in Python using the Django web framework.
https://www.atlassian.com/software/bitbucket
What's your advice/points for developers to follow or avoid in the development and early stages of developing a database driven asp.net websites. so we could have an easy and efficient deployment -specially in development(creating) the database to be easily deployed in the feature on my shared hosting server- .
Edit 1
I'm sorry, but still I didn't get any detailed advice specially about the database.
I mean, I'm creating my website database using the SqlExpress -am not sure which version. this is from the connection string "AttachDbFilename="C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\DB.mdf"- . I use the Database Diagrams option in the studio to create FKs and create the relations between table.
So how can I copy this database structure and data in the future to be used on a server. I was thinking maybe I should do it all in SQL and save the script and run it later on a database that I'd create on the deployment server.
Just some thoughts! I hope I'd find some great ways to do it from developers who already deployed websites before!
If possible you should practice an iterative development approach including continuous deployment. Even if you deploy iteratively to a staging area you will be exercising many of your processes. This gives you a chance to fail early and often, there by making your final deployment smoother.
From a prioritisation perspective: software development has the end goal delivered functionality, and if you can't deploy, then you can't deliver any functionality.
For most projects - web or otherwise - the first story should be something like "As a user, I want to be able to install the product, so I can run it." This usually causes the development of the deployment mechanism to be done very early, and maintained as the codebase changes when additional stories are completed.
The deployment mechanism should be your way of delivering functionality to the customer for approval and testing.
It is very important to avoid getting to the end of a project and having to ask "okay, now how do we deploy it?"
Edited to add: Also make 100% certain you're aware of the licencing and distribution restrictions on any third-party components you're using. Pay particular attention to any Free code that may be covered by licences like the GPL. Check whether any commercial components you're using require royalties per deployment, or require special 'server' licences.
I have a client that has an ASP.NET web application they sell to people that either in turn host it with our company or elsewhere. The end result is that source code and database setup can be spread across multiple servers, so when we push updates, we have to push manually by copying over source code and then updating the databases as needed. Are there any good alternatives out there for doing this to multiple servers?
If you're using Visual Studio 2010, then you could consider the new Web Deployment Package technology. See ASP.NET Web Application Project Deployment Overview
, which explains how you can also deploy IIS settings and even necessary databases as part of the deployment.
Have a look at the video on this page. It gives a decent introduction to MSDeploy.
An MSI would probably work best. Odds are, your customers will want to be in control of when/if the update happens, and an MSI is probably the simplest mechanism for giving out such an update.
John and Trip have great answers. I just wanted to add one little thing, unless this is an open source project:
Don't deploy source code.
Your client's app should be compiled and obfuscated.
I have a three-tier application which is installed in corporate environments. With every server version update, all clients have to be updated, too. Currently, I provide an MSI package which is automatically deployed via Active Directory, however my customers (mostly with 20-300 users each) seem to hate the MSI solution because it is
Complicated to get it running (little Active Directory knowledge);
The update process can't be triggered by the server, when a new version is detected;
Customers can't install multiple versions of the client (e.g. 2.3 and 2.4) at the same time to speak to different servers;
The update process itself doesn't always work as expected (sometimes very strange behaviour healing itself after a few hours)
I've now made a few experiments with ClickOnce, but that way to unflexible for me and too hard to integrate in my automated build process. Also, it produces cryptic error messages which would surely confuse my customers.
I would have no problems to write the update logic myself, but there the problem is that the users running to self-updating applications have too restricted rights to perform an update. I've found that they are able to write to their Local Application Data directory, but I don't think this would be the typical place to install application files into.
Do you know a way to an update that "just works"?
You can somewhat replicate what ClickOnce does, just adjust it for your needs.
Create a lightweight executable that checks a network/web location for updates.
If there are updates, it copies them locally and replaces the "real" application files.
It runs the "real" application.
The location for the application files should be determined by permissions and operating system. If users only have write permission to a limited set of folders, then you don't have a choice but use one of these folders. Another option is provide an initial installation package that installs the lightweight executable and grants r/w permission on a specific folder such as "C:\Program Files\MyApp". This approach usually requires a buy-in from IT.
I hope this helps.
It is really hard to provide you exact answers because critical information about the client side installer is not explicit. Do you install client side files into Program Files? Then you may meet problems when users are restricted.
You don't think Local Application Data is a folder to deploy application, but Google does. Its Chrome browser installs that way on Windows, and its automatic update process is even unnoticable (which may sound horrible). So why not deploy your application into this folder for restricted users? You may find more about Chrome installer here,
http://robmensching.com/blog/archive/2008/09/04/Dissecting-the-Google-Chrome-setup.aspx
Here's an open-source solution I wrote to address specific needs we had for WinForms and WPF apps. The general idea is to have the greatest flexibility, at the lowest overhead possible. It should give you all the flexibility you need for all that you have described.
So, integration is super-easy, and the library does pretty much everything for you, including synchronizing operations. It is also highly flexible, and lets you determine what tasks to execute and on what conditions - you make the rules (or use some that are there already). Last by not least is the support for any updates source (web, BitTorrent, etc) and any feed format - whatever is not implemented you can just write for yourself.
Cold updates (requiring an application restart) is also supported, and done automatically unless "hot-swap" is specified for the task.
This boild down to one DLL, less than 70kb in size.
More details at http://www.code972.com/blog/2010/08/nappupdate-application-auto-update-framework-for-dotnet/
Code is at http://github.com/synhershko/NAppUpdate (Licensed under the Apache 2.0 license)
I plan on extending it more when I'll get some more time, but honestly you should be able to quickly enhance it yourself for whatever it currently doesn't support.
If you don't want to give your users too many rights, it is possible to write a Windows Service, which will run on each computer under an account with the appropriate privileges, and which can update your application, when a new version gets available.