How and when should I run database migrations in this case? - c#

I am maintaining an open-source Software as a Service project, which uses AWS RDS (PostgreSQL) as its database, and GitHub Actions as CI/CD pipeline. I am using Entity Framework Core as my ORM.
I've heard that it's a good idea not to run migrations during application startup - especially if you're on a scale-out system like mine with multiple application nodes.
That's fine, and I guess running it at the CI/CD step would be the best thing to do? However, I would rather not want my GitHub Actions to have access to the database (even if the connection string is described with secrets). Because right now, it is not publically available, and I would like it to stay that way.
In addition, I don't feel like my project is big enough to justify having a release management system like Octopus Deploy.
What do I do here? Is it possible to whitelist GitHub Actions only but not have public access in general, in terms of RDS? Am I even on the right track here?
I am also using AWS Systems Manager's Parameter Store (KMS encrypted), if that's of interest.

Is it possible to whitelist GitHub Actions only but not have public access in general, in terms of RDS?
This should be possible, however range of IP shared by agents could be huge. You can get it vira REST api call as it is mentioned here. The list is updated each week so you should maintain your whitelist.
Another approach could be self hosted agent and then you are the owner of the agent used to build and deploy your code. You can the assign static IP and make whitelist maintenance an easy stuff.

Related

How to allow for multiple types deployment?

In my search for the meaning of life, I stumbled upon a blog post that mentioned that your deployment strategy is not your architecture, it is simply an implementation detail, and as such we need to design for allowing different deployment patterns, whether you want to deploy your system to 1 node or multi-node, or another type of structure.
Do the latest versions of Visual Studio provide some kind of flexibility (besides azure) to be able to deploy services in a variety of strategies?
For example, let's say I have a solution
Acme Solution
--Acme Startup Proj
--Acme Service A.csproj
--Acme Service B.csproj
--Acme Service C.csproj
I want to be able to deploy this entire solution as 1 solution, or I would like to be able to deploy 3 separate binaries, one for each microservice.
AcmeServiceA.exe
AcmeServiceb.exe
AcmeServicec.exe
What does Visual Studio give you in terms of flexibility of deployment configuration?
Deployment techniques will vary with which technologies your app is built. For the sake of an example, I'm going to assume we're dealing with web services or sites.
You've specified two deployment scenarios: deploying a single project (e.g. microservice), and deploying all projects (full rollout). Let's start small...
Deploying an individual project
The main thing to plan for is that each deployable atom (this could be a project or a service + DB backend... something as small as you would prefer not to split it into smaller deployments).
For web projects (either it be Web API projects or other types), Visual Studio's built-in options can be generally summarized as: WebDeploy, Azure, and now with .NET Core, Docker images. I'm not going to go into the details of each, because those are separate questions. But I may refer to some details for you to research if they sound interesting (I'm more familiar conceptually with WebDeploy, so I'll refer to that a lot; but I'm not advocating for or against it).
If you were using WebDeploy for example, you could have each project produce a WebDeploy Package. (Again, look this up for more details on how to do it). This package can be crafted to contain a file payload (the site/service files) as well as a database payload, or other subatoms using the WebDeploy provider model. Visual Studio has pretty decent support for this scenario, and there is documentation on it.
Or you could generate a Docker image. From my understanding (and lack of experience with Docker as yet), if you wanted to deploy your web service and database, they ought to be in separate containers. You'll soon find yourself building these yourself outside of VS. That's not a bad thing, Docker sounds very flexible once you get the hang of it; but you are leaving the IDE for this.
Either way, now you can deploy the atomic package. This was the easy part.
Deploying the solution
So, you've got lots of these atomic deployment packages. How do you roll them all out?
Well, at this point VS doesn't provide a lot for you. And it's hard to justify what VS should do here. Almost every organization is going to come up with slightly different rules. Do you deploy from your CI? Do you create packages and deploy them to different environments in your release pipeline? Or do you do it in the cloud and hotswap environments (like Azure deployment slots)?
A VS native solution has to be either extremely configurable (and hence extremely complicated), or it will be too simple to fit most customers' needs. (As an aside, the initial support for WebDeploy back in VS2010 errored on the first of these. It was extremely configurable, and very difficult for customer or even the product team to wrap their heads around all of the possible scenarios. Source: I was the QA for that feature once upon a time.)
Really at this point you need to determine how and when you rollout your deployments. You need something to orchestrate each of these deployments.
VS generally orchestrates things with MSBuild. Again, I'm not advocating this as your orchestration platform (I actually dislike it for that... it's ok for your project configuration, but IMO not a good fit for task management), but if this is what you want to use, it can work. It's actually pretty simple if you're using it for to the Web Project scenario. You can build your solution and use the parameter /p:PublishOnBuild=true. If you are using WebDeploy to directly publish, you're done! If you're creating WebDeploy Packages, then you still need to push those, but at least you've created them all at once.
If you are using WebDeploy Packages, they will each generate a script to use for publishing. There are ways of passing in different WebDeploy parameters as well, so you can reuse the same package (build output) to publish to different environments. However, you'll have to write your own script to combine all of these into one megalithic deployment.
Ditto for Docker as well. You may get a set of images, but you still need something to orchestrate publishing all of them. Tools like Kubernetes can help you rollout, or in the event of issues, rollback.
There's also more generic orchestration platforms like Octopus Deploy.
How Unsatisfying!
Yeah, it kind of sucks that there isn't an out-of-the-box solution for large scale deployments. But if there was, it wouldn't work for 95% of teams. Most of what VS does provide is enough for an individual or very small development team to get their code to their servers. Any larger of a team, and you'll get better mileage out of building a system that is tailored for how your team operates. There are plenty of tools out there and none of them work perfectly in all cases. Find one that works for you, and you'll be fine. And in the end, it all comes down to pushing files and running scripts. If you don't like one system or tool, you can try another one.
If you are looking for an improved deployment experience in Visual Studio, check out Flexera's InstallShield Limited Edition in-box solution (ISLE, http://blogs.msdn.com/b/visualstudio/archive/2013/8/15/what-s-new-in-visual-studio-2013-and-installshield-limited-edition.aspx). ISLE is a great solution for those customers looking for added capabilities not found in Visual Studio Installer Projects, such as TFS and MSBuild integration, support for creating new web sites and ISO 19770-2 Tagging support, etc.
VS2015: https://marketplace.visualstudio.com/items?itemName=VisualStudioProductTeam.MicrosoftVisualStudio2015InstallerProjects
VS2017: https://marketplace.visualstudio.com/items?itemName=VisualStudioProductTeam.MicrosoftVisualStudio2017InstallerProjects
With the Setup and Deployment project templates you can choose to package all assemblies in the solution or each one individually as MicroService's using Setup, Web, CAB or Merge Module Projects:
Then choose which assemblies are included:
It really kind of depends on the exact use case how to achieve that requested some kind of flexibility and on your definition of an acceptable level of such a flexibility.
Taking your example with these three different executables as separate microservices (Service A, B, C) and as a complete service (Startup) in the context of Web.Api you could do the following:
Each project (Service A, B, C) can be designed as a separate OWIN self hosted executable (as outlined in Use OWIN to Self-Host ASP.NET Web API 2) and provide one or more endpoints to be exposed.
The main project (Startup) could also be an OWIN self host or a regular IIS Web.Api application that references the three projects (Service A, B, C) and load their respective endpoints in its own Startup routine (and optionally additional endpoints of iteself).
You can then use a separate configuration project in Visual Studio (or an external project in a completely different environment) and make use of deployment technologies like Puppet, Chef, or whatever to deploy according to your scenarios.
Your code would then be unaffected by the deployment you are actually wishing to perform and that respective configuration would be managed separately.
If this does not answer your question or if I have misunderstood your question, could you please clarify it and give more details?
When we talk about the meaning of life, here are the two cents about it by a deployment (install) specialist :-)- the answer is (seems :-) long, but it will contain specific information where to look for every point..
First of all, let's state that deployment is NOT a nobrainer, though many developers would like to see it like that (and as a deployment specialist, I observe quite often stakeholders in the software development process actually thinking like this- simple put, deployment is kinda forgotten until the day before shipment :-)
Compare it with coke for a bold and simple example. The "developers" produce the liquid, but it is quite easy to realize here, that the job isn't done yet. :-)
Visual Studio itself has not really support for deployment strategies. Based on several areas of deployment as mentioned in the following list there are of course a lot of technologys, some by Microsoft, helping with that.
What I would do is building setup bundles for different customers or scenarios which install subsets of services like client/server scenarios or others (see no. 3. in the following list.)
Second, as you may have seen from other answers, deployment is not deployment.
Partly, this depends if one sees deployment as just producing binary files by MSBuild or deploying to a test system or deploying to the customer, e.g. by updating the productive web site or producing DVDs or uploading executables to the update web site...
There are several different areas which sure have relations, but every numbered area is large and complicated enough to have own specialists for it:
Deployment seen as part of architecture has to deal with source and binary structures and entities, e.g. project and binary structure (how much .exe, .dll files, how are their dependencies, variation planning.
=> As you mentioned, you are here in the area of (Visual Studio, etc.) solutions, projects, as well namespaces, especially in the WCF area you have contracts, etc., you have (POC#) interfaces, etc. You have Nuget or other tools to resolve and manage dependencies.
.NET has to offer the concept of the assembly to deal with this, the architecture, e.g. if to deploy interfaces and contracts in own assemblies, how to deal with client/server scenarios, how the assemblies depend on each other, is up to you and architecture..
Concerning services, there is an interesting subtask how to host services, They can be hosted on a web server, they can be selfhosted in an .exe, they can be hosted with IIS or OWIN, etc. Links for more information:
Selfhosting in WCF:
https://msdn.microsoft.com/en-us/library/ee939340.aspx
Selfhosting in a Windows service, here with SignalR:
https://code.msdn.microsoft.com/windowsapps/SignalR-self-hosted-in-6ff7e6c3
Hosting with OWIN:
https://en.wikipedia.org/wiki/Open_Web_Interface_for_.NET
Deployment as part of a local Windows or other operating system integration: You have to think about, in which system directories you have to place certain files or data generally. You have to think about shared dlls, shared data, project data, temporary data, user specific data, registry, file system, Windows logo requirements, best practices, service configuration, etc.
Deployment as a process of creating setups, own installations, which, besides other things, accomplishes the needed actions mentioned in 2- with additional tasks like graphical installation front-end (setup GUI), license acknowledgement, what's new section, selection of optional components (just think of Visual Studio setup), uninstall/repair/modify possibilities, and so on.
Deployment as a devops process, e.g. part of continuous integration , continuous delivery and/or continuous deployment. Here are two main points: Technically, to have a defined process which is doing things mentioned in 2. and 3. (or alternatively web deploy steps) automatically as part of the build process ("post-build step").
This can include creating setups or hierarchies of setups- or working without setups at all). The second is to enable testers, developers and managers (or even customers) to see at least every morning or even more often the already installed example of the last nightly or daily build, maybe with several deployment variants (client/server?, basic/prof?) or on different systems.
You are here half in developer world, half in admin world.
Here the main point is often not creating complicated setups like in 3. but primarily to define own "pack" and copy (and sign... etc.) processes, and to automate them as part of the development (and test and delivery) process. Puppet and Chef were already mentioned.
Deployment as web or cloud deployment (can also be the endpoint of a devops process)- others have said something about that, I will omit details here, but an important differentiation is, if you are talking about deployment to the customer or deployment to an intermediary test or staging system.
Maybe one thing making this point worth to be mentioned additionally to devops, is that a deploy to online servers, server farms or a cloud has very own challenges.
Deployment seen primarily as an administrative process of distributing shippable, buyed and/or own programmed software to all the thousands of PCs in a company and it's daughter firms. there are of course special tools for this including update strategy, monitoring, license management and more. You are here in admin world, not in developer world anymore. Microservices will be a new and high challenge to admins which are mostly used to install and distribute "large" packages like MS Office or Oracle or whatsoever.
This topic is not so boring for developers as it seems. Primary because the two "worlds" of developers and admins are merging. And developers have to care about the customer view of "running the software in the real world". Devops is only the beginning. Everybody knows virtual machines, but now we have software defined networking, virtual apps, virtual server farms, the cloud, etc. You can define a deployment architecture by dependendies without any programming just by configuration. So deployment should be part of your application architecture, but mostly it isn't (enough). In fact until now the admin view is nearly nowhere integrated with the view of the software producers/developers. Concerning Microsoft, there is a lot of work done here by the Windows team, esp. in the server product line, and that was never really strategically coordinated with the developer team AFAIK (this is probably valid for EVERY software shop until now :-)
Currently, a lot of people publishing related to devops or the continuous buzzwords are not very experienced with setups. Building setups can be seen as a special technology among the other necessary steps.
Given that you are interested in knowing more about 3. (setups) :
If you don't want only to copy executables, but to have the functionality of full setups, which do more work than just copy, part of setup strategy can be to have bundle setups (sometimes called suite setups or bootstrapper setups) with own selection features. They can call the underlying small setups e.g. for your microservices.
Visual Studio itself has not longer an own support for the more sophisticated setup types like MSI, and especially never had for grouping setups to bundles, what can be one possible solution of deploying a bunch (or variants of bunches) of services- VS has e.g some support for "ClickOnce" deployment, but this has been made more for database ("smart") clients than for services or even microservices.
ClickOnce: https://msdn.microsoft.com/de-de/library/31kztyey.aspx
A replacement for the lack of "real" setup creation in Visual Studio can be the WiX toolset which is an Open Source project formed by Microsoft employees. Or InstallShield Express (which is a free, but a limited variant of the commercial ones).
With both you can create full MSI setups which are maybe the most sophisticated setup type in the windows setup zoo.
a) Of course there are other setups types besides MSI (aka Windows Installer), they are from third party vendors which are more or less proprietary but more simple: , e.g. Nullsoft - NSIS and InnoSetup.
I will not give links for creating single MSI setups because they can be easily found with the given links of creating bundles of MSI setups in the next lines:
b)
The tool for creating setups that select and install other (defined subsets of underlying) in the Wix "world" is called "Burn":
Creating bundles of setups with Burn:
http://wixtoolset.org/documentation/manual/v3/bundle/
Special (paid) support for this you can get for example from the founder of WiX who created a company especially for this:
https://www.firegiant.com/wix/tutorial/net-and-net/bootstrapping/
Rob Mensching, the founder, can be found here on SA as well answering dedicated questions.
c) InstallShield Suite setups:
Another is the already mentioned tool InstallShield, but for this you will need their InstallShield Premium variant which costs bucks:
http://helpnet.installshield.com/installshield21helplib/helplibrary/SteCreatingSuites.htm
d) Setup-Factory :
https://www.indigorose.com/setup-factory/
e) I am sure, many people would advise to take a look into Docker.
Virtual applications are not only setups, but they isolate themselves in the "installed" state from other apps like a sandbox.
See for example https://docs.docker.com/docker-for-windows/
f)
The list would be not complete, if I would not mention APP-V as virtual application installation technology which shares some but not all features with docker. But these technologies are not really made for orchestrating multiple deliveries but to deliver just one app.
And Microsoft has defined a new setup type called AppX.
Especially you have to differ, if you want to create "legacy" (full) desktop applications for Windows where MSI setups are the known technology for or store apps which are the new type since Windows 8 (aka Universal Windows apps aka Windows Store apps aka modern apps aka Metro apps).
AppX:
https://msdn.microsoft.com/en-us/library/windows/desktop/hh446767(v=vs.85).aspx
AppX targets a more simple setup type than MSI.
Universal Windows apps (UWP):
https://learn.microsoft.com/en-us/windows/uwp/get-started/whats-a-uwp
For anything more detailed we have to know more of your requirements.

NuGet/OpenWrap for Deployments and Run Time Dependency Management

I fully understand what NuGet/OpenWrap were primarily made and design for and how it has been adopted and applied since it was released a while ago.
I can however see other cases to use it in yet another way. One of the things that I was thinking of draws attention to the run time dependencies.
The enterprise product suite that I'm working on basically comes with a core that consists of various services and optional modules. These modules plug right in to make specific functionality available to form unique solutions as per requirements. These unique solutions are getting deployed to remote servers inhouse, data centers, the cloud, your patio... pretty much anywhere.
Needles to say deployments of updates for bugfixes + maintenance are complicated and have to be carried out manually which have proven to be error-prone and clumsy. Especially since interface revisions and other components have to match and major deployments usually require a depolyment of each and every module.
Personally I'm not a big fan of creating installer packages (MSI, Web Installer, etc.) for every unique solution as this would get out of hand soon and doesn't scale very well.
I was wondering whether or not a package manager and custom feeds could help us streamlining this process. Maybe I'm thinking in the wrong direction and would appreciate comments and thoughts.
We've done that successfully. OpenWrap can simply be called to update packages into specific directories. Deploying an app is then jsut a matter of adding a new descriptor with the packages you want to see deployed, and letting openwrap do the resolve for you.
This works well especially because OpenWrap has the concept of a system repository (which is per user), which can also be redirected (in case you want to partition multiple repositories, one per application, or for testing...).
Deploying a new app is then only a matter of either adding a new folder with an associated descriptor, or adding the application straight into the system repository. Auto-update can be implemented by simply running the openwrap command-line tools in a batch job.
If you want to go one level up, you can make your application composite by leveraging the OpenWrap API, and adding / removing packages dynamically. We have runtime assembly resolving available.

Rules to follow in the development stage, so it makes the deployment stage easier

What's your advice/points for developers to follow or avoid in the development and early stages of developing a database driven asp.net websites. so we could have an easy and efficient deployment -specially in development(creating) the database to be easily deployed in the feature on my shared hosting server- .
Edit 1
I'm sorry, but still I didn't get any detailed advice specially about the database.
I mean, I'm creating my website database using the SqlExpress -am not sure which version. this is from the connection string "AttachDbFilename="C:\Program Files\Microsoft SQL Server\MSSQL10_50.MSSQLSERVER\MSSQL\DATA\DB.mdf"- . I use the Database Diagrams option in the studio to create FKs and create the relations between table.
So how can I copy this database structure and data in the future to be used on a server. I was thinking maybe I should do it all in SQL and save the script and run it later on a database that I'd create on the deployment server.
Just some thoughts! I hope I'd find some great ways to do it from developers who already deployed websites before!
If possible you should practice an iterative development approach including continuous deployment. Even if you deploy iteratively to a staging area you will be exercising many of your processes. This gives you a chance to fail early and often, there by making your final deployment smoother.
From a prioritisation perspective: software development has the end goal delivered functionality, and if you can't deploy, then you can't deliver any functionality.
For most projects - web or otherwise - the first story should be something like "As a user, I want to be able to install the product, so I can run it." This usually causes the development of the deployment mechanism to be done very early, and maintained as the codebase changes when additional stories are completed.
The deployment mechanism should be your way of delivering functionality to the customer for approval and testing.
It is very important to avoid getting to the end of a project and having to ask "okay, now how do we deploy it?"
Edited to add: Also make 100% certain you're aware of the licencing and distribution restrictions on any third-party components you're using. Pay particular attention to any Free code that may be covered by licences like the GPL. Check whether any commercial components you're using require royalties per deployment, or require special 'server' licences.

Guidance for Migrating MS Access Apps to .Net Apps

I will soon begin the painful*(kidding)* process of migrating multiple, separate, Access Applications to "Real" applications*(notice the quotes, no flame wars please)*. Most likely this will be Web Apps as the usual reason is multiple users and deployability but I will take it case by case.
Some of these are traditional Access apps using Access as the back end and others are using SQL Server(a central one) as the back end.
What I am looking for is a combination of your experience doing this and what resources you used to help.
Websites, apps, standards, best practices, gotcha's, don't forget's, etcetera.
I am a 1 person C# shop with SQL Server back end so whether Web or not I will be looking that direction.
Also, is it overkill or unattainable to try and develop a Framework for this kind of thing? Would there just be TOO MANY variables to even try and walk this path? Anyone ever try this?
Some further info based on below questions. We currently have ~250 users and they are spread between 5 Locations.
What I meant by deployability is perhaps a little vague. I simply meant that we are a Non-Profit Organization and as such we do not have the best bandwidth available so deploying full apps, even through ClickOnce can be tricky when combinded with the highly fickle nature of my users*(I want that box purple, no green, no get rid of it altogether type stuff...)*.
My idea is to try and develop a "framework", of sorts, that will help to streamline the process of moving an Access App to a .Net App.
Now I fully understand that this "framework" may be nothing more than a set of steps and guidelines; like, Use ORM*(LINQ2SQL or SubSonic)*to generate DAL, Copy UI to corresponding UserControls, rewrite Business Logic.
I am just looking for your experience/expertise to help me streamline my streamlining process... ;)
Those apps which use an Access database to store tables and which need web access should first be upsized to SQL Server. There is a tool from the SQL Server group. SQL Server Migration Assistant for Access (SSMA Access)
Then consider moving to the web only that portion of the app that requires remote access. And leaving the rest of the app in Access. That could save a considerable amount of time.
Alternatively consider going to Terminal Server. That along with a VPN means just some software licensing costs and next to no work on your part.
That said what do you mean by "multiple users" and "deployability"? Possibly we can give you some suggestions there. Access is multi user out of the box. However if you have mission critical data or can't rekey the data in the event of a corruption or have more than 25-50 users on the LAN then you should be moving the data to SQL Server.
Now that it's public Access 2010 can deploy applications to the web. All kinds of very interesting stuff can be done. For more information check the Microsoft Access product group blog or my blog with the appropriate Access 2010 tags
Speaking from experience I think you would need to upgrade on a case by case basis. Upgrading is essentially a re-write from scratch and you should take the opportunity here to re-design as necessary. The type of application structure and code style used for Access (likely to be procedural I'm guessing) is very different to a well designed OO .Net app.
You will be able to re-use the SQL Server databases of course and, depending on the apps maybe even the Access ones. If you're feeling brave you could even try the upsizing wizard although I wouldn't recommend it as we found the results less than ideal.
I would also advise you take a look at some kind of ORM tool (we use Subsonic) as this can massively reduce the amount of boiler plate code you need to write. Some ORM tools will also generate DDL for your database too.
We follow these standards (good idea to pick a standard early on and stick to it we found) and also found this really useful to get up and running.
Hope this was some help.

Offline synchronization options with .NET

I've been asked to research approaches to deal with an app we're supposed to be building. This app, hypothetically a Windows form written in C#, will issue commands directly to the server if it's connected, but if the app is offline, the state must be maintained as if it was connected and then sync up and issue data changes/commands to the server once it is connected.
I'm not sure where to start looking. This is something akin to Google Gears, but I don't think I have that option if we go a Winform route (which looks likely, given that there are other functions the application needs that a web app couldn't perform). Is the Microsoft Sync framework a viable option? Does Silverlight do anything like this? Any other options? I've Googled around a bit but would like the community input on what's best given the scenario.
The Microsoft Sync Framework definitely supports the scenario you describe, although I would say that it's fairly complicated to get it working.
One thing to understand about the Sync Framework is that it's really two quite distinct frameworks shipping in the same package:
Sync Framework
ADO.NET Sync services v. 2
The ADO.NET Sync services are by far the easiest to set up, but they are constrained to synchronizing two relational data stores (although you can set up a web service as a remote facade between the two).
The core Sync Framework has no such limitations, but is far more complex to implement. When I used it about six months ago, I found that the best source to learn from was the SDK, and particularly the File/Folder sync sample code.
As far as I could tell, there was little to no sharing of code and types between the two 'frameworks', so you will have to pick one or the other.
In either case, there are no constraints on how you host the sync code, so Windows Forms is just one option among many.
If I understand correctly, this doesn't sound like an actual data synchronization issue to me where you want to keep two databases in sync. it sounds more like you want a reliable mechanism for a client to call functions on a server in an environment where the connection is unstable, and if the connection is not present at the time, you want the function called as soon as the connection is back up.
If my understanding is right, this is one option. if not, this will probably not be helpful.
This is a very short answer to an in-depth problem, but we had a similar situation and this is how we handled it.
We have a client application that needs to monitor some data on a PC in a store. When certain events happen, this client application needs to update our server in the corporate offices, preferably Real-Time. However, the connection is not 100% reliable, so we needed a similar mechanism.
We solved this by trying to write to the server via a web service. If there is an error calling the web service, the command is serialized as an XML file in a folder named "waiting to upload".
We have a routine running in our client app on a timer set for every n minutes. When the timer elapses, it checks for XML files in this folder. If found, it attempts to call the web service using the information saved in the file, and so on until it is successful. Upon a successful call, the XML file is deleted.
It sounds hack-ish, but it was simple to code and has worked flawlessly for five years now. It's actually been our most trouble-free application all-around and we've implemented the pattern elsewhere successfully

Categories

Resources