I have a small production instance on Azure for my web app. To quickly make changes to the live site, I open the instance via RDC and copy across any dlls and files that need to be added. That seemed to be working fine.
However, last night, the instance seemed to have been reset (I'm still investigating why) and the version of the site was rolled back to a month or so ago.
I have read on StackOverFlow that any changes made via RDC are not saved when an instance reset is performed by Microsoft and that it rolls back to the previous Publish.
Surely there has to be a quicker way to make changes to a production instance than having to Publish the app each time? Each Publish takes approximately 45 minutes. If I'm making multiple deployments per day then is there a better solution?
No, all changes must be published.
Microsoft garantees that there will be an working instance but no garantees that it will be same instance.
We were looking for other solutions but now just publish changes.
But we are probably lucky to waste 10 minutes to publish.
There are a few things you can look at:
The size of your *.cspkg file. Try to shrink the *.cspkg file to reduce the upload time. You could store static files like images, videos, ... in blob storage for example.
Use a synchronization mechanism that synchronizes all files from a blob container to your IIS website. If you use this you can simply copy the files to blob storage in order to update your instance. Note that there are things you need to consider, like what happens to your startup tasks, what about rolling upgrades, ... ? Steve wrote a great blogpost about this: Update Your Windows Azure Website in Just Seconds by Syncing with Blob Storage (this uses a WorkerRole instead of a WebRole)
Go for Windows Azure Web Sites, this allows you to deploy immediately using FTP, Git, ...
Related
I have an existing Umbraco-project running on an IIS-server. When I started the project I basically installed Umbraco directly to the server and coded through the Admin-interface until the site was launched and went live.
Now, the customer wants some changes and it feels like I've painted myself into a corner here, since I obviously can't make changes in the code while the site is running. So, my question, which I hope this excellent community can help me out with:
How do I proceed when I get a copy of the project locally, to develop in Visual Studio and then publish it back to the live server? Should I create a GIT-repo, or is WebDeploy an option? Which workflow is the most convenient when developing and maintaining an Umbraco site?
I am also quite insecure about what configurations are needed on the IIS-server to support deployment from my local machine, so some input about that would be much appreciated!
Don't tell anyone, but for file updates only (like stylesheets or templates), I sometimes just FTP them up or publish to file system and copy/paste the files via RDP. If you make changes to document types etc., consider if you can't just test them locally and recreate the changes in the live environment afterwards.
Of course, this is only viable for small changes/solutions and one-man teams, but I feel that it's perfectly okay since the alternative(s) get very complex very quickly.
Edit: and also it requires you to keep your local copy pretty close to the live one, so you can be relatively sure stuff doesn't break when you "publish".
You'll have to get a copy of the files from the server, but also use a local copy of the database and update the connection string in web.config to point to that, otherwise any backoffice changes you make will be made on the live server.
If you are just making changes to views then some kind of publish from visual studio will probably work, but if you are making new document types, macros etc., this also involves data changes in the database which won't be published from visual studio at the same time.
There are various tools out there which cater for this situation of making development changes and moving them to live.
The obvious one is that you repeat all the changes you made in dev on the live server, although this can become tedious if there are lots of changes.
There are also things like Umbraco Courier and another one which I can't remember the name of for the life of me that copy all the relevant things from dev to live in one go.
I have a system with two web applications, one web service, one Windows service and a WPF application running 24 hours a day on a touch screen. All of them are connected to a database.
I want to be able to upgrade all of those applications by uploading upgrade files to the database and set the date and time for the upgrade to occur.
I have one idea on how to do this.
An application has a thread running to look for available upgrades.
When an upgrade is found, the file is downloaded to the application's computer.
When download is complete, the applications triggers a restart.
When application starts, it looks for an upgrade file on the local computer.
If upgrade is available, the application upgrades itself.
I'm not really sure how all these steps should be done yet, especially the last one. But I want some comments about this. Is this completely wrong? Am I on the right track? Any tips on how to do it like this or in another way?
I think you're going down the right lines here. A polling application to check the database for the existence of a new update followed by an xcopy deployment script would do it.
This might be doable from a PowerShell script too, that runs on a schedule, say every 10 minutes. It could check the database, close the process and service, xcopy the application (from a shared source) and restart the said service and app.
All this assumes that you are not using Windows Installer to package and deploy your application initially. Although an xcopy to directly replace binaries wouldn't hurt an MSI package, it's not recommended. We use AD MSI deployment at work and it's a pain at the best of times!
MSDN contains references for MSI vs XCopy deployment for WPF applications (as well as the security requirements).
This was the first link I found for querying SQL from PowerShell: http://elegantcode.com/2008/03/27/discovering-windows-powershell/
Good luck!
You will have trouble doing this with ClickOnce. ClickOnce would only work for your WPF app, it can't do anything with the services or web application. You could write a separate ClickOnce-deployed "Updater" app whose job is to update the other apps, but that still seems a little iffy.
It may sound stupid, but I'd start with the simplest thing I could think of. How about using Dropbox to push your update files; then an AutoHotKey script that runs on startup, watches the Dropbox folder for new updates, and runs them?
Sounds hokey, but it's something you could prove out in an hour or two.
Microsoft have an Updater Application Block which might be what you are looking for.
Do you really want to run an update from the database or is this just a possible solution? You are reinventing the wheel.
Have a look at ClickOnce deployment, everything you need is already done for you and integrated into VisualStudio. If you use something that already exists you have the benefit of existing documentation, helpful blogs of people who have already gone through the pain points and updates and fixes.
ClickOnce Deployment
ClickOnce Deployment in .NET Framework 2.0
How you want to use ClickOnce depends on what you want to get out of it. Out of the box you can very easily create a deployment that checks for an upgrade every time you run the application but you can also with a little bit of code have the application check for updates whilst it is running.
The Updater Application BlockVersion that Dominic Zukiewicz mentioned is the pre cursor to ClickOnce.
EDIT
ClickOnce provides a roll-back scenario on both the Server and Client end. The client can roll back to a previous version using the normal add remove programs dialogue and you can easily republish a previous version.
You could create another Windows Service that does the updates on a daily basis. The service would look on a specific folder if there are any updates to be process. For example it could look for an xml file which tells it the new version of the application and what the files to update are. It would shut down the application/services, backup the files that it needs to update, start the application/services, and clean up backup files keeping at least three backup files. The service should keep track of the last and current version installed so that when it reads the xml file it can check if it is a new update or not or you can simply delete the xml file when it completes.
How about Google Omaha? It's an open source tool, currently used to push updates of Google Chrome and Google Earth. Omaha can handle application installation, too. A high-level design overview can be found here.
I need to create a patching routine for my application,
it's really small but I need to update it daily or weekly
how does the xdelta and the others work?
i've read around about those but I didn't understand much of it
the user shouldn't be prompted at all
Ok this post got flagged on meta for the answers given, so I'm going to weigh in on this.
xdelta is a binary difference program that, rather than providing you with a full image, only gives you what has changed and where. An example of a text diff will have + and - signs before lines of text showing you that these have been added or removed in the new version.
There are two ways to update a binary image: replace it using your own program or replace it using some form of package management. For example, Linux Systems use rpm etc to push out updates to packages. In a windows environment your options are limited by what is installed if you're not on a corporate network. If you are, try WSUS and MSI packaging. That'll give you an easier life, or ClickOnce as someone has mentioned.
If you're not however, you will need to bear in mind the following:
You need to be an administrator to update anything in certain folders as others have said. I would strongly encourage you to accept this behaviour.
If the user is an administrator, you can offer to check for updates. Then, you can do one of two things. You can download a whole new version of your application and write it over the image on the hard disk (i.e. the file - remember images are loaded into memory so you can re-write your own program file). You then need to tell the user the update has succeeded and reload the program as the new image will be different.
Or, you can apply a diff if bandwidth is a concern. Probably not in your case but you will need to know from the client program the two versions to diff between so that the update server gives you the correct patch. Otherwise, the diff might not succeed.
I don't think for your purposes xdelta is going to give you much gain anyway. Just replace the entire image.
Edit if the user must not be prompted at all, just reload the app. However, I would strongly encourage informing the user you are talking on their network and ask permission to do so / enable a manual update mode, otherwise people like me will block it.
What kind of application is this ? Perhaps you could use clickonce to deploy your application. Clickonce very easily allows you to push updates to your users.
The short story is, Clickonce creates an installation that allows your users to install the application from a web server or a file share, you enable automatic updates, and whenever you place a new version of the app on the server the app will automatically(or ask the user wether to) update the app. The clickonce framework takes care of the rest - fetching the update , figure out which files have changed and need to be downloaded again and performs the update. You can also check/perform the update programatically.
That said, clickonce leaves you with little control over the actual installation procedure, and you have nowhere close to the freedom of building your own .msi.
I wouldn't go with a patching solution, since it really complicates things when you have a lot of revisions. How will the patching solution handle different versions asking to be updated? What if user A is 10 revisions behind the current revision? Or 100 revisions, etc? It would probably be best to just download the latest exe(s) and dll(s) and replace them.
That said, I think this SO question on silent updates might help you.
There is a solution for efficient patching - it works on all platforms and can run in completely silent mode, without the user noticing anything. On .NET, it provides seamless integration of the update process using a custom UserControl declaratively bound to events from your own UI.
It's called wyUpdate.
While the updating client (wyUpdate) is open source, a paid for wybuild tool is used to build and publish the patches.
Depending on the size of your application, you'd probably have it split up into several dll's, an exe, and other files.
What you could do is have the main program check for updates. If updates are available, the main program would close and the update program would take over - updating old files, creating new ones, and deleting current files as specified by the instructions sent along with a patch file (probably a compressed format such as .zip) downloaded by the updater.
If your application is small (say, a single exe) it would suffice to simply have the updater replace that one exe.
Edit:
Another way to do this would be to (upon compilation of the new exe), compare the new one to the old one, and just send the differences over to the updater. It would then make the appropriate adjustments.
You can make your function reside in a separate DLL. So you can just replace the DLL instead of patching the whole program. (Assuming Windows as the target platform for a C# program.)
Recently I was working on displaying workflow diagram images in our web application. I managed to use the rehosted WF designer and create images on-the-fly on the server, but imagining how large the workflow diagrams can very quickly become, I wanted to give a better user experience by using some ajax control for displaying images that would support zoom & pan functionality.
I happened to come across the website of seadragon, which seems to be just an amazing piece of work that I could use. There is just one disadvantage - in order to use their library for generating deep zoom versions of images I have to use the file structure on a server. Because of the temporary nature of the images I am using (workflow diagrams with progress indicators), it is important to not only be able to create such images but also to get rid of them after some time.
Now the question is how can I best ensure that the temporary image files and the folder hierarchy can be created on a server (ASP.NET web app), and later cleaned up. I was thinking of using the cache functionality and by the expiration of the cache item delete the corresponding image folder hierarchy, or simply in the Application_Start and Application_End of Global.asax delete the content of the whole temporary folder, but I'm not really sure whether this is a good idea and whether there are some security restrictions or file-system-related troubles. What do you think ?
We do something similar for creating PDF reports and found the easiest way is to use a timestamp check to determine how "old" files are, and then delete them based on a period of time, in our case more then 2 hours old. This is done before the next PDF document is created, but as part of the creation process. We also created a specific folder and gave the ASP.Net user read/write access to the folder.
The only disadvantage is that if the process of creating PDF's is not used regularly there will be a build up of files, however they will be cleaned up eventually. In 2 years and close on 4000 PDF's we have yet to have an error doing it this way.
Use the App_Data folder. This folder is inside your application and writable by your app without having to go outside the context of the app, but it's also secured from casual browsing. It's meant to hold data files for your application.
Application_Start and Application_End will only fire once each, so if you need better cleanup than that, I would consider using a cache structure or a simple windows service to handle the cleanup.
First, you have to make sure your IIS worker process has rights to write/delete files from your cache directory (and NOT the rest of your site, just in case)
2nd, I would stay away from using App_Start and App_End, App end to clean up files is not 100% guaranteed to fire, and you could end up with a growing pile of orphaned images.
I would instead make a scheduled process, maybe runs once per hour, or once a day, depending on what you want. And have it check how old each image in your cache is, and if its older than your arbitrary "expiure time" then delete it.
Other than that there's not much to it.
What would be the Most secure and Safe way to allow software to auto-update without opening too many holes to enable a hacker easy access to a system?
Have you looked into ClickOnce Deployment?
http://msdn.microsoft.com/en-us/library/t71a733d(VS.80).aspx
The short overview is here:
http://msdn.microsoft.com/en-us/library/142dbbz4(VS.80).aspx
I recommend not building your own auto-update, use ClickOnce if it works for you or a commercial auto-update component if not.
If you want to see what is involved I wrote a series about writing an auto-update component on my blog some time ago, the last post with links to all the posts in the series is at: http://www.nbdtech.com/blog/archive/2007/08/07/How-To-Write-an-Automatic-Update-System-Part-8.aspx
If you are going to make your own system then you will probably want to have a public/private key pair.
So, you would zip up the update.
Then encrypt with the private key on the server.
The client can then decrypt and unzip it, and then install it.
That way, as long as your private key is secure then you can ensure that the update is legit.
The only weakness here is that if someone changed the public key to some other key, then they could fool that program into thinking that a trojan is a valid update.
There are various schemes you can use to get around this, but that would depend on how much work you want to put into this.
ClickOnce auto update is all fair and well but anyone can admit that it is not the most of fashionable solution. I've recently developed a solution that requires such an auto-update feature. Here is a list of brief steps I took to deploying my very own updating service that also allows for roll-backs with 'minimal' know-how.
Add a Setup project to the solution so that the project could be wrapped up neatly in a .exe or .msi installer package.
The following is to setup a FTP server with your desired user credential that only your application knows. On the ftp server, setup a default directory for where you will put any new updates.
Your application will check for internet connection on start-up, log into your remote FTP server and check for any new files to download.
Download new updates to your client application and put them in a date-time named folder for future reference. Some checks need to be in place to make sure that you don't download the same old files.
Close the application and run the new installation. Depending on how you setup your Setup project, the installation wizard may remove the previous version completely or just update partial (patches, etc.).
Your application may have a feature to roll-back to previous version by going into the local update directory and fish out the previously downloaded files. This is where the date-time stamped files come in handy for reference.
This solution offers a level of customization that I think most Enterprise solutions will need and I found that it works very effectively for me. FTP servers are secure and reliable as far as file downloads are involved. You can find a lot of FTP download helper library on the internet so its a matter of making work the way you want and not worry too much about how it works.