Azure instance change code roll back - c#

I am new to Azure, I have small instance of cloud service, In last one week my instance is changed 2 times & all my project data is lost, it will roll back to 1 month older. All my client data is lost, Is there any way to recover that data & why this issue occurs.

There is no way to recover your data and there's no way to prevent this from happening. This is by design.
Whenever your machine crashes or there's an update to the system, it is completely wiped. A new system image will be copied, the machine will boot again and your application is copied over. Azure cloud services are Platform-as-a-Service (PaaS).
This leaves you with two possible options. The first would be to not store persistent data on the cloud service in the first way. This is no proper way for Azure Cloud Services. Instead store your data in the Azure Storage or an Azure SQL database (or wherever you like).
Another option would be to use a virtual machine instead of a cloud service. That machine is completely in your hand. It's your duty to update it, keep it secure and to do whatever it takes to keep it running. With this approach you also have to take care yourself about a loadbalancer, about multiple instances, etc, so outscaling becomes a lot more hard. This is Infrastructure-as-a-Service (IaaS).
So it actually depends on what you want to do.

Cloud instances are stateless, this means that anything that you've stored on the local storage for the virtual machines can and will be deleted on the event of a node failure, or a system upgrade, or even a new deployment of a package that you upload.
A couple of things you can do:
If you need to add additional files or configurations to your project upon deployment, then make use of the OnStart() to perform it. This assures than on each deployment or failure restore you get back the same environment you always had.
To avoid losing your source code I recommend you setup source control and integrate it with you cloud instance implementation. You can either do this with Git or with Team Foundation Service (checkout tfspreview.com)
If you need to store files on the server such as assets or client-updated media, consider using Azure Blob Storage. Blob storage is replicated both locally on the datacenter and geo-replicated to other datacenters if you choose to do so.
Hope that helps

Related

how is better deploy (install) this application?

I have an application that has two main parts. First, the client, basicly is the user iterface, second, a repository that is a library, that connects with the database and has all the logic to insert, update, delete... and ensures the coherence of the data.
The application is not deplyed yet, and by the moment the client uses directly the repository to access to the database. But when I will have to deploy the application to be used for many users, inside the LAN, I think that this is not the best solution.
First solution
Install the client and the repository in all the computer of the users that need the application.
This have the disadvantage that when I update the application, I have to update many applications, and perhaps not all the applications are updated because of any reason. So if the update is of the repository that fix some problem, if the client that has not updated the application will introduce incoherence data in the database, if the fix is to correct this type of problem.
Second solution
The client use direcly the repository, but the application is installed in a network drive. I have only one installation, so if I need to update the application, I have to do it once.
The application is not so big, about 12MB, but it could be a bit slow because has to go through the net from the server to user computer. So perhaps some user could copy the application to the local computer, so I can't ensure that happens the problem with the first solution.
Third solution
The client application does not use the repository directly, the repository is in the server and the client use WCF to communicate with the server, and the server uses the repository to access to the database.
The disadvantage is that the server has to run the repository, so if there are many clients connected, it needs a lot of RAM, instead that if the computers of the users have the application in local, the memory is needed in the local computer.
In sumary, when I have to deply this kind of application, which is the best solution, or which is the solution that would you use in your projects?
Thank you so much.
This really depends on your deployment method, are you using a ClickOnce to deploy it? If so you could keep the data local to each PC, avoid those RAM issue, and if you send out a new update change the required version number and set it to check prior to running, that way they will be unable to run the program without updating it. The problem is they must have network access, but this would also be an issue with remote data. In this situation you would only need network access during the update, not sure if this would be an issue or not.

Handling multiple deployments of application ASP.NET

I have a product, and a front end website where people can purchase the product. Upon purchase, I have a system that creates an A record in my DNS server that points to an IP address. It then creates a new IIS website with the bindings required.
All this works well, but I'm now looking at growing the business and to do this I'll need to handle upgrades of the application.
Currently, I have my application running 40 websites. It's all the same code base and each website uses it's own SQL Server database. Each website is ran in a separate application pool and operate completely independently.
I've looked at using TeamCity to build the application and then have a manual step that runs MSDeploy for each website but this isn't particularly ideal since I'd need to a) purchase a full license and b) always remember to add a new website to the TeamCity build.
How do you handle the upgrade and deployments of the same code base running many different websites and separate SQL Server databases?
First thing, it is possible to have a build configuration in TeamCity that builds and deploys to a specific location...whether a local path or a network drive. I don't remember exactly how but one of the companies I worked with in Perth had exactly the same environment. This assumes that all websites are pointing to the same physical path in the file system.
Now, a word of advice, I don't know how you have it all setup, but if this A record is simply creating a subdomain, I'd shift my approach to a real multi-tenant environment. That is, one single website, one single app pool for all clients and multiple bindings associated to a specific subdomain. This approach is way more scalable and uses way less memory resources...I've done some benchmark profiling in the past and amount of memory each process (apppool) was consuming was a massive waste of resources. There's a catch though, you will need to prepare your app for a multi-tenant architecture to avoid any sort of bleeding such as
Avoiding any per-client singleton component
Avoiding static variables
Cache cannot be global and MUST a client context associated
Pay special attention to how your save client files to the file system
Among other stuff. If you need more details about setting up TeamCity in your current environment, let me know. I could probably find some useful info

How to deploy the same application to multiple servers

looking for examples of what people have done inorder to deploy the same webapp or processes to multiple servers.
The deployment process right now consists of copying the same file multiple times to different servers within our company. There has to be a better way to do this right now I am looking into ms build does anyone have other ideas? Thanks in advance.
Take a look at msdeploy and Web Deploy.
I've done this using a variety of methods. However, I think the best one is what I call a "rolling" deployment.
The following assumes a code only deployment:
Take one or more web servers "offline" by removing them from the load balancing list, let's call this group A. You should keep enough running to keep up with existing traffic, we'll call those group B. Push the code to the offline servers (group A).
Then, put group A back into rotation and pull group B out. Make sure the app is still functional with the new code. If all is good, update group B and put them back in rotation. In the event of a problem, just put group B back in and take A out again.
In the case of a database update there are other factors to consider. If you can take the whole site down for a limited period then do so and perform all necessary updates. This is by far the easiest.
However, if you can't then do a modified "rolling" deployment which requires multiple database servers. Pick a point in time and move a copy of the production database to the second one. Apply your changes. Then pull a group of web servers out, update their code to production and test. If all is good, put those web servers back into rotation and take out group B. Update the code on B while pointing them to the second DB server. Put them back into rotation.
Finally, apply all data changes that occurred on the primary production database to the secondary one.
Note, I don't use Web Deploy or MS Deploy for pushes to production. Quite frankly I want the files ready to be copy/pasted into the correct directory on the server so that the push can run as quickly as possible. Both Web and MS Deploy options involve transferring those files over a network connection; which is typically much slower than simply copy/pasting from one local directory to another.
You can build a simple console app that connects to a fixed sftp download, uncompress
and run all the files in a fixed directory. A meta XML file can be usefull to create rules
such as each machine will run each application, pre-requirements and so on.
You can also use dropbox api to download your files if you don't have a centralized server to unify your apps.
Have a look at kwateeSDCM. It's language and platform agnostic (Windows, Linux, Solaris, MacOS). There's an article dedicated to deployment of a webapp on multiple tomcat servers.

Implementing a desktop .NET application that can work offline.

I need to create a desktop WPF application in .NET.
The application communicates with a web server, and can work in offline mode when the web server isn't available.
For example the application needs to calculate how much time the user works on a project. The application connects to the server and gets a list of projects, the user selects one project, and presses a button to start timer. The user can later stop the timer. The project start and stop times need to be sent to the server.
How to implement this functionality when the application is in offline mode?
Is there are some existing solution or some libraries to simplify this task?
Thanks in advance.
You'll need to do a couple of things differently in order to work offline.
First, you'll need to cache a list of projects. This way, the user doesn't have to go online to get the project list - you can pull it from your local cache when the user is offline.
Secondly, you'll need to save your timing results locally. Once you go online again, you can update the server will all of the historic timing data.
This just requires saving the information locally. You can choose to save it anywhere you wish, and even a simple XML file would suffice for the information you're saving, since it's simple - just a project + a timespan.
It sounds like this is a timing application for business tracking purposes, in which case you'll want to prevent the user from easily changing the data. Personally, I would probably save this in Isolated Storage, and potentially encrypt it.
You can use Sql Server Compact for you local storage and then you microsoft sync framework to sync your local database to the server database. I recommend doing some research on the Microsoft Sync Framework.
Hello all I implemented this application I've created my own off-line framework
based on this article and Microsoft Disconnected Service Agent
DSA
I've adapted this framework for my needs.
Thank you for all.
you can use a typed or untyped dataset for offline-storage.
when online (connected to internet) you can download the data into a dataset and upload it back to the database server. the dataset can be loaded from and saved to a local file.

Is caching infrequently changed data to speed client application load times an appropriate use of Isolated Storage?

I'm working on an enterprise application re-write using Silverlight. The project is still in its early stages of development, but there is a heavy initial data load on the application start as it pulls several sets of business objects from the server. Some of these data sets are infrequently changed once they're set up by the user; like the list of all customized data types in use by the user.
In these cases, the thought is to cache the data objects (probably in a serialized form) in Isolated Storage so there's no wait on an asynchronous call to the server to grab the data after the first application load.
I thought that Isolated Storage is meant to store configuration data such as user preferences, or to share across the in-browser and out-of-browser version of an app...that it works a lot like a cookie store.
My main concern is that I'm unsure of how secure Isolated Storage is, and I don't trust caching application data in it. To be fair, the user would also have access to the Silverlight .xap file.
Is this an appropriate use for Isolated Storage, why or why not?
It's a fair use of isolated storage, if you're comfortable with the caveats.
The first caveat in my mind is that whatever you store in isolated storage on one machine will not be available when the user fires up your app on another machine - you lose the mobility advantage of web applications over desktop installed apps. If the user spends some time configuring their preferences, etc, they will be irritated that they have to do it all over again just because they switched to a different computer to view your web app. To solve this, you should replicate the user's customizations to cloud storage so that it can be copied down to whatever machine they choose to run your web app on. Treat the isolated storage as a performance optimization cache for data that officially lives in the cloud.
I believe Silverlight isolated storage is written to disk in the user's private data area in the file system. \users\\AppData or similar. This will keep it isolated away from other users on the same machine, but will not provide any protection from other programs running for the same user. I don't recall if Silverlight isolated storage is encrypted on disk. I highly doubt it.
A second caveat is that Silverlight isolated storage has a quota limit, and it's fairly small by default (1MB). The quota can be increased with a call to IncreaseQuotaTo(), which will prompt the end user to ok the request.
The third caveat is that if you're going to use local storage as a cache of data that lives in the cloud, you have to manage the data synchronization yourself. If the user makes changes locally, you need to push that up to the storage of authority in the cloud, and you'll have to decide when or how often to refresh the local cache from the cloud, and what to do when both have been changed at the same time (collision).
The browser cookie store is not a great metaphor for describing Silverlight isolated storage. Browser cookies for a given domain are attached to every http request that is made from the client to the server. The cookies are transmitted to the server constantly. The data in Silverlight isostorage is only accessible to the Silverlight code running on the client machine - it is never transmitted anywhere by Silverlight or the browser.
Treat Silverlight's isolated storage as a local cache of cloud data and you should be fine. Treat isostorage as a permanent storage and you'll piss off your customers because the data won't follow them everywhere they can use your web app.
Not a complete answer to your story but a data point to consider:
Beware the IO speeds of IsolatedStorage. While there has been considerable effort put into speeding it up, you may want to consider other options if you plan to do multiple small reads/writes as it can be extremely slow. (That, or use appropriate buffering techniques to ensure your reads/writes are larger and infrequent.)

Categories

Resources