Recently I was working on displaying workflow diagram images in our web application. I managed to use the rehosted WF designer and create images on-the-fly on the server, but imagining how large the workflow diagrams can very quickly become, I wanted to give a better user experience by using some ajax control for displaying images that would support zoom & pan functionality.
I happened to come across the website of seadragon, which seems to be just an amazing piece of work that I could use. There is just one disadvantage - in order to use their library for generating deep zoom versions of images I have to use the file structure on a server. Because of the temporary nature of the images I am using (workflow diagrams with progress indicators), it is important to not only be able to create such images but also to get rid of them after some time.
Now the question is how can I best ensure that the temporary image files and the folder hierarchy can be created on a server (ASP.NET web app), and later cleaned up. I was thinking of using the cache functionality and by the expiration of the cache item delete the corresponding image folder hierarchy, or simply in the Application_Start and Application_End of Global.asax delete the content of the whole temporary folder, but I'm not really sure whether this is a good idea and whether there are some security restrictions or file-system-related troubles. What do you think ?
We do something similar for creating PDF reports and found the easiest way is to use a timestamp check to determine how "old" files are, and then delete them based on a period of time, in our case more then 2 hours old. This is done before the next PDF document is created, but as part of the creation process. We also created a specific folder and gave the ASP.Net user read/write access to the folder.
The only disadvantage is that if the process of creating PDF's is not used regularly there will be a build up of files, however they will be cleaned up eventually. In 2 years and close on 4000 PDF's we have yet to have an error doing it this way.
Use the App_Data folder. This folder is inside your application and writable by your app without having to go outside the context of the app, but it's also secured from casual browsing. It's meant to hold data files for your application.
Application_Start and Application_End will only fire once each, so if you need better cleanup than that, I would consider using a cache structure or a simple windows service to handle the cleanup.
First, you have to make sure your IIS worker process has rights to write/delete files from your cache directory (and NOT the rest of your site, just in case)
2nd, I would stay away from using App_Start and App_End, App end to clean up files is not 100% guaranteed to fire, and you could end up with a growing pile of orphaned images.
I would instead make a scheduled process, maybe runs once per hour, or once a day, depending on what you want. And have it check how old each image in your cache is, and if its older than your arbitrary "expiure time" then delete it.
Other than that there's not much to it.
Related
I have a column in my grid view with images of a progress bar. These images are created on each render and written to my 'write' folder.
However, after Microsft's patch KB3052480, IIS resets once files in the application's directory have been created, changed, or overwritten.
This can be changed in IIS's settings so that it never resets on update . However this means the application would need to manually be restarted when any patch is applied (not an acceptable outcome).
Is there a way to keep the setting (so that IIS still resets on updates such as changes to .dll files) but still create and write images without it resetting.
I have looked around a lot but there is not much information on this particular issue.
What I was thinking is- somehow stop monitoring changes to file right before the save takes place. And then resume monitoring again.
How would this be done, or is there another way to prevent IIS from recycling after this specific change?
To answer your question mentioned in the comment, which I think is your real question, to prevent the app domain from recycling on file save don't put the files you are saving inside the websites' folder. Instead have it in some other path that is not part of the application.
I'm a bit late but if you are using asp.net framework, then you can store the "dynamic" files in App_Data, i think its a exception to the recycle rule.
I have a stylesheet in my application ~/Content/theme/style.css. It is referenced in my application using standard bundling as such:
bundles.Add(new StyleBundle("~/Content/css").Include(
"~/Content/font-awesome/font-awesome.css",
"~/Content/theme/style.css"));
Now, I have used a Sass compiler (Libsass) to allow me to change the output style.css file to a customised user output file as required.
So basically I do something like this.
CompilationResult compileResult = SassCompiler.CompileFile(Server.MapPath(Path.Combine(WebConfigSettings.RootSassPath, "style.scss"), options: new CompilationOptions {
SourceMap = true,
SourceMapFileUrls = true
});
and then I save like this.
string outputPath = Server.MapPath(WebConfigSettings.ThemeOutputPath);
if (System.IO.File.Exists(outputPath))
System.IO.File.Copy(outputPath, string.Format("{0}.bak", outputPath), true);
System.IO.File.WriteAllText(Server.MapPath(WebConfigSettings.ThemeOutputPath), compileResult.CompiledContent);
However intermittently I receive the following dreaded access error: "The process cannot access the file C:....\style.css" because it is being used by another process." (Note: This occurs at the File.WriteAllText line)
This doesn't make sense because I do not open any streams to the file and perform what I assume to be a single atomic operation using File.WriteAllText.
Now I have also noticed that this error is particularly likely when I use two different browsers to modify this file consecutively.
My assumption is that one of two things is happening.
Either:
a. The bundling packager is somehow locking the file because it has been modified while it updates the bundles and not releasing the lock or
b. Because two different connections access the file somehow a lock persists across them.
So, has anyone run into anything similar? Any suggestions on how I might be able to fix this issue?
PS: I have tried using HttpRuntime.UnloadAppDomain(); as a hacky way to try and release any locks on the file but this doesn't seem to be helping.
Your web server itself will get a read lock on the file(s) when they are served. So, if you are going to be writing files at the same time, collisions will be inevitable.
Option 1
Write to disk in a retry loop and ignore this exception. The files are likely to be available for writing within a very short time span.
Option 2
Avoid the web server locking the files by serving them yourself from a cache.
From this answer:
...if you are updating these [files] a lot, you're really defeating IIS's caching mechanisms here. And it is not healthy for the web server to be serving files that are constantly changing. Web servers are great at serving static files.
Now if your [files] are so dynamic, perhaps you'll need to serve it through a server-side program instead.
Since you mentioned in a comment that your end users are changing the files, I would suggest doing the following to ensure there is no chance of a locking conflict:
Use an action method to serve the content of the bundle.
By default, read the files from disk.
When an end user loads the "edit" functionality of the application, load the content from the file(s) into a cache. Your action method that serves the content should check this cache first, serving it if available, and serve the file(s) from disk if not.
When the end user saves the content, compile the content, write it to disk, then invalidate the cache. If the user doesn't save, the cache will just time out eventually and the files will be read from disk again by end users.
See How can I add the result of an ASP.NET MVC controller action to a Bundle? for some potential solutions on how to serve the bundle from an action method. I would probably use a solution similar to this one (although the caching strategy might need to be different).
Alternatively, you could make the cache reload every time it is empty in a user request and update both the files and cache during the "save" operation which would probably be simpler and reduce the chance of a file lock issue to zero, but wouldn't scale as well.
When one page rendered on browser, then the optimizer process the bundled css and jqueries into caching. So when once the page got cashed, one page re-request the browser first will check for page cached contents, if not present then only it make service call. There is only two solutions for your question less or sass type css usage.
turn off bundling
Less,coffeescript,scss & sass bundling
Using Awesomium.NET 1.7 RC3, if I create a WebSession and a WebView in my application like so:
var webSession =
WebCore.CreateWebSession("C:\\AwCache", new WebPreferences{...});
var webView =
WebCore.CreateWebView(500, 500, webSession);
...and then exit the app, will the cached data (images, css etc.) be available the next time my app starts and creates a WebSession using the same location for the cache?
I believe the cache will still be available. While most of my experience with caching was in Awesomium 1.6.6 and was done by setting the WebCoreConfig.UserDataPath property when calling WebCore.Initialize(), a little testing hints that it is still available.
If you look at the files created when you first run your code and access a web page (I chose Flickr just so there would be a reasonable amount of images on the page), you'll see that inside your AwCache folder, there's another folder called 'Cache'. This folder contains 4 'data_X' files, an index file and a number of 'f_XXXXXX' files. One other thing worth noting is how quickly those files are generated on the first app run. When you rerun the app, no new files are created as long as you're visiting the same URL, but the time stamp on the data_X files, the index files, and maybe a couple of the f_X files get updated, but many f_X files remain the same. The file changes also happen very quickly.
I believe the f_X files are the actual cached items from the site, as visiting a different site will result in an increasing number of f_X files, while revisiting the same site will not.
Obviously, this is far from a matter-of-fact answer, but based on these observations, I think it seems apparent that the cache is maintained. One final piece, if you look at the Awesomium 1.7 documentation, CreateWebSession(WebPreferences) specifies in bold that it is in-memory cache, where the CreateWebSession(string, WebPreferences) method that you are calling does not.
I am developing a WinForms application using C# 3.5. I have a requirement to save a file on a temporary basis. Let's just say, for arguments sake, that's it's for a short duration of time while the user is viewing a particular tab on the app. After the user navigates away from the tab I am free to delete this file. Each time the user navigates to the tab(which is typically only done once), the file will be created(using a GUID name).
To get to my question - is it considered good practice to save a file to the temp directory? I'll be using the following logic:
Path.GetTempFileName();
My intention would be to create the file and leave it without deleting it. I'm going to assume here that the Windows OS cleans up the temp directory at some interval based on % of available space remaining.
Note: I had considered using the IsolatedStorage option to create the file and manually delete the file when I was finished using it i.e. when the user navigates away from the tab. However, it's not going so well as I have a requirement to get the Absolute or Relative path to the file and this does not appear to be an straight-forward/safe chore when interacting with IsolatedStorage. My opinion is that it's just not designed to allow
this.
I write temp files quite frequently. In my humble opionion the key is to clean up after one self by deleting unneeded temp files.
In my opinion, it's a better practice to actually delete the temporary files when you don't need them. Consider the following remarks from Path.GetTempFileName() Method:
The GetTempFileName method will raise an IOException if it is used to
create more than 65535 files without deleting previous temporary
files.
The GetTempFileName method will raise an IOException if no
unique temporary file name is available. To resolve this error, delete
all unneeded temporary files.
Also, you should beaware about the following hotfix for Windows 7 and Windows Server 2008 R2.
Creating temp files in the temp directory is fine. It is considered good practice to clean up any temporary file when you are done using it.
Remember that temp files shouldn't persist any data you need on a long term basis (defined as across user sessions). Exaples of data needed "long term" are user settings or a saved data file.
Go ahead and save there, but clean up when you're done (closing the program). Keeping them until the end also allows re-use.
I need to create a patching routine for my application,
it's really small but I need to update it daily or weekly
how does the xdelta and the others work?
i've read around about those but I didn't understand much of it
the user shouldn't be prompted at all
Ok this post got flagged on meta for the answers given, so I'm going to weigh in on this.
xdelta is a binary difference program that, rather than providing you with a full image, only gives you what has changed and where. An example of a text diff will have + and - signs before lines of text showing you that these have been added or removed in the new version.
There are two ways to update a binary image: replace it using your own program or replace it using some form of package management. For example, Linux Systems use rpm etc to push out updates to packages. In a windows environment your options are limited by what is installed if you're not on a corporate network. If you are, try WSUS and MSI packaging. That'll give you an easier life, or ClickOnce as someone has mentioned.
If you're not however, you will need to bear in mind the following:
You need to be an administrator to update anything in certain folders as others have said. I would strongly encourage you to accept this behaviour.
If the user is an administrator, you can offer to check for updates. Then, you can do one of two things. You can download a whole new version of your application and write it over the image on the hard disk (i.e. the file - remember images are loaded into memory so you can re-write your own program file). You then need to tell the user the update has succeeded and reload the program as the new image will be different.
Or, you can apply a diff if bandwidth is a concern. Probably not in your case but you will need to know from the client program the two versions to diff between so that the update server gives you the correct patch. Otherwise, the diff might not succeed.
I don't think for your purposes xdelta is going to give you much gain anyway. Just replace the entire image.
Edit if the user must not be prompted at all, just reload the app. However, I would strongly encourage informing the user you are talking on their network and ask permission to do so / enable a manual update mode, otherwise people like me will block it.
What kind of application is this ? Perhaps you could use clickonce to deploy your application. Clickonce very easily allows you to push updates to your users.
The short story is, Clickonce creates an installation that allows your users to install the application from a web server or a file share, you enable automatic updates, and whenever you place a new version of the app on the server the app will automatically(or ask the user wether to) update the app. The clickonce framework takes care of the rest - fetching the update , figure out which files have changed and need to be downloaded again and performs the update. You can also check/perform the update programatically.
That said, clickonce leaves you with little control over the actual installation procedure, and you have nowhere close to the freedom of building your own .msi.
I wouldn't go with a patching solution, since it really complicates things when you have a lot of revisions. How will the patching solution handle different versions asking to be updated? What if user A is 10 revisions behind the current revision? Or 100 revisions, etc? It would probably be best to just download the latest exe(s) and dll(s) and replace them.
That said, I think this SO question on silent updates might help you.
There is a solution for efficient patching - it works on all platforms and can run in completely silent mode, without the user noticing anything. On .NET, it provides seamless integration of the update process using a custom UserControl declaratively bound to events from your own UI.
It's called wyUpdate.
While the updating client (wyUpdate) is open source, a paid for wybuild tool is used to build and publish the patches.
Depending on the size of your application, you'd probably have it split up into several dll's, an exe, and other files.
What you could do is have the main program check for updates. If updates are available, the main program would close and the update program would take over - updating old files, creating new ones, and deleting current files as specified by the instructions sent along with a patch file (probably a compressed format such as .zip) downloaded by the updater.
If your application is small (say, a single exe) it would suffice to simply have the updater replace that one exe.
Edit:
Another way to do this would be to (upon compilation of the new exe), compare the new one to the old one, and just send the differences over to the updater. It would then make the appropriate adjustments.
You can make your function reside in a separate DLL. So you can just replace the DLL instead of patching the whole program. (Assuming Windows as the target platform for a C# program.)