How to store temp files in SharePoint 2010 hosted code? - c#

I have a hosted web service in SharePoint 2010 that does uploads and downloads to sharepoint.
Because the files can be large (100MB+), i would much rather use temp files as the streams the code goes through instead of memory-streams to avoid 100mb memory allocations each time it does download/upload.
The problem is that i could find a location in the server to store temp files. System.IO.Path.GetTempFileName() throws an error because the authenticated user doesn't have permissions to %TEMP% folder in the server.
"%systemroot%\temp" allows writing files but not deleting them.
Any idea if i can get a location from sharepoint that is accessible for any authenticated user to store the files?
few notes:
the files are temporary and need to be deleted right away so no need to consider clustering issues.
I don't want a solution that requires doing any active action in the servers as this plugin might be deployed on farms with a lot of servers and i'd hate to ask the customer to go through each server.
Thanks.

You need to access files under SharePoint's "system account". And yes, System.IO.Path.GetTempFileName() is correct location.
Starting point - SPSecurity.RunWithElevatedPrivileges.
Notes
If you can open files as "temporary + delete on close" (check appropriate flags in FileStream class).
Be extremely careful not to perform access to other SharePoint resource (SPFile/SPItem...) while running code inside RunWithElevatedPrivileges delegate.
You may only need to open file under RunWithElevatedPrivileges, read/write may work outside - please verify yourself. I'd keep all file access inside delegates running with RunWithElevatedPrivileges.

It might not be the best place but I have used the _layouts directory in the hive before (C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\TEMPLATE\LAYOUTS) for doing something similar to this before.
You can get this location with Microsoft.SharePoint.Utilities.SPUtility.GetGenericSetupPath() and you should be able to read/write in the directory. You may need to run as elevated permissions.

Related

Opening files locally from web application

I recently added a way for my web application (ASP.NET written in C#) to go to a folder which contains a bunch of spreadsheets and import them into SQL server tables. I set the folders and file names using an admin table so it knows how to handle each file and which table they should go to etc. It even keeps track of the file dates and times so it ignores anything that isn't new since the last time it imported them. Very cool but it only works on my development machine, most likely because the path is easily recognized there.
I'd like others to be able to do this but I can't seem get the web application to access a pre-arranged path on the users local machine. Now I'm assuming this is normal (we shouldn't be able to have a web application reach into someone's machine and grab files!) but is there some way to either do it using a known path or by having a user select the local folder? Is it possibly done more easily if I put the files in a folder within the site?
Dana
If I understand your question correctly, the approach is that you want a user to type in a local file path and you process it.
This will not work through a website. And from a security perspective this is very wise as you point out. So unless you install some client application on the local machine it is not possible.
You will need a file-upload dialog and have the user explicity locate the files for you, click upload and process them on the server.
Some other strategies here:
https://developer.mozilla.org/en-US/docs/Using_files_from_web_applications
but it still requires the user to select them manually.

Shares files folder between two web servers

I am implementing a C# (.NET) application in which user can upload files and images due to huge size (more than 80 gb) we are storing these files on file system now I am preparing a another web server which need to access those files.
My question is How can I access files from IIS from another server, both servers are on same network sharing folder does not solve problem for me
I read about using virtual folder but I have concern will it put extra load on IIS for handling user requests ?
Thanks in advance
If it's on the same network, just access it via a network share. Using a virtual directory can be helpful, if for example you'd like to change the network location without having to change any code that references the location. The virtual directory itself won't put any extra strain on the IIS.
Local network speed will be your only bottleneck compared to files stored on the same machine as the IIS as far as I can tell, since you will be transferring the files first on your local network from one machine to the IIS machine, and then to the end user.

.NET MVC Websites - shared folders between websites

I have a production website that sits on two servers that used local label files to drive their page labels (request going round robin between the two).
Users need the ability to upload new labels files, but once uploaded on one I need it also updated on the second website - this needs to be immediate. I was trying to use a shared folder on one of the servers, but even if I give it everyone full access i get the error "Exception message: Unable to find label folder at \\MACHINENAME\LabelFiles" when reading from the other server, I've also tried giving full permissions to "IIS AppPool\DefaultAppPool", but get the same issue.
I'm using IIS 7.5 on Windows Server 2008 R2
Question-
Is there a way to share a folder between the two sites?
Is there a better alternative solution?
Thanks
Both the websites should have a virtual folder pointing to the same physical folder, where the users can upload files.
Make sure also that the Anonymous access is disabled
One approach is to map the folder as a drive on each of the production machines, it should then be as simple as refereing to that particular drive letter.
This can be done by navigating to the folder in windows explorer, then clicking Map Network Drive.
I cannot guarantee this will work, buut it might be worth a go.
The IIS AppPool\DefaultAppPool user is a special account that is only local to the machine where it is being used. You cannot access network file shares with that account. You will either need to use the 'Network Service' account or a domain account as the app pool user.
Also, since you are load balancing this site between two servers, you might want to consider using some type of SAN or NAS storage that is shared between the two servers. Otherwise, you will need to come up with some kind of process to synchronize the file share on both servers.

Building, and Publishing, and User-Data (oh my!)

What is the best way to update a "Web-Application" ("re-publish" from dev server to live server) while preserving user-data (such as images, videos, and audio stored in the filesystem) in a VS 2010 build/publish setup?
Additionally, what is the best way to minimize site downtime during these updates?
My backstory:
Usually I "build/publish" the site to a folder on my dev machine, ftp:// into the live server, then drag-and-drop the new "published" files and folders to the live site while making sure not to overwrite any user-generated directories.
Obviously this method comes from my static-html days where it didn't matter. And obviously this is dangerous, flawed, and counter-productive for any Web Application with user-generated data in the FS.
The easiest way is to have a directory that's outside of your code folder where you store the user data. You can even map this folder as a virtual folder in IIS when you need this folder to be available from the internet. Like:
C:\Inetpub
\ProjectWebsite
\ProjectFiles

sync database to uploaded file using Windows Service or website code

I have a website that occasionally needs to have a handful of the tables in its database updated. The updates come from another system that exports to comma delimited text files. I can then either FTP the text files to the web server, send them in through an admin upload page, or manually log in to Remote Desktop to download the text files. I have all my C# code written to parse the files, check the database contents, and decide what to do.
Should I code the sync logic to be part of a file upload page, protected in the admin section of the site or should I create a Windows Service that constantly looks for files to process in a particular directory that I can drop files in through FTP?
I have used Windows Services in the past and they have worked great, but if I ever have to make a change to the code it can take longer than it would if I just had to modify an ASPX.
Are their security benefits one way or another?
Performance benefits?
ASPX page wins the "ease of maintenance" category.
I would create a Windows Service to watch a secure folder and use a directory watcher to look for new files. Since the files are coming from another system, it is asynchronous in nature, and it is much more performant to have a Windows Service running separately to watch for updates as they happen. It can also parse the files and update the database for you.
Depending on who maintains the remote system, the easiest way is to grant permission to the service to access the files on a secure, shared folder. Then you won't need to do anything manually.

Categories

Resources