scheduling files for upload - c#

I want to schedule the files for upload. Files are residing in companies network(on intranet)
I use "FileUpload" to browse the files on screen which are scheduled for upload. But the FileUpload does not give the full path of the file. So what is the way to workaround this limitation.

It's not possible for security reasons. Though, I It's possible in either IE with some javascript

Related

Check whether a SFTP file is in use [duplicate]

How can I make sure that a file uploaded through SFTP (in a Linux base system) stays locked during the transfer so an automated system will not read it?
Is there an option on the client side? Or server side?
SFTP protocol supports locking since version 5. See the specification.
You didn't specify, what SFTP server are you using. So I'm assuming the most widespread one, the OpenSSH. The OpenSSH supports SFTP version 3 only, so it does not support locking.
Anyway, even if your server supported file locking, most SFTP clients/libraries won't support SFTP version 5. Or even if they do, they won't support the locking feature. Note that the lock is explicit, the client has to request it.
There are some common workarounds for the problem:
As suggested by #user1717259, you can have the client upload a "done" file, once an upload finishes. Make your automated system wait for the "done" file to appear.
You can have a dedicated "upload" folder and have the client (atomically) move the uploaded file to a "done" folder. Make your automated system look to the "done" folder only.
Have a file naming convention for files being uploaded (".filepart") and have the client (atomically) rename the file after an upload to its final name. Make your automated system ignore the ".filepart" files.
See (my) article Locking files while uploading / Upload to temporary file name for example of implementing this approach.
Also, some SFTP servers have this functionality built-in. For example ProFTPD with its HiddenStores directive (courtesy of #fakedad).
A gross hack is to periodically check for file attributes (size and time) and consider the upload finished, if the attributes have not changed for some time interval.
You can also make use of the fact that some file formats have clear end-of-the-file marker (like XML or ZIP). So you know, when you download an incomplete file.
A typical way of solving this problem is to upload your real file, and then to upload an empty 'done.txt' file.
The automated system should wait for the appearance of the 'done' file before trying to read the real file.
A simple file locking mechanism for SFTP is to first upload a file to a directory (folder) where the read process isn't looking. You can "make" an alternate folder using the sftp> mkdir command. Upload the file to the alternate directory, instead of the ultimate destination directory. Once the SFTP> put command completes, then do a move like this:
SFTP> move alternate_path/filename destination_path/filename. Since the SFTP "move" is just switching the file pointers, it is atomic, so it is an effective lock.

How to store temp files in SharePoint 2010 hosted code?

I have a hosted web service in SharePoint 2010 that does uploads and downloads to sharepoint.
Because the files can be large (100MB+), i would much rather use temp files as the streams the code goes through instead of memory-streams to avoid 100mb memory allocations each time it does download/upload.
The problem is that i could find a location in the server to store temp files. System.IO.Path.GetTempFileName() throws an error because the authenticated user doesn't have permissions to %TEMP% folder in the server.
"%systemroot%\temp" allows writing files but not deleting them.
Any idea if i can get a location from sharepoint that is accessible for any authenticated user to store the files?
few notes:
the files are temporary and need to be deleted right away so no need to consider clustering issues.
I don't want a solution that requires doing any active action in the servers as this plugin might be deployed on farms with a lot of servers and i'd hate to ask the customer to go through each server.
Thanks.
You need to access files under SharePoint's "system account". And yes, System.IO.Path.GetTempFileName() is correct location.
Starting point - SPSecurity.RunWithElevatedPrivileges.
Notes
If you can open files as "temporary + delete on close" (check appropriate flags in FileStream class).
Be extremely careful not to perform access to other SharePoint resource (SPFile/SPItem...) while running code inside RunWithElevatedPrivileges delegate.
You may only need to open file under RunWithElevatedPrivileges, read/write may work outside - please verify yourself. I'd keep all file access inside delegates running with RunWithElevatedPrivileges.
It might not be the best place but I have used the _layouts directory in the hive before (C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\TEMPLATE\LAYOUTS) for doing something similar to this before.
You can get this location with Microsoft.SharePoint.Utilities.SPUtility.GetGenericSetupPath() and you should be able to read/write in the directory. You may need to run as elevated permissions.

Upload to web on dropping file to a folder like dropbox

There are similar questions like this in stackoverflow but none of them fulfills my requirements.
I want that when user move a file to his folder in desktop, that file should be uploaded to a web server (i.e. a one way drobox feature).
Technically, I want a listener who can check when a file is dropped to a folder and trigger an uploading function.
p.s. Will prefer code or resources in .net.
You can use a FileSystemWatcher to watch the folder.
You should use a FileSystemWatcher to monitor the folder. Then you upload the changed files with one of the methods available on your web server.

Concurrency Issue on IO operation

I'm writing a multi threaded console application which downloads pdf files from the web and copies it locally on to our content Server location(windows server). This is also the same location from which the files will be served to our website.
I am skeptical about the approach, because of concurrrency issues such as if the user on the web site requests a pdf file from the content server, and at the same time the file is being written to or being updated by the console application there might be an IO Exception. (The application also makes updates to the pdf files if the original contents change over time)
Is there a way to control the concurrency issue?
You probably want your operations on creating and updating the files where they are served to be atomic, so that any other processes dealing with those files get the correct version, not the one that is still open for writing.
Instead of actually writing the files to where they will be served, you could write them to a temporary directory and then move them into the directory where they will be served from.
Similarly, for updating them, you should check that when your application is updating those pdfs that the files themselves are not changed until writing has finished. You could test this by making your application sleep after it has started writing to the file, for example.
The details depend on which web server software you are using, but the key to this problem is to give each version of the file a different name. The same URL, mind you, but a different name on the underlying file system.
Once a newer version of the file is ready, change the web server's configuration so that the URL points to the new file. In any reasonably functional web server this should be an atomic operation.
If the web server doesn't have built-in support for this, you could serve the files via a custom server-side script.
Mark the files hidden until the copy or update is complete.

open a file before uploading through file uploader in asp.net

I want to upload an exe to web server from client system through file up-loader and want to run/open that exe before uploading .how can i run/open that exe before uploading it.
Short answer: No way!
If you really want to execute it client side, the user has to do it manually, JavaScript and jQuery are not going to execute an application locally.
if you want to execute it on the server side, you should first upload it to the server.
why are you trying to do this? can you explain a bit your use case?
If you're trying to execute on the server then you'll have to upload it first. Plain and simple.
You cannot make the web client open a file or even access the files because of browser security restrictions. All you can do is access the immediate file name (e.g., file.ext) and file content once the user browses, manually selects the file, and the form submits.
The reason for this restriction is that, if a website could execute files, any site could very easily install malware on a person's machine.
On the other hand, to execute the EXE on the server, it must first be uploaded.

Categories

Resources