Upload to web on dropping file to a folder like dropbox - c#

There are similar questions like this in stackoverflow but none of them fulfills my requirements.
I want that when user move a file to his folder in desktop, that file should be uploaded to a web server (i.e. a one way drobox feature).
Technically, I want a listener who can check when a file is dropped to a folder and trigger an uploading function.
p.s. Will prefer code or resources in .net.

You can use a FileSystemWatcher to watch the folder.

You should use a FileSystemWatcher to monitor the folder. Then you upload the changed files with one of the methods available on your web server.

Related

Check whether a SFTP file is in use [duplicate]

How can I make sure that a file uploaded through SFTP (in a Linux base system) stays locked during the transfer so an automated system will not read it?
Is there an option on the client side? Or server side?
SFTP protocol supports locking since version 5. See the specification.
You didn't specify, what SFTP server are you using. So I'm assuming the most widespread one, the OpenSSH. The OpenSSH supports SFTP version 3 only, so it does not support locking.
Anyway, even if your server supported file locking, most SFTP clients/libraries won't support SFTP version 5. Or even if they do, they won't support the locking feature. Note that the lock is explicit, the client has to request it.
There are some common workarounds for the problem:
As suggested by #user1717259, you can have the client upload a "done" file, once an upload finishes. Make your automated system wait for the "done" file to appear.
You can have a dedicated "upload" folder and have the client (atomically) move the uploaded file to a "done" folder. Make your automated system look to the "done" folder only.
Have a file naming convention for files being uploaded (".filepart") and have the client (atomically) rename the file after an upload to its final name. Make your automated system ignore the ".filepart" files.
See (my) article Locking files while uploading / Upload to temporary file name for example of implementing this approach.
Also, some SFTP servers have this functionality built-in. For example ProFTPD with its HiddenStores directive (courtesy of #fakedad).
A gross hack is to periodically check for file attributes (size and time) and consider the upload finished, if the attributes have not changed for some time interval.
You can also make use of the fact that some file formats have clear end-of-the-file marker (like XML or ZIP). So you know, when you download an incomplete file.
A typical way of solving this problem is to upload your real file, and then to upload an empty 'done.txt' file.
The automated system should wait for the appearance of the 'done' file before trying to read the real file.
A simple file locking mechanism for SFTP is to first upload a file to a directory (folder) where the read process isn't looking. You can "make" an alternate folder using the sftp> mkdir command. Upload the file to the alternate directory, instead of the ultimate destination directory. Once the SFTP> put command completes, then do a move like this:
SFTP> move alternate_path/filename destination_path/filename. Since the SFTP "move" is just switching the file pointers, it is atomic, so it is an effective lock.

Display folders and files from a remote desktop computer on a website

I've been trying to make a website which is able to display all the folders and files which is placed on a remote computer, and here afterwards but is there a good way to do this?
I've been looking on RDP and RDC and here you have to use a remote desktop application in order du do this.
i'm working with asp.net, C# and javaScript
But is there a way to display the folder system on a website
like :
C: drive (freesspace)
(then comes all the folders on c drive)
folder 1
(and in each folder comes all the files)
file 1
file 2
file 3
folder 2
folder 3
i've been looking alittle on ConnectionOptions and it looks like it would do the trick, but does it work? and/or is there another way.
found this link http://social.msdn.microsoft.com/Forums/vstudio/en-US/27ea1e6d-dc11-4ed0-a3d8-1d1462231848/remotely-access-the-computer-c but not sure if it would work.
Need your help!
Regards Kasper
If you are looking for a file manager, see this question:
Best free file manager for ASP.Net
The best option is this guy:
http://www.izwebfilemanager.com/
With that said, it is a little dubious of a security proposition to expose the entire hard drive of a remote machine in a web app.

How to store temp files in SharePoint 2010 hosted code?

I have a hosted web service in SharePoint 2010 that does uploads and downloads to sharepoint.
Because the files can be large (100MB+), i would much rather use temp files as the streams the code goes through instead of memory-streams to avoid 100mb memory allocations each time it does download/upload.
The problem is that i could find a location in the server to store temp files. System.IO.Path.GetTempFileName() throws an error because the authenticated user doesn't have permissions to %TEMP% folder in the server.
"%systemroot%\temp" allows writing files but not deleting them.
Any idea if i can get a location from sharepoint that is accessible for any authenticated user to store the files?
few notes:
the files are temporary and need to be deleted right away so no need to consider clustering issues.
I don't want a solution that requires doing any active action in the servers as this plugin might be deployed on farms with a lot of servers and i'd hate to ask the customer to go through each server.
Thanks.
You need to access files under SharePoint's "system account". And yes, System.IO.Path.GetTempFileName() is correct location.
Starting point - SPSecurity.RunWithElevatedPrivileges.
Notes
If you can open files as "temporary + delete on close" (check appropriate flags in FileStream class).
Be extremely careful not to perform access to other SharePoint resource (SPFile/SPItem...) while running code inside RunWithElevatedPrivileges delegate.
You may only need to open file under RunWithElevatedPrivileges, read/write may work outside - please verify yourself. I'd keep all file access inside delegates running with RunWithElevatedPrivileges.
It might not be the best place but I have used the _layouts directory in the hive before (C:\Program Files\Common Files\Microsoft Shared\Web Server Extensions\14\TEMPLATE\LAYOUTS) for doing something similar to this before.
You can get this location with Microsoft.SharePoint.Utilities.SPUtility.GetGenericSetupPath() and you should be able to read/write in the directory. You may need to run as elevated permissions.

scheduling files for upload

I want to schedule the files for upload. Files are residing in companies network(on intranet)
I use "FileUpload" to browse the files on screen which are scheduled for upload. But the FileUpload does not give the full path of the file. So what is the way to workaround this limitation.
It's not possible for security reasons. Though, I It's possible in either IE with some javascript

sync database to uploaded file using Windows Service or website code

I have a website that occasionally needs to have a handful of the tables in its database updated. The updates come from another system that exports to comma delimited text files. I can then either FTP the text files to the web server, send them in through an admin upload page, or manually log in to Remote Desktop to download the text files. I have all my C# code written to parse the files, check the database contents, and decide what to do.
Should I code the sync logic to be part of a file upload page, protected in the admin section of the site or should I create a Windows Service that constantly looks for files to process in a particular directory that I can drop files in through FTP?
I have used Windows Services in the past and they have worked great, but if I ever have to make a change to the code it can take longer than it would if I just had to modify an ASPX.
Are their security benefits one way or another?
Performance benefits?
ASPX page wins the "ease of maintenance" category.
I would create a Windows Service to watch a secure folder and use a directory watcher to look for new files. Since the files are coming from another system, it is asynchronous in nature, and it is much more performant to have a Windows Service running separately to watch for updates as they happen. It can also parse the files and update the database for you.
Depending on who maintains the remote system, the easiest way is to grant permission to the service to access the files on a secure, shared folder. Then you won't need to do anything manually.

Categories

Resources