I am creating an SFTP upload program. It is working great, it connects to the remote SFTP server and uploads the files as intended. The issue I am having it I want the files once uploaded moved to a new directory on the local server. I have searched WinSCP site and did google searches, but the code I am up with it not working. Here is what I have:
foreach (TransferEventArgs transfer in transferResult.Transfers)
{
Console.WriteLine("Upload of {0} succeeded", transfer.FileName);
session.MoveFile(transfer.FileName, Local_Processed);
}
In the log it states that it is moving the files but the files remain in the original folder and nothing appears in the processed folder.
The Session.MoveFile is for moving a remote file to another remote directory or for renaming a remote file. It's not for moving a remote file to a local directory.
To move a remote file to a local directory, use the remove parameter of the Session.GetFiles.
Though for me it looks like you actually want to move an original local file (that was uploaded) to another local directory. So it has actually nothing to do with WinSCP.
To move a local file, use the File.Move:
File.Move(transfer.FileName, destinationPath);
Here is what ended up with after Martin Prikryl posted. I ended up having to add a second foreach after my first one used to just move the files. I also found that the *.* in my original original directory call had to be left out as this was also causing issues.
I ended up creating a second variable in my app.config file. It had the exact same path as the original directory variable except it didn't have the *.* for file name.
foreach (var file in Directory.GetFiles(OrgPath))
{
File.Move(file, Path.Combine(Processed, Path.GetFileName(file)));
}
Related
I am currently using FTPwebRequest to move a local file over to a server. I am able to FTP to the root directory at ftp://ftp.xxxx.com. But, whenever I try to FTP the file to a folder within that directory like: ftp://ftp.xxxx.com/firstfolder nothing happens. I don't get any hard halts in the code and I also setup a FTPwebResponse stating that the transfer is complete.
string dest = "ftp://username:password#ftp.xxxx.com/firstfolder/" + fileName;
ftp = (FtpWebRequest)FtpWebRequest.Create(dest);
I have also tried using %2f to mimic the CD command.
Here are a few links I have been looking at with no luck:
https://blogs.msdn.microsoft.com/mariya/2006/03/06/changing-to-the-root-directory-with-ftpwebrequest
https://social.msdn.microsoft.com/Forums/en-US/91e2bed0-9e5e-4503-9e66-d224086e43a8/change-directory-with-ftpwebrequest
https://msdn.microsoft.com/en-us/library/system.net.ftpwebrequest(v=vs.110).aspx
In IE the file did not appear in the servers directory. I used google chrome and I was able to view the file successfully. It was uploaded the hole time.
I am simply trying to transfer text files from one FTP server to another using a windows service. I download the required files from source FTP server and save it locally on my system and then upload the saved file to the destination server. For downloading and uploading files I am using WinSCP .Net Assembly. Here is my code that I am using to transfer files to the destination server:
WinSCP.SessionOptions sessionOptions = new WinSCP.SessionOptions();
sessionOptions.Protocol = WinSCP.Protocol.Ftp;
sessionOptions.UserName = "myUsername";
sessionOptions.Password = "myPassword"
sessionOptions.PortNumber = 21;
sessionOptions.HostName = serverIPAddress;
session.Open(sessionOptions);
WinSCP.TransferOptions transferOptions = new WinSCP.TransferOptions();
transferOptions.TransferMode = WinSCP.TransferMode.Binary;
WinSCP.TransferOperationResult transferResult;
transferResult = session.PutFiles(PathToLocalFile + filename, destinationFilePath, false, transferOptions);
transferResult.Check();
It works fine and uploads file to the server, but in case a connectivity issue occurs while transferring the file, an incomplete chunk of required file is transferred to the destination server.
I have searched the WinSCP official documentation but I couldn't find anything related to this.
Is there any way to ensure that only complete files gets transferred to the destination otherwise (in case an error occurs during transfer), the transferred chunk of file gets deleted automatically? (Without manually deleting the incomplete file)
There no way to make this automatic.
You have to code it. Just check, if the transfer failed, reconnect (if needed), and delete the partially uploaded file.
Though as already mentioned in comments, if the transfer fails, because of problems with connection, you may not be able to reconnect to delete the file.
There's no magic solution. The server should be able to deal with partial files in the first place.
See also:
How to detect that a file is being uploaded over FTP (while seemingly different topic, detecting if file is being uploaded is basically the same thing, as detecting if file has not been uploaded completely)
File upload with WinSCP .NET/COM with temporary filenames
i have multiple web server and one central file server inside my data center.
and all my Web server store the user uploaded files into central internal file server.
i would like to know what is the best way to pass the file from web server to file server in this case?
as suggested i try to add more details to question:
the solution i came up was:
after receiving files from user at web server, i should just do an Http Post to the file server. but i think there is some thing wrong with this because it causes large files to be entirely loaded into memory twice: (once at web server and once at file server)
Is your file server just another windows/linux server or is it a NAS device. I can suggest you number of approaches based on your requirement. The question is why d you want to use HTTP protocol when you have much better way to transfer files between servers.
HTTP protocol is best when you send text data as HTTP itself is based
on text.From the client side to Server side HTTP is used as that is
the only available option for you by our browsers .But
between your servers ,I feel you should use SMB protocol(am assuming
you are using windows as it is tagged for IIS) to move data.It will
be orders of magnitude faster as much more efficient to transfer the same data over SMB vs
HTTP.
And for SMB protocol,you do not have to write any code or complex scripts to do this.As provided by one of the answers above,you can just issue a simple copy command and it will happen for you.
So just summarizing the options for you (based on my preference)
Let the files get upload to some location on the each IIS web server e.g C:\temp\UploadedFiles . You can write a simple 2-3 line powershell script which will copy the files from this C:\temp\UploadedFiles to \FileServer\Files\UserID\\uploaded.file .This same powershell script can delete the file once it is moved to the other server successfully.
E.g script can be this simple and easy to make it as windows scheduled task
$Destination = "\\FileServer\Files\UserID\<FILEGUID>\"
New-Item -ItemType directory -Path $Destination -Force
Copy-Item -Path $Source\*.* -Destination $Destination -Force
This script can be modified to suit your needs to delete the files if it is done :)
In the Asp.net application ,you can directly save the file to network location.So in the SaveAs call,you can give the network path itself. This you have to make sure this network share is accessible for the IIS worker process and also has write permission.Also in my understanding asp.net gets the file saved to temporary location first (you do not have control on this if you are using the asp.net HttpPostedFileBase or FormCollection ). More details here
You can even run this in an async so that your requests will not be blocked
if (FileUpload1.HasFile)
// Call to save the file.
FileUpload1.SaveAs("\\networkshare\filename");
https://msdn.microsoft.com/en-us/library/system.web.ui.webcontrols.fileupload.saveas(v=vs.110).aspx
3.Save the file the current way to local directory and then use HTTP POST. This is worst design possible as you are first going to read the contents and then transfer it as chunked to other server where you have to setup another webservice which recieves the file.The you have to read the file from request stream and again save it to your location. Am not sure if you need to do this.
let me know if you need more details on any of the listed method.
Or you just write it to a folder on the webservers, and create a scheduled task that moves the files to the file server every x minutes (e.g. via robocopy). This also makes sure your webservers are not reliant on your file server.
Assuming that you have an HttpPostedFileBase then the best way is just to call the .SaveAs() method.
You need the UNC path to the file server and that is it. The simplest version would look something like this:
public void SaveFile(HttpPostedFileBase inputFile) {
var saveDirectory = #"\\fileshare\application\directory";
var savePath = Path.Combine(saveDirectory, inputFile.FileName);
inputFile.SaveAs(savePath);
}
However, this is simplistic in the extreme. Take a look at the OWASP Guidance on Unrestricted File Uploads. File uploads can be the source of many vulnerabilities in your application.
You also need to make sure that the web application has access to the file share. Take a look at this answer
Creating a file on network location in asp.net
for more info. Generally the best solution is to run the application pool with a special identity which is only used to access the folder.
the solution i came up was: after receiving files from user at web server, i should just do an Http Post to the file server. but i think there is some thing wrong with this because it causes large files to be entirely loaded into memory twice: (once at web server and once at file server)
I would suggest not posting the file at once - it's then full in memory, which is not needed.
You could post the file in chunks, by using ajax. When a chunk receives at your server, just add it to the file.
With the File Reader API, you could read the file in chunks in Javascript.
Something like this:
/** upload file in chunks */
function upload(file) {
var chunkSize = 8000;
var start = 0;
while (start < file.size) {
var chunk = file.slice(start, start + chunkSize);
var xhr = new XMLHttpRequest();
xhr.onload = function () {
//check if all chunks are and then send filename or send in in the first/last request.
};
xhr.open("POST", "/FileUpload", true);
xhr.send(chunk);
start = end;
}
}
It can be implemented in different ways. If you are storing files in files server as files in file system. And all of your servers inside the same virtual network
Then will be better to create shared folder on your file server and once you received files at web server, just save this file in this shared folder directly on file server.
Here the instructions how to create shared folders: https://technet.microsoft.com/en-us/library/cc770880(v=ws.11).aspx
Just map a drive
I take it you have a means of saving the uploaded file on the web server's local filesystem. The question pertains to moving the file from the web server (which is probably one of many load-balanced nodes) to a central file system all web servers can access it.
The solution to this is remarkably simple.
Let's say you are currently saving the files some folder, say c:\uploadedfiles. The path to uploadedfiles is stored in your web.config.
Take the following steps:
Sign on as the service account under which your web site executes
Map a persistent network drive to the desired location, e.g. from command line:
NET USE f: \\MyFileServer\MyFileShare /user:SomeUserName password
Modify your web.config and change c:\uploadedfiles to f:\
Ta da, all done.
Just make sure the drive mapping is persistent, and make sure you use a user with adequate permissions, and voila.
I'm having an issue here, I developed an application in C# which creates a text file. This text file is saved in the X:\Public\3rd\ASN\, the problem is that in development the files are created and saved with no issues but once I move the application into our Web Server the appplication fails and it throws out this error "Could not find a part of the path X:\Public\3rd\ASN\1175_0001.txt".
This is the code I'm using to saved the file in the directory:
w = File.CreateText("X:\Public\Public\3rd\ASN\1175ASN_0001.txt");
Keep in mind that this directory is another server.
Any help will be really appreciate it.
your X drive is a mapped network drive. You need to use the network url eg \\server\directory\Public\3rd\ASN\1175_0001.txt
I am having some files in FTP having a directory named say "ParentDirectory" and it too have a child folder named "Child1". How can I move a file from ParentDirectory to its child folder.
Say ParentDirectory is having file named "File01.pdf" in it, now I want to move it to
ParentDirectory/Child1/
Without downloading a file then again uploading it to server and then deleting it from ftp server. Is there any way to directly move the file to its child directory.
Would it be an Option for you to programm kind of small Client Server App which moves / deletes Files on your FTP host by getting some signals from the client?
There might be an issue with files which are in use from the FTP-Server but you can check this programmatically.
Link served by #Petoj helped to get the solution.
It was being provided as comment to the question. As it was not in the answer section I could not mark it as being answered.
possible duplicate of How can I use FTP to move files between directories? – Petoj Nov 30 '11 at 9:23
Using your favorite FTP GUI client simply connect to the remote server and drag and drop your file from the parent directory to the child directory.