I'm working on a program where I am uploading multiple files to an FTP. I need to complete 8 actions in doing this:
Create a new folder within the FTP
Upload three files to the new directory
Create three subdirectories within the new directory
From what I've gathered, I'm only able to process one method in FtpWebRequest, much like this:
FtpWebrequest request = WebRequest.Create("ftp://microsoft.com/NewDir/") as FtpWebRequest;
request.Method = WebRequestMethods.Ftp.MakeDirectory;
request.Credentials = new NetworkCredentials("username", "password");
FtpWebResponse response = request.GetResponse() as FtpWebResponse;
response.Close();
So am I going to have to write code to also create the subdirectories and stream the files upload separately? Or is it possible to complete all of this within the single connection?
This is my first post, so if formatting is messed up I apologize.
From what I've gathered, I'm only able to process one method in FtpWebRequest, much like this: ...
You don't have to close the connection. Just process your request, get the response from the server like you have [request.GetResponse()], and move on the next method. When you are done with everything, close the connection.
The list of methods you can set is located in the enumeration WebRequestMethods.Ftp. Please see them at this link.
Related
I bougth a server on myasp.com, I wanted to create a folder dynamicly to every user in folder that called "UserData". when registering, for now, I create the directory by FTP client and the folder get the username. this method not allways works so I found the traditional method:
Directory.CreateDirectory(Server.MapPath("~") + "hey");
by using this method I get an error :Access to the path 'h:\root\home\sagigamil-001\www\site1\hey' is denied. However, I can check if folder exist.
What should I do? there is a way to give the server access to write to himself? what is the right way?
You probably need to ask your host to give the ASP .NET process write permissions. They might be reluctant to do so because of security reasons. If you can't get such permission, there will be no way for ASP .NET to create the folder.
You can create a directory over FTP by using this snippet:
var request = WebRequest.Create(new Uri("ftp://host/directory"));
request.Method = WebRequestMethods.Ftp.MakeDirectory;
using (var response = (FtpWebResponse)request.GetResponse()) {
Console.WriteLine("Response status code: {0}", response.StatusCode);
}
Don't forget to set your credentials if needed (assign them to request.Credentials).
If you're still running into trouble, don't forget to post the error you're getting.
I have a question regarding the ftp library from C#. I need to download 9000 txt files from a ftp server. Station.ToUpper() is the file name, so for every file I need a new ftp connection. For one file it takes around one second. The txt files contain two lines. So for all files it takes around one and a half hour. Is there a better / faster solution?
// Get the object used to communicate with the server.
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(ftpAddress + station.ToUpper());
//request.UsePassive = false;
request.Method = WebRequestMethods.Ftp.DownloadFile;
// This example assumes the FTP site uses anonymous logon.
request.Credentials = new NetworkCredential("anonymous", "janeDoe#contoso.com");
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream responseStream = response.GetResponseStream();
StreamReader reader = new StreamReader(responseStream);
There's not much you're doing wrong in this code, except for the fact you're not calling Dispose() on your streams or response objects. Do that first to make sure you're not somehow running out of resources on the client or something.
Other than that, you don't have too many options here, and a lot depends upon what you can do on the server side.
First, you could try to use threading to download a bunch of files at once. You'll need to experiment with how this affects your throughput. It will probably scale linearly for a while, then fall off. If you open up too many connections, you could anger the maintainer of the server, or it could start denying you connections. Be conservative.
Optimally, the files would be zipped (.ZIP or .TGZ) on the server. This will likely not be an option if you don't have more control over the process.
Use the MGET command to avoid re-establishing a connection each time. The System.Net client does not support MGET, so you would have to use a third party library or script ftp.exe. Regardless of the client you choose, the FTP log would look like the following:
USER anonymous
PASS janeDoe#contoso.com
CWD path/to/file
// to get 3 named files
MGET file1.txt file2.txt file3.txt
// or to get all files matching a pattern
MGET *.txt
The file transfers will use the same control session, avoiding login and other network overhead.
One library which could be of interest is FTPLib, which avoids tearing down the channel on each command. Be careful, however, as FTPLib is based on wininet which is not allowable for use in an NT Service.
I would also take a look at LumiSoft, an open source project with a friendly license, and DotNetFtpLib, though I have used neither and cannot speak to their stability or featureset. On the scripting side, take a look at "Using FTP Batch Scripts".
I want to write a file to a virtual directory path in same cloud.
For writing files to local we use
File.WriteAllText('c:\temp\sample.text',string)
Similarly, i want to write to network system like.
File.WriteAllText('\\\10.11.144.29\e$\projects\Map.text',string)
And to virtual directory location like.
File.WriteAllText('http://10.11.144.29/map/test.svg',string)
Is it possible to to write to URL location using c#? if possible, What class can be used?
Any help will be appreciated.
WebClient client = new WebClient();
//client.Credentials = new NetworkCredential("username", "password");
client.UploadFile("http://10.11.144.29/map/test.svg","test.svg");
The last option isn't possible as you require an HTTP PUT or POST to be able to send binary data to a URL, using the HttpWebClient classes or similar.
Examples #1 and #2 you gave should be just fine, though, providing the code running has sufficient permissions at the given network location (i.e. write access)
File.WriteAllText(Server.MapPath(#"c:\temp\sample.text"),string).*
I need to download a zip file created in realtime from a webservice.
Let me explain.
I am developing a web application that uses a SoapXml webservice. There is the Export function in the webservice that returns a temporary url to download the file. Upon request to download, the server creates the file and makes it available for download after a few seconds.
I'm trying to use
webClient.DownloadFile(url, #"c:/etc../")
This function downloads the file and saves it to me 0kb. is too fast! The server does not have time to create the file. I also tried to put
webClient.OpenRead(url);
System.Threading.Thread.Sleep(7000);
webClient.DownloadFile(url, #"c:/etc../");
but does not work.
In debug mode if I put a BREAK POINT on webClient.DownloadFile and I start again after 3, 4 seconds, the server have the time to create the file and I have a full download.
The developers of the webservice suggested me to use "polling" on the url until the file gets ready for the download. how does it work?
How can I do to solve my problem? (I also tried to DownloadFile Asynchronous mode )
I have similar mechanism in my application, and it works perfectly. WebClient does request, and waits, because server is creating response(file). If WebClient downloads 0kb that means that server responded to request with some empty response. This may not be a bug, but a design. If creating file takes long time, this method could result in timeouts. On the other hand if creating file takes short time, server side should respond with file(making WebClient hang on request, till the file is ready). I would try to discuss this matter with other developers and maybe redesign "file generator".
EDIT: Pooling means making requests in loop, for example every 2 seconds. I'm using DownloadData because it's useless, and resource consuming, to save empty file every time, which DownloadFile does.
public void PoolAndDownloadFile(Uri uri, string filePath)
{
WebClient webClient = new WebClient();
byte[] downloadedBytes = webClient.DownloadData(uri);
while (downloadedBytes.Length == 0)
{
Thread.Sleep(2000);
downloadedBytes = webClient.DownloadData(uri);
}
Stream file = File.Open(filePath, FileMode.Create);
file.Write(downloadedBytes, 0, downloadedBytes.Length);
file.Close();
}
So, I'm actually trying to setup a Wopi Host for a Web project.
I've been working with this sample (the one from Shawn Cicoria, if anyone knows this), and he provides a whole code sample which tells you how to build the links to use your Office Web App servers with some files.
My problem here, is that his sample is working with files that are ON the OWA server, and i need it to work with online files (like http://myserv/res/test.docx. So when he reads his file content, he's using this :
var stream = new FileStream(myFile, FileMode.Open, FileAccess.Read);
responseMessage.Content = new StreamContent(stream);
But that ain't working on "http" files, so i changed it with this :
byte[] tmp;
using (WebClient client = new WebClient())
{
client.Credentials = CredentialCache.DefaultNetworkCredentials;
tmp = client.DownloadData(name);
}
responseMessage.Content = new ByteArrayContent(tmp);
which IS compiling. And with this sample, i managed to open excel files in my office web app, but words and powerpoint files aren't opened. So, here's my question.
Is there a difference between theses two methods, which could alter the content of the files that i'm reading, despite the fact that the WebClient alows "online reading" ?
Sorry for the unclear post, it's not that easy to explain such a problem x) I did my best.
Thanks four your help !
Is there a difference between theses two methods, which could alter
the content of the files that i'm reading, despite the fact that the
WebClient allows "online reading"
FileStream open a file handle to a file placed locally on disk, or a remote disk sitting elsewhere inside a network. When you open a FileStream, you're directly manipulating that particular file.
On the other hand, WebClient is a wrapper around the HTTP protocol. It's responsibility is to construct HTTP request and response messages, allowing you to conveniently work with them. It has no direct knowledge of a resources such as a file, or particularly where it's located. All it knows is to construct message complying with the specification, sends a request and expects a response.