I need to download a zip file created in realtime from a webservice.
Let me explain.
I am developing a web application that uses a SoapXml webservice. There is the Export function in the webservice that returns a temporary url to download the file. Upon request to download, the server creates the file and makes it available for download after a few seconds.
I'm trying to use
webClient.DownloadFile(url, #"c:/etc../")
This function downloads the file and saves it to me 0kb. is too fast! The server does not have time to create the file. I also tried to put
webClient.OpenRead(url);
System.Threading.Thread.Sleep(7000);
webClient.DownloadFile(url, #"c:/etc../");
but does not work.
In debug mode if I put a BREAK POINT on webClient.DownloadFile and I start again after 3, 4 seconds, the server have the time to create the file and I have a full download.
The developers of the webservice suggested me to use "polling" on the url until the file gets ready for the download. how does it work?
How can I do to solve my problem? (I also tried to DownloadFile Asynchronous mode )
I have similar mechanism in my application, and it works perfectly. WebClient does request, and waits, because server is creating response(file). If WebClient downloads 0kb that means that server responded to request with some empty response. This may not be a bug, but a design. If creating file takes long time, this method could result in timeouts. On the other hand if creating file takes short time, server side should respond with file(making WebClient hang on request, till the file is ready). I would try to discuss this matter with other developers and maybe redesign "file generator".
EDIT: Pooling means making requests in loop, for example every 2 seconds. I'm using DownloadData because it's useless, and resource consuming, to save empty file every time, which DownloadFile does.
public void PoolAndDownloadFile(Uri uri, string filePath)
{
WebClient webClient = new WebClient();
byte[] downloadedBytes = webClient.DownloadData(uri);
while (downloadedBytes.Length == 0)
{
Thread.Sleep(2000);
downloadedBytes = webClient.DownloadData(uri);
}
Stream file = File.Open(filePath, FileMode.Create);
file.Write(downloadedBytes, 0, downloadedBytes.Length);
file.Close();
}
Related
In my Windows application I am using WebClient DownloadFile method to download several PDF files from a server on local network.
Each file is a report that gets generated when its URL is called. Reports are of different sizes and take different periods of time to get generated.
My code loops through a list of URLs (about 400), and for each URL it calls DownloadFile method, and the corresponding report is generated and downloaded to local machine. URLs are definitely correct.
The problem is that almost each time the application is run, some of downloaded files are damaged, only 7KBs are downloaded (I think it’s for meta data), and Acrobat Reader gives me a message when I try to open the file:
“…it’s either not a supported file type or because the file has been damaged…”
It’s not always the same files that get damaged, and when I re-run the application, those files often succeed, and some others might fail… it seems to be random and I can’t find out the criteria.
Note 1: I don’t want a file to start download until its precedence is completely downloaded, that’s why I am not using the asynchronous method.
Note 2: All files are Oracle reports and get generated by querying a database.
Note 3: No EXCEPTION is thrown in case of damaged files.
Here is my code :
using ( WebClient client = new WebClient() )
{
for(int i=0; i< URL_List.Length; i++)
{
try
{
client.DownloadFile( URL_List[i] , myLocalPath+fileName+".pdf" );
}
catch(Exception x)
{
// write exception message to error log...
}
}
}
Thanks in advance.
So I'm trying to Download a file using WebClient class but the problem is that when the download is finished the file that should be downloaded is 0 byte, I tried uploading the same file without extension and than changing it after download but that didn't help. What Can I do? This is the code I Use
WebClient updateDownloader = new WebClient();
updateDownloader.DownloadFile(new Uri("http://zazaia.ucoz.com/SomeExeFile.exe"),
Application.StartupPath + "\\SomeFile.EXE");
And also have DownloadCompleted event handler which just shows MessageBox and Disposes the WebClient.
There is nothing wrong with the code you have shown and this should work. The problem is on the server which is not returning the file properly. Also make sure that the site you are querying doesn't require some authentication before being able to download files. In addition to that don't forget that a WebClient will not execute any javascript, so if the server relies on it to download the file, this will not happen.
Have you checked that your antivirus is not interfering? Sometimes an automatic scan will lock an executable file being downloaded until it passes. The client code itself looks fine however.
What about the server side? If is one of your own applications serving the download, then it may not be setting the MIME header or even not handling the download correctly at all
uploading file to web server with the help of web service is easy. this is the way i do it generally. here is my sample code.
[WebMethod]
public bool UploadFile(string FileName, byte[] buffer, long Offset)
{
bool retVal = false;
try
{
// setting the file location to be saved in the server.
// reading from the web.config file
string FilePath =
Path.Combine(ConfigurationManager.AppSettings["upload_path"], FileName);
if (Offset == 0) // new file, create an empty file
File.Create(FilePath).Close();
// open a file stream and write the buffer.
// Don't open with FileMode.Append because the transfer may wish to
// start a different point
using (FileStream fs = new FileStream(FilePath, FileMode.Open,
FileAccess.ReadWrite, FileShare.Read))
{
fs.Seek(Offset, SeekOrigin.Begin);
fs.Write(buffer, 0, buffer.Length);
}
retVal = true;
}
catch (Exception ex)
{
//sending error to an email id
common.SendError(ex);
}
return retVal;
}
but i want to develop web service which will give me status for uploading file in percentage and when file upload will be completed then a event will be fired at client side with status message whether file is uploaded completely or not. also i need to write routine which can handle multiple request simultaneously and also routine must be thread safe. so please guide me how to design routine which will suffice all my require points. thanks
I'd highly recommend that you forget about implementing this from scratch and instead look into one of the existing client file upload solutions that are available - most come with some boilerplate .NET code you can plug into an existing application.
I've used jQuery Upload and plUpload both of which have solid client side upload managers that upload files via HTTP Range headers and provide upload status information in the process. I believe both come with .NET examples.
Implementation of the server side for these types of upload handlers involve receiving HTTP chunks of data, that are identified via a sort of upload session id. The client sends chunks of files, each identified by this file related id as well as some progress information like bytes transferred and total file size and a status value that inidicates the status of the request. The status lets you know when the file is completely uploaded.
The incoming data from the POST buffer can then be written to a file or into a database or some other storage mechanism based on the unique ID passed from the client. Because the client application is sending chunks of data to the server it can provide progress information. If you use a client library like plUpload or jQuery-Upload they'll provide the customizable UI.
I'm a newbie and I'm developing a windows application. I need to download a video file from my site and that's my issue here. I had designed a custom down-loader, through which I can download images, text files from my site. But I wasn't able download videos from my site. Could anyone please help me out..?
WebClient client = new WebClient();
client.DownloadProgressChanged += new DownloadProgressChangedEventHandler(client_DownloadProgressChanged);
client.DownloadFileCompleted += new AsyncCompletedEventHandler(client_DownloadFileCompleted);
client.DownloadFileAsync(new Uri("http://mysitename.com/Videos/vid.mp4"), "c:\\movie.mp4");
I don't want to download by means of response content dispatch because my client wants me to download through custom browser.. so please let me know solutions from you experts.. thank you
I have tried to download a video file with WebClient and it works. My setup is as below:
I have a virtualdirectory(Video) in defaultwebsite (IIS) which has this video file.
I just use the below code to download the video file to C drive:
var client = new WebClient();
Uri address = new Uri("http://localhost/Video/wildlife.wmv");
client.DownloadFileAsync(address, #"c:\video.wmv");
Also note since you are downloading in Async fashion, wait for about a min for the operation to complete for the full file to be downloaded. Initially it shows 0 bytes but based on the size it takes some time to complete it.
UPDATE: If your server doesnt have the file mime type specified then just add to the collection of mime types that IIS can serve and you can download the file without any problem.
When adding MIME type the following values to be used are (for your scenario):
File Extension: .mp4
MIME Type: video/mp4
To add mime types in IIS follow these links:
For IIS 4,5
For IIS 6
For IIS 7
This sounds more like a server issue, but if you are doubting your code, you may want to try download sync (I have had some issues in the past downloading async). Another way is to use the WebRequest class. If this server is very remote, try pinging beforehand. I think that you should also check to make sure the file is on the server, and if the file is really big, you should check to see if the file finished uploading.
In my scenario, users are able to upload zip files to a.example.com
I would love to create a "daemon" which in specified time intervals will move-transfer any zip files uploaded by the users from a.example.com to b.example.com
From the info i gathered so far,
The daemon will be an .ashx generic handler.
The daemon will be triggered at the specified time intervals via a plesk cron job
The daemon (thanks to SLaks) will consist of two FtpWebRequest's (One for reading and one for writing).
So the question is how could i implement step 3?
Do i have to read into to a memory() array the whole file and try to write that in b.example.com ?
How could i write the info i read to b.example.com?
Could i perform reading and writing of the file at the same time?
No i am not asking for the full code, i just can figure out, how could i perform reading and writing on the fly, without user interaction.
I mean i could download the file locally from a.example.com and upload it at b.example.com but that is not the point.
Here is another solution:
Let ASP.Net in server A receive the file as a regular file upload and store it in directory XXX
Have a windows service in server A that checks directory XXX for new files.
Let the window service upload the file to server B using HttpWebRequest
Let server B receive the file using a regular ASP.Net file upload page.
Links:
File upload example (ASP.Net): http://msdn.microsoft.com/en-us/library/aa479405.aspx
Building a windows service: http://www.codeproject.com/KB/system/WindowsService.aspx
Uploading files using HttpWebRequest: Upload files with HTTPWebrequest (multipart/form-data)
Problems you gotto solve:
How to determine which files to upload to server B. I would use Directory.GetFiles in a Timer to find new files instead of using a FileSystemWatcher. You need to be able to check if a file have been uploaded previously (delete it, rename it, check DB or whatever suits your needs).
Authentication on server B, so that only you can upload files to it.
To answer your questions - yes you can read and write the files at the same time.
You can open an FTPWebRequest to ServerA and a FTPWebRequest to ServerB. On the FTPWebRequest to serverA you would request the file, and get the ResponseStream. Once you have the ResponseStream, you would read a chunk of bytes at a time, and write that chunck of bytes to the serverB RequestStream.
The only memory you would be using would be the byte[] buffer in your read/write loop. Just keep in mind though that the underlying implementation of FTPWebRequest will download the complete FTP file before returning the response stream.
Similarly, you cannot send your FTPWebRequest to upload the new file until all bytes have been written. In effect, the operations will happen synchronously. You will call GetResponse which won't return until the full file is available, and only then can you 'upload' the new file.
References:
FTPWebRequest
Something you have to take into consideration is that a long running web requests (your .ashx generic handler) may be killed when the AppDomain refreshes. Therefore you have to implement some sort of atomic transaction logic in your code, and you should handle sudden disconnects and incomplete FTP transfers if you go that way.
Did you have a look at Windows Azure before? This cloud platform supports distributed file system, and has built-in atomic transactions. Plus it scales nicely, should your service grow fast.
I would make it pretty simple. The client program uploads the file to server A. This can be done very easily in C# with an FtpWebRequest.
http://msdn.microsoft.com/en-us/library/ms229715.aspx
I would then have a service on server A that monitors the directory where files are uploaded. When a file is uploaded to that directory or on certain intervals it simply copies files over to server B. Again this can be done via Ftp or other means if they're on the same network.
you need some listener on the target domain, ftp server running there, and on the client side you will use System.Net.WebClient and UploadFile or UploadFileAsync to send the file. is that what you are asking?
It sounds like you don't really need a webservice or handler. All you need is a program that will, at regular intervals, open up an FTP connection to the other server and move the files. This can be done by any .NET program with the System.WebClient library, doesn't have to be a "web app". This other program could be a service, which could handle its own timing, or a simple app run by your cron job. If you need this to go two ways, for instance if the two servers are mirrors, you simply have the same app on the second box doing the same thing to upload files over to the first.
If both machines are in the same domain, couldn't you just do file replication at the OS level?
DFS
set up keys if you are using linux based systems:
http://compdottech.blogspot.com/2007/10/unix-login-without-password-setting.html
Once you have the keys working, you can copy the file from system A to system B by writing regular shell scripts that would not need any user interactions.