how do we know if the file downloaded from FTP is not corrupt? The FTP server does not provide any checksums of their files or is there a checksum command? Any other ways of checking downloaded file's integrity? Thanks.
Other than checking to see if the file will actually open in the target application or if it's missing an EOF marker or something like that, it's going to be tough to validate the contents of a downloaded file without some image or checksum to compare it to.
Perhaps, using an MD5 File Verification Service (custom) of course would enable you to schedule a service in a selected repository to verify files to their md5 signatures...
Generally, this is done on the download rather than on uploads...but, ya know you may be able to figure out a way to do this for uploads.
Related
I need to read and write(update) some remote machine file.I am able to find the remote file using WMI(System.Management) but not able to do read or updation on that.
Any help would be appreciated.
Thanks
Himanshu
The WMI doesn't have any class (or method) to read or write the content of files. You may only retrieve the metadata (FileName, Date, Size) of the files using CIM_DataFile, or do tasks like Copy, Rename, Delete or Compress files.
RRUZ is correct: WMI cannot copy or create files over a network. This is because it would require credential "hopping":
http://msdn.microsoft.com/en-us/library/windows/desktop/aa389288%28v=vs.85%29.aspx
However, a workaround was recently created by Stackoverflow.com user Frank White in C#, and the WMI logic ports directly to VBS. Here's his solution:
WMI remote process to copy file
I ported it to a fully working VBScript:
https://stackoverflow.com/a/11948096/1569434
First check your file access in premmisions and set user "Everyone" to Full Control
then try it again.
In my scenario, users are able to upload zip files to a.example.com
I would love to create a "daemon" which in specified time intervals will move-transfer any zip files uploaded by the users from a.example.com to b.example.com
From the info i gathered so far,
The daemon will be an .ashx generic handler.
The daemon will be triggered at the specified time intervals via a plesk cron job
The daemon (thanks to SLaks) will consist of two FtpWebRequest's (One for reading and one for writing).
So the question is how could i implement step 3?
Do i have to read into to a memory() array the whole file and try to write that in b.example.com ?
How could i write the info i read to b.example.com?
Could i perform reading and writing of the file at the same time?
No i am not asking for the full code, i just can figure out, how could i perform reading and writing on the fly, without user interaction.
I mean i could download the file locally from a.example.com and upload it at b.example.com but that is not the point.
Here is another solution:
Let ASP.Net in server A receive the file as a regular file upload and store it in directory XXX
Have a windows service in server A that checks directory XXX for new files.
Let the window service upload the file to server B using HttpWebRequest
Let server B receive the file using a regular ASP.Net file upload page.
Links:
File upload example (ASP.Net): http://msdn.microsoft.com/en-us/library/aa479405.aspx
Building a windows service: http://www.codeproject.com/KB/system/WindowsService.aspx
Uploading files using HttpWebRequest: Upload files with HTTPWebrequest (multipart/form-data)
Problems you gotto solve:
How to determine which files to upload to server B. I would use Directory.GetFiles in a Timer to find new files instead of using a FileSystemWatcher. You need to be able to check if a file have been uploaded previously (delete it, rename it, check DB or whatever suits your needs).
Authentication on server B, so that only you can upload files to it.
To answer your questions - yes you can read and write the files at the same time.
You can open an FTPWebRequest to ServerA and a FTPWebRequest to ServerB. On the FTPWebRequest to serverA you would request the file, and get the ResponseStream. Once you have the ResponseStream, you would read a chunk of bytes at a time, and write that chunck of bytes to the serverB RequestStream.
The only memory you would be using would be the byte[] buffer in your read/write loop. Just keep in mind though that the underlying implementation of FTPWebRequest will download the complete FTP file before returning the response stream.
Similarly, you cannot send your FTPWebRequest to upload the new file until all bytes have been written. In effect, the operations will happen synchronously. You will call GetResponse which won't return until the full file is available, and only then can you 'upload' the new file.
References:
FTPWebRequest
Something you have to take into consideration is that a long running web requests (your .ashx generic handler) may be killed when the AppDomain refreshes. Therefore you have to implement some sort of atomic transaction logic in your code, and you should handle sudden disconnects and incomplete FTP transfers if you go that way.
Did you have a look at Windows Azure before? This cloud platform supports distributed file system, and has built-in atomic transactions. Plus it scales nicely, should your service grow fast.
I would make it pretty simple. The client program uploads the file to server A. This can be done very easily in C# with an FtpWebRequest.
http://msdn.microsoft.com/en-us/library/ms229715.aspx
I would then have a service on server A that monitors the directory where files are uploaded. When a file is uploaded to that directory or on certain intervals it simply copies files over to server B. Again this can be done via Ftp or other means if they're on the same network.
you need some listener on the target domain, ftp server running there, and on the client side you will use System.Net.WebClient and UploadFile or UploadFileAsync to send the file. is that what you are asking?
It sounds like you don't really need a webservice or handler. All you need is a program that will, at regular intervals, open up an FTP connection to the other server and move the files. This can be done by any .NET program with the System.WebClient library, doesn't have to be a "web app". This other program could be a service, which could handle its own timing, or a simple app run by your cron job. If you need this to go two ways, for instance if the two servers are mirrors, you simply have the same app on the second box doing the same thing to upload files over to the first.
If both machines are in the same domain, couldn't you just do file replication at the OS level?
DFS
set up keys if you are using linux based systems:
http://compdottech.blogspot.com/2007/10/unix-login-without-password-setting.html
Once you have the keys working, you can copy the file from system A to system B by writing regular shell scripts that would not need any user interactions.
I'm trying to keep track of the email created after I "send" it using SmtpClient.Send.
I have it configured to write to a directory by configuring my app.config to use specifiedPickupDirectory.
What I'd like to gain access to is the name of the file that was used, so that I can periodically check and make sure that my mail server has retrieved it and sent it along.
Any suggestions?
Perhaps try initially creating the file in a temporary directory. Make sure that it is the only file in the directory. Use Directory.GetFiles to find the file name and save it to a variable. Then move the file to the real directory.
Please refer to question: Resume in upload file control
Now to find a solution for the above question, I want to work on it and develop a user control which can resume a HTTP File Upload process.
For this, I'm going to create a temporary file on server until the upload is complete. Once the uploading is done completely then will just save it to the desired location.
The procedure I have thought of is:
Create a temporary file with a unique name (may be GUID) on the server.
Read a chunk of file and append it to this temp file on the server.
Continue step 1 to 3 until EOF is reached.
Now if the connection is lost or user clicks on pause button, the writing of file is stopped and the temporary file is not deleted.
Clicking on resume again or uploading the same file again will check on the server if the file exists and user whether to resume or to overwrite. (Not sure how to check if it's the same file. Also, how to step sending the chunks from client to server.)
Clicking on resume will start from where it is required to be uploaded and will append that to the file on the server. (Again not sure how to do this.)
Questions:
Are these steps correct to achieve the goal? Or need some modifications?
I'm not sure how to implement all these steps. :-( Need ideas, links...
Any help appreciated...
What you are trying is not easy, but it is doable. Follow these guidelines:
Write 2 functions on the client side using ajax:
getUploadedPortionFromServer(filename) - This will ask the server if the file exists, and it should get an xml response from the server with the following info:
boolean(exist/not exist), size(bytes), md5(string)
The function will also run an md5 on the same size it got from the server on the local file,
If the md5 is the same, you can continue sending from the point it was stopped,
elseif not the same or size == 0, start over.
uploadFromBytes(x) - Will depend on the 1st function.
On the server you have to write matching functions, which will check the needed stuff, and send the response via XML to the client.
In order to have distinct filenames, you should use some sort of user tagging.
Are users logged in to your server? if so, use a hash of thier username appended to the filename, this will help you distinguish between the files.
If not, use a cookie.
I'm using asp:FileUpload control to upload a file to the server. Nothing fancy there, just
FileUploadId.Save();
File gets uploaded successfully, and everything is fine until I try to delete that file on the CLIENT. I get a good-old "File is being used by another person or program" message.
How do I make sure that file is not being accessed on the client after it's been uploaded?
EDIT
deleting the file has nothing to do with the application. i'm just trying to delete the file manually since i don't need it any more.
EDIT2
closing the browser fixed the problem ... any ideas?
Since the problem happens both in IE and FF: could it be that the file is locked by some AntiVirus software?
The issue might be the file can be locked by the aspnet process even after uploading. Once you close the IE, the aspnet process release the file
How are you trying to delete the file at the client? Unless you're hosting in WebBrowser, or using something like an ActiveX control, you only have javascript at the client - and that doesn't provide random file access.
So: what is the full setup here?
A thought. It may not be the file upload that is causing the problem. As the surrounding code isn't posted it's difficult to tell, but, for example, do you have a Zip manager object of some kind that you're not disposing of?