How to send folders in c# through TCP? - c#

I'm having trouble looking for a way to send whole folders over TCP. My initial idea was that the sender sends a string that contains the path of a given file like C://MyFolder/MySubFolder/MyFile then the receiver creates the folders and subfolders. The sender then goes ahead with the sending of the files containing their directory.
I think it goes without saying that this is not the best method in doing this. Is there a better approach?
EDIT:
Sorry if I was a little vague. I have a file transfer app that sends sends/receives files obviously and I want to add a way to send whole folders.

You need some sort of a file transfer protocol for that (i.e. FTP). Use an easy to setup c# FTP server library (i.e. this one: http://sourceforge.net/projects/csftpserver/) on the sending side and use FtpWebRequest on the client side to get the whole folder structure.

Use famous archiving methods (zip, rar...) and transfer data. The extract the at the peer side. This way you save:
Implementing an error-prone
recursive pattern.
Your bandwidth

Have you looked at existing protocols for this purpose? It seems you want to clone FTP, maybe with a streaming mechanism like tar in between.

If you consider zipping/compressing:
You could have a look at GZipStream class for that.
http://www.geekpedia.com/tutorial190_Zipping-files-using-GZipStream.html

Related

Best practice to send a ASP.Net Core Web App project to someone?

I'm a newbie in ASP.NET projects and I am wondering how do I share my finished C# project to someone by email so they can run it (IIS Express) on their machine ? It's for a job offer skill test.
Do I need to send every files and folders or i can just send the source code?
The project folder is 6MB so i can't send it by email.
Sorry for my English and thanks for help !
A nice way to send it could be by uploading it to GitHub and sharing the repository link. Since it is for a job it will also showcase that you are Version Control savvy.
You can also include a README.md file there that describes the app.
I'm not sure this has anything to do with ASP.net. It's just a question on how to transfer a large file isn't it? If so, 6mb isn't really that large for email nowadays. Most mail servers will handle that fine. But if email is a problem, put it on one of the myriad of fileshare platforms available such as Dropbox.
As for what specific files you need to send. That would depend on the requirements the company gave you - but I'd imagine they certainly need to see your source code and likely want the entire thing to be runnable. So send whatever is needed so they can easily run it.
Have you tried zipping it up? Source code will generally zip very effectively to a much smaller file as it contains so much repetition.
Upload the zip file to your google drive and then send it as a drive attachment. (This is actually Google’s recommendation) Or just simply share it with them. I recommend that attachment be as small as possible in the event that the user your sending it to has a mail quota.

Transactional Write of multiple FileStreams in C#

I'm supposed to get multiple files with the same extension (for example, an archive that's split into several archives: rar, r00, r01, etc).
I'm trying to find a solution where if one of the file streams I got fails to be written, all the previously successful files that were created will be deleted.
Just like a transactional file stream writer.
I've bumped into .NET Transactional File Manager project, which seems just like what I need -- except it doesn't work with streams but with file paths.
At this point I only see two options:
Keeping a list of successful file writes and if other one will fail, I'll go over the list and delete all.
Write all files to %TEMP% or something via FileStream and then - after all files were written successfully, I'll use the Transactional File Manager (mentioned above) to move the files to the desired location.
Need to notice that I have to work with streams
Which of the two options is better in your opinion?
Is there any better recommendation or idea for doing this?
Thanks
Edit:
Another option I bumped into is using AlphaFS just like in the following example.
Any thoughts on this?
I ended up using AlphaFS.
Using it just like in this example.
It works perfectly and does exactly what I was needed.
Thanks for all the comments.

Update file, not replace or overwrite

this is more of a question because I am experimenting with this.
All over the internet I see how you can update a .txt file. Well that is all good and well, but lets say I have a .doxc or even an .exe or even a .dll file.
If we make a minor change to a file, do we really have to replace(overwrite) the whole file?
Is it possible to "update" the file so that we don't use too mush data (over the internet).
What I am trying to achieve is to create a FTP client with a FileSystemWatcher. This will monitor a certain folder on the Computer. If anything changes in this folder (even sub directories) then it uploads, deletes, renames, or changes the file. But at the moment I am wondering if I have, lets say, a 20MB .exe file or whatever, if it is possible to change something in that .exe, instead of just overwriting the whole thing... thus, sparing some cap.
In general, it's possible to update the remote file only partially, but not in your case.
What would work:
1) track the file change using a filesystem filter driver, which gives you information about what parts of the file have been updated.
2) use the protocol which allows partial upload or remote modification of the file (eg. SFTP).
As for your scenario:
Step 1 is not possible with FileSystemWatcher.
Step 2 is not possible with FTP protocol which doesn't support modification of file blocks.
Since your are experimenting, I can provide some pointers. But I dont know for sure if the below operations are just updates or replaced newly by the underlysing os calls
Have different cases for each file type. Try with a basic types first, a txt file, then a binary file etc.
You should have the entire copy of the current file somewhere, sine you "should" compare the old file to know what changes
Then when a change is made to the file compare it with the old file e.g) in a text file with 1 MB and the change is only 1 KB you will need to build a format like
[Text][Offset][[operation]
e.g) [Mrs.Y][40][Delete] then [Mr.X][40][Add]
Then your ftp client should be able to implement this format and make changes to the local copy on the client.
No it is not possible to only upload the changes to .exe file.we have to overwrite it.
#Frederik - It would be possible if FTP supports an updating of resource like HTTP's PUT command. Try exploring that angle. Let us know if you find something.

FTP Several Files

I have what would seem like a common problem, but I cannot find an appropriate solution on any forums. I need to FTP an entire directory structure using .NET. I have found several code examples all of which show how you can FTP a single file by creating an FtpWebRequest object. Unfortunately, there is no information on how to deal with several files. Do I simply create a FtpWebRequest object for every single file?
You can always call a new Process with a command like using for example WinSCP (open source FTP client)
https://winscp.net/eng/docs/start
Perhaps call the synchronize operation:
https://winscp.net/eng/docs/scriptcommand_synchronize
If there's a shorter way, I don't know what it is.
I wrote my code to handle each file one by one. If you are dealing with entire directory structures, that would entail processing each directory one by one as well.

Download 3000+ Images Using C#?

I have a list of around 3000 image URLs, where I need to download them to my desktop.
I'm a web dev, so naturally wrote a little asp.net c# download method to do this but the obvious problem happened and the page timed out before I got hardly any of them.
Was wondering if anyone else knew of a good, quick and robust way of me looping through all the image URL's and downloading them to a folder? Open to any suggestions, WinForms, batch file although I'm a novice at both.
Any help greatly appreciated
What about wget? It can download a list of URL specified in a file.
wget -i c:\list-of-urls.txt
Write a C# command-line application (or Winforms, if that's your inclination), and use the WebClient class to retrieve the files.
Here are some tutorials:
C# WebClient Tutorial
Using WebClient to Download a File
or, just Google C# WebClient.
You'll either need to provide a list of files to download and loop through the list, issuing a request for each file and saving the result, or issue a request for the index page, parse it using something like HTML Agility Pack to find all of the image tags, and then issue a request for each image, saving the result somewhere on your local drive.
Edit
If you just want to do this once (as in, not as part of an application), mbeckish's answer makes the most sense.
You might want to use an existing download manager like Orbit, rather than writing your own program for the purpose. (blasphemy, I know)
I've been pretty happy with Orbit. It lets you import a list of downloads from a text file. It'll manage the connections, downloading portions of each file in parallel with multiple connections, to increase the speed of each download. It'll take care of retrying if connections time out, etc. It seems like you'd have to go to a lot of effort to build these kind of features from scratch.
If this is just a one-time job, then one easy solution would be to write a HTML page with img tags pointing to the URLs.
Then browse it with FireFox and use an extension to save all of the images to a folder.
Working on the assumption that this is a one off run once project and as you are a novice with other technologies I would suggest the following:
Rather than try and download all 3000 images in one web request do one image per request. When the image download is complete redirect to the same page passing the URL of the next image to get as a query string parameter. Download that one and then repeat until all images are downloaded.
Not what I would call a "production" solution, but if my assumption is correct it is a solution that will have you up an running in no time.
Another fairly simple solution would be to create a simple C# console application that uses WebClient to download each of the images. The following psuedo code should give you enough to get going:
List<string> imageUrls = new List<string>();
imageUrls.Add(..... your urls from wherever .....)
foreach(string imageUrl in imagesUrls)
{
using (WebClient client = new WebClient())
{
byte[] raw = client.DownloadData(imageUrl);
.. write raw .. to file
}
}
I've written a similar app in WinForms that loops through URLs in an Excel spreadsheet and downloads the image files. I think they problem you're having with implementing this as a web application is that server will only allow the process to run for a short amount of time before the request from your browser times out. You could either increase this time in the web.config file (change the executionTimeout attribute of the httpRuntime element), or implement this functionality as a WinForms application where the long execution time won't be a problem. If this is more than a throw-away application and you decide to go the WinForms route, you may want to add a progress bar to ind

Categories

Resources