how to know when download is finished - c#

Hi I'm creating online shop. In this shope people online must be buy files with zip extension. They pay with their credit cards or other methods get key and download product. How can I know when they finish product download?
Thanks

Unfortunatelly there is no really good way to do this as some clients might not download the file at once (e.g. Downloadmanagers split the download into several parralel part downloads).
Options are:
If it is very important to you that it can only be downloaded once: You could
simply not support resuming. Then you
can log if the file has entirely been
downloaded (as soon as the last byte
has been sent). This might work well if the download is small.
Otherwise you could offer some grace
data (we usually allow to download
clients to download 5 times the size
of the real download) and log every
download attempt.
You should NOT just count the bytes downloaded (because the download might be disrupted). And NOT just determine if all sections have been downloaded once (also because the download might be disrupted)
Just to clarify: All this means that you have to write your own download handler (fileserver).

you can use custom file server that works on either http or ftp and have it send a notification once the client received the last file fragment.
all other options are problematic; the client might download the file using a download manager,so you cannot even register for any browser event, if there was any.

A custom server application seems indeed a solution for this,
or possibly some kind of scripting.
A normal http server does not notify the end of a connection,
but possibly, if you generate the output in a cgi/php/asp/* script,
you read the file in cgi/php/asp/* scripting language and
send it to the output. when you reach the end of the file, you
do the notification, and then end the script.
When you do it that way, it will only detect fully downloaded files,
and if the connection gets interrupted half-way, it would not mark
the file as downloaded.
a 'cgi-script' can be a compiled c program, (or any other langauge
for that matter). Compiled code anyways. A compiled program
would give better performance then a interpreted script solution.

Related

Best practice to send a ASP.Net Core Web App project to someone?

I'm a newbie in ASP.NET projects and I am wondering how do I share my finished C# project to someone by email so they can run it (IIS Express) on their machine ? It's for a job offer skill test.
Do I need to send every files and folders or i can just send the source code?
The project folder is 6MB so i can't send it by email.
Sorry for my English and thanks for help !
A nice way to send it could be by uploading it to GitHub and sharing the repository link. Since it is for a job it will also showcase that you are Version Control savvy.
You can also include a README.md file there that describes the app.
I'm not sure this has anything to do with ASP.net. It's just a question on how to transfer a large file isn't it? If so, 6mb isn't really that large for email nowadays. Most mail servers will handle that fine. But if email is a problem, put it on one of the myriad of fileshare platforms available such as Dropbox.
As for what specific files you need to send. That would depend on the requirements the company gave you - but I'd imagine they certainly need to see your source code and likely want the entire thing to be runnable. So send whatever is needed so they can easily run it.
Have you tried zipping it up? Source code will generally zip very effectively to a much smaller file as it contains so much repetition.
Upload the zip file to your google drive and then send it as a drive attachment. (This is actually Google’s recommendation) Or just simply share it with them. I recommend that attachment be as small as possible in the event that the user your sending it to has a mail quota.

ASP.net Create a Torrent from File

Our current software updates are hosted on our server.
We'd like to offer Torrents as an alternative download option from our server. When new releases are published it should offer people better download speeds if people seed it.
I've figured out everything except how to create a Torrent file automatically (we'd rather not have to create it manually each time).
Does anyone know how we can create a torrent file from a specified exe file?
Thanks!
MonoTorrent seems like it might be of help. I've previously compiled and run it under .net, so no worries there.
This looks like the relevant wiki page.
As usual, it's probably best to check license compatibility before integrating with your product, but it looks quite permissive.
Of course, you'll need to host the torrent to ensure at least a single seed!

Download 3000+ Images Using C#?

I have a list of around 3000 image URLs, where I need to download them to my desktop.
I'm a web dev, so naturally wrote a little asp.net c# download method to do this but the obvious problem happened and the page timed out before I got hardly any of them.
Was wondering if anyone else knew of a good, quick and robust way of me looping through all the image URL's and downloading them to a folder? Open to any suggestions, WinForms, batch file although I'm a novice at both.
Any help greatly appreciated
What about wget? It can download a list of URL specified in a file.
wget -i c:\list-of-urls.txt
Write a C# command-line application (or Winforms, if that's your inclination), and use the WebClient class to retrieve the files.
Here are some tutorials:
C# WebClient Tutorial
Using WebClient to Download a File
or, just Google C# WebClient.
You'll either need to provide a list of files to download and loop through the list, issuing a request for each file and saving the result, or issue a request for the index page, parse it using something like HTML Agility Pack to find all of the image tags, and then issue a request for each image, saving the result somewhere on your local drive.
Edit
If you just want to do this once (as in, not as part of an application), mbeckish's answer makes the most sense.
You might want to use an existing download manager like Orbit, rather than writing your own program for the purpose. (blasphemy, I know)
I've been pretty happy with Orbit. It lets you import a list of downloads from a text file. It'll manage the connections, downloading portions of each file in parallel with multiple connections, to increase the speed of each download. It'll take care of retrying if connections time out, etc. It seems like you'd have to go to a lot of effort to build these kind of features from scratch.
If this is just a one-time job, then one easy solution would be to write a HTML page with img tags pointing to the URLs.
Then browse it with FireFox and use an extension to save all of the images to a folder.
Working on the assumption that this is a one off run once project and as you are a novice with other technologies I would suggest the following:
Rather than try and download all 3000 images in one web request do one image per request. When the image download is complete redirect to the same page passing the URL of the next image to get as a query string parameter. Download that one and then repeat until all images are downloaded.
Not what I would call a "production" solution, but if my assumption is correct it is a solution that will have you up an running in no time.
Another fairly simple solution would be to create a simple C# console application that uses WebClient to download each of the images. The following psuedo code should give you enough to get going:
List<string> imageUrls = new List<string>();
imageUrls.Add(..... your urls from wherever .....)
foreach(string imageUrl in imagesUrls)
{
using (WebClient client = new WebClient())
{
byte[] raw = client.DownloadData(imageUrl);
.. write raw .. to file
}
}
I've written a similar app in WinForms that loops through URLs in an Excel spreadsheet and downloads the image files. I think they problem you're having with implementing this as a web application is that server will only allow the process to run for a short amount of time before the request from your browser times out. You could either increase this time in the web.config file (change the executionTimeout attribute of the httpRuntime element), or implement this functionality as a WinForms application where the long execution time won't be a problem. If this is more than a throw-away application and you decide to go the WinForms route, you may want to add a progress bar to ind

C#: Programmatically apply merge/patch to file?

I have a program that requires a few large (~4 or 5mb) files. Once a week, every week, there are new versions of these files with minor changes. Mostly just a few lines added or removed.
When the program starts, if there's an Internet connection, I'd like the program to update these files automatically. Instead of downloading the entire new versions of the files, I'll like to download just a patch based on the client's version of the files that updates them.
How might I do this?
I have total control over the server.
That is a tough problem to solve if you don't have any for knowledge of what is in the file or the server doest have a facility to allow you to request differences. Any program you write that does not have a way to determine the differences with out looking at the old and new file will have to download it anyway.
C# doesn't have any built-in facility to do this, but it sounds like your requirements aren't complicated. Look at how diff and ed on Unix can be used to patch a text file based on an easy-to-grok delta. Of course you should check the resulting file against a hash and fall back to a full download if it isn't correct.

best way to write a polled FTP download in C#

I currently have a manual process where we upload a text file to a business partner, they have an automated process which reads in the file, processes it and then generates a 'results' log file any where from 3-10minutes (typically) after the initial upload. I need to automate this process via a .NET application.
I already have the upload completed, what I do not have is the download of the result. Since I dont know exactly when the file will be ready to download I figure that I must need to poll the remote site every so often, get a listing of the files in the results directory and see if one matches what I am expecting.
I have done some reading and found some references to AsyncCallBack but I'm not really sure how to proceed with it. the solution has to be something I can manage without any third-party libraries outside of .net since I have a budget of 0 for this little project.
Any help would be greatly appreciated!
Just have a thread (or your main thread) sleep for x milliseconds and attempt to do the download when it's not sleeping. No need to buy a 3rd party FTP library, FTP is built into .NET (FtpWebRequest and FtpWebResponse). They aren't very good (very bare bones) but will probably do for what you want.

Categories

Resources