I am uploading large files to server about 4-5 GB. I have currently used HTML 5 and javascript var reader = new FileReader(); to upload the file payload chunk by chunk to the action method and it will write to the file.
Problem with current uploading is if the user navigate to another tab in application or accidentally close the browser, file upload will stop because it's getting chunk using javascript on page. My requirement is also just upload and forget some background task will upload it for you.
I need some background thread that handle the file upload and upload the file in background. Like file upload queue. So if the user navigate to other page the file upload won't stop and continue in background.
Is this possible with web application? I did my research and found that thread is not a good idea on IIS, instead windows service should be used. But don't know how to do this.
I am not a pro so any idea/example will be helpful. Thanks.
I'm trying to download file from FTP using javascript, for which I created the following topic:
Is it possible to download file from FTP using Javascript?
From there I learned that I can use window.open('ftp://xyz.org/file.zip'); to download the file. It opens a browser new window, but the window closes immediately.
How I can I force it to stay open?
Actually I do all these in Silverlight application:
Here is the code:
HtmlPage.Window.Eval("window.open('" + url+ "', 'Download', 'height=500,width=800,top=10,left=10');");
I also tried this,
string targetFeatures = "height=500,width=800,top=10,left=10";
HtmlPage.Window.Navigate(new Uri(url), "_blank", targetFeatures);
But both results in same : it opens a window, and closes it immediately. I see it just for fraction of second!
I know this doesn't answer your question, and I'm sure you know all of this. I'm answering more because I don't see this point brought up often. :)
Silverlight has very limited support for client interactions. Javascript is a shim that in my opinion gets overused to try and bypass things that Silverlight was architectured against. It would have been very easy for Microsoft to include FTP support in Silverlight but it was excluded for a reason.
However, Silverlight has great support for webservice interactions. So the recommended way of getting a file would be to call a webservice that would do the FTP transfer for you and then send the contents down to the Silverlight application via the webservice. Possibly even processing it on the webservice side for any business logic etc.
Like I said, I suspect your requirement is to not use a webservice (to pass the bandwith cost onto the user most likely). But it'd be interesting to know more about your business problem instead of your technical problem for the solution you've chosen.
It closes because it triggers file download. You can open two windows - one for message and one to download file, but I thiunk user will know it is downloading...
If I were you, I'd open up a page that has whatever visual/UI stuff you'd want to show the user, and either have a META tag that redirects to the download URL, or has a javascript blurb to fire off said download. That way, your window will stay open, but the download will still start automatically.
to keep it open use
var test = window.open();
test.location = 'ftp://openbsd.org.ar/pub/OpenBSD/2.0/arc/kernels/bsd.ecoff';
and to not open any window use
window.location = 'ftp://openbsd.org.ar/pub/OpenBSD/2.0/arc/kernels/bsd.ecoff';
or make a normal link
Remember that a browser is not meant to "display" (visually anyway) the FTP protocol, and not all browsers will suport it. If you want to allow the user to download something, consider using a normal http:// protocol, and opening a window normally as others have suggested.
If you really need the download to be hosted via FTP, consider your backend ingesting (and caching) the file and return it to the user via http
There is nothing to be parsed on the browser's side, hence it closes. If you want to have the page open, you'll have todo something dirty. Like creating a html (or php) page and serve the content you want the user to see, then with a hidden i-frame which will call the FTP contents.
This way your user will see the content you want them to see, and the file is being downloaded.
I had the exact same problem, Silverlight opening a new window for downloading a file would flash a blank window up briefly and it would disappear again without the file download occurring.
This seemed to happen in IE 8 (not 9 and up) and could be fixed by going into Tools->Internet Options->Security then click Custom level... (for whatever zone your site would be in) and go to Downloads->Automatic prompting for file downloads and make sure this is Enabled (I also have File download enabled below that). This Automatic prompting for file downloads setting seems to be absent from IE 9+.
Another workaround is to not open in a new window, if the target url immediately downloads a file it won't change the current window so there's no difference in UX:
HtmlPage.Window.Navigate(new Uri("\download.ashx?fileid=12345"));
I am creating a web browser using C#, and I need to get specific data from the web pages that are loaded in my browser.
The pages I am loading is a download scripts. The data I want to get is: the number of times the file has been downloaded.
I want to save this value in text.
What code can I use for this, or where can I start? Any help is appreciated.
Most web browsers have their own storage. Mozilla uses SQLite for some things.
Whenever your app/browser needs to retrieve a remote resource (URL of any kind), simply log it to a database table.
Perhaps use SQLite yourself for this. A decent start would be to create a history table like this:
URL --varchar(max)
LastAccessed --datetime
TotalRequests --int
If it is a file that the users will be downloading, you could add a global static int and increment it every time the file is downloaded.
I have an webapp which dynamically generates a file and stores it on the server. When a user comes along, this file is served, after which it gets deleted. Each user gets their own generated file after which it becomes useless.
Is there a way to know if the user managed to download the file fully, so that it may be deleted and not clogging up my server with outdated files?
At the moment the problem is a user is clicking "Download file" then it being cancelled or failed, so they try again but its been deleted.
There's no way to tell for certain whether the file was downloaded completely - even if you counted the bytes that the browser downloaded, what if their browser downloaded every byte of it but then crashed, for instance?
A common solution to this problem is to leave the file there for a predetermined period, then delete it. You tell the user "Your download will be available for the next 48 hours."
(The exact example above happened to me with Amazon's MP3 Store. I had to contact customer services for them to re-enable the download, which to their credit they did very quickly, but getting Customer Services involved for something like that is expensive.)
Have the user click a "Continue" button after they download the file, and delete the file at that point.
I'm having an issue within my application Pelotonics. When a user downloads a file the system seems to block all incoming requests until that file is done downloading. What is the proper technique to to open a download dialog box (standard from the browser), let the user start downloading the file, then while the file is downloading, let the user continue throughout the application.
The way we're getting the file from the server is we have a separate ASPX page that get's passed in a value through the query string, then retrieves the stream of the file from the server, then I add the "content-disposition" header to the Response and then loop through the file's stream and read 2KB chunks out to the response.outputstream. Then once that's done I do a Response.End.
Watch this for a quick screencast on the issue:
http://www.screencast.com/users/PeloCast/folders/Jing/media/8bb4b1dd-ac66-4f84-a1a3-7fc64cd650c0
by the way, we're in ASP.NET and C#...
Thanks!!!
Daniel
I think ASP.NET allows one simultaneous page execution per session and I'm not aware of any way to configure this otherwise.
This is not a very pretty workaround, but it might help if you rewrote ASP.NET_SESSIONID value to the request cookie in Application_BeginRequest (in global.asax). Of course, you would need to the authentication some other way. I haven't tried this, though.
Another way would be launching a separate thread for the download process, but you would need to find a way how this can be done without the worker thread closing it's resources.
May I ask, is there a reason why don't you just use HttpResponse.TransmitFile?