A client has a page which when called starts a long running process and intermittently spits out its progress as it goes
In the format
[dd/MM/yyyy hh:mm:ss] - Process Started
[dd/MM/yyyy hh:mm:ss] - CSV Imported
[dd/MM/yyyy hh:mm:ss] - Process 10% complete
Then 30 seconds later it might write out :
[dd/MM/yyyy hh:mm:ss] - User x Created
[dd/MM/yyyy hh:mm:ss] - User y Created
[dd/MM/yyyy hh:mm:ss] - Process 20% complete
etc... it takes 10-20 minutes to run, we dont have access to the code for this page.
What I have been asked to do is to call this page from one of the other applications, consume the output and give a realtime update on our dashboard.
My first thought was was to use an http client call .GetStreamAsync() and have a loop reading the stream intermittently and reporting back on the last thing that was written out.
This was my first attempt :
using (HttpClient httpClient = new HttpClient())
{
httpClient.Timeout = TimeSpan.FromMilliseconds(Timeout.Infinite);
var requestUri = "http://localhost:64501/Page1.aspx";
var stream = httpClient.GetStreamAsync(requestUri).Result;
using (var reader = new StreamReader(stream))
{
while (!reader.EndOfStream)
{
var currentLine = reader.ReadLine();
Thread.Sleep(1000);
}
}
}
However
var currentLine = reader.ReadLine();
Appears to block and wait for the response to complete before returning anything..
I need to be able to read the stream as it comes in.. Is this possible?
The problem lies in ReadLine, the server may be not sending lines (something logic as it seems to be prepared to be sent to a web page where newlines are ignored) so you need to read chuncks of data and convert those to strings:
using (HttpClient httpClient = new HttpClient())
{
httpClient.Timeout = TimeSpan.FromMilliseconds(Timeout.Infinite);
var requestUri = "http://localhost:64501/Page1.aspx";
var stream = httpClient.GetStreamAsync(requestUri).Result;
string read = "";
byte[] buffer = new byte[1024];
while(!stream.EndOfStream)
{
int readed = stream.Read(buffer, 0, 1024);
read += Encoding.UTF8.GetString(buffer, 0, readed);
//Do whatever you need to do with the string.
}
}
Related
I'm trying to upload a large video (1 GB+) from my xamarin app and it keeps crashing once it reaches about 0.5 GB of my file. The only way I've found to get the videos to post to my WCF service while sending data along with it is using the MultiPart logic but I'm not sure if I'm running out of memory or what because even in debug mode, it simply crashes without any real error message.
I'm trying to run it on a native device (not a sim) and it's a Samsung Galaxy S9 with Android 9.
Here's the upload code that I'm using: (p.s. - as a test, I tried putting the WriteAsync into a for loop thinking that maybe trying to write the whole gig was the problem, but the result was the same. That's why you'll see the MAXFILESIZEPART constant in there which is just an int equal to 10000000.)
private async Task<byte[]> GetMultipartFormDataAsync(Dictionary<string, object> postParameters, string boundary)
{
try
{
using (Stream formDataStream = new System.IO.MemoryStream())
{
bool needsCLRF = false;
foreach (var param in postParameters)
{
// Thanks to feedback from commenters, add a CRLF to allow multiple parameters to be added.
// Skip it on the first parameter, add it to subsequent parameters.
if (needsCLRF)
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes("\r\n"), 0, Encoding.UTF8.GetByteCount("\r\n"));
needsCLRF = true;
if (param.Value is FileParameter)
{
FileParameter fileToUpload = (FileParameter)param.Value;
// Add just the first part of this param, since we will write the file data directly to the Stream
string header = string.Format("--{0}\r\nContent-Disposition: form-data; name=\"{1}\"; filename=\"{2}\"\r\nContent-Type: {3}\r\n\r\n",
boundary,
param.Key,
fileToUpload.FileName ?? param.Key,
fileToUpload.ContentType ?? "application/octet-stream");
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes(header), 0, Encoding.UTF8.GetByteCount(header));
// Write the file data directly to the Stream, rather than serializing it to a string.
if (fileToUpload.File.Length > MAXFILESIZEPART)
{
for (var i = 0; i < fileToUpload.File.Length; i += MAXFILESIZEPART)
{
var len = i + MAXFILESIZEPART > fileToUpload.File.Length
? fileToUpload.File.Length - i
: MAXFILESIZEPART;
await formDataStream.WriteAsync(fileToUpload.File, i, len);
}
}
else
{
await formDataStream.WriteAsync(fileToUpload.File, 0, fileToUpload.File.Length);
}
}
else
{
string postData = string.Format("--{0}\r\nContent-Disposition: form-data; name=\"{1}\"\r\n\r\n{2}",
boundary,
param.Key,
param.Value);
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes(postData), 0, Encoding.UTF8.GetByteCount(postData));
}
}
// Add the end of the request. Start with a newline
string footer = "\r\n--" + boundary + "--\r\n";
await formDataStream.WriteAsync(Encoding.UTF8.GetBytes(footer), 0, Encoding.UTF8.GetByteCount(footer));
// Dump the Stream into a byte[]
formDataStream.Position = 0;
byte[] formData = new byte[formDataStream.Length];
formDataStream.Read(formData, 0, formData.Length);
return formData;
}
}
catch (Exception e)
{
Console.WriteLine(e);
throw;
}
}
And it's eventually failing on the following line
await formDataStream.WriteAsync(fileToUpload.File, i, len);
but only after a certain point (about 500MB) so I'm assuming it's a memory issue but it doesn't say so. Is there a better way to accomplish this task? I'm doing it so that it also records the progress as the upload happens. I'm trying to accomplish something similar to uploading large videos via the facebook app so that it will upload in the background while you continue working. It works great when working with smaller files (i.e. - < 500 MB) but this is the first time I've tried a file that was almost a gig in size.
NOTE: This happens BEFORE it starts posting anything to the server so it's not IIS or WCF related. This code crashes just writing the bytes to the memory stream.
Any suggestions?
Thanks!
According to your description, the service will stop at a certain time point, and because the file you transfer is about 1G, it is likely to be sendtimeout.No transfer completed within the specified time, causing exception。The SendTimeout that specifies how long the write operation has to complete before timing out. The default value is 1 minute.
I set sendtimeout to 15 seconds in my configuration file.If the data takes more than 15 seconds, an exception will occur. You can set it to a higher value to avoid timeout and exception.
For information about sendtimeout, please refer to the following link:
https://learn.microsoft.com/en-us/dotnet/api/system.servicemodel.channels.binding.sendtimeout?view=dotnet-plat-ext-3.1
UPDATE
I think it might be a memory overflow problem.Large file may cause memory overflow, unable to read at the same time.
You can refer to the following links for solutions
https://learn.microsoft.com/en-us/archive/blogs/johan/are-you-getting-outofmemoryexceptions-when-uploading-large-files
currently i'm facing issue while creating file, i'm trying to write text contents using streamWriter class but i'm not getting expected answer..
Below is my example code :-
My c# code looks like :-
public void ProcessRequest(HttpContext context)
{
// Create a connexion to the Remote Server to redirect all requests
RemoteServer server = new RemoteServer(context);
// Create a request with same data in navigator request
HttpWebRequest request = server.GetRequest();
// Send the request to the remote server and return the response
HttpWebResponse response = server.GetResponse(request);
context.Response.AddHeader("Content-Disposition", "attachment; filename=playlist.m3u8");
context.Response.ContentType = response.ContentType;
Stream receiveStream = response.GetResponseStream();
var buff = new byte[1024];
int bytes = 0;
string token = Guid.NewGuid().ToString();
while ((bytes = receiveStream.Read(buff, 0, 1024)) > 0)
{
//Write the stream directly to the client
context.Response.OutputStream.Write(buff, 0, bytes);
context.Response.Write("&token="+token);
}
//close streams
response.Close();
context.Response.End();
}
output of above code looks like :-
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-STREAM-INF:BANDWIDTH=20776,CODECS="avc1.66.41",RESOLUTION=320x240
chunk.m3u8?nimblesessionid=62
&token=42712adc-f932-43c7-b282-69cf349941da
But my expected output is :-
#EXTM3U
#EXT-X-VERSION:3
#EXT-X-STREAM-INF:BANDWIDTH=20776,CODECS="avc1.66.41",RESOLUTION=320x240
chunk.m3u8?nimblesessionid=62&token=42712adc-f932-43c7-b282-69cf349941da
I just wanted that token param in same line instead of new line..
Thank you.
If you want to simply remove a newline at the end of the received bytes, change the code in your while loop like so:
while ((bytes = receiveStream.Read(buff, 0, 1024)) > 0)
{
if (buff[bytes-1] == 0x0a)
bytes -= 1;
//Write the stream directly to the client
context.Response.OutputStream.Write(buff, 0, bytes);
context.Response.Write("&token="+token);
}
Several caveats:
It will only work if 0x0a (newline byte, '\n' as a character) is at the end of the bytes you received. If for some reason the message sent by the server is received in several blocks, you will first have to make sure you received everything there is to receive before checking the last byte.
Please also note that this would result in multiple &token=... lines in your current code.
Depending on the server, it might use carriage return (0x0d or '\r') as its line ending byte, or even both. Check what the server sends and adapt the code accordingly.
I'm currently developing for an environment that has poor network connectivity. My application helps to automatically download required Google Drive files for users. It works reasonably well for small files (ranging from 40KB to 2MB), but fails far too often for larger files (9MB). I know these file sizes might seem small, but in terms of my client's network environment, Google Drive API constantly fails with the 9MB file.
I've concluded that I need to download files in smaller byte chunks, but I don't see how I can do that with Google Drive API. I've read this over and over again, and I've tried the following code:
// with the Drive File ID, and the appropriate export MIME type, I create the export request
var request = DriveService.Files.Export(fileId, exportMimeType);
// take the message so I can modify it by hand
var message = request.CreateRequest();
var client = request.Service.HttpClient;
// I change the Range headers of both the client, and message
client.DefaultRequestHeaders.Range =
message.Headers.Range =
new System.Net.Http.Headers.RangeHeaderValue(100, 200);
var response = await request.Service.HttpClient.SendAsync(message);
// if status code = 200, copy to local file
if (response.IsSuccessStatusCode)
{
using (var fileStream = new FileStream(downloadFileName, FileMode.CreateNew, FileAccess.ReadWrite))
{
await response.Content.CopyToAsync(fileStream);
}
}
The resultant local file (from fileStream) however, is still full-length (i.e. 40KB file for the 40KB Drive file, and a 500 Internal Server Error for the 9MB file). On a sidenote, I've also experimented with ExportRequest.MediaDownloader.ChunkSize, but from what I observe it only changes the frequency at which the ExportRequest.MediaDownloader.ProgressChanged callback is called (i.e. callback will trigger every 256KB if ChunkSize is set to 256 * 1024).
How can I proceed?
You seemed to be heading in the right direction. From your last comment, the request will update progress based on the chunk size, so your observation was accurate.
Looking into the source code for MediaDownloader in the SDK the following was found (emphasis mine)
The core download logic. We download the media and write it to an
output stream ChunkSize bytes at a time, raising the ProgressChanged
event after each chunk. The chunking behavior is largely a historical
artifact: a previous implementation issued multiple web requests, each
for ChunkSize bytes. Now we do everything in one request, but the API
and client-visible behavior are retained for compatibility.
Your example code will only download one chunk from 100 to 200. Using that approach you would have to keep track of an index and download each chunk manually, copying them to the file stream for each partial download
const int KB = 0x400;
int ChunkSize = 256 * KB; // 256KB;
public async Task ExportFileAsync(string downloadFileName, string fileId, string exportMimeType) {
var exportRequest = driveService.Files.Export(fileId, exportMimeType);
var client = exportRequest.Service.HttpClient;
//you would need to know the file size
var size = await GetFileSize(fileId);
using (var file = new FileStream(downloadFileName, FileMode.CreateNew, FileAccess.ReadWrite)) {
file.SetLength(size);
var chunks = (size / ChunkSize) + 1;
for (long index = 0; index < chunks; index++) {
var request = exportRequest.CreateRequest();
var from = index * ChunkSize;
var to = from + ChunkSize - 1;
request.Headers.Range = new RangeHeaderValue(from, to);
var response = await client.SendAsync(request);
if (response.StatusCode == HttpStatusCode.PartialContent || response.IsSuccessStatusCode) {
using (var stream = await response.Content.ReadAsStreamAsync()) {
file.Seek(from, SeekOrigin.Begin);
await stream.CopyToAsync(file);
}
}
}
}
}
private async Task<long> GetFileSize(string fileId) {
var file = await driveService.Files.Get(fileId).ExecuteAsync();
var size = file.size;
return size;
}
This code makes some assumptions about the drive api/server.
That the server will allow the multiple requests needed to download the file in chunks. Don't know if requests are throttled.
That the server still accepts the Range header like stated in the developer documenation
I am using a simple function:-
while((br.Read(downbuffer, 0, downbuffer.Length)>0)
{
//Write the data in downbuffer to a file.
}
While its working well for a file(tested multiple times on a zip file download- http://citylan.dl.sourceforge.net/project/cric-scoreslive/v8.5-Live%20Cricket%20Scores%20Desktop%20App.zip), its not working when the link refers to a HTML file(tested on http://www.mediafire.com/?rc3kj22p1tb4vi9). With the later link, it only downloads and write only about 1 KB of data, while the page is of about 60 KB.
If it has something to do with not being flushed or anything, wondering how is it working with the files download? Relative code is this:-(Running in a separate thread)-
wreq = (HttpWebRequest)WebRequest.Create(uri);
wres = wreq.GetResponse();
fs = new FileStream(totalpath, FileMode.Create);
bw = new BinaryWriter(fs);
br = new BinaryReader(wres.GetResponseStream());
while((onecount=br.Read(downbuffer, 0, downbuffer.Length)>0)
{
bw.Write(downbuffer, 0, onecount);
totalcount += onecount;
}
totalpath leads to a file,nothing special.
downbuffer size is 20 KB, my internet speed is about 60 kBps(512kbps).
You could solve the whole problem by avoiding it:
using (var client = new WebClient())
{
client.DownloadFile(uri, totalpath);
}
I have a big problem dealing with data I try to download in my Application over the internet via HttpWebResponse. My code looks like that:
myWebRequest.Timeout = 10000;
using (HttpWebResponse myWebResponse = (HttpWebResponse)myWebRequest.GetResponse())
{
using (Stream ReceiveStream = myWebResponse.GetResponseStream())
{
Encoding encode = Encoding.GetEncoding("utf-8");
StreamReader readStream = new StreamReader(ReceiveStream, encode);
// Read 1024 characters at a time.
Char[] read = new Char[1024];
int count = readStream.Read(read, 0, 1024);
int break_counter = 0;
while (count > 0 && break_counter < 10000)
{
String str = new String(read, 0, count);
buffer += str;
count = readStream.Read(read, 0, 1024);
break_counter++;
}
}
}
This code runs in a few instances in separated threads so it's a little bit hard to debug. The problem is this method got stuck and I blame it on the poor connection to the data.
As you can see I already set a timeout and was hoping the code would just terminate after the timeout time has expired. It does not! At least not all the time. Sometimes I get a WebException/Timeout but a few times it just got stuck.
What is a timeout exactly? When is it called?
Lets say the HttpWebResponse starts to receive data but it got stuck somewhere in the middle of transmission. Do I get a timeout? For me it looks like I don't because my application got stuck too and no timeout exception is raised.
What can I do to patch this up or how can I get further information about what is going wrong here?
Try setting HttpWebRequest.ReadWriteTimeout Property
The number of milliseconds before the
writing or reading times out. The
default value is 300,000 milliseconds
(5 minutes).