Can anyone help with a small problem I am having, I have WCF Rest Based Service, which has function that can accept a stream, this will be used for uplaoding images/audio/video to the server and then storing them on the server somewhere.
Testing with and image, and it appears to work, i select the image in the client, and a few seconds later the image appears on the server in the location expected, but when i try to open the image in windows picture viewer (or any image viewer), i get "No Preview Available", and no image to view.
I am assuming it is because i am not recreating the file again correctly from the stream.
This is the method on the WCF Rest Service
public void PutFileInFolder(int eid, Stream fileContents)
{
try
{
byte[] buffer = new byte[32768];
MemoryStream ms = new MemoryStream();
int bytesRead = 0;
int totalBytesRead = 0;
do
{
bytesRead = fileContents.Read(buffer, 0, buffer.Length);
totalBytesRead += bytesRead;
ms.Write(buffer, 0, bytesRead);
} while (bytesRead > 0);
//now have file in memorystream
//save the file to the users folder
FileStream file = new FileStream(#"C:\bd_sites\ttgme\wwwroot\Evidence\{" + ed.LearnerID + #"}\" + ed.EvidenceFileName, FileMode.Create, System.IO.FileAccess.Write);
byte[] bytes = new byte[ms.Length];
ms.Read(bytes, 0, (int)ms.Length);
file.Write(bytes, 0, bytes.Length);
file.Close();
ms.Close();
}
catch (Exception ex)
{
return;
}
}
And this is the client function for sending the file/image
private void PostFile(EvidenceObject eo)
{
try
{
// Create the REST request.
string url = ConfigurationManager.AppSettings["serviceUrl"];
string requestUrl = string.Format("{0}/PutFileInFolder/{0}", 1001);
HttpWebRequest request = (HttpWebRequest)HttpWebRequest.Create(requestUrl);
request.Method = "POST";
request.ContentType = "text/plain";
byte[] fileToSend = File.ReadAllBytes(txtFileName.Text);
request.ContentLength = fileToSend.Length;
using (Stream requestStream = request.GetRequestStream())
{
// Send the file as body request.
requestStream.Write(fileToSend, 0, fileToSend.Length);
requestStream.Close();
}
using (HttpWebResponse response = (HttpWebResponse)request.GetResponse())
Console.WriteLine("HTTP/{0} {1} {2}", response.ProtocolVersion, (int)response.StatusCode, response.StatusDescription);
MessageBox.Show("File sucessfully uploaded.", "Upload", MessageBoxButton.OK, MessageBoxImage.Information);
this.DialogResult = true;
}
catch (Exception ex)
{
MessageBox.Show("Error during file upload: " + ex.Message, "Upload", MessageBoxButton.OK, MessageBoxImage.Error);
}
}
Also just tested a video file, the orignal file plays happily, then when i upload it through the service, the file that is created on the server wont play.
I am sure it is somemthing really dumb i am doing, but any help is really appreciated.
The problem was the way i was writing to the file stream, i wasnt actually passing out the bytes of the file, but rather the new bytes making the file the same size but with basically no contents of the original file.
this was the change to the code
//FileStream file = new FileStream(#"C:\bd_sites\ttgme\wwwroot\Evidence\{" + ed.LearnerID + #"}\" + ed.EvidenceFileName, FileMode.Create, System.IO.FileAccess.Write);
//byte[] bytes = new byte[ms.Length];
////ms.Read(buffer, 0, (int)ms.Length);
//file.Write(bytes, 0, bytes.Length);
//file.Close();
//ms.Close();
using (FileStream fs = File.OpenWrite(#"C:\bd_sites\ttgme\wwwroot\Evidence\{" + ed.LearnerID + #"}\" + ed.EvidenceFileName))
{
ms.WriteTo(fs);
fs.Close();
ms.Close();
}
Related
I'm using .NET 3.5 and I need to transfer by FTP some files.
I don't want to use files because I manage all by using MemoryStream and bytes arrays.
Reading these articles (article and article), I made my client.
public void Upload(byte[] fileBytes, string remoteFile)
{
try
{
string uri = string.Format("{0}:{1}/{2}", Hostname, Port, remoteFile);
FtpWebRequest ftp = (FtpWebRequest)WebRequest.Create(uri);
ftp.Credentials = new NetworkCredential(Username.Normalize(), Password.Normalize());
ftp.UseBinary = true;
ftp.UsePassive = true;
ftp.Method = WebRequestMethods.Ftp.UploadFile;
using (Stream localFileStream = new MemoryStream(fileBytes))
{
using (Stream ftpStream = ftp.GetRequestStream())
{
int bufferSize = (int)Math.Min(localFileStream.Length, 2048);
byte[] buffer = new byte[bufferSize];
int bytesSent = -1;
while (bytesSent != 0)
{
bytesSent = localFileStream.Read(buffer, 0, bufferSize);
ftpStream.Write(buffer, 0, bufferSize);
}
}
}
}
catch (Exception ex)
{
LogHelper.WriteLog(logs, "Errore Upload", ex);
throw;
}
}
The FTP client connects, writes and close correctly without any error. But the written files are corrupted, such as PDF cannot be opened and for DOC/DOCX Word shows a message about file corruption and tries to restore them.
If I write to a file the same bytes passed to the Upload method, I get a correct file. So the problem must be with FTP transfer.
byte[] fileBytes = memoryStream.ToArray();
File.WriteAllBytes(#"C:\test.pdf", fileBytes); // --> File OK!
ftpClient.Upload(fileBytes, remoteFile); // --> File CORRUPTED on FTP folder!
You need to use bytesSent in the Write call:
bytesSent = localFileStream.Read(buffer, 0, bufferSize);
ftpStream.Write(buffer, 0, bytesSent);
Otherwise you write too many bytes in the last round.
I have a site with a list of photos. The user has the option to download each of the photos. The download just writes the result to the output stream.
Here's the code:
[WebMethod]
public static void DownloadPhotoAsset(string assetId)
{
var photoAsset = GetPhotoAsset(assetId);
Stream stream = null;
int bytesToRead = 10000;
byte[] buffer = new Byte[bytesToRead];
try
{
HttpWebRequest fileReq =
(HttpWebRequest)HttpWebRequest.Create(photoAsset.FileAbsoluteUrl);
HttpWebResponse fileResp = (HttpWebResponse)fileReq.GetResponse();
if (fileReq.ContentLength > 0)
fileResp.ContentLength = fileReq.ContentLength;
stream = fileResp.GetResponseStream();
var resp = HttpContext.Current.Response;
resp.ContentType = "application/octet-stream";
resp.AddHeader("Content-Disposition",
"attachment; filename=\"" +
Path.GetFileName(photoAsset.FileAbsoluteUrl) + "\"");
resp.AddHeader("Content-Length", fileResp.ContentLength.ToString());
int length;
do
{
// verify that the client is connected.
if (resp.IsClientConnected)
{
// read data into the buffer and write it out
// to the response's output stream
length = stream.Read(buffer, 0, bytesToRead);
resp.OutputStream.Write(buffer, 0, length);
// flush the data and clear the buffer
resp.Flush();
buffer = new Byte[bytesToRead];
}
else
length = -1; // cancel the download if client has disconnected
} while (length > 0); //Repeat until no data is read
}
finally
{
if (stream != null)
stream.Close(); // close the input stream
}
}
This works fine in every browser on Windows, but I get network connection issues on Macs.
In Safari, the download stops after a second and says "The network connection was lost"
In Chrome, the error says "Failed - Network Error"
In Firefox, the error says "Download Error - image.jpeg.part could not be saved, because the source file could not be read"
I've checked on two different Macs with OSX 10.7.4 and and OSX 10.8.3
Anyone know what I'm doing wrong here?
protected void downloadFunction(string filename)
{
string filepath = #"D:\XtraFiles\" + filename;
string contentType = "application/x-newton-compatible-pkg";
Stream iStream = null;
// Buffer to read 1024K bytes in chunk
byte[] buffer = new Byte[1048576];
// Length of the file:
int length;
// Total bytes to read:
long dataToRead;
try
{
// Open the file.
iStream = new FileStream(filepath, FileMode.Open, FileAccess.Read, FileShare.Read);
// Total bytes to read:
dataToRead = iStream.Length;
HttpContext.Current.Response.ContentType = contentType;
HttpContext.Current.Response.AddHeader("Content-Disposition", "attachment; filename=" + HttpUtility.UrlEncode(filename, System.Text.Encoding.UTF8));
// Read the bytes.
while (dataToRead > 0)
{
// Verify that the client is connected.
if (HttpContext.Current.Response.IsClientConnected)
{
// Read the data in buffer.
length = iStream.Read(buffer, 0, 10000);
// Write the data to the current output stream.
HttpContext.Current.Response.OutputStream.Write(buffer, 0, length);
// Flush the data to the HTML output.
HttpContext.Current.Response.Flush();
buffer = new Byte[10000];
dataToRead = dataToRead - length;
}
else
{
//prevent infinite loop if user disconnects
dataToRead = -1;
}
}
}
catch (Exception ex)
{
// Trap the error, if any.
HttpContext.Current.Response.Write("Error : " + ex.Message + "<br />");
HttpContext.Current.Response.ContentType = "text/html";
HttpContext.Current.Response.Write("Error : file not found");
}
finally
{
if (iStream != null)
{
//Close the file.
iStream.Close();
}
HttpContext.Current.Response.End();
HttpContext.Current.Response.Close();
}
}
My donwload function is working perfect, but when users are downloading the browser cant see the total file size of the download.
So now the browser says eq. Downloading 8mb of ?, insted of Downloading 8mb of 142mb.
What have i missed?
The Content-Length header seems to be what you are missing.
If you set this the browser will then know how much to expect. Otherwise it will just keep going til you stop sending data and it won't know how long it is until the end.
Response.AddHeader("Content-Length", iStream.Length);
You may also be interested in Response.WriteFile whcih can provide an easier way to send a file to a client without having to worry about streams yourself.
You need to send a ContentLength-Header:
HttpContext.Current.Response.AddHeader(HttpRequestHeader.ContentLength, iStream.Length);
Problem: When i ftp upload only one file at a time, the files are uploaded fine, But when i use multiple Background workers to upload files to ftp server i get exception:
ex {"The remote server returned an error: (550) File unavailable
(e.g., file not found, no access)."} System.Exception
{System.Net.WebException}
And only Some of the files get uploaded. I am pretty sure the file exits at that location, infact in another run the file it was complaining about that does not exist is downloaded but the error shifts on another file.
Code Description:
In below code i am downloading a file from one ftp server and putting it on another. This code is inside a BackgroundsWorker_DoWork Method. Background Workers are being created inside a loop.
void imageDownloadWorker_DoWork(object sender, DoWorkEventArgs e)
{
string[] ftpInfo = (string[])e.Argument;
try
{
///////////////////////////Downloading///////////////////////////////////////
string uri = String.Format("ftp://{0}/{1}/images/{2}", ftpInfo[1], ftpInfo[2], ftpInfo[5]);
FtpWebRequest request = (FtpWebRequest)WebRequest.Create(uri);
request.Method = WebRequestMethods.Ftp.DownloadFile;
request.UseBinary = true;
request.Credentials = new NetworkCredential(ftpInfo[3], ftpInfo[4]);
FtpWebResponse response = (FtpWebResponse)request.GetResponse();
Stream ftpStream = response.GetResponseStream();
long cl = response.ContentLength;
int bufferSize = 4096;
int readCount = 0;
byte[] buffer = new byte[bufferSize];
MemoryStream memStream = new MemoryStream();
readCount = ftpStream.Read(buffer, 0, bufferSize);
while (readCount > 0)
{
memStream.Write(buffer, 0, readCount);
readCount = ftpStream.Read(buffer, 0, bufferSize);
}
response.Close();
///////////////////////////Uploading///////////////////////////////////////
string uri1 = String.Format("ftp://{0}/{1}/{2}", "127.0.0.1", string.Empty, ftpInfo[5]);
FtpWebRequest request1 = (FtpWebRequest)WebRequest.Create(uri1);
request1.Credentials = new NetworkCredential("user", "password");
request1.KeepAlive = false;
request1.Method = WebRequestMethods.Ftp.UploadFile;
request1.UseBinary = true;
request1.ContentLength = memStream.Length;
int buffLength = 4096;
byte[] buff = new byte[buffLength];
int contentLen;
// Stream to which the file to be upload is written
Stream strm = request1.GetRequestStream();
memStream.Seek(0, SeekOrigin.Begin);
contentLen = memStream.Read(buff, 0, buffLength);
// Till Stream content ends
while (contentLen != 0)
{
// Write Content from the file stream to the FTP Upload Stream
strm.Write(buff, 0, contentLen);
contentLen = memStream.Read(buff, 0, buffLength);
}
// Close the file stream and the Request Stream
strm.Close();
ftpStream.Close();
memStream.Close();
}
catch(Exception ex)
{
MessageBox.Show("While Downloading File " + ftpInfo[5] + " " + ex.ToString(), "Error", MessageBoxButtons.OK, MessageBoxIcon.Error);
e.Result = null;
return;
}
Related Thread
Edit:
In File Zilla Server there is an option General Settings>Perfomance Settings>Number of Threads i have set that to 20 it didnt make any difference.
There maybe nothing wrong with your code. That error is a Permissions error.
Complete stab in the dark, but does the upload target server have a connection per IP limit? If so you may be falling foul of this by exceeding the concurrent connection limit from a single IP address.
I think I have some memory issue with a function that should download a file from ftp server. The reason that I think it's a memory thing is because it works fine during debug (maybe it gives the garbage collector more time). Yet I thought that the using should solve this...
Just to be clear the function works when called once yet calling it several times in a for loop invokes the err message: 550 The specified network name is no longer available.
Please help
Asaf
private void downloadFile(string sourceFile, string targetFolder)
{
string remoteFile = sourceFile.Replace("\\","//");
string localFolder = targetFolder + "\\" + sourceFile.Substring(sourceFile.LastIndexOf("\\")+1);
string filename = "ftp://" + ftpServerIP + "//" + remoteFile;
FtpWebRequest ftpReq = (FtpWebRequest)WebRequest.Create(filename);
ftpReq.Method = WebRequestMethods.Ftp.DownloadFile;
ftpReq.Credentials = new NetworkCredential(ftpUserID, ftpPassword);
ftpReq.UseBinary = true;
ftpReq.Proxy = null;
ftpReq.KeepAlive = false; //'3. Settings and action
try
{
using (System.Net.FtpWebResponse response = (System.Net.FtpWebResponse)(ftpReq.GetResponse()))
{
using (System.IO.Stream responseStream = response.GetResponseStream())
{
using (System.IO.FileStream fs = new System.IO.FileStream(localFolder, System.IO.FileMode.Create))
{
Byte[] buffer = new byte[2047];
int read = 0;
do
{
read = responseStream.Read(buffer, 0, buffer.Length);
fs.Write(buffer, 0, read);
} while (read == 0);
responseStream.Close();
fs.Flush();
fs.Close();
}
responseStream.Close();
}
response.Close();
}
}
catch (WebException ex)
{
FtpWebResponse response = (FtpWebResponse)ex.Response;
Console.Out.WriteLine(response.StatusDescription);
}
}
There's a bug in the read loop that causes large files to be truncated. Use:
int read;
while ((read = responseStream.Read(buffer, 0, buffer.Length)) != 0)
{
fs.Write(buffer, 0, read);
}
With that change in place I was able to download a number of large files via FTP without encountering exceptions.