I'm trying to download a FTP file from one of our clients, but in production it fails with this message: "Operation timeout exceeded", but if you try to open the FTP location from browser, it opens without problem. During develop this happened because of the proxy, but also blocked the browser. In production, the proxy was shutdown but still fails. I need help with this.
This is the code for the download
using (WebClient vloftpClient = new WebClient())
{
vloftpClient.Credentials = new System.Net.NetworkCredential(vlcUsuario, vlcClave);
vloftpClient.DownloadFile(#vlcUbicacionOrigen + #"/" + vlcNombreArchivoOrigen, pvcUbicacionDestino + #"/" + pvcProveedor + "_" + vlcNombreArchivoDestino);
}
WebRequestClient offers a better solution for working such as mode of transfer.
so use as below
FtpWebRequest objFtp;
objFtp.Method = ...DownloadFile
objFtp.ReadWriteTimeout = objFtp.Timeout = 'a value in terms of millisecond'
at last , try to download the files
Related
I am making an ASP.NET Core 3.1 MVC with Entity Framework website on a Microsoft server running IIS 10 with basic user file upload and download functions. I can upload files to the _webHostEnvironment.WebRootPath from the web but when I try to download them it cannot find them. However, if I recycle the app pool I can download the file (I can also download it from the web after accessing the website on localhost on the server).
Here is the code for when I upload the file:
string uniqueFileName = null;
if (model.Data != null)
{
string uploadsFolder = Path.Combine(_webHostEnvironment.WebRootPath, "images");
uniqueFileName = Guid.NewGuid().ToString() + "_" + model.Data.FileName;
string filePath = Path.Combine(uploadsFolder, uniqueFileName);
model.Data.CopyTo(new FileStream(filePath, FileMode.Create));
}
Here is the code for when I download it:
public FileResult DownloadData(string fileName, string tempAddCode, string patientName)
{
byte[] fileBytes = System.IO.File.ReadAllBytes(Path.Combine(_webHostEnvironment.WebRootPath, "images") + "\\" + fileName);
var dataStream = new MemoryStream(fileBytes);
return new FileStreamResult(dataStream, new MediaTypeHeaderValue("application/pdf"))
{
FileDownloadName = patientName.ToLower().Replace(" ", "_") + ".pdf"
};
}
What is going on? Why would recycling the app pool fix this? Am I saving these user-uploaded files to the wrong location and ASP.NET is doing some sort of caching thing? I'd love any help finding a solution! Thanks.
With this line you open file stream and never close it.
model.Data.CopyTo(new FileStream(filePath, FileMode.Create));
Make sure you always dispose your streams, so files don't stay locked. Use code below to properly copy stream to file
using (var fs = new FileStream(filePath, FileMode.Create))
{
model.Data.CopyTo(fs);
}
What happens if you put a breakpoint in the DownloadData function, and step through? Does it successfully read the file into fileBytes, or do you get an exception - what exception do you get? Also check IIS logs to get the error code and post it, please.
Also, make sure the user that you're running under, has Write access to the folder. By Default, it will not.
To add, right-click on the folder, security, and add (from computer)
IIS AppPool\ [whatever the appPool name is].
IIS AppPool\MyAppPool
I have a website deployed on Hostgator. This is my first site so I'm still a little new to all this. Using Visual Studio 2019 this works completely fine in development but once deployed on my site it throws an error:
.../CreateRateConfirmation?loadNumber=RT222398 500 (Internal Server Error) VM1795 jquery:1
I'm using itext v.7 to create the pdf, and I'm thinking it might have to do with my path when creating the pdf? The actual pdf function is over 1800 lines of code so I am not going to post all that, but this is the relevant (I think) info:
string fileName = loads[i].Carrier.carrierName + " " + loadNumber + " Rate Confirmation.pdf";
string desktopFolder = "C:";
System.IO.Directory.CreateDirectory(desktopFolder + "\\Rate Confirmations");
string downloadFolder = desktopFolder + "\\Rate Confirmations";
PdfWriter writer = new PdfWriter(downloadFolder + "\\" + fileName);
PdfDocument pdf = new PdfDocument(writer);
Document document = new Document(pdf);
... roughly a billion lines of pdf creation...
document.Close();
System.Diagnostics.Process.Start(downloadFolder + "\\" + fileName);
string timeStamp = getTimeStamp(name);
loads[i].notes = loads[i].notes + timeStamp + ": Rate Confirmation Created^";
db.Entry(loads[i]).CurrentValues.SetValues(loads[i]);
db.SaveChanges();
break;
For what it's worth the function has a timestamp in it that should show up in the notes on my site as well and it's not, so it clearly is not making it all the way through this function.
I'm sure this is a completely easy solution but I'm really new to all this and I just can't seem to find a solution.
Thanks in advance!!!
I am not exactly sure about what is causing the problem. But there is certainly a thing which may be causing PDF generation to fail. Never hardcode directory manually. Instead try to get it dynamically according to the server on which your web application is deployed
You can replace the directory code with
string desktopFolder = Path.GetFullPath(Directory.GetCurrentDirectory());
string downloadFolder = desktopFolder + "\\Rate Confirmations";
if (!Directory.Exists(downloadFolder))
Directory.CreateDirectory(downloadFolder);
This code will automatically get you application current directory and change it to absolute path from virtual path
I am trying to upload PDF files. Only sometimes uploaded file gets corrupt. When I open that file on Adobe or browser. its says Insufficient data for an images
This problem is not for all the uploads. When I tried to upload the same file again its works perfectly.
I am not able to replicate the issue so that I can know why this type of issue occurring
I am using below code to save the file
FileUpload upload = GridView1.Rows[index].FindControl("FileUpload1") as FileUpload;
if (upload.HasFile)
{
string nameoffile = upload.FileName;
Random ran = new Random();
int forReference = ran.Next();
string[] strfileArray = nameoffile.Split('.');
nameoffile = strfileArray[0] + "" + forReference + ".pdf";
upload.SaveAs(path + "/" + nameoffile);
}
else
{
upload.SaveAs(path + "/" + nameoffile);
}
Note: We are running this appliaction on Azure VM. This code was works without any issue on previous server. since we migrated to Azure
We had hosted application on C drive of azure VM. After Changing application to secondary storage i.e. D drive problem got resolved
I'm trying to get my web server to write logs to a txt file on the server. It should be simple and it is working fine on my development machine but it doesn't on the server.
I have followed he advice in this: IIS7 Permissions Overview - ApplicationPoolIdentity
But it still won't write. the path is correct and it should have the proper permissions after following the above link.
my code for writing the file is:
private void Logger(String lines)
{
String fileName = DateTime.Now.Day.ToString() + DateTime.Now.Month.ToString() + DateTime.Now.Year.ToString() + "_Logs.txt";
try
{
System.IO.StreamWriter file = new System.IO.StreamWriter("C:/Web_Srvice_Logs/" + fileName, true);
file.WriteLine(lines + " " + DateTime.Now.ToString());
file.Close();
}
catch (Exception) { }
any Ideas?
Catch the exception and check the value. My guess is that the App Pool user does not have rights to write to that directory (I know you said you checked it) and/or that directory does not exist. The error message you are getting will help diagnose it.
You could always set the app pool to an admin user to test if it is a permission problem.
I am trying this code to download a file from a Windows machine using C# against a Solaris machine and I receive error 550 - File unavailable.
string fileName = FileName();
string remoteUri = "xxxx";
var webClient = new WebClient();
webClient.Proxy = null;
webClient.Credentials = new NetworkCredential(Settings.Default.FtpUser, Settings.Default.FtpPassword);
webClient.BaseAddress = "ftp://"+Settings.Default.FtpHost;
webClient.DownloadFile(remoteUri, fileName);
I have validated that the URI works when using it in the address line of an Internet Explorer.
The URI looks like this
ftp://10.99.137.99/opt/scripts/overnight/test.txt
The actual location after the login on the Unix side is
/opt/scripts/overnight/test.txt
on the Unix side.
I am able to view the file after entering my user and password. What am I doing wrong? What other steps can I take? Is there an easy way to use more manual ftp?
Here's another interesting article with a different answer:
FtpWebRequest Download File
string remoteUri = "xxxx";
Have you posted the actual code? This is the name of the remote file. It needs to be ftp://10.99.137.99/opt/scripts/overnight/test.txt not xxxx
If it isn't the actual code, can you post the code you are really using?