I have a file, which the users browse and hit the upload button, i save the file on the server, appdata/uploads in the application directory. then i try to read it using stream reader and then parse it. on my local development enviornment it works fine, but when i deploy it on a server it does not work at all. any suggestions?? thank you
//Save LoadList File:
DateTime uploadDate = DateTime.Now;
string destinationPath = string.Format("{0}\\{1}\\{2}\\{3}\\", Server.MapPath("~/App_Data/uploads"), uploadDate.ToString("yyyy"), uploadDate.ToString("MMM"), uploadDate.ToString("dd"));
if (!Directory.Exists(destinationPath))
Directory.CreateDirectory(destinationPath);
string storedFileName = string.Format("{0}{1}.json", destinationPath, System.Guid.NewGuid());
file.ElementAt(0).SaveAs(storedFileName);
//FileImport is a static class
var Pair = FileImport.CyclesCompleted(storedFileName);
private static string LoadTextFromFile(string fileName)
{
StreamReader streamReader = new StreamReader(fileName);
string text = streamReader.ReadToEnd();
streamReader.Close();
return text;
}
Saving file on server ususlly results in permission errors since most accounts can't write to default location on server. You may get away with using Path.GetTempFileName, but even in this case some accounts (i.e. account that requests run for "anonymous user") will not have permissions to read/write to that location.
If you simply need to parse uploaded file you can copy stream to MemoryStream and create StreamReader over this memory stream. You may be able to use Stream for uploaded file directly, but it will not support seaking (may work as you are using StreamReader which does not seek).
Related
My app requires copying file using SFTP from a location directly to Azure storage.
Our app is using C# with .NET 4.6 and our WinSCP version is 5.21.1.
My old code works using Session.GetFileToDirectory() method, but the problem is it need to store the file on temp folder inside our hosting.
using (Session session = new Session())
{
session.Open(sessionOptions);
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
var transfer = session.GetFileToDirectory(FilePath, fullPath);
using (Stream stream = File.OpenRead(transfer.Destination))
{
UploadToAzure(stream, Filename, Foldername);
}
}
As we planned to entirely use Azure storage, I change my code like this
using (Session session = new Session())
{
session.Open(sessionOptions);
TransferOptions transferOptions = new TransferOptions();
transferOptions.TransferMode = TransferMode.Binary;
using (Stream stream = session.GetFile(FilePath, transferOptions))
{
UploadToAzure(stream, Filename, Foldername);
}
}
Here my library that uploads the file using Stream to Azure.
This code is working fine using my old code that still save to temp folder before send to Azure.
public static string UploadToAzure(Stream attachment, string Filename, string Foldername)
{
System.Net.ServicePointManager.SecurityProtocol = System.Net.SecurityProtocolType.Tls12;
var connectionString = $"{ConfigurationManager.AppSettings["AzureFileShareConnectionString"]}";
string shareName = $"{ConfigurationManager.AppSettings["AzureFileShareFolderName"]}";
string dirName = $"files\\{Foldername}";
string fileName = Filename;
try
{
ShareClient share = new ShareClient(connectionString, shareName);
share.CreateIfNotExists();
ShareDirectoryClient directory = share.GetDirectoryClient(dirName);
directory.CreateIfNotExists();
// Get a reference to a file and upload it
ShareFileClient file = directory.GetFileClient(fileName);
file.Create(attachment.Length);
file.UploadRange(
new HttpRange(0, attachment.Length), attachment);
}
catch (Exception e)
{
return $"Uploaded {Filename} failed : {e.ToString()}";
}
return $"{Filename} Uploaded";
}
But currently my new code not working with error message
'((WinSCP.PipeStream)stream).Length' threw an exception of type 'System.NotSupportedException'.
This is the object description on creating stream using Session.GetFile method
This is 'exception stacktrace' on sending the empty-stream to Azure
The Stream returned by WinSCP Session.GetFile does not implement the Stream.Length property, because WinSCP cannot guarantee that the size of the file is fixed. The remote file might be changing while you are downloading the file. Not to mention ASCII transfer mode, when the file is converted while being transferred, with unpredictable impact on the final size.
You use the size (Stream.Length) in two places:
When creating the file:
file.Create(attachment.Length);
The parameter of ShareFileClient.Create is maxSize. So it does not look like it's a real size. You can possibly just put an arbitrary large number here.
Or if you prefer (and know that the file is not changing), read the current size of the remote file using Session.GetFileInfo and RemoteFileInfo.Length:
file.Create(session.GetFileInfo(FilePath).Length);
When uploading the contents:
file.UploadRange(new HttpRange(0, attachment.Length), attachment);
The above can be replaced with simple ShareFileClient.Upload:
file.Upload(attachment);
Hello I'm beginner with C# and I want to delete the last character of my file to inject JSON objects to this file manually (I know that's not the best way to do that), so I can get the right format I tried with multiple ways like open the file, manipulating the string (deleting the last character) and when I try to replace the text in that same file I have errors like IOException: The process cannot access the file 'file path' because it is being used by another process or System.UnauthorizedAccessException : 'Access to the path 'C:\Users\ASUS\Desktop\Root' is denied.
I'll show you the code :
StoreLogs Log = new StoreLogs()
{
Id = ID,
DateTime = dateT,
TaskName = task,
SrcAddress = srcPath,
DstAddress = path,
FileSize = DirSize(new DirectoryInfo(srcPath)),
DelayTransfer = ts.Milliseconds,
};
// Record JSON data in the variable
string strResultJson = JsonConvert.SerializeObject(Log);
// Show the JSON Data
// Console.WriteLine(strResultJson);
// Write JSON Data in another file
string MyJSON = null;
string strPath = #"C:\Users\ASUS\Desktop\Backup\logs\log.json";
if (File.Exists(strPath))
{
//FileInfo table = new FileInfo(strPath);
//string strTable = table.OpenText().ReadToEnd();
//string erase = strTable.Remove(strTable.LastIndexOf(']'));
//Console.WriteLine(erase);
//StreamReader r1 = new StreamReader(strPath);
//string strTable = r1.OpenText().ReadToEnd();
//string erase = strTable.Remove(strTable.LastIndexOf(']'));
//r1.Close();
using (StreamReader sr = File.OpenText(strPath))
{
string table = sr.ReadToEnd();
string erase = table.Remove(table.LastIndexOf(']'));
sr.Close();
File.WriteAllText(strPath, erase);
}
//MyJSON = "," + strResultJson;
//File.AppendAllText(strPath, MyJSON + "]");
//Console.WriteLine("The file exists.");
}
else if (!File.Exists(strPath))
{
MyJSON = "[" + strResultJson + "]";
File.WriteAllText(strPath, MyJSON);
Console.WriteLine("The file doesn't exists.");
}
else
{
Console.WriteLine("Error");
}
// End
Console.WriteLine("JSON Object generated !");
Console.ReadLine();
And that's the result I want :
[{"Id":"8484","DateTime":"26 novembre 2019 02:33:35 ","TaskName":"dezuhduzhd","SrcAddress":"C:\\Users\\ASUS\\Desktop\\Root","DstAddress":"C:\\Users\\ASUS\\Desktop\\Backup","FileSize":7997832.0,"DelayTransfer":0.0},{"Id":"8484","DateTime":"26 novembre 2019 02:33:35 ","TaskName":"dezuhduzhd","SrcAddress":"C:\\Users\\ASUS\\Desktop\\Root","DstAddress":"C:\\Users\\ASUS\\Desktop\\Backup","FileSize":7997832.0,"DelayTransfer":0.0},{"Id":"8484","DateTime":"26 novembre 2019 02:33:35 ","TaskName":"dezuhduzhd","SrcAddress":"C:\\Users\\ASUS\\Desktop\\Root","DstAddress":"C:\\Users\\ASUS\\Desktop\\Backup","FileSize":7997832.0,"DelayTransfer":0.0}]
Edit :
Thank you all for your advices
Solution:
FileStream fs = new FileStream(strPath, FileMode.Open, FileAccess.ReadWrite);
fs.SetLength(fs.Length - 1);
fs.Close();
In the code example you have posted you are opening a stream to read the file. A using block will dispose the stream after you exit the block. You are trying to write to the file, while the read stream is still accessing it (the read stream still exists). You've basically opened the file, you read from it, and are trying to write back to it while still holding it open. The reason this is a problem is that you are not using the stream to write. So your second, write, process is unable to access the file. I see you are closing the stream prior to write, but I'm willing to bet it's still holding the reference open.
I would try this method:
How to both read and write a file in C#
what it says is the access to the path (C:\Users\ASUS\Desktop\Root) denied for the user who is running the application. for ex: If you are running from Visual studio on user1 windows login then user1 should have appropriate rights to that root folder. If the code is running by itself (exe) then check the access for that user who is invoking that exe.
Based on the errors you posted seems that:
Maybe you're leaving some stream open pointing to the file you want to edit, use the 'using' statement to avoid this (see this link for more info)
You're trying to access a file when you don't have needed permissions (you aren't a system admin or file is read-only), try changing file ubication or setting it to be writeable (see this link for mor info about the UnauthorizedAccessException exception)
Hope this helps you!
I need to save the file when method OnDestroy is called and load same file when method OnCreate is called. At this time I can read json file easily from Assets (this works fine)
StreamReader reader = new StreamReader(Assets.Open("reiksmes.json"));
string JSONstring = reader.ReadToEnd();
Daiktai myList = JsonConvert.DeserializeObject<Daiktai>(JSONstring);
items.Add(myList);
, but I have some problems when I try to save(write) Daiktai class data to the same file I opened above. I tried:
string data = JsonConvert.SerializeObject(items);
File.WriteAllText("Assets\\reiksmes.json", data);
with this try I get error System.UnauthorizedAccessException: Access to the path "/Assets
eiksmes.json" is denied.
also tried:
string data = JsonConvert.SerializeObject(items);
StreamWriter writer = new StreamWriter(Assets.Open("reiksmes.json"));
writer.WriteLine(data);
and with this try I get error System.ArgumentException: Stream was not writable.
Summary:
I think I chose bad directory(Assets), I need to save and load data (json format). So where do I need to save them and how(give example)?
You can't save anything to assets. You can just read from it. You have to save the file to a different folder.
var fileName = "reiksmes.json";
string documentsPath = System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal); // Documents folder
var path = Path.Combine(documentsPath, fileName);
Console.WriteLine(path);
if (!File.Exists(path))
{
var s = AssetManager.Open(fileName);
// create a write stream
FileStream writeStream = new FileStream(path, FileMode.OpenOrCreate, FileAccess.Write);
// write to the stream
ReadWriteStream(s, writeStream);
}
I am using DotNetZip.
What I need to do is to open up a zip files with files from the server.
The user can then grab the files and store it locally on their machine.
What I did before was the following:
string path = "Q:\\ZipFiles\\zip" + npnum + ".zip";
zip.Save(path);
Process.Start(path);
Note that Q: is a drive on the server. With Process.Start, it simply open up the zip file so that the user can access all the files. I like to do the same but not store the file on disk but show it from memory.
Now, instead of storing the zip file on the server, I like to open it up with MemoryStream
I have the following but does not seem to work
var ms = new MemoryStream();
zip.Save(ms);
but not sure how to proceed further in terms of opening up the zip file from a memory stream so that the user can access all the files
Here is a live piece of code (copied verbatim) which I wrote to download a series of blog posts as a zipped csv file. It's live and it works.
public ActionResult L2CSV()
{
var posts = _dataItemService.SelectStuff();
string csv = CSV.IEnumerableToCSV(posts);
// These first two lines simply get our required data as a long csv string
var fileData = Zip.CreateZip("LogPosts.csv", System.Text.Encoding.UTF8.GetBytes(csv));
var cd = new System.Net.Mime.ContentDisposition
{
FileName = "LogPosts.zip",
// always prompt the user for downloading, set to true if you want
// the browser to try to show the file inline
Inline = false,
};
Response.AppendHeader("Content-Disposition", cd.ToString());
return File(fileData, "application/octet-stream");
}
You can use:
zip.Save(ms);
// Set read point to beginning of stream
ms.Position = 0;
ZipFile newZip = ZipFile.Read(ms);
See the documentation for Create a zip using content obtained from a stream.
using (ZipFile zip = new ZipFile())
{
ZipEntry e= zip.AddEntry("Content-From-Stream.bin", "basedirectory", StreamToRead);
e.Comment = "The content for entry in the zip file was obtained from a stream";
zip.AddFile("Readme.txt");
zip.Save(zipFileToCreate);
}
After saving it, you can then open it up as normal.
I have two bits of code. One which uploads a zipfile and a server which saves the upload to the drive. My problem is I upload a zip file which opens fine in the windows 7 default zipping program, but when I try to open it from the webserver it was posted too it won't open anymore with the error:
Windows cannot open the folder. The compressed zipped folder 'blah' is invalid.
Note1: The file opens completely fine in WinRar or other zip programs.
Note2: The original file and the file on the server are the exact same size on disk but the size of the one of the server is 200ish bytes bigger
Here is the code for uploading zips:
public static String UploadFile(String url, String filePath)
{
if (!File.Exists(filePath))
throw new FileNotFoundException();
try
{
using (var client = new WebClient())
{
byte[] result = client.UploadFile(url, filePath);
UTF8Encoding enc = new UTF8Encoding();
string response = enc.GetString(result);
return response;
}
}
catch (WebException webException)
{
HttpWebResponse httpWebResponse = webException.Response as HttpWebResponse;
return (httpWebResponse == null) ? webException.Message : httpWebResponse.StatusCode.ToString();
}
}
Here is the code on the server which saves the incoming file (exists in the page load of a .NET C# aspx page):
private void SaveZipFile()
{
string fileName;
string zipPath;
fileName = GenerateFileName();
zipPath = _hhDescriptor.GetDirectory(path => Server.MapPath(("./" + _serviceName + "\\" + path)) + "\\" + fileName + ".zip");
if (!Directory.Exists(zipPath))
{
Directory.CreateDirectory(Path.GetDirectoryName(zipPath));
}
Request.SaveAs(zipPath, false);
logger.Trace(string.Format("ManualUpload: Successfully saved uploaded zip file to {0}", zipPath));
}
Any ideas / or suggestions as possible places this could be breaking would be greatly appreciated!. I am probably saving some other random stuff along with the zip file.
UPDATE 1:
When I open the server's zip file in notepad it contains
-----------------------8cd8d0e69a0670b Content-Disposition: form-data;
name="file"; filename="filename.zip"
Content-Type: application/octet-stream
So my question is how to save the zip without capturing the header info.
I believe that the problem is using HttpRequest.SaveAs. I suspect that's saving the entire request, including the HTTP headers. Look at the file in a binary file editor and I suspect you'll find the headers at the start.
Use HttpRequest.Files to get at files uploaded as part of the request, and HttpPostedFile.SaveAs to save the file to disk.
You are writing the entirety of the request which may have some Multipart MIME separators in it. I think you need to use Request.Files.