After looking at many topics I decided to ask this
I have a WCF service that reads a file from the local file system. When the service is tested locally on my computer it was no problem doing that.
But when I publish the service in IIS8 i am getting this error
The system cannot find the file specified
I have tried creating a new user and new ApplicationPool that uses that identity to run the service and also given full control to the folder that is trying to be read but the problem continues.
I have also tried even using the Administrator as the identity of the new Application Pool but did not solve the problem either
What am i missing ?
Assuming that you have a relative URL and the account that is running the application has the proper permissions, you're probably not getting the correct pathname to your file.
You can try something like this to find the full path of your file:
using System.IO;
public FileInfo GetFileInfo(string filename)
{
if(filename == null)
throw new ArgumentNullException("filename");
FileInfo info = new FileInfo(filename);
if(!Path.IsPathRooted(filename) && !info.Exists)
{
string[] paths = {
Environment.CurrentDirectory,
AppDomain.CurrentDomain.BaseDirectory,
HostingEnvironment.ApplicationPhysicalPath,
};
foreach(var path in paths)
{
if(path != null)
{
string file = null;
file = Path.Combine(path, filename);
if(File.Exists(file))
{
return new FileInfo(file);
}
}
}
}
throw new FileNotFoundException("Couldn not find the requested file", filename);
}
It's returning an instance of System.IO.FileInfo but you can easily adapt it to return a string (full pathname).
Related
I am trying to merge some txt-Files from a folder in a Xamarin Form app.
The path is selectable by the user (i´ve created a simple folder browser for that).
e.g. the source path is
/storage/1EE7-170F/
I collect e.g. all txt-files in that folder, read them using File.ReadAllText, do some processing.
That is working fine so far.
I also want to write the results to a new file in the same folder using File.WriteAllText. Target file would be:
/storage/1EE7-170F/SomeResultFile.txt
Getting the directories content is in AppName.Android:
var fileContents = Directory.GetFiles(baseDirectory.FullPath, searchPattern)
.Select(fullPath =>
{
var fileName = Path.GetFileName(fullPath);
var extension = Path.GetExtension(fullPath);
return new FileContent(fileName, fullPath, extension);
});
Reading of the files goes like this in common project:
var content = File.ReadAllText(fileContent.FullPath);
The files are written like this (also in common project):
var baseDir = Path.GetDirectoryName(sourceFiles.First().FullPath);
var targetFilePath = Path.Combine(baseDir, filename);
File.WriteAllLines(targetFilePath, lines);
But i always get an UnauthorizedAccessException saying 'Access to the path "/storage/1EE7-170F/SomeResultFile.txt" is denied.'
I wonder why i can read from that folder, but not write to it.
Writing to System.Environment.GetFolderPath(System.Environment.SpecialFolder.Personal) is working fine, but that´s not what i need.
I need the result files in the same directory as the source files.
I have both permission "READ_EXTERNAL_STORAGE" and "WRITE_EXTERNAL_STORAGE" enabled in the manifest.
In MainActivity.OnCreate i have added the following:
if (ContextCompat.CheckSelfPermission(this, Manifest.Permission.WriteExternalStorage) != (int)Permission.Granted)
{
ActivityCompat.RequestPermissions(this, new string[] { Manifest.Permission.WriteExternalStorage }, 0);
}
if (ContextCompat.CheckSelfPermission(this, Manifest.Permission.ReadExternalStorage) != (int)Permission.Granted)
{
ActivityCompat.RequestPermissions(this, new string[] { Manifest.Permission.ReadExternalStorage }, 0);
}
I´ve tried that both on Android emulator and a S10 with latest Android installed (on S10 i selected the "Documents" folder). Same result on both systems with different folders...
Any ideas what could be the problem?
Thanks a lot for help!
I have the following lines of code that work for creating a zip using ZipFile.CreateFromDirectory(selectedFile, zipPath)
if (selectedFolder == string.Empty)
{
Console.WriteLine("Invalid folder, try again");
}
else
{
Console.WriteLine("\nSelect zipfile name: ");
var zipName = Console.ReadLine();
// Also available: extractToDirectory
var zipPath = #"C:\Users\User\Documents\Dev\" + zipName + ".zip";
ZipFile.CreateFromDirectory(selectedFolder, zipPath);
However, the following code which should for all intents and purposes do the same thing except for multiple files being archived into a single zip folder refuses to work:
public static void CreateZipFile(string folderToCreateZip, IEnumerable<string> files)
{
var zipPath = folderToCreateZip + "\\test6.zip";
// Create a new ZIP in this location
using (var zip = ZipFile.Open(zipPath, ZipArchiveMode.Create))
{
foreach (var file in files)
{
// Add entry for files
zip.CreateEntryFromFile(file, zipPath, CompressionLevel.Optimal);
}
}
// Dispose of zip object after files have been zipped
//zip.Dispose();
}
var zip == ZipArchive zip
I've tried disabling read-only mode on the folders where the zip should get created, but I don't think this matters since the prior function with CreateFromDirectory() works fine. I've also tried creating a ZIP on desktop, but I get the same error.
This is the exception I'm getting:
As a note, I noticed that it does initially create the zip despite this error, just that it cannot add anything to it unlike CreateFromDirectory() can due to the folder either being in use, no permissions to that area or the folder already existing. Is there a way I can get CreateEntryFromFile() working or an alternative that would work for multiple files?
I had the same problem. The solution was post the full path name at the destinationArchiveFileName parameter (and also a write alowed path). For example c:\my apps folder\my app\my temp\zipfile.zip
I have two network shares that both have dedicated accounts (network credentials) to access them
For example:
\RemoteComputer1\Folder - user1
\RemoteComputer2\Folder - user2
user1 don't have access to local computer and to \RemoteComputer2\Folder, same with user2, he doesn't have access to local computer and \RemoteComputer1\Folder.
I need to be able to do 3 types of operations:
Copy local file to \RemoteComputer1\Folder
Copy file from \RemoteComputer1\Folder to \RemoteComputer2\Folder
Copy file from \RemoteComputer1\Folder to local folder
Right now I'm using https://github.com/mj1856/SimpleImpersonation to get source stream and target stream and then I'm copying from source to target stream using stream.CopyTo
Below is my current code:
public static void CopyWithCredentials(string #sourcePath, string destinationPath, NetworkCredential readUser = null, NetworkCredential writeUser = null)
{
//same user, do normal File.Copy
if (readUser!=null && writeUser!=null && readUser == writeUser)
{
using (Impersonation.LogonUser(readUser.Domain, readUser.UserName, readUser.Password, LogonType.NewCredentials))
{
if (!Directory.Exists(Path.GetDirectoryName(destinationPath)))
{
Directory.CreateDirectory(Path.GetDirectoryName(destinationPath));
}
File.Copy(#sourcePath, destinationPath);
return;
}
}
FileStream sourceStream;
if (readUser != null)
{
using (Impersonation.LogonUser(readUser.Domain, readUser.UserName, readUser.Password, LogonType.NewCredentials))
{
sourceStream = new FileStream(#sourcePath, FileMode.OpenOrCreate, System.IO.FileAccess.Read);
}
}
else
{
sourceStream = new FileStream(#sourcePath, FileMode.OpenOrCreate, System.IO.FileAccess.Read);
}
FileStream destinationStream;
if (writeUser != null)
{
using (Impersonation.LogonUser(writeUser.Domain, writeUser.UserName, writeUser.Password, LogonType.NewCredentials))
{
if (!Directory.Exists(Path.GetDirectoryName(destinationPath)))
{
Directory.CreateDirectory(Path.GetDirectoryName(destinationPath));
}
while (File.Exists(destinationPath))
{
string fileName = Path.GetFileNameWithoutExtension(destinationPath);
string newFileName = fileName + "1";
destinationPath = destinationPath.Replace(fileName, newFileName);
}
destinationStream = File.Create(destinationPath);
}
}
else
{
if (!Directory.Exists(Path.GetDirectoryName(destinationPath)))
{
Directory.CreateDirectory(Path.GetDirectoryName(destinationPath));
}
destinationStream = File.Create(destinationPath);
}
#warning is this enough for closing streams after copying?
using (sourceStream)
{
using (destinationStream)
{
sourceStream.CopyTo(destinationStream);
}
}
sourceStream.Dispose();
destinationStream.Dispose();
sourceStream = null;
destinationStream = null;
}
I must admit this looks ugly and over-complicated.
Problem is I have multiple files in one folder and I want to copy all of them to second folder. Using my approach I call LogonUser twice for each file. For 1000 files I must call it 2000 times. Ideally I'd like to call LogonUser twice (ones for first folder and second time for second folder) and copy all files in that "session".
I would like to use File.Copy because it uses native kernel32.dll function (https://stackoverflow.com/a/1247092/965722) and as I found out in same question File.Copy has much improved since Vista SP1.
Also I found question about File.Copy and Stream speed and it looks like they should be equal, because File.Copy is using streams.
My question is: Can this be done simpler without using streams?
I don't want to create super admin account that has access everywhere.
I'd like to avoid WNetUseConnection because it can leave open connections.
I don't want to add permissions per files as mentioned in comments to this question.
I have seen so many working examples of File uploading with MVC.
However, I want to follow a different approach such that, I want a little abstraction as follows:
I want to introduce a FileService, and inject that to the controller as a dependency. Let the service upload the file and return me a UploadedFile object.
A problem I am having right now is to upload to correct place/directory in file system or application root.
In a controller, I have access to Server object which I can call Server.MapPath and it does the magic, below I cant access to that Object since it is not a Controller.
How can I upload to anywhere in file system or in project root below?
public class FileService : IFileService
{
private const string UploadBase = "/Files";
public File UploadFile(HttpPostedFileBase file)
{
if (file != null)
{
string folder = DateTime.Today.Month + "-" + DateTime.Today.Year;
string finalFolder = Path.Combine(UploadBase, folder);
if (!Directory.Exists(finalFolder))
{
return Directory.CreateDirectory(finalFolder);
}
var filename = UploadFile(file, directoryInfo.Name + "/");
var newFile = new File { ContentType = file.ContentType, FilePath = filename, Filename = file.FileName };
return newFile;
}
return null;
}
An error is :
The SaveAs method is configured to require a rooted path, and the path '9-2013/037a9ddf-7ffe-4131-b223-c4b5435d0fed.JPG' is not rooted.
Re-stating what was noted in comments:
If you want to map the virtual path to the physical path outside of the controller, you can always use HostingEnvironment.MapPath method.
I have written two methods such as FileUpLoad() and FileDownLoad() to Upload and Download a single file in my local system.
void FileUpLoad()
{
string hfBrowsePath = fuplGridDocs.PostedFile.FileName; //fuplGridDocs is a fileupload control
if (hfBrowsePath != string.Empty)
{
string destfile = string.Empty;
string FilePath = Path.Combine(#"E:\Documents\");
FileInfo FP = new FileInfo(hfBrowsePath);
hfFileNameAutoGen.Value = PONumber + FP.Extension;
destfile = FilePath + hfFileNameAutoGen.Value; //hfFileNameAutoGen is a hidden field
fuplGridDocs.PostedFile.SaveAs(destfile);
}
}
void FileDownLoad(LinkButton lnkFileName)
{
string filename = lnkFileName.Text;
string FilePath = Path.Combine(#"E:\Documents", filename);
fuplGridDocs.SaveAs(FilePath);
FileInfo fileToDownLoad = new FileInfo(FilePath);
if (fileToDownLoad.Exists)
{
Process.Start(fileToDownLoad.FullName);
}
else
{
lblMessage.Text = "File Not Saved!";
return;
}
}
While running the application before hosting it in IIS, I can upload a file to the desired location and can also retrieve a file from the saved location. But after publishing it in the localhost, I can only Upload a file. I could not download the saved file. There is no exception too. The Uploaded file is saved in the desired location. I don't know why it is not retrieving the file? Why I cant download the file in IIS? I have searched a lot in the internet, but couldn't find the solution. How to solve this? I am using Windows XP and IIS 5.1 version.
How do you expect your Web Application to do a Process.Start when you deploy this site to a server, your just going to be opening pictures on the server, not on the client PC.
I think this will answer your question: http://www.codeproject.com/Articles/74654/File-Download-in-ASP-NET-and-Tracking-the-Status-o
Also the download file is missing a slash after E:\Documents
another option is to add your wildcard to IIS MIME types