UnauthorizedAccessException in UWP Directory.CreateDirectory - c#

This is a Unity project building to Hololens 2. I'm trying to create a new folder within the Application.PersistentDataPath but it's failing due to UnauthorizedAccessException. Strange thing is it's been working and only recently with seemingly unrelated changes it stopped.
Here's the function that is failing.
static DirectoryInfo EnsureDirectory(string subFolder)
{
Debug.Log($"starting EnsureDirectory() for {subFolder}.");
string directoryPath = Path.Combine(Application.persistentDataPath, subFolder.ValidatePath());
Debug.Log($"About to create directory {directoryPath}");
var dir = Directory.CreateDirectory(directoryPath);
Debug.Log($"Successfully created directory {dir.FullName}");
return dir;
}
At Directory.CreateDirectory I get the error as follows:
Hololens screenshot of debug log
EXCEPTION: UnauthorizedAccessException: Access to the path "C:\" is denied.
This error only happens when deployed to the device. In the Unity editor it works perfectly. I also don't know why it would say "C:" when that's not the path I'm trying to use.
Any help would be greatly appreciate.

Thank you #aybe for the answer that did the job.
This is probably because it checks for every path segment, starting
with the first one which is root and obviously forbidden. Try new
DirectoryInfo(Application.persistentDataPath).CreateSubdirectory(...);
instead to see if the error vanishes

Related

access outside project folder with Asp.net core application on linux

I'm having some issues trying to access a folder outside the application root folder with an asp.net application deployed on Linux.
The applcation is deployed at the moment (for testing purposes) in /home/pbl/projects/pbl-web/.
I want to access a folder of an external hard drive I've mounted on the system, and it's located in /mnt/ExtDist/Data.
I'm using a PhysicalFileProvider with the following path: /mnt/ExtDist/Data. This path is configured in app.settings.json file and retrieved through the IConfiguration configuration variable.
Here's a part of the code in the Startup.cs file :
public void ConfigureServices(IServiceCollection services)
{
...
var imagePath = configuration.GetSection("PhotoManagementSettings")["ImagesFolderPath"];
var rootPath = Path.GetDirectoryName(Assembly.GetEntryAssembly().Location);
this.imagePhysicalFileProvider = new PhysicalFileProvider(imagePath));
I've tried in different ways with no luck so far:
passing the absolute path
passing the relative path and combining with the rootPath variable (see the code above).
The PhysicalFileProvider is getting me the following error:
Unhandled exception. System.IO.DirectoryNotFoundException: /mnt/ExtDist/Data/
Testing the code in windows and giving it an absolute path like i.e "C:\Test" works fine.
So there's something weird in linux that is failing, but I cannot understand why. Any clues ?
Thanks in advance
Paolo

SSIS package fails on file.delete of script task

I have an SSIS package, zip.dtsx. This package successfully runs on serverA. I copied this package in serverB. However, when I try to run zip.dtsx on serverB, it fails.
zip.dtsx just reads a file in a source folder, compresses it, saves the compressed file to a different folder, then deletes the original file in the source folder.
After some investigation, I figured out that if I comment out the part in the C# script task that deletes the file in the source folder. The package runs successfully.
I need to delete the file in the source folder. Otherwise, this file will just be repeatedly loaded to the database. I've already re-added the script task references as suggested here, but still I cannot make the file.delete run successfully.
public void Main()
{
String sourcePath = Convert.ToString(Dts.Variables["SourcePath"].Value);
String namePart = Convert.ToString(Dts.Variables["NamePart"].Value);
String destinationPath = Convert.ToString(Dts.Variables["DestinationPath"].Value);
FileStream sourceFile = File.OpenRead(#sourcePath + namePart);
FileStream destFile = File.Create(#destinationPath + namePart);
GZipStream compStream = new GZipStream(destFile, CompressionMode.Compress);
try
{
int theByte = sourceFile.ReadByte();
while (theByte != -1)
{
compStream.WriteByte((byte)theByte);
theByte = sourceFile.ReadByte();
}
}
finally
{
compStream.Dispose();
sourceFile.Close();
destFile.Close();
File.Delete(#sourcePath + namePart);
}
Dts.TaskResult = (int)ScriptResults.Success;
}
UPDATE:
After trying the exact same code here. and founding out that this code, deleted my file in the source folder, I tried to update my code to follow the way the file was deleted in the link. However, it still did not work. Below is how I updated my code.
String sourcePath = Convert.ToString(Dts.Variables["SourcePath"].Value);
String namePart = Convert.ToString(Dts.Variables["NamePart"].Value);
String destinationPath = Convert.ToString(Dts.Variables["DestinationPath"].Value);
FileStream sourceFile = File.OpenRead(#sourcePath + namePart);
FileStream destFile = File.Create(#destinationPath + namePart);
GZipStream compStream = new GZipStream(destFile, CompressionMode.Compress);
try
{
int theByte = sourceFile.ReadByte();
while (theByte != -1)
{
compStream.WriteByte((byte)theByte);
theByte = sourceFile.ReadByte();
}
}
finally
{
compStream.Dispose();
sourceFile.Close();
destFile.Close();
FileInfo currFileInfo = new FileInfo(#sourcePath + namePart);
currFileInfo.Delete();
I've finally figured this out.
To explain the whole situation, we have a fully running sqlserver, serverA. We want to duplicate this on serverB so that we can have a test environment at serverB. The necessary databases are restored on serverB already, all that's left are the SSIS packages and SQL Server Agent Jobs.
The main package (main.dtsx) that loads the contents of the files to the DB was failing.
From Integration Services Catalog -> Catalog folder -> right click -> Reports -> All Executions, I've learned that it is failing on Zip.dtsx that is called from main.dtsx. Zip.dtsx that compresses the file accessed, archives it and deletes it from the source folder.
After playing around with the script task in Zip.dtsx(an idea I got from Kannan Kandasamy's comments), I figured out that it's in the File.delete() part where my script task fails. Instantly, one would think that it is a permission issue.
My 1st mistake is I continued to play on my script task while executing Zip.dtsx by right click -> Execute task on Visual Studio. I kept getting the runtime error screencaptured in my previous post without realizing that I was getting it because I am using package variable passed by main.dtsx to zip.dtsx. I got hung up with this until I figured why the script task runs successfully when I replace the variables with my hardcoded pathnames.
My 2nd mistake was to replace the package variable of Zip.dtsx with my hard coded paths. Until finally, I realized that the folder accessed by Zip.dtsx is a local folder in serverB and I'm running the SQL server agent jobs in my local machine, say machineA. So, I shared the serverB local folders to my user account. For some reason, that messed up my package badly that it no longer sees the files in the folder thus my package succeeds running because it finds the folder empty.
The main solution:
I changed back the changes I did to Zip.dtsx and removed the sharing of serverB's local folders to my user account and instead added the NT Service\SQL$Instance to have full control over the source folder where the script task deletes the files. See this link to check how to add the SQLServer$Instance to the folder permission settings.
Other issues I encountered are:
When transferring packages form different servers and the packages fail, I think this is also important. I also did this step. But in serverB, I couldn't find Microsoft.SQLServer.ManagedDTS.dll. What I did is I copied this dll from serverA to serverB. Also check your system path to see what version of Microsoft SQL Server tools you are using on default and make sure that is what you reference in your script task.
At some point in time, while investigating, I encountered an error when viewing Integration Services Catalog's All Executions report. I solved this by going to C:\Temp folder. When I double click it to access it, a dialog appeared that I currently do not have access. I continued by clicking the 'continue' button in the dialog. After that, All Executions Report is running again.
The error I got from SSMS is,
TITLE: Microsoft SQL Server Management Studio
An error occurred during local report processing.
(Microsoft.ReportViewer.WinForms)
------------------------------ ADDITIONAL INFORMATION:
The definition of the report '' is invalid.
(Microsoft.ReportViewer.Common)
An unexpected error occurred while compiling expressions. Native
compiler return value: ‘[BC2001] file 'C:\Windows\TEMP\mpgc21o3.0.vb'
could not be found’. (Microsoft.ReportViewer.Common)
------------------------------ BUTTONS:
OK
I'm not sure how much of this really mattered on my case but I'm sharing it in case someone went through the same problem. I am a beginner in MS SQL Server so I think this might help someone who is also a beginner like me.

System.IO.Directory.GetDirectories not working after publish

I have a C# application that gets a list of directories inside a folder. This is done using the call
String[] projects = System.IO.Directory.GetDirectories("path/to/folder", "*", System.IO.SearchOption.TopDirectoryOnly);
This works fine on my machine, but after publishing (resulting in a setup.exe, as well as programName.application + Application Files) I tried running the program on a new machine and it threw an unhandled exception error.
The error was in regards to being unable to connect to a database, but the interesting part is that it was complaining about path not being valid, listing a path that only exists on my machine.
Does System.IO.Directory.GetDirectories not get reinitialized when running on another machine?
I guess the problem is with path/to/folder, as that path might not exist in new machine. Don't hard-code the path. Instead read it from config file (app.config using ConfigurationManager).

Access to the path 'autorun.inf' is denied when File.Copy is executed

I get the following exception when I try to copy a file in my Windows Application...
Access to the path 'autorun.inf' is denied.
The file I am trying to copy is a shortcut file. However when I move the EXACT line of code in a standalone application, it executes flawlessly. Same user credentials. There is no difference on how I launch either application.
This is the line of code...
File.Copy(Path.Combine(Path.GetDirectoryName(Application.ExecutablePath), "JEE Startup.lnk"), #"C:\Documents and Settings\All Users\Start Menu\Programs\Startup\JEE Startup.lnk", true);
I tried deleting the bin folder and rebuilding. No luck.
I have no idea what is wrong.
The application is being executed on an XP machine.

WmAutoUpdate - anyone used it? Won't roll back

I've built a Compact Framework application and I'm using WmAutoUpdate to deploy new versions to the mobile devices (http://www.sebastianvogelsang.com/2009/09/23/wmautoupdate-a-net-compact-framework-auto-update-library/). Has anyone used this? It's cool but I've got a problem.
If I cause the application to crash half-way through updating it is supposed to recover by copying the backup version back into the main directory. This doesn't work because the exe file is "locked" by the operating system because it is currently in use. I can verify this is the case because I can't delete it using Windows Explorer either. The error details are:
System.IO.IOException was unhandled
Message="IOException"
StackTrace:
at System.IO.__Error.WinIOError(Int32 errorCode, String str)
at System.IO.File.Move(String sourceFileName, String destFileName)
at WmAutoUpdate.Updater.assertPreviousUpdate()
at WmAutoUpdate.Updater..ctor(String url)
The error occurs on this line in Updater.assertPreviousUpdate():
File.Move(f, appPath + "\\" + getFilenameFromPath(f));
The code manages to update the application exe file when it's allowed to run normally (I'm not sure how). The problem is that it doesn't work when rolling back.
Cheer
Mark
I've used WmAutoUpdate and I've found the same problem. The issue is that you can move the files of the actual running process, but you cannot overwrite them. If you check the update part, WmAutoUpdate moves the running application to a backup directory and then it writes the update version to the original directory. I have fixed the rollback part this way:
if (Directory.Exists(backupDir))
{
string tmpDir = Path.Combine(Path.GetTempPath(),Path.GetFileNameWithoutExtension(Path.GetTempFileName()));
Directory.Move(appPath, tmpDir);
Directory.Move(backupDir, appPath);
}
First we move the running application files to a random directory in Temp. Then we copy the backup folder to the application original directory. Of course, this will generate a .TMP file in the Temp directory of your device, and a folder with the actual running process. You will have to delete this temporary folder once in a while in production code.

Categories

Resources