I'm probably being blind, but this has me stumped. The following unit test fails on my local machine (but not on our build server) with a System.UnauthorizedAccessException.
[TestMethod()]
public void UserPreferences_LocalApplicationDataPath_PathAlwaysExists()
{
//setup: get the proposed path and then delete the bottom folder and any existing files
string logPath = Environment.GetFolderPath(Environment.SpecialFolder.LocalApplicationData);
string actual = System.IO.Path.Combine(logPath, #"Autoscribe\Matrix\Gemini");
//if the folder is there, first delete it
if (Directory.Exists(actual))
{
//log some information about what is going on
Console.WriteLine("Attempting to delete " + actual + " as user " + Environment.UserName);
Directory.Delete(actual, true); // <-THROWS EXCEPTION HERE
Console.WriteLine("Deleted");
}
//action
actual = UserPreferencesManager.LocalApplicationDataPath;
//assert that getting the path forces it to be created.
Assert.IsTrue(Directory.Exists(actual));
}
The reported Environment.UserName value is the Windows user who 'owns' the local AppData folder. Why can't folder be deleted? Oddly I don't have this problem on all machines that the test is run on (all Windows 7).
You may not be able to delete the folder locally because you still have other applications running that access files in it. Maybe you have opened a notepad to check a logfile in this folder?
The build server does not have these problems, because there are no users or processes actually "working" on the machine.
Related
In my .net core web API , I'm using OpenHtmlToPdf nuget package for rendering HTML documents to PDF format. The implementation working fine locally but not in the server production K8 environment. I'm getting following message from the logs.
Permission denied
I did several workaround and was able to find out, a lib called OpenHtmltoPdf uses Path.GetTempPath() and Guid.NewGuid() to create a temp file. seems above permission denied error occurs when it trying to access that temp path. this is code snippet from the OpenHtmltoPdf lib. OpenHtmlToPdf git repo
//inside TemporaryPdf class
public static string TemporaryFilePath()
{
return Path.Combine(Path.GetTempPath(), "OpenHtmlToPdf", TemporaryPdf.TemporaryFilename());
}
private static string TemporaryFilename()
{
return Guid.NewGuid().ToString("N") + ".pdf";
}
I just added following line to my application to determine the temp path and its returns File Path is: /tmp/.
Console.WriteLine("File path is:" + Path.GetTempPath());
But in the Kubernetes pods have rwx permissions for /tmp/ folder for the all users.
My problem is, after deploying the API, do the NuGet packages use a separate temp path? So how exactly identify it?
I have an SSIS package, zip.dtsx. This package successfully runs on serverA. I copied this package in serverB. However, when I try to run zip.dtsx on serverB, it fails.
zip.dtsx just reads a file in a source folder, compresses it, saves the compressed file to a different folder, then deletes the original file in the source folder.
After some investigation, I figured out that if I comment out the part in the C# script task that deletes the file in the source folder. The package runs successfully.
I need to delete the file in the source folder. Otherwise, this file will just be repeatedly loaded to the database. I've already re-added the script task references as suggested here, but still I cannot make the file.delete run successfully.
public void Main()
{
String sourcePath = Convert.ToString(Dts.Variables["SourcePath"].Value);
String namePart = Convert.ToString(Dts.Variables["NamePart"].Value);
String destinationPath = Convert.ToString(Dts.Variables["DestinationPath"].Value);
FileStream sourceFile = File.OpenRead(#sourcePath + namePart);
FileStream destFile = File.Create(#destinationPath + namePart);
GZipStream compStream = new GZipStream(destFile, CompressionMode.Compress);
try
{
int theByte = sourceFile.ReadByte();
while (theByte != -1)
{
compStream.WriteByte((byte)theByte);
theByte = sourceFile.ReadByte();
}
}
finally
{
compStream.Dispose();
sourceFile.Close();
destFile.Close();
File.Delete(#sourcePath + namePart);
}
Dts.TaskResult = (int)ScriptResults.Success;
}
UPDATE:
After trying the exact same code here. and founding out that this code, deleted my file in the source folder, I tried to update my code to follow the way the file was deleted in the link. However, it still did not work. Below is how I updated my code.
String sourcePath = Convert.ToString(Dts.Variables["SourcePath"].Value);
String namePart = Convert.ToString(Dts.Variables["NamePart"].Value);
String destinationPath = Convert.ToString(Dts.Variables["DestinationPath"].Value);
FileStream sourceFile = File.OpenRead(#sourcePath + namePart);
FileStream destFile = File.Create(#destinationPath + namePart);
GZipStream compStream = new GZipStream(destFile, CompressionMode.Compress);
try
{
int theByte = sourceFile.ReadByte();
while (theByte != -1)
{
compStream.WriteByte((byte)theByte);
theByte = sourceFile.ReadByte();
}
}
finally
{
compStream.Dispose();
sourceFile.Close();
destFile.Close();
FileInfo currFileInfo = new FileInfo(#sourcePath + namePart);
currFileInfo.Delete();
I've finally figured this out.
To explain the whole situation, we have a fully running sqlserver, serverA. We want to duplicate this on serverB so that we can have a test environment at serverB. The necessary databases are restored on serverB already, all that's left are the SSIS packages and SQL Server Agent Jobs.
The main package (main.dtsx) that loads the contents of the files to the DB was failing.
From Integration Services Catalog -> Catalog folder -> right click -> Reports -> All Executions, I've learned that it is failing on Zip.dtsx that is called from main.dtsx. Zip.dtsx that compresses the file accessed, archives it and deletes it from the source folder.
After playing around with the script task in Zip.dtsx(an idea I got from Kannan Kandasamy's comments), I figured out that it's in the File.delete() part where my script task fails. Instantly, one would think that it is a permission issue.
My 1st mistake is I continued to play on my script task while executing Zip.dtsx by right click -> Execute task on Visual Studio. I kept getting the runtime error screencaptured in my previous post without realizing that I was getting it because I am using package variable passed by main.dtsx to zip.dtsx. I got hung up with this until I figured why the script task runs successfully when I replace the variables with my hardcoded pathnames.
My 2nd mistake was to replace the package variable of Zip.dtsx with my hard coded paths. Until finally, I realized that the folder accessed by Zip.dtsx is a local folder in serverB and I'm running the SQL server agent jobs in my local machine, say machineA. So, I shared the serverB local folders to my user account. For some reason, that messed up my package badly that it no longer sees the files in the folder thus my package succeeds running because it finds the folder empty.
The main solution:
I changed back the changes I did to Zip.dtsx and removed the sharing of serverB's local folders to my user account and instead added the NT Service\SQL$Instance to have full control over the source folder where the script task deletes the files. See this link to check how to add the SQLServer$Instance to the folder permission settings.
Other issues I encountered are:
When transferring packages form different servers and the packages fail, I think this is also important. I also did this step. But in serverB, I couldn't find Microsoft.SQLServer.ManagedDTS.dll. What I did is I copied this dll from serverA to serverB. Also check your system path to see what version of Microsoft SQL Server tools you are using on default and make sure that is what you reference in your script task.
At some point in time, while investigating, I encountered an error when viewing Integration Services Catalog's All Executions report. I solved this by going to C:\Temp folder. When I double click it to access it, a dialog appeared that I currently do not have access. I continued by clicking the 'continue' button in the dialog. After that, All Executions Report is running again.
The error I got from SSMS is,
TITLE: Microsoft SQL Server Management Studio
An error occurred during local report processing.
(Microsoft.ReportViewer.WinForms)
------------------------------ ADDITIONAL INFORMATION:
The definition of the report '' is invalid.
(Microsoft.ReportViewer.Common)
An unexpected error occurred while compiling expressions. Native
compiler return value: ‘[BC2001] file 'C:\Windows\TEMP\mpgc21o3.0.vb'
could not be found’. (Microsoft.ReportViewer.Common)
------------------------------ BUTTONS:
OK
I'm not sure how much of this really mattered on my case but I'm sharing it in case someone went through the same problem. I am a beginner in MS SQL Server so I think this might help someone who is also a beginner like me.
I have a C# application that gets a list of directories inside a folder. This is done using the call
String[] projects = System.IO.Directory.GetDirectories("path/to/folder", "*", System.IO.SearchOption.TopDirectoryOnly);
This works fine on my machine, but after publishing (resulting in a setup.exe, as well as programName.application + Application Files) I tried running the program on a new machine and it threw an unhandled exception error.
The error was in regards to being unable to connect to a database, but the interesting part is that it was complaining about path not being valid, listing a path that only exists on my machine.
Does System.IO.Directory.GetDirectories not get reinitialized when running on another machine?
I guess the problem is with path/to/folder, as that path might not exist in new machine. Don't hard-code the path. Instead read it from config file (app.config using ConfigurationManager).
i am trying to create a folder and copy some images into it using c# wpf application.
curName = txt_PoemName.Text;
// Specify a "currently active folder"
string activeDir = #"C:\Program Files\Default Company Name\Setup2\Poems";
//Create a new subfolder under the current active folder
string newPath = System.IO.Path.Combine(activeDir, curName);
// Create the subfolder
System.IO.Directory.CreateDirectory(newPath);
foreach(DictionaryEntry entry in curPoem){
string newFilePath = System.IO.Path.Combine(newPath, entry.Key.ToString() + Path.GetExtension(entry.Value.ToString()));
System.IO.File.Copy(entry.Value.ToString(), newFilePath, true);
}
i have successfully created the folder and images. and also i can access them via application. but i cant see them in the location on my local disk. when i restart the machine , then application also cant see them.how can i solve this?
Sounds like you have encountered UAC Data Redirection
http://blogs.windows.com/windows/b/developers/archive/2009/08/04/user-account-control-data-redirection.aspx
You need to either force the application to run as an administrator.
How do I force my .NET application to run as administrator?
Or not save your data in a sensitive area. I would recommend saving in a subfolder of
Environment.GetFolderPath(Environment.SpecialFolder.CommonApplicationData);
I've built a Compact Framework application and I'm using WmAutoUpdate to deploy new versions to the mobile devices (http://www.sebastianvogelsang.com/2009/09/23/wmautoupdate-a-net-compact-framework-auto-update-library/). Has anyone used this? It's cool but I've got a problem.
If I cause the application to crash half-way through updating it is supposed to recover by copying the backup version back into the main directory. This doesn't work because the exe file is "locked" by the operating system because it is currently in use. I can verify this is the case because I can't delete it using Windows Explorer either. The error details are:
System.IO.IOException was unhandled
Message="IOException"
StackTrace:
at System.IO.__Error.WinIOError(Int32 errorCode, String str)
at System.IO.File.Move(String sourceFileName, String destFileName)
at WmAutoUpdate.Updater.assertPreviousUpdate()
at WmAutoUpdate.Updater..ctor(String url)
The error occurs on this line in Updater.assertPreviousUpdate():
File.Move(f, appPath + "\\" + getFilenameFromPath(f));
The code manages to update the application exe file when it's allowed to run normally (I'm not sure how). The problem is that it doesn't work when rolling back.
Cheer
Mark
I've used WmAutoUpdate and I've found the same problem. The issue is that you can move the files of the actual running process, but you cannot overwrite them. If you check the update part, WmAutoUpdate moves the running application to a backup directory and then it writes the update version to the original directory. I have fixed the rollback part this way:
if (Directory.Exists(backupDir))
{
string tmpDir = Path.Combine(Path.GetTempPath(),Path.GetFileNameWithoutExtension(Path.GetTempFileName()));
Directory.Move(appPath, tmpDir);
Directory.Move(backupDir, appPath);
}
First we move the running application files to a random directory in Temp. Then we copy the backup folder to the application original directory. Of course, this will generate a .TMP file in the Temp directory of your device, and a folder with the actual running process. You will have to delete this temporary folder once in a while in production code.