I Have a c# application that include various esternal DLL. When the application start, dlls are exctracted in the .exe folder to grant the correct execution.
here the code:
var executingAssembly = Assembly.GetExecutingAssembly();
string folderName = string.Format("{0}.Resources.DLLs", executingAssembly.GetName().Name);
var list = executingAssembly
.GetManifestResourceNames()
.ToArray();
foreach (var item in list)
{
File.WriteAllBytes(item.Replace("myapp.DLLs.", ""),
ReadAllBytes(executingAssembly.GetManifestResourceStream(item)));
}
when i close the form, i want to delete those files with this code, associated to the form closing event:
private void CleanFiles(Object sender, FormClosingEventArgs e)
{
var executingAssembly = Assembly.GetExecutingAssembly();
string folderName = string.Format("{0}.Resources.DLLs", executingAssembly.GetName().Name);
string folder = System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetEntryAssembly().Location);
var list = executingAssembly
.GetManifestResourceNames()
.ToArray();
foreach (var item in list)
{
File.Delete(folder + #"\" + item.Replace("myapp.DLLs.", ""));
}
}
If I open and then close the form, it works perfectly. But, if I open the form, do some operations, it throw an exception during the closing operations because access to dlls is denied.
How can I release all dlls/resources?
I assume that if you copy these DLLs, you load and use them afterwards.
There's a slight issue with dynamically loading DLLs, being that you can't just unload them. Once loaded into your application domain, they're here to stay.
The solution is thus to simply create a new application domain, load the DLL inside of it, and when you're done, unload this new application domain.
Something like:
var dynamicDomain = AppDomain.CreateDomain("YourDomainName");
var dynamicallyLoadedAssembly = dynamicDomain.Load("YourAssembly");
// do stuff with your dynamically loaded assembly
AppDomain.Unload(dynamicDomain);
For more information on the topic: MSDN "How to: Load and unload assemblies".
You can (and should) implement exception handling around the parts where you manipulate files on the system:
try
{
// Manipulate your files here.
}
catch (Exception ex)
{
// Handle exceptions here. you can also delete the dlls here if you wish.
// Remember to check in CleanFiles if the files are already deleted.
}
finally
{
// You could also get rid of the files here.
// Finally-block is executed regardless if an exception was thrown or not.
}
Usefull links: Microsoft: Exception Handling, Microsoft: Best Ractices for exceptions
Related
I have a question that I can't handle for a week, you won't believe it, I've already checked everything that is possible, but it still doesn't want to work correctly. Now I'm writing a script system for my game engine and I need to dynamically recomile/reload .the dll file that contains my library and scripts. To do this, I do:
Initialize Mono. (I get an appdomain (MonoDomain*) already created for me)
Open an assembly file. (dll)
Get an image from this assembly.
Use mono (I load internal calls, create C# objects, call functions, etc.).
But at some point I need to recompile/reload this assembly, because I change the script code, well, it's understandable. As I understand from a lot of read documentation and various sources, I do not have to describe the domain, assemblies and images while it is active and for this you first need:
Create a new domain.
Upload the image.
Connect the image to the object assembly (thereby making the image status OK)
And only then describe the resources from the last domain.
But just so that I don't try, either an error occurs in the mono code, or now I have reached the stage that there are no errors, but here is a pointer to a new image obtained from a newly loaded assembly (with the same name) is the same and thus it turns out that mono continues to use the same image and does not have the updated script values.
This is code of my Mono init:
mono_set_dirs(".", ".");
// 0. Load first domain, init mono + appdomain + load assembly.
const char* domainName = "ForceCS";
const char* assemblyName = "ForceCS.dll";
pMonoDomain = mono_jit_init(domainName);
if (pMonoDomain) {
pMonoAssembly = mono_domain_assembly_open(pMonoDomain, assemblyName);
if (pMonoAssembly) {
pMonoImage = mono_assembly_get_image(pMonoAssembly);
if (pMonoImage) {
AddInternalFunctionsAndScriptClasses(); // mono_add_internal_call ...
}
}
}
Then i do some stuff with my dll like create objects or call methods as i said before. And then try to reload the ForceCS.dll.
// 1. Load new domain (Works fine).
char* newDomainName = "ForceCS";
MonoDomain* newDomain = mono_domain_create_appdomain(newDomainName, NULL);
if (!mono_domain_set(newDomain, false)) {
// Cannot set domain!
}
// 2. Load image. (Works fine)
const char* dllFile = "C:\\Force\\Editor\\ForceCS.dll";
MonoImageOpenStatus status;
MonoImage* newImage = mono_image_open(dllFile, &status); ** Problem here: Shows the same address as before and that means that Mono use same .dll.**
// 3. Load new assembly. (Works fine).
MonoAssembly* assembly = mono_assembly_load_from(newImage, mono_image_get_name(newImage), &status);
if (status != MonoImageOpenStatus::MONO_IMAGE_OK) {
///Cannot load assembly!
}
assembly = mono_assembly_open(dllFile, &status);
newImage = mono_assembly_get_image(assembly);
if (newImage) {
AddInternalFunctionsAndScriptClasses();
}
// 4. Inload old domain.
MonoDomain* domainToUnload = pMonoDomain == nullptr ? mono_domain_get() : pMonoDomain;
if (domainToUnload && domainToUnload != mono_get_root_domain()) {
mono_domain_unload(domainToUnload);
mono_gc_collect(mono_gc_max_generation());
mono_domain_finalize(domainToUnload, 2000);
mono_gc_collect(mono_gc_max_generation());
pMonoDomain = newDomain;
pMonoImage = newImage;
pMonoAssembly = assembly;
}
Final problem that image always stays the same, only it works if i load this assembly with different name like ForceCS1.dll, but its not what i want. Please explain me:
When i need to close/free domains, assemblies, images.
What the connection between assembly-image.
How reload my .dll assembly.
I will be grateful for every answer.
Finally, after another week of torment, I finally managed to fix it. In general, as I understood the problem was that I was unloading my only main domain, but I did not create/set to another domain for mono so that he could access it on another thread while the assembly is being reloaded.
In general, I completely rewrote the code, I got something like:
int main() {
// Init Mono Runtime. (Create Root Domain)
mono_init();
if (mono_create_domain_and_open_assembly("Scripting Domain", "MyScript.dll"))
mono_invoke("Script", "OnCreate", 0, NULL, NULL);
while (true) {
bool is_reload_request;
std::cin >> is_reload_request;
if (is_reload_request) {
mono_destroy_domain_with_assisisated_assembly(domain);
// Update new compiled assembly.
mono_update_assembly("C:\\MonoProject\\MyScript.dll", "C:\\MonoProjectC#\\Binary\\Debug-windows-x86_64\\MyScript.dll");
if (mono_create_domain_and_open_assembly("Scripting Domain", "MyScript.dll"))
mono_invoke("Script", "OnCreate", 0, NULL, NULL);
}
}
mono_jit_cleanup(root_domain);
return 0;
}
Now I want to explain in detail what I did, for those who may also have encountered such a problem and cannot understand what and how and how to fix it.
Init Mono & create root domain.
The first thing I did was initialize mono runtime as before, thereby creating a root domain so that in case of a reloading we could switch to it.
void mono_init() {
mono_set_dirs(".", ".");
root_domain = mono_jit_init_version("Root Domain", "v4.0.30319");
}
Creating Scripting domain, loading assembly.
Then I created a new domain in our case, this is the Scripting Domain, and after creating it, I set it for mono (mono_domain_set(domain, 0)) so that all subsequent actions are performed from it. And then, just like before, I opened the my assembly MyScript.dll through the domain and got pointers to the assembly and the image.
bool mono_create_domain_and_open_assembly(char* domain_name, const char* assembly_file) {
domain = mono_domain_create_appdomain(domain_name, NULL);
if (domain) {
mono_domain_set(domain, 0);
// Open assembly.
assembly = mono_domain_assembly_open(mono_domain_get(), assembly_file);
if (assembly) {
image = mono_assembly_get_image(assembly);
if (image)
return true;
}
}
return false;
}
After this stage, I checked that my assembly was working fine, just created a Script object and called the OnCreate method from it. I will not write this code so I hope those people who have already worked with mono know how to do it.
Reloading
Now the reloading itself. In fact, everything is much simpler here than it seems, just before calling mono_domain_upload, you need to call mono_domain_set to switch our Scripting Domain to Root Domain for mono when we will creating new domain (new dll).
bool mono_destory_domain_with_assisisated_assembly(MonoDomain* domain_to_destroy) {
if (domain_to_destroy != root_domain) {
mono_domain_set(root_domain, 0);
mono_domain_unload(domain_to_destroy);
return true;
}
return false;
}
And then compile our C# project and move it to our project (or where you specified the paths for mono). Or how I did it through C++:
void mono_update_assembly(char* assembly_path, char* assembly_path_from) {
if (std::filesystem::exists(assembly_path_from)) {
if (std::filesystem::exists(assembly_path))
std::filesystem::remove(assembly_path);
std::filesystem::copy(assembly_path_from, assembly_path);
}
}
And the last step after all that has been done, we just create new domain and load new assembly.
My application creates files and directories throughout the year and needs to access the timestamps of those directories to determine if it's time to create another one. So it's vital that when I move a directory I preserve its timestamps. I can do it like this when Directory.Move() isn't an option (e.g. when moving to a different drive).
FileSystem.CopyDirectory(sourcePath, targetPath, overwrite);
Directory.SetCreationTimeUtc (targetPath, Directory.GetCreationTimeUtc (sourcePath));
Directory.SetLastAccessTimeUtc(targetPath, Directory.GetLastAccessTimeUtc(sourcePath));
Directory.SetLastWriteTimeUtc (targetPath, Directory.GetLastWriteTimeUtc (sourcePath));
Directory.Delete(sourcePath, true);
However, all three of these "Directory.Set" methods fail if File Explorer is open, and it seems that it doesn't even matter whether the directory in question is currently visible in File Explorer or not (EDIT: I suspect this has something to do with Quick Access, but the reason isn't particularly important). It throws an IOException that says "The process cannot access the file 'C:\MyFolder' because it is being used by another process."
How should I handle this? Is there an alternative way to modify a timestamp that doesn't throw an error when File Explorer is open? Should I automatically close File Explorer? Or if my application simply needs to fail, then I'd like to fail before any file operations take place. Is there a way to determine ahead of time if Directory.SetCreationTimeUtc() for example will encounter an IOException?
Thanks in advance.
EDIT: I've made a discovery. Here's some sample code you can use to try recreating the problem:
using System;
using System.IO;
namespace CreationTimeTest
{
class Program
{
static void Main( string[] args )
{
try
{
DirectoryInfo di = new DirectoryInfo( #"C:\Test" );
di.CreationTimeUtc = DateTime.UtcNow;
Console.WriteLine( di.FullName + " creation time set to " + di.CreationTimeUtc );
}
catch ( Exception ex )
{
Console.WriteLine( ex );
//throw;
}
finally
{
Console.ReadKey( true );
}
}
}
}
Create C:\Test, build CreationTimeTest.exe, and run it.
I've found that the "used by another process" error doesn't always occur just because File Explorer is open. It occurs if the folder C:\Test had been visible because C:\ was expanded. This means the time stamp can be set just fine if File Explorer is open and C:\ was never expanded. However, once C:\Test becomes visible in File Explorer, it seems to remember that folder and not allow any time stamp modification even after C:\ is collapsed. Can anyone recreate this?
EDIT: I'm now thinking that this is a File Explorer bug.
I have recreated this behavior using CreationTimeTest on multiple Windows 10 devices. There are two ways an attempt to set the creation time can throw the "used by another process" exception. The first is to have C:\Test open in the main pane, but in that case you can navigate away from C:\Test and then the program will run successfully again. But the second way is to have C:\Test visible in the navigation pane, i.e. to have C:\ expanded. And once you've done that, it seems File Explorer keeps a handle open because the program continues to fail even once you collapse C:\ until you close File Explorer.
I was mistaken earlier. Having C:\Test be visible doesn't cause the problem. C:\Test can be visible in the main pane without issue. Its visibility in the navigation pane is what matters.
Try this:
string sourcePath = "";
string targetPath = "";
DirectoryInfo sourceDirectoryInfo = new DirectoryInfo(sourcePath);
FileSystem.CopyDirectory(sourcePath, targetPath, overwrite);
DirectoryInfo targetDirectory = new DirectoryInfo(targetPath);
targetDirectory.CreationTimeUtc = sourceDirectoryInfo.CreationTimeUtc;
targetDirectory.LastAccessTimeUtc = sourceDirectoryInfo.LastAccessTimeUtc;
targetDirectory.LastWriteTimeUtc = sourceDirectoryInfo.LastWriteTimeUtc;
Directory.Delete(sourcePath, true);
This will allow you to set the creation/access/write times for the target directory, so long as the directory itself is not open in explorer (I am assuming it won't be, as it has only just been created).
I am suspecting FileSystem.CopyDirectory ties into Explorer and somehow blocks the directory. Try copying all the files and directories using standard C# methods, like this:
DirectoryCopy(#"C:\SourceDirectory", #"D:\DestinationDirectory", true);
Using these utility methods:
private static void DirectoryCopy(string sourceDirName, string destDirName, bool copySubDirs)
{
// Get the subdirectories for the specified directory.
DirectoryInfo dir = new DirectoryInfo(sourceDirName);
if (!dir.Exists)
{
throw new DirectoryNotFoundException("Source directory does not exist or could not be found: " + sourceDirName);
}
if ((dir.Attributes & FileAttributes.ReparsePoint) == FileAttributes.ReparsePoint)
{
// Don't copy symbolic links
return;
}
var createdDirectory = false;
// If the destination directory doesn't exist, create it.
if (!Directory.Exists(destDirName))
{
var newdir = Directory.CreateDirectory(destDirName);
createdDirectory = true;
}
// Get the files in the directory and copy them to the new location.
DirectoryInfo[] dirs = dir.GetDirectories();
FileInfo[] files = dir.GetFiles();
foreach (FileInfo file in files)
{
if ((file.Attributes & FileAttributes.ReparsePoint) == FileAttributes.ReparsePoint)
continue; // Don't copy symbolic links
string temppath = Path.Combine(destDirName, file.Name);
file.CopyTo(temppath, false);
CopyMetaData(file, new FileInfo(temppath));
}
// If copying subdirectories, copy them and their contents to new location.
if (copySubDirs)
{
foreach (DirectoryInfo subdir in dirs)
{
string temppath = Path.Combine(destDirName, subdir.Name);
DirectoryCopy(subdir.FullName, temppath, copySubDirs);
}
}
if (createdDirectory)
{
// We must set it AFTER copying all files in the directory - otherwise the timestamp gets updated to Now.
CopyMetaData(dir, new DirectoryInfo(destDirName));
}
}
private static void CopyMetaData(FileSystemInfo source, FileSystemInfo dest)
{
dest.Attributes = source.Attributes;
dest.CreationTimeUtc = source.CreationTimeUtc;
dest.LastAccessTimeUtc = source.LastAccessTimeUtc;
dest.LastWriteTimeUtc = source.LastWriteTimeUtc;
}
I'm currently working on a project where I want to automatically build a showcase of all the styles that are defined in the styling project of another Visual Studio solution.
To do this, I select the Styling.dll from the at runtime via OpenFileDialog from the other solutions bin/Debug folder, add its resources to my applications resources and then build the views. That works pretty well, but now I've got a problem when selecting an assembly with references to several other assemblies.
This is how I load the assemblies right now:
public void OpenFile()
{
var openFileDialog = new OpenFileDialog { Filter = "DLL Dateien (*.dll)|*.dll" };
if (openFileDialog.ShowDialog() == true)
{
var path = Path.GetDirectoryName(openFileDialog.FileName);
foreach (var dll in Directory.GetFiles(path, "*.dll").Where(dll => dll != openFileDialog.FileName && !dll.ToString().ToLower().Contains("datagrid")))
{
this.LoadResources(dll, false);
}
this.LoadResources(openFileDialog.FileName);
}
}
I put in the foreach to first load a Utils.dll that always comes with the Styling.dll, but that obviously doesnt work if there are multiple other assemblies referencing each other.
Then in LoadResources :
public void LoadResources(string assemblypath)
{
var assembly = Assembly.LoadFile(assemblypath);
var stream = assembly.GetManifestResourceStream(assembly.GetName().Name + ".g.resources");
if (stream != null)
{
var resourceReader = new ResourceReader(stream);
foreach (DictionaryEntry resource in resourceReader)
{
// Get all the .baml files from the .dll and create URI for corresponding .xaml files
if (new FileInfo(resource.Key.ToString()).Extension.Equals(".baml"))
{
var uri = new Uri("/" + assembly.GetName().Name + ";component/" + resource.Key.ToString().Replace(".baml", ".xaml"), UriKind.Relative);
// Add ResourceDictionary to Application Resources
var rd = Application.LoadComponent(uri) as ResourceDictionary;
if (rd != null)
{
Application.Current.Resources.MergedDictionaries.Add(rd);
}
}
}
}
}
Note: The exception always throws at the var rd = Application.LoadComponent(uri) as ResourceDictionary; point.
Now, how do I get my application to correctly load all the other assemblies in the directory? Right now, I always get an error telling me that the other assemblies could not be found, even though they are in the same directory as the referencing assembly.
After looking for a solution for a while now, I understood that the application automatically tries to find the missing assemblies, but doesn't search in the directory where they actually are.
Since i want to load the assemblies dynamically, I also cant just add additional search paths to my app.config beforehand, can I do it at runtime somehow?
I hope you understand my problem, any help is appreciated, thank you!
Edit:
I have already read through this question, but it didnt help me any further.
When I implement the AssemblyResolve Event, I still get FileNotFound / XamlParse exceptions, even when the file in the assemblypath exists.
Also, the handler always tries to find xxx.resources.dll files, which on the other hand do not exist in the directory.
Okay so I've been looking around and I can't find an answer anywhere.
What I want my program to do is every time I run it, the name that shows up in the task manager is randomized.
There is a program called 'Liberation' that when you run it, it will change the process name to some random characters like AeB4B3wf52.tmp or something. I'm not sure what it is coded in though, so that might be the issue.
Is this possible in C#?
Edit:
I made a sloppy work around, I created a separate program that will check if there is a file named 'pb.dat', it will copy it to the temp folder, rename it to a 'randomchars.tmp' and run it.
Code if anyone was interested:
private void Form1_Load(object sender, EventArgs e)
{
try
{
if (!Directory.Exists(Environment.CurrentDirectory + #"\temp")) // Create a temp directory.
Directory.CreateDirectory(Environment.CurrentDirectory + #"\temp");
DirectoryInfo di = new DirectoryInfo(Environment.CurrentDirectory + #"\temp");
foreach (FileInfo f in di.GetFiles()) // Cleaning old .tmp files
{
if (f.Name.EndsWith(".tmp"))
f.Delete();
}
string charList = "abcdefghijklmnopqrstuvwxyz1234567890";
char[] trueList = charList.ToCharArray();
string newProcName = "";
for (int i = 0; i < 8; i++) // Build the random name
newProcName += trueList[r.Next(0, charList.Length)];
newProcName += ".tmp";
if (File.Exists(Environment.CurrentDirectory + #"\pb.dat")) // Just renaming and running.
{
File.Copy(Environment.CurrentDirectory + #"\pb.dat", Environment.CurrentDirectory + #"\temp\" + newProcName);
ProcessStartInfo p = new ProcessStartInfo();
p.FileName = Environment.CurrentDirectory + #"\temp\" + newProcName;
p.UseShellExecute = false;
Process.Start(p);
}
}
catch (Exception ex)
{
MessageBox.Show("I caught an exception! This is a bad thing...\n\n" + ex.ToString(), "Exception caught!");
}
Environment.Exit(-1); // Close this program anyway.
}
The process name in the task manager bases on the executable name without the extension, which you can not change while it is running.
Read the documentation:
The ProcessName property holds an executable file name, such as
Outlook, that does not include the .exe extension or the path. It is
helpful for getting and manipulating all the processes that are
associated with the same executable file.
in visual studio go to Project - Properties - Application - Assembly information and change Title
I would implement a host application to do this that simply runs and monitors a sub process (other executable). You may rename a file as such:
System.IO.File.Move("oldfilename", "newfilename");
and start the process like this:
Process.Start("newfilename");
This would mean that instead of one process you would have two, but the owner process only needs to be alive under startup - in order to change the name.
I'm kinda new to working with C# .NET's System.IO namespace. So please forgive me for some basic questions.
I am writing an online interface that will allow a site owner to modify files and directories on the server.
I have gotten inconsistent performance out of System.IO.Directory.Delete(PathToDelete, true);. Sometimes it works great, sometimes it throws an error. My controller looks like this:
public ActionResult FileDelete(List<string> entity = null)
{
if (entity != null)
{
if (entity.Count() > 0)
foreach (string s in entity)
{
string CurrentFile = s.Replace(Path.AltDirectorySeparatorChar, Path.DirectorySeparatorChar);
string FileToDelete = Server.MapPath(CurrentFile);
bool isDir = (System.IO.File.GetAttributes(FileToDelete) & FileAttributes.Directory) == FileAttributes.Directory;
if (isDir)
{
if (System.IO.Directory.Exists(FileToDelete))
{
//Problem line/////////////////////////////////
System.IO.Directory.Delete(FileToDelete, true);
}
}
else
{
if (System.IO.File.Exists(FileToDelete))
{
System.IO.File.Delete(FileToDelete);
string ThumbConfigDir = ConfigurationManager.AppSettings["ThumbnailSubdirectory"];
string ThumbFileToDelete = Path.GetDirectoryName(FileToDelete) + Path.DirectorySeparatorChar + ThumbConfigDir + Path.DirectorySeparatorChar + Path.GetFileName(FileToDelete);
if (System.IO.File.Exists(ThumbFileToDelete))
{
System.IO.File.Delete(ThumbFileToDelete);
}
}
}
}
}
return Redirect(HttpContext.Request.UrlReferrer.AbsoluteUri.ToString());
}
Sometimes, I get an error when tring to delete directories that says:
The directory is not empty.
Description: An unhandled exception occurred during the execution of the current
web request. Please review the stack trace for more information about the error
and where it originated in the code.
Exception Details: System.IO.IOException: The directory is not empty.
Source Error:
Line 137: if (System.IO.Directory.Exists(FileToDelete))
Line 138: {
Line 139: System.IO.Directory.Delete(FileToDelete, true);
Line 140: }
Line 141: }
I'm not sure what kind of defensive coding I can implement to avoid get errors like these. Any thoughts? Am I missunderstanding what it means to set recursive to true by saying System.IO.Directory.Delete(FileToDelete, true);?
If there's a file that's in use, the Delete won't empty the directory, and then will fail when it will try to delete the directory.
Try using FileInfo instead of the static methods, and use Refresh after you do any action on the file. (or DirectoryInfo for direcotries)
Similar problem
In general you just have to expect this sort of exceptions from file/folder manipulation code. There is large number of reasons why it could happen - some file in use, some process have working folder set to the directory, some files are not visible to your process due to permissions and so on.
Process monitor ( http://technet.microsoft.com/en-us/sysinternals/bb896645.aspx) likely will show what causes the problem.
One of common reason if you create folder yourself for your temporary files and then try to delete it is to forget to dispose Stream objects related to files in such folder (could be indirect links by Reader and Writer objets, XmlDocument).