Related
I have a dll library with unmanaged C++ API code I need to use in my .NET 4.0 application. But every method I try to load my dll I get an error:
Unable to load DLL 'MyOwn.dll': The specified module could not be found. (Exception from HRESULT: 0x8007007E)
I have read and tried several solutions I have found on the internet. Nothing works..
I have tried using following methods:
[DllImport("MyOwn.dll", CallingConvention = CallingConvention.Cdecl)]
[return: MarshalAs((UnmanagedType.I4))]
public static extern Int32 MyProIni(string DBname, string DBuser_pass,
string WorkDirectory, ref StringBuilder ErrorMessage);
When I tried following this article and when I run this example (from the downloaded code) it runs without a problem (the dll used is in the bin/debug folder)
I have copied my dll (along with all the files the it depends on into my bin folder).
I also tried this approach but got the same error:
[DllImportAttribute(MyOwnLibDllPath, EntryPoint="TMproIni")]
[return: MarshalAs(UnmanagedType.I4)]
public static extern int MyproIni(string DBname, string DBuser_pass,
string WorkDirectory, ref StringBuilder ErrorMessage);
Any suggestions?
From what I remember on Windows the search order for a dll is:
Current Directory
System folder, C:\windows\system32 or c:\windows\SysWOW64 (for 32-bit process on 64-bit box).
Reading from the Path environment variable
In addition I'd check the dependencies of the DLL, the dependency walker provided with Visual Studio can help you out here, it can also be downloaded for free: http://www.dependencywalker.com
You can use the dumpbin tool to find out the required DLL dependencies:
dumpbin /DEPENDENTS my.dll
This will tell you which DLLs your DLL needs to load. Particularly look out for MSVCR*.dll. I have seen your error code occur when the correct Visual C++ Redistributable is not installed.
You can get the "Visual C++ Redistributable Packages for Visual Studio 2013" from the Microsoft website. It installs c:\windows\system32\MSVCR120.dll
In the file name, 120 = 12.0 = Visual Studio 2013.
Be careful that you have the right Visual Studio version (10.0 = VS 10, 11 = VS 2012, 12.0 = VS 2013...) right architecture (x64 or x86) for your DLL's target platform, and also you need to be careful around debug builds. The debug build of a DLL depends on MSVCR120d.dll which is a debug version of the library, which is installed with Visual Studio but not by the Redistributable Package.
The DLL has to be in the bin folder.
In Visual Studio, I add the dll to my project NOT in References, but "Add existing file". Then set the "Copy to Output Directory" Property for the dll to "Copy if newer".
This is a 'kludge' but you could at least use it to sanity-test:
Try hard-coding the path to the DLL in your code
[DllImport(#"C:\\mycompany\\MyDLL.dll")]
Having said that; in my case running dumpbin /DEPENDENTS as suggested by #anthony-hayward, and copying over 32-bit versions of the DLLs listed there into my working directory solved this problem for me.
The message is just a bit misleading, becuase it isn't "my" dll that can't be loaded - it's the dependencies
Try to enter the full-path of the dll.
If it doesn't work, try to copy the dll into the system32 folder.
"Unable to load DLL 'xxx.dll': The specified module could not be found. (Exception from HRESULT: 0x8007007E)" means the file CAN be found BUT it's not able to load it. Try to copy the DLL file to the root folder of your application, some DLL libraries need to be available in the root folder of the application in order for it to work. Or check if there are any other depending DLL files required by it.
"Cannot find DLL 'xxx.dll': ..." means the file CANNOT be found. Try to check the path. For example, [DllImport(#"\Libraries\Folder\xxx.dll")]
Ensure that all dependencies of your own dll are present near the dll, or in System32.
Turn on the fusion logging, see this question for lots of advice on how to do that. Debugging mixed-mode apps loading problems can be a right royal pain. The fusion logging can be a big help.
Make sure you set the Build Platform Target to x86 or x64 so that it is compatible with your DLL - which might be compiled for a 32 bit platform.
There is one very funny thing (and has a technical relevance) which might waste your hours so thought of sharing it here -
I created a console application project ConsoleApplication1 and a class library project ClassLibrary1.
All the code which was making the p/invoke was present in ClassLibrary1.dll. So before debugging the application from visual studio I simply copied the C++ unmanaged assembly (myUnmanagedFunctions.dll) into the \bin\debug\ directory of ClassLibrary1 project so that it can be loaded at run-time by the CLR.
I kept getting the
Unable to load DLL
error for hours. Later on I realized that all such unmanaged assemblies which are to be loaded need to be copied into the \bin\debug directory of the start-up project ConsoleApplication1 which is usually a win form, console or web application.
So please be cautious the Current Directory in the accepted answer actually means Current Directory of main executable from where you application process is starting. Looks like an obvious thing but might not be so at times.
Lesson Learnt - Always place the unamanaged dlls in the same directory as the start-up executable to ensure that it can be found.
I had the same problem when I deployed my application to test PC. The problem was development PC had msvcp110d.dll and msvcr110d.dll but not the test PC.
I added "Visual Studio C++ 11.0 DebugCRT (x86)" merge module in InstalledSheild and it worked. Hope this will be helpful for someone else.
In my case one unmanaged dll was depending on another which was missing. In that case the error will point to the existing dll instead of the missing one which can be really confusing.
That is exactly what had happen in my case. Hope this helps someone else.
If the DLL and the .NET projects are in the same solution and you want to compile and run both every time, you can right click the properties of the .NET project, Build events, then add something like the following to Post-build event command line:
copy $(SolutionDir)Debug\MyOwn.dll .
It's basically a DOS line, and you can tweak based on where your DLL is being built to.
I think your unmanaged library needs a manifest.
Here is how to add it to your binary. and here is why.
In summary, several Redistributable library versions can be installed in your box but only one of them should satisfy your App, and it might not be the default, so you need to tell the system the version your library needs, that's why the manifest.
Setup: 32-bit Windows 7
Context: Installed a PCI-GPIB driver that I was unable to communicate through due to the aforementioned issue.
Short Answer: Reinstall the driver.
Long Answer:
I also used Dependency Walker, which identified several missing dependency modules. Immediately, I thought that it must have been a botched driver installation. I didn't want to check and restore each missing file.
The fact that I was unable to find the uninstaller under Programs and Features of the Control Panel is another indicator of bad installation. I had to manually delete a couple of *.dll in \system32 and registry keys to allow for driver re-installation.
Issue fixed.
The unexpected part was that not all dependency modules were resolved. Nevertheless, the *.dll of interest can now be referenced.
I have come across the same problem, In my case I had two 32 bit pcs.
One with .NET4.5 installed and other one was fresh PC.
my 32-bit cpp dll(Release mode build) was working fine with .NET installed PC but Not with fresh PC where I got the below error
Unable to load DLL 'PrinterSettings.dll': The specified module could not be
found. (Exception from HRESULT: 0x8007007E)
finally,
I just built my project in Debug mode configuration and this time my
cpp dll was working fine.
Also faced the same problem when using unmanaged c/c++ dll file in c# environment.
1.Checked the compatibility of dll with 32bit or 64bit CPU.
2.Checked the correct paths of DLL .bin folder, system32/sysWOW64 , or given path.
3.Checked if PDB(Programme Database) files are missing.This video gives you ans best
undestand about pdb files.
When running 32-bit C/C++ binary code in 64bit system, could arise this because of platform incompatibility. You can change it from Build>Configuration manager.
I faced the same problem when import C++ Dll in .Net Framework +4, I unchecked Project->Properties->Build->Prefer 32-bit and it solved for me.
It has nothing to do with dependencies if you checked all dependencies and you know you got them all, it has nothing to do with the file being in the wrong directory either or incorrect ARGUMENTS passed to dll, the DLL Fails to load using LoadLibrary itself.. you could check the address returned from LoadLibrary is always 0x0000000 (not loaded).
I couldn't figure this error out either it worked fine on Windows 7, but on Windows 10 it doesn't work. I fixed the problem though it had nothing to do with missing dependencies or Runtime redistributable packs.
The problem was I had to pack the DLL with upx and it started working again.
Something with the file being unpacked and compiled on old Windows XP operating system created a bad PE Header or Bad file format or something, but packing it with UPX did the trick works fine now and the DLL got 3x smaller haha.
I got this error for one C++ project in our solution, and only on our buildmaster's machine. The rest of us could build it with no problem.
In our case it was because that particular project had <WindowsTargetPlatformVersion> in the .vcxproj file set to "10.0" vs. "10.0.18362.0" as in all our other C++ projects.
Not specifying the entire SDK version number seems to have allowed MSBuild to choose the newest(?) SDK and associated build tools.
Our buildmaster likely had the remnants of a newer SDK on his machine, and MSBuild was trying to use it (and thus RC.exe was not found).
In any case, bringing up the project's property page and changing Configuration Properties > General > Windows SDK Version to "10.0.18362.0" (or whichever specific version of the SDK you have installed) for all of the project's configurations/platforms did the trick.
I have a dll library with unmanaged C++ API code I need to use in my .NET 4.0 application. But every method I try to load my dll I get an error:
Unable to load DLL 'MyOwn.dll': The specified module could not be found. (Exception from HRESULT: 0x8007007E)
I have read and tried several solutions I have found on the internet. Nothing works..
I have tried using following methods:
[DllImport("MyOwn.dll", CallingConvention = CallingConvention.Cdecl)]
[return: MarshalAs((UnmanagedType.I4))]
public static extern Int32 MyProIni(string DBname, string DBuser_pass,
string WorkDirectory, ref StringBuilder ErrorMessage);
When I tried following this article and when I run this example (from the downloaded code) it runs without a problem (the dll used is in the bin/debug folder)
I have copied my dll (along with all the files the it depends on into my bin folder).
I also tried this approach but got the same error:
[DllImportAttribute(MyOwnLibDllPath, EntryPoint="TMproIni")]
[return: MarshalAs(UnmanagedType.I4)]
public static extern int MyproIni(string DBname, string DBuser_pass,
string WorkDirectory, ref StringBuilder ErrorMessage);
Any suggestions?
From what I remember on Windows the search order for a dll is:
Current Directory
System folder, C:\windows\system32 or c:\windows\SysWOW64 (for 32-bit process on 64-bit box).
Reading from the Path environment variable
In addition I'd check the dependencies of the DLL, the dependency walker provided with Visual Studio can help you out here, it can also be downloaded for free: http://www.dependencywalker.com
You can use the dumpbin tool to find out the required DLL dependencies:
dumpbin /DEPENDENTS my.dll
This will tell you which DLLs your DLL needs to load. Particularly look out for MSVCR*.dll. I have seen your error code occur when the correct Visual C++ Redistributable is not installed.
You can get the "Visual C++ Redistributable Packages for Visual Studio 2013" from the Microsoft website. It installs c:\windows\system32\MSVCR120.dll
In the file name, 120 = 12.0 = Visual Studio 2013.
Be careful that you have the right Visual Studio version (10.0 = VS 10, 11 = VS 2012, 12.0 = VS 2013...) right architecture (x64 or x86) for your DLL's target platform, and also you need to be careful around debug builds. The debug build of a DLL depends on MSVCR120d.dll which is a debug version of the library, which is installed with Visual Studio but not by the Redistributable Package.
The DLL has to be in the bin folder.
In Visual Studio, I add the dll to my project NOT in References, but "Add existing file". Then set the "Copy to Output Directory" Property for the dll to "Copy if newer".
This is a 'kludge' but you could at least use it to sanity-test:
Try hard-coding the path to the DLL in your code
[DllImport(#"C:\\mycompany\\MyDLL.dll")]
Having said that; in my case running dumpbin /DEPENDENTS as suggested by #anthony-hayward, and copying over 32-bit versions of the DLLs listed there into my working directory solved this problem for me.
The message is just a bit misleading, becuase it isn't "my" dll that can't be loaded - it's the dependencies
Try to enter the full-path of the dll.
If it doesn't work, try to copy the dll into the system32 folder.
"Unable to load DLL 'xxx.dll': The specified module could not be found. (Exception from HRESULT: 0x8007007E)" means the file CAN be found BUT it's not able to load it. Try to copy the DLL file to the root folder of your application, some DLL libraries need to be available in the root folder of the application in order for it to work. Or check if there are any other depending DLL files required by it.
"Cannot find DLL 'xxx.dll': ..." means the file CANNOT be found. Try to check the path. For example, [DllImport(#"\Libraries\Folder\xxx.dll")]
Ensure that all dependencies of your own dll are present near the dll, or in System32.
Turn on the fusion logging, see this question for lots of advice on how to do that. Debugging mixed-mode apps loading problems can be a right royal pain. The fusion logging can be a big help.
Make sure you set the Build Platform Target to x86 or x64 so that it is compatible with your DLL - which might be compiled for a 32 bit platform.
There is one very funny thing (and has a technical relevance) which might waste your hours so thought of sharing it here -
I created a console application project ConsoleApplication1 and a class library project ClassLibrary1.
All the code which was making the p/invoke was present in ClassLibrary1.dll. So before debugging the application from visual studio I simply copied the C++ unmanaged assembly (myUnmanagedFunctions.dll) into the \bin\debug\ directory of ClassLibrary1 project so that it can be loaded at run-time by the CLR.
I kept getting the
Unable to load DLL
error for hours. Later on I realized that all such unmanaged assemblies which are to be loaded need to be copied into the \bin\debug directory of the start-up project ConsoleApplication1 which is usually a win form, console or web application.
So please be cautious the Current Directory in the accepted answer actually means Current Directory of main executable from where you application process is starting. Looks like an obvious thing but might not be so at times.
Lesson Learnt - Always place the unamanaged dlls in the same directory as the start-up executable to ensure that it can be found.
I had the same problem when I deployed my application to test PC. The problem was development PC had msvcp110d.dll and msvcr110d.dll but not the test PC.
I added "Visual Studio C++ 11.0 DebugCRT (x86)" merge module in InstalledSheild and it worked. Hope this will be helpful for someone else.
In my case one unmanaged dll was depending on another which was missing. In that case the error will point to the existing dll instead of the missing one which can be really confusing.
That is exactly what had happen in my case. Hope this helps someone else.
If the DLL and the .NET projects are in the same solution and you want to compile and run both every time, you can right click the properties of the .NET project, Build events, then add something like the following to Post-build event command line:
copy $(SolutionDir)Debug\MyOwn.dll .
It's basically a DOS line, and you can tweak based on where your DLL is being built to.
I think your unmanaged library needs a manifest.
Here is how to add it to your binary. and here is why.
In summary, several Redistributable library versions can be installed in your box but only one of them should satisfy your App, and it might not be the default, so you need to tell the system the version your library needs, that's why the manifest.
Setup: 32-bit Windows 7
Context: Installed a PCI-GPIB driver that I was unable to communicate through due to the aforementioned issue.
Short Answer: Reinstall the driver.
Long Answer:
I also used Dependency Walker, which identified several missing dependency modules. Immediately, I thought that it must have been a botched driver installation. I didn't want to check and restore each missing file.
The fact that I was unable to find the uninstaller under Programs and Features of the Control Panel is another indicator of bad installation. I had to manually delete a couple of *.dll in \system32 and registry keys to allow for driver re-installation.
Issue fixed.
The unexpected part was that not all dependency modules were resolved. Nevertheless, the *.dll of interest can now be referenced.
I have come across the same problem, In my case I had two 32 bit pcs.
One with .NET4.5 installed and other one was fresh PC.
my 32-bit cpp dll(Release mode build) was working fine with .NET installed PC but Not with fresh PC where I got the below error
Unable to load DLL 'PrinterSettings.dll': The specified module could not be
found. (Exception from HRESULT: 0x8007007E)
finally,
I just built my project in Debug mode configuration and this time my
cpp dll was working fine.
Also faced the same problem when using unmanaged c/c++ dll file in c# environment.
1.Checked the compatibility of dll with 32bit or 64bit CPU.
2.Checked the correct paths of DLL .bin folder, system32/sysWOW64 , or given path.
3.Checked if PDB(Programme Database) files are missing.This video gives you ans best
undestand about pdb files.
When running 32-bit C/C++ binary code in 64bit system, could arise this because of platform incompatibility. You can change it from Build>Configuration manager.
I faced the same problem when import C++ Dll in .Net Framework +4, I unchecked Project->Properties->Build->Prefer 32-bit and it solved for me.
It has nothing to do with dependencies if you checked all dependencies and you know you got them all, it has nothing to do with the file being in the wrong directory either or incorrect ARGUMENTS passed to dll, the DLL Fails to load using LoadLibrary itself.. you could check the address returned from LoadLibrary is always 0x0000000 (not loaded).
I couldn't figure this error out either it worked fine on Windows 7, but on Windows 10 it doesn't work. I fixed the problem though it had nothing to do with missing dependencies or Runtime redistributable packs.
The problem was I had to pack the DLL with upx and it started working again.
Something with the file being unpacked and compiled on old Windows XP operating system created a bad PE Header or Bad file format or something, but packing it with UPX did the trick works fine now and the DLL got 3x smaller haha.
I got this error for one C++ project in our solution, and only on our buildmaster's machine. The rest of us could build it with no problem.
In our case it was because that particular project had <WindowsTargetPlatformVersion> in the .vcxproj file set to "10.0" vs. "10.0.18362.0" as in all our other C++ projects.
Not specifying the entire SDK version number seems to have allowed MSBuild to choose the newest(?) SDK and associated build tools.
Our buildmaster likely had the remnants of a newer SDK on his machine, and MSBuild was trying to use it (and thus RC.exe was not found).
In any case, bringing up the project's property page and changing Configuration Properties > General > Windows SDK Version to "10.0.18362.0" (or whichever specific version of the SDK you have installed) for all of the project's configurations/platforms did the trick.
I have a solution that contains C# and managed C++ projects.
It compiles in the solution platform x64 and x86. Since it is managed C++ I wanted to create a 'Any CPU' solution and get rid of the old ones.
I changed the C++ project linker settings to Force Safe IL Image for both x64 and x86.
Next, using the Configuration Manager, I created a new solution platform called 'Any CPU'. Next I added a project platform also called 'Any CPU'.
I proceeded to set all the C# projects to 'Any CPU', but for the C++ I can't do that. The project platform 'Any CPU' is not in the drop down, and there is also no option 'New...'.
VS is adement about it, so I kept it like it was and started a build. To my surprise the result DLL (from the C++ project) was MSIL even though the platform for C++ was x64. Same happens when compiling x32, the resulting DLL is in MSIL.
What gives?
Why can't I set the C++ project to 'Any CPU'?
As far as I know, you cannot create an "AnyCPU" project type in Visual Studio for a C++/CLI project. However, you can configure your C++/CLI project (under the "Win32" project type) so that it compiles as pure, safe MSIL, without a target platform. Doing so will allow your C++/CLI DLL assembly to be used with an "AnyCPU" C# project. I.e. it's effectively "AnyCPU", even though that's not its actual name in the Configuration Manager.
In the "C/C++" project settings:
Common Language RunTime Support: Safe MSIL Common Language RunTime Support (/clr:safe)
In the "Linker" project settings:
CLR Image Type: just make sure this isn't set explicitly to IJW or PURE
Notes:
By using the "safe" project type, a few of the compiler and linker options which appear to affect platform type will be ignored. I.e. you don't have to go through and set everything to a non-specific platform type. Just the above. But you may set the other options to something appropriate, if it makes you feel better. :)
"Safe" will prevent the use of pointers. If this is an important issue, it is apparently possible to do albeit with a more complicated process. See Creating a pure MSIL assembly from a C++/CLI project? for details.
Don't forget that by default, Visual Studio will create C# projects that even though they are "AnyCPU" and even though they are executed on a 64-bit OS, will start up as a 32-bit process. This can hide platform-mismatch issues, if a dependency is x86 instead of pure/safe MSIL as intended. Just something be aware of (you can control this by unchecking the "Prefer 32-bit" option in the C# project's "Build" project properties page).
In order for the C++ functionality to be consumed by a C# dll, the C++ project must produce both x86 and x64 versions of the dll. It is not possible to reference just a x86 or a x64 dll from a C# dll compiled with the AnyCPU setting.
The trick to getting the AnyCPU dll to play with the C++ dll, is at runtime make sure the assembly cannot load the C++ dll and then subscribe to the AppDomain AssemblyResolve event. When the assembly tries to load the dll and fails, then your code has the opportunity to determine which dll needs to be loaded.
Subscribing to the event looks something like this:
System.AppDomain.CurrentDomain.AssemblyResolve += Resolver;
Event handler looks something like this:
System.Reflection.Assembly Resolver(object sender, System.ResolveEventArgs args)
{
string assembly_dll = new AssemblyName(args.Name).Name + ".dll";
string assembly_directory = "Parent directory of the C++ dlls";
Assembly assembly = null;
if(Environment.Is64BitProcess)
{
assembly = Assembly.LoadFile(assembly_directory + #"\x64\" + assembly_dll);
}
else
{
assembly = Assembly.LoadFile(assembly_directory + #"\x86\" + assembly_dll);
}
return assembly;
}
I have created a simple project demonstrating how to access C++ functionality from an AnyCPU dll.
https://github.com/kevin-marshall/Managed.AnyCPU
I have a problem when compiling a managed DLL project. The solution consists of two projects, the first is a .NET DLL written in C# and the other is Managed C++ DLL that directly references the C# project.
Both projects/DLLs are strongly named with an snk file on disk. The C# dll has a target framework of "AnyCPU" while the Manage C++ project is compiled twice, one for an x86 target and the other for an x64.
My problem is that when I compile the Managed C++ project to target the x86 platform, the result DLL has a PublicKeyToken = null as reported by ILSpy. When compiling to target an x64 platform, the DLL has the correct PublicKeyToken. I have checked my project properties, the snk file is referenced correctly for both platform targets under Configuration Properties -> Linker->Advanced->Key File with no delay signing; the Target Machine option is also set correctly based on the desired compilation target.
Here is the information shown by ILSpy when I load my DLL.
For the x64 dll:
// MyDll.x64, Version=1.1.1000.1, Culture=neutral, PublicKeyToken=XXXXXXXXX
// Architecture: x64
// This assembly contains unmanaged code.
// Runtime: .NET 2.0
For the x86 dll:
// MyDll.x86, Version=1.1.1000.1, Culture=neutral, PublicKeyToken=null
// Architecture: AnyCPU (64-bit preferred)
// This assembly contains unmanaged code.
// Runtime: .NET 2.0
What concerns me is the Architecture description for the x86 assembly: AnyCPU (64-bit preferred)
I am not sure why its using the AnyCPU configuration and what the 64-bit preferred annotation means exactly?
I also like to mention that my project is built against .NET Framwork 2.0 for the C# project while the Managed c++ project is built against the v90 Platform Toolset. I am using Visual Studio 2010 running on a Windows 7 64-bit machine.
Can someone tell me why this is happening and how can I solve this issue?
It is simply a consequence of how the COR header in the assembly can indicate what processor architecture is desired. You can see the declarations in the CorHdr.h SDK header file, you'll find it in your Windows SDK directory on your machine. You can use the CorFlags.exe utility to display the values.
The only flag available is COMIMAGE_FLAGS_32BITREQUIRED. When set, it indicates to the CLR that you want to run the program in 32-bit mode, even on a 64-bit operating system. An additional flag got added in .NET 4.5, COMIMAGE_FLAGS_32BITPREFERRED, it resolves an ambiguity on ARM cores. Too many assemblies around where 32BITREQUIRED actually means "x86 required" instead of "32-bit required".
So there is nothing similar to a "64 bit required" flag, an assembly can only indicate "32-bit" or "doesn't matter". With the jitter providing the "doesn't matter" glue, it generates the architecture dependent machine code at runtime. Since the 32BITREQUIRED option isn't turned on in your assembly, the disassembler cannot display anything else but AnyCPU.
The next detail is the IMAGE_FILE_HEADER.Machine field in the PE header of the executable file, it indicates what kind of machine the executable can run on. That's a weak signal for .NET assemblies since they don't normally contain any executable code, just MSIL. And it is readily ignored by the Windows loader, .NET assemblies normally have this field set to IMAGE_FILE_MACHINE_I386 to indicate x86. You still get a 64-bit process out of such an EXE assembly, some pretty heroic loader structure patching occurs when such an EXE is loaded. The job of mscoree.dll, the "loader shim". More about that in this post.
Since you targeted x64 in your C++/CLI project, the IMAGE_FILE_HEADER.Machine was set to IMAGE_FILE_MACHINE_AMD64 by the linker. The disassembler saw that, thus producing the "64-bit preferred" annotation.
Don't be fooled by the word "preferred" here. The disassembler didn't look deep enough to see that your assembly actually contains machine code, generated by the C++/CLI compiler. They don't like to, there isn't any disassembler that will decompile the machine code back to C++/CLI source code. The assembly isn't ever going to run on a 32-bit operating system. Kaboom on a 32-bit OS, the program fails with error 11, ERROR_BAD_FORMAT, "An attempt was made to load a program with an incorrect format".
This answers your question, it doesn't otherwise have anything to do with a strong name.
I have a program which needs to function in both an x86 and an x64 environment. It is using Oracle's ODBC drivers. I have a reference to Oracle.DataAccess.DLL. This DLL is different depending on whether the system is x64 or x86, though.
Currently, I have two separate solutions and I am maintaining the code on both. This is atrocious. I was wondering what the proper solution is?
I have my platform set to "Any CPU." and it is my understanding that VS should compile the DLL to an intermediary language such that it should not matter if I use the x86 or x64 version. Yet, if I attempt to use the x64 DLL I receive the error "Could not load file or assembly 'Oracle.DataAccess, Version=2.102.3.2, Culture=neutral, PublicKeyToken=89b483f429c47342' or one of its dependencies. An attempt was made to load a program with an incorrect format."
I am running on a 32 bit machine, so the error message makes sense, but it leaves me wondering how I am supposed to efficiently develop this program when it needs to work on x64.
Thanks.
This is purely a deployment problem, you should never have to maintain different projects. It is an awkward one though, and boo on Oracle for not taking care of this themselves. Another consideration is that this assembly really should be ngen-ed on the target machine. Some options
Create two installers, one for x64 and one for x86. The customer picks the right one, based on the operating system she uses. Simple enough, you just copy the right file.
Deploy both assemblies to the GAC. Now it is automatic, .NET picks the right one on either type of machine. Big companies should almost always use the GAC so they can deploy security updates, not sure why Oracle doesn't do this.
Deploy the assemblies to a x86 and x64 subdirectory of the install directory. You'll need to write an AppDomain.AssemblyResolve event handler that, based on the value of IntPtr.Size, picks the right directory.
Change the target platform on your EXE project to x86. Given that your code needs to work on a 32-bit machine as well as on a 64-bit machine, there isn't/shouldn't be a reason to build for AnyCPU.
This is a working solution for your problem:
Add the 2 DLL's (x86 and x64) to your solution in a subfolder. Make them "Copy if newer"
Reference the correct DLL you use for development for debugging from the 2 DLL's you added. Make it Copy Local=false.
What this does is that when you app starts the DLL is not autoloaded. It will not be loaded until you use a Type from that assembly. Once that happens an event will be triggered in .Net that asks where it can find your assembly.
So sometime before the first use of that assembly make sure you attach yourself to that event.
AppDomain.CurrentDomain.AssemblyResolve += CurrentDomain_AssemblyResolve;
In the content of the handler make sure you load the DLL (x86 or x64) when it asks for it.
static System.Reflection.Assembly CurrentDomain_AssemblyResolve(object sender, ResolveEventArgs args) {
if (args.Name.Equals("MyFullAssemblyName")) {
var path = System.IO.Path.GetDirectoryName(System.Reflection.Assembly.GetExecutingAssembly().Location);
if (IntPtr.Size > 4) {
var dll = System.IO.Path.Combine(path, #"MySubDir\MyDLL_x64.dll");
return System.Reflection.Assembly.LoadFile(dll);
}
else {
var dll = System.IO.Path.Combine(path, #"MySubDir\MyDLL.dll");
return System.Reflection.Assembly.LoadFile(dll);
}
}
return null;
}
Voila. You can now run your app as both 32 bit and 64 bit.
Alternatively to adding the DLLs in a subfolder, you can make them as Embedded Resources, and then load them like this:
static System.Reflection.Assembly CurrentDomain_AssemblyResolve(object sender, ResolveEventArgs args) {
if (args.Name.Equals("MyFullAssemblyName")) {
var ass = Assembly.GetExecutingAssembly();
if (IntPtr.Size > 4) {
var strm = ass.GetManifestResourceStream("the.resource.name.for.MyDLL_x64.dll");
var data = new byte[strm.Length];
strm.Read(data, 0, data.Length);
return Assembly.Load(data);
}
else {
var strm = ass.GetManifestResourceStream("the.resource.name.for.MyDLL.dll");
var data = new byte[strm.Length];
strm.Read(data, 0, data.Length);
return Assembly.Load(data);
}
}
return null;
}
This does not work for all assemblies. Some "hybrid" assemblies tends to fail unless they are loaded from disk (can be solved by writing them to disk just before loading).
If you're running on a 32-bit machine, then you have to load the 32-bit version of the Oracle DLL. A 32-bit program can't reference a 64-bit DLL. And, a 64-bit program can't reference a 32-bit DLL.
"Any CPU" is the correct target if you have multiple versions of the external DLL. The trick is making sure that the proper Oracle DLL is located and loaded. Your best bet is to locate the 64-bit version of the DLL on your 32-bit system and rename it so that the runtime can't find it.
Using AnyCPU with native early bindings is just not going to work, for that you need two separate solutions and builds as you saw. You have to get hold of a 64-bit system to develop or at least test x64 compiled dlls on.
However, with late binding, you can use AnyCPU and System properties to figure out what architecture you're running as and link to the correct dll, if you keep the named like Oracle.DataAccess.x86.dll. If they're installed into the GAC, it's even easier, you can bind without even bothering to test for architecture first, but I believe you still have to late bind.
Note that VMware can run a 64-bit guest on a 32-bit host, if you really can't be bothered to reinstall Windows.
You shold be able to configure the same solution to build x86/x64 versions separately. You may also need to add post build steps to copy correct version of DLL to corresponding output folders...
At least if you have to build 2 solutions - use the same source (add files as refernce to second solution, not copy into second solution).