i've got a question, is it possible to identify the creator of a .NET assembly, just with traces from VisualStudio within the assembly ?
Or can you even get a kind of unique ID of the creator out of it?
I don't mean the application information like company or description, they can be edited too easily.
The answer based on the fact that the code is not strong named or signed is no. Ultiamtely the only way would be to use some kind of public authority isseued certificate based code signing approach. And that is say unequivocally (theft aside) that a particular certificate owner signed the code, not that someone wrote the code.
Into the realms of more conjecture perhaps, if the code was written via a unique compiler, then one could possibly work this out. However I cannot see even this being unequivocal as who ran the compiler etc....
Related
I am building a custom installer fed from an xml document...
I know most programmers do not build their own anymore but this is specifically what I was assigned so work with me please. The installer will need to uninstall old versions of the program before it can do its job.
I can obtain registry uninstall strings no problem the problem is that the users building the instructions files are not always going to have an exact display name.
So...
I am using Levenshteins difference to obtain possible matches above 70%, this leaves me open to mistakes. To correct them was hoping i could deserialize the the GUID to obtain the name and make sure I had the right one. or somethign along those lines.
Can someone let me know where to look, or any recomendations on how to make a redundant check in the event the likeness is less than 100% based on levenshteins difference (and confirmed with Hamming difference when words/phrases are of equal length)
Note:
Versions may not be know it is a remove all old versions
Publisher will be identical on all
install location should be network but not guarenteed users love to copy locally
GUIDs do not (or at least SHOULDN'T!) contain any information from the domain they were generated from. These are randomly generated numbers, with a keyspace large enough that they are supposedly guaranteed to be unique.
Unless you have a database or some form of repository to search for this GUID's associated information, a bare GUID is no more useful than an integer ID on a random database table. It's only a identifier.
GUIDs
EDIT
I found a VBS script that may do what you are looking for. This will uninstall an application by it's Registry Id. If your program is written in another language, you can still launch VBS scripts using the System.Diagnostics namespace.
System.Diagnostics.Process.Start("path to script here");
I'm looking for the right approach to verify a currently running executable from within that executable.
I've already found a way to compute a (SHA256) hash for the file that is currently running.
The problem is: Where do I safely store this hash? If I store it in a config file, a malicious user can just calculate his own hash and replace it. If I store it in the executable itself, it can be overridden with a hex editor probably.
A suggestion I read was to do an asymmetrical en- (or was it de-) cryption, but how would I go about this?
A requirement is that the executable code hashes and en/decrypts exactly the same on different computers, otherwise I can't verify correctly. The computers will all be running the same OS which is Windows XP (Embedded).
I'm already signing all of my assemblies, but I need some added security to successfully pass our Security Target.
For those who know, it concerns FPT_TST.1.3: The TSF shall provide authorised users with the capability to verify the integrity of stored TSF executable code.
All the comments, especially the one from Marc, are valid.
I think your best bet is to look at authenticode signatures - that's kind of what they're meant for. The point being that the exe or dll is signed with a certificate (stamping your organisation's information into it, much like an SSL request) and a modified version cannot (in theory plus with all the normal security caveats) be re-signed with the same certificate.
Depending upon the requirement (I say this because this 'security target' is a bit woolly - the ability to verify the integrity of the code can just as easily be a walkthrough on how to check a file in windows explorer), this is either enough in itself (Windows has built-in capability to display the publisher information from the certificate) or you can write a routine to verify the authenticode certificate.
See this SO Verify whether an executable is signed or not (signtool used to sign that exe), the top answer links to an (admittedly old) article about how to programmatically check the authenticode certificate.
Update
To follow on from what Marc suggested - even this won't be enough if a self-programmatic check is required. The executable can be modified, removing the check and then deployed without the certificate. Thus killing it.
To be honest - the host application/environment really should have it's own checks in place (for example, requiring a valid authenticode certificate) - if it's so important that code isn't modified then it should have its own steps for doing so. I think you might actually be on a wild goose chase.
Just put whatever check will take least amount of effort on your behalf without worrying too much about the actual security it apparently provides - because I think you're starting from an impossible point. If there is actually any genuine reason why someone would want to hack the code you've written, then it won't just be a schoolboy who tries to hack it. Therefore any solution available to you (those mentioned in comments etc) will be subverted easily.
Rent-a-quote final sentence explaining my 'wild goose chase' comment
Following the weakest link principle - the integrity of an executable file is only as valid as the security requirements of the host that runs that executable.
Thus, on a modern Windows machine that has UAC switched on and all security features switched on; it's quite difficult to install or run code that isn't signed, for example. The user must really want to run it. If you turn all that stuff down to zero, then it's relatively simple. On a rooted Android phone it's easy to run stuff that can kill your phone. There are many other examples of this.
So if the XP Embedded environment your code will be deployed into has no runtime security checks on what it actually runs in the first place (e.g. a policy requiring authenticode certs for all applications) then you're starting from a point where you've inherited a lower level of security than you actually supposed to be providing. No amount of security primitives and routines can restore that.
Since .NET 3.5 SP1, the runtime is not checking the strong name signature.
When your assemblies are strong named, so I suggest to check the signature by code.
Use the native mscoree.dll with p/Invoke.
private static class NativeMethods
{
[DllImport("mscoree.dll")]
public static extern bool StrongNameSignatureVerificationEx([MarshalAs(UnmanagedType.LPWStr)] string wszFilePath, byte dwInFlags, ref byte pdwOutFlags);
}
Than you can use the assemlby load event and check every assembly that is loaded into your (current) app domain:
AppDomain.CurrentDomain.AssemblyLoad += CurrentDomain_AssemblyLoad;
private static void CurrentDomain_AssemblyLoad(object sender, AssemblyLoadEventArgs args)
{
Assembly loadedAssembly = args.LoadedAssembly;
if (!VerifytrongNameSignature(loadedAssembly))
// Do whatever you want when the signature is broken.
}
private static bool VerifytrongNameSignature(Assembly assembly)
{
byte wasVerified = 0;
return NativeMethods.StrongNameSignatureVerificationEx(assembly.Location, 1, ref wasVerified);
}
Of course, someone with enough experience can patch out the "check code" from you assemlby, or simply strip the strong name from your assembly..
Is it possible to create an app in C++ or C# so I can patch a exe file for copy protection purposes?
So if a user has an account on my website with the software tied to it, I can require them to enter a key which is checked with the database and then execute or show an error.
When I say "patch", I mean applying to an already built/compiled exe. Thanks for the help. :)
Its easily possible, many packers and protection systems like Themida do this, however, things like this can be easily cracked, thus you need to evaluate the effort vs reward required for someone to hack your program.
However, to directly answer your question, your best bet is to hook the code entry point defined in the PE and have it redirect to your checker (OS dependant). UPX is an opensource executable packer, and should provide a good base to use or point of reference asa it hooks the entry of the executable to run the unpacking engine. You can also find a few articles on packers and protectors here.
Depending on how complicated your copy protection is, "patching" may be in the simplest case just boiled down to writing a few bytes at selected offsets in the protected EXE file. This project may be interesting.
It is a good practice to always sign executable files (exe, dll, ocx, etc.). On the other hand, with an open source project it may considered disregarding the contributions to the project from all other developers.
This is quite an ethical dilemma for me and I would like to hear more opinions on this from either people who have been in a similar situation or people who contributed to an open source project.
I would like to note that this question is for an open-source project that was written in C# using .NET 4 so when user clicks the executable, he or she will be prompted a warning stating that the file is from an untrusted publisher if it is not digitally signed.
By the way, the assemblies all have strong-naming (signature) already, but they are not digitally signed yet (i.e. using a Verisign Code signing certificate).
.Net is a diffrent beast as many features require (especially libraries) require the file to be signed with a strong name key, but those can be self signed with no complaint from the final product (it uses the programs cert not the libraries to pop up that message box you refer to in your original question).
However in the general case I see nothing wrong with a group signing the official distro with a private key. If you do something to the source and recompile technically "the file is from an untrusted publisher" as I may trust Canonical but I do not trust you. As long as the executable being not being signed from a specific publisher does not stop it from being used in the manner it was intended (the tivoization clause in the GPL) I see no reason NOT to sign your executables.
Saying that this is "quite an ethical dilemma" is probably blowing it out of proportion. You definitely want to code sign your executables, and I don't really see the problem with you signing it. For example, TortoiseSVN is signed by "Stefan Kueng, Open Source Developer".
That said, it is probably a good idea to form some kind of legal entity for your project, and then get the code-signing certificate in the name of your project's entity. That way, rather than you personally signing the executable (and thus "taking all the credit"), your project's name shows up as the publisher.
If you were in the US, I would suggest either forming a LLC or possibly a 501(c)(3) organization, which is exempt from income tax and allows individuals to make tax-deductable donations to the project. (Many open source projects organize as 501(c)(3) entities, including WordPress and jQuery.) I see you're in Turkey, so you'll have to research your local requirements for forming some kind of legal entity; once formed, you'll be able to get a certificate from a CA in the name of your project's entity rather than your own.
This question already has answers here:
How can I protect my .NET assemblies from decompilation?
(13 answers)
Closed 6 years ago.
I am writing a .NET application (a Windows class library) for implementing the licensing our product.
The problem is that the DLL can be easily disassembled by the MSIL disassembler or any other third-party tools and one can easily break the code.
I have even tried signing the assembly, but still it can be disassembled.
So how do I prevent this?
Is there any tool to available for this?
Check out the answers for this question.
You cannot achieve complete protection, but you can hinder disassembly in ways that make it more difficult for people to succeed at it. There are more ways to do this, some of them detailed in the answers to the question in my link above:
Obfuscate your code.
Use public/private key or asymmetric encryption to generate product license keys.
Use a packer to pack your .NET executable into an encrypted w32 wrapper application.
What I would add would be incremental updating, both for your core functionality and the protection code, so your users will constantly benefit from the new features while making crackers lag behind you in breaking your software. If you can release faster than they can break and distribute your software, you are in gain. Legitimate users will always have technical support and a word to say regarding new features. They are your best market, as the ones who crack your software wouldn't have payed for it anyway.
You can't, but you can use an obfuscator so that it's impossible to make sense out of your code.
For example, have a look at Dotfuscator.
The community edition of this program is included with Visual Studio.
.NET has an attribute called SuppressIldasmAttribute which prevents disassembling the code. For example, consider the following code:
using System;
using System.Text;
using System.Runtime.CompilerServices;
[assembly: SuppressIldasmAttribute()]
namespace HelloWorld
{
class Program
{
static void Main(string[] args)
{
Console.WriteLine("Hello world...");
}
}
}
As you can see, there are just two differences:
We have added System.Runtime.CompilerServices namespace decleration.
We have added [assembly: SuppressIldasmAttribute()] attribute.
After building the application in Visual Studio, when we try to open the resulting EXE file in ILDASM, now we get the error message.
Anything written for the .NET framework is subject to disassembly. You cannot prevent it.
There are obfuscation tools available that will change variable names and insert other 'noise' into your IL, for instance Dotfuscator.
You might want to consider taking another approach with your licensing library, that is, using something else other than .NET, if licensing your product is absolutely necessary.
As mentioned above in the selected answer, there is no true full proof way to secure you code.
Even in something like c++, if your applications code is in your customers hands : Eg - they have physical access to the binary, that application could potentially be disassembled.
The solution would be to keep the functionality that you are licensing, out of reach of people who may want to disassemble it, let them use it, but don't let them hold it.
Consider using :
Web Services
Keep your marketable content server side and beyond the reach of your clients. They can use it, but not examine the code.
Encrypted Binaries and Online Binaries
Maybe even streaming assemblies in an encrypted format to a wrapper application. Keep your decryption keys server side to prevent offline disassembly. This might be circumvented however if someone found a way of exporting the assembly from the app domain of the application, once it has loaded it. (You cannot load an encrypted binary, so an end user might wait until your application has done the work of decryption, then exploit that to export the finished binary) (I was investigating a way to accomplish this (the exporting) - I didn't quite get it working, doesn't mean someone else wont)
The only thing to remember is that ANY code, no matter how well coded it may be, is vulnerable if it is on a users system. You have to assume the worst when you put binaries on their system. A really talented Engineer can disassemble any dll, be it c++ or c#.