How can we obfuscate .Net dll (MVC web application)?
As, I am trying to obfuscate MVC website dll using Dotfuscator, when I run application using obfuscated dll it dont work properly, show website HTML content or website structure.
Why would you? Web application binaries reside on the server, and with sane configuration settings they're unretrievable through HTTP requests. If this is not the case, you'd best spend your time to configure your web server instead.
As to why you can't really obfuscate MVC code, the architecture they went with uses a lot of reflection, like a lot. And because functions aren't actually called from code, the obfuscators won't know to update the "caller" side through reflection and it'll just crash.
The reason that your mvc app is blowing up is that by default dotfuscator uses a type of obfuscation where it renames classes, methods and pretty much everything thing else that gets a name. It's probably more "secure" and all but renaming obfuscation kills anything that uses reflection.
The good news is that you can configure dofuscator to use another approach called control flow obfuscation. When you enable that, disable the renaming obfuscation and you should be ok.
If this wasnt enough info and you think im on the right track for you, reply to me and i can post a dotfuscator config example in the morning. (It's 1:30 am where i am)
Related
I've been asked to look into a live client site which currently isn't working. I've been told that an IIS recycle will fix this issue for about 3 months when it will re-appear again. The issue seems to be in a 3rd party CMS but I can't currently provide any debugging information for them to try and reproduce.
That got me wondering if it would be possible to do the following - Put together a simple ASP page with a text editor which can accept arbitrary input. Take the input and compile/execute it in the current App Domain using the Rosyln service and print any output to another text area on a page.
Can anyone give me an indication on if this is achievable? The bits I'm not sure if I can achieve are:
Running the Roslyn code in the context of the current page/app domain.
Tracing output to the page without turning on global tracing.
When you load a dynamic assembly, by default it loads it into the same AppDomain that you're running in, with Roslyn or not.
However there are some considerations:
You do not need the Roslyn service (which is used by Script Engine), you only need the Roslyn client DLLs if you're just going to build and run the code.
You don't really need Roslyn to compile dynamic code and run it. .NET already has the ability to load, compile and run an arbitrary lump of code. This article is a little old but still valid. Use reflection in your page to load and run the dynamic DLL.
If it is the Roslyn script engine you need, that needs the Roslyn service (to act as a host). The service needs visual studio 2012. As Daniel Mann points out in the comment, that's not licensed for production.
Memory leaks via IIS are often caused by a given thread/request, not by actions on the App Domain (or shared resources). Your dynamic code (be it Roslyn Script Engine or plain old dynamic assembly) will be running outside where the leaks are happening, so I doubt you will see any problems.
This scares me. Running dynamic code straight onto live sounds amazingly dangerous. Protect yourself!
Can you reproduce it locally or on a test machine similar to live? If it only happens over an extended period of time, then you can simulate use using automation like Selenium.
Once you have it reproduced, use something like ANTS Memory Profiler to see where the problems are.
I have a weird scenario in which the website seems to randomly run out of memory from time to time: it works for weeks then suddenly everything throws an out of memory exception, and it stays so until the server is rebooted. It may happen after weeks or after days. We weren't able to identify a regular pattern.
Here a list of tech stuff used for this site:
Net framework 3.5
Mvc 2.0 with C#
IIS 6.0 on a dedicated server (no policy restrictions, etc)
3 layer architecture (ui - bll - dal)
Automapper 1.1.0.118
Elmah 1.1
FluentValidation 2.0
MvcContrib 2.0.95.0
MvcSiteMapProvider 3.0.0.1
Castle 2.5.2
NHibernate 3.0.0.4
FluentNHibernate 1.1.0.0
PdfSharp 1.31.1789.0
MarkdownSharp
Other than this, the site includes (via iframes) some old legacy asp forms. Those forms are the same that were on the old version of the site (which was entirely in asp), they have some problems but the old site never ran out of memory.
I've already checked common stuff, like all IDisposable implementing classes are inside using statements, no infinite loops, etc.
The site doesn't do anything strange, it pulls some data from the DB like news, generates some pdf on the fly after certain form submissions, allows users to subscribe to a newsletter. The usual stuff.
I'm really clueless, I've developed many sites, used the mentioned libraries almost everywhere, but this is the first time I experience this kind of problem.
I know this information isn't enough to "find" the problem, but if anyone can think of something I might have overlooked, or anything, it will be very welcome :)
EDIT: A detail that might be important. We have another website running on the same server (made with old asp) and it runs just fine, while the other is stuck. So it seems like the overall server memory isn't depleted, otherwise it wouldn't work too.
Install DebugDiag. Trigger it to take dumps of the process as it breaches memory thresholds (say at 300Mb and then at every 100Mb after that).
Comparing the dump files should give you a clue as to what is suddenly occupying all that memory
I would have a look at how Castle is configured and used, do you use castle to resolve your controller dependencies, using the ControllerBuilder.Current.SetControllerFactory method? If you do, you also have to remember to release the controller instances.
If your hosting multi application pool on IIS7/7.5 and high load, try change gc mode.
aspnet.config
[ Element](http://msdn.microsoft.com/en-us/library/ms229357.aspx Element)
I'm inheriting a web application and the previous programmer compiled all his code into a .dll. The .cs files are not present on the server.
Working on previous projects, I've always uploaded the .aspx file and the corresponding .cs file. It's never been a problem for me and I always thought it was standard procedure. Am I wrong or just paranoid?
Will,I think this is quite common to keep code precompiled into dll. Then the code is less exposed for potential security holes. This provides also many advantages, which include faster initial response time, error checking, source-code protection, and efficient deployment. This is particularly important in large sites where there are frequent changes in Web pages and code files.
Leaving source code as a part of the project isn't necessarily the best source code management process. There are tools for that.
Also, precompiling source code isn't out of the ordinary (this is a Web Application project rather than a Web Site project in Visual Studio), and has many benefits.
Note that this doesn't make you wrong or paranoid.
There are good reasons for both strategies you just have to figure out what is going to work best for you environment and for the application.
In some ways it is good to have it precompiled if you worry about someone accidently making a change on the server but not checking the change into source control. With non-precompiled if you don't have change control on your server it can be hard to figure out who "accidently" made a change and why without checking it in.
On the other side, if you don't precompile it can make deployment more straight forward.
Just do a little research behind both strategies and decide what is going to work best in your situation.
As Nader pointed out, in a Web Application you don't need the CS files at all. There is not a huge risk of the source files being served accidentally, as protecting these files is a core function of IIS request management. Still, it is generally good practice not to deploy them to a production web server.
In any case, source files should at the bare minimum always be backed up in a location that is not the web server and should be source controlled whenever possible. I have seen too many websites where the source files were lost and the site was useless as a result.
Like everyone above has said, compiling source code into DLLs is considered best practice.
If you'd like to see the code of the DLLs you've been left with, there's a handy (and free!) tool called Reflector (apologies if you've already got it)
http://www.red-gate.com/products/reflector/
Just load up the DLL and then disassemble to view the source.
Web Application Projects compile into .dlls and leave no source on the server.
Web Site Projects deploy all the source to the server.
It's a religious war as to which is best. Google will present you with many varied opinions, so I won't press my own opinions on you.
How do you precompile WCF code so that the WCF code can't be seen by anyone who has access to the WCF code.
it's possible to this with ASP.NET code by using the "precompilation" feature. basically, what the precompilation feature does is enable the developper to deploy "binaries" to IIS instead of a folder containing source code.
can this be done with WCF too ?
I think you may be confused about something.
A WCF project is always "pre-compiled", unless you go out of your way to be unusual. The source files do not need to be deployed in order for the service to operate.
What leads you to believe that your source code needs to be deployed?
You're looking for an obfuscator.
I'm not sure why people are DVing your post as this is a legitimate question. However what you are actually looking for is source code protection not pre-compilation.
Source code protection can be many faceted to be effective however I routinely state it's generally worthless as you will spend alot of time and money for a very intangible benefit. If your product is based on a brand new algorithm that has true commercial value C# is definitely not the optimal language to develop that module in.
There are posts on here about this topic: https://stackoverflow.com/questions/402430/how-do-i-use-net-without-loss-of-control-over-intellectual-property being on of them. It's also possible this question will be closed as a duplicate.
We’re coming to a big release in a web project I work on, and I’m starting to think more and more about javascript/css performance and versioning. I’ve got a couple high level ideas so far:
Write (or customize) an http handler to do this for me. This would obviously have to handle caching as well to justify the on the fly IO that would occur.
Add these steps into a custom msbuild script that would be ran for deployment only.
I’m also looking at automatically generating config files for each of the servers I deploy to, which lends itself to the second idea. The major advantage I see with the first idea is that I could dynamically handle versioning (at least that’s what one of my links at the bottom says, I’ve yet to convince myself that this would actually work).
Anyway, I’m curious if any of these problems have already been solved. I’d love any feedback. Thanks!
Here are some resources that I've been looking at so far:
http://madskristensen.net/post/Combine-multiple-stylesheets-at-runtime.aspx
http://madskristensen.net/post/Remove-whitespace-from-stylesheets-and-JavaScript-files.aspx
http://www.west-wind.com/WebLog/posts/413878.aspx
http://svn.offwhite.net/trac/SmallSharpTools.Packer/wiki
You do this as part of the continuous integration build process you have.
Compare all JS to the previous checked inversion, for each that have changes, call out to the YUI Compressor on that JS file and name the output with the current revision number. Add that file to your repository, and change a config file to have the latest revision number for that js file. Then you will write a custom control that imports a js file. This control will either use the uncompressed js when running on a development machine, or the compressed file with the revision number from the config file when it is run on a deployed setup.
In addition for 1), Microsoft has built in support for embedding resource files into DLL. This will always get updated when your project is changed and recompiled.
The problem is, you don't have control over caching and file name. When debugging, it's hard to pick which one to debug when everything is called "webresource.axd" something. That was hell.
Would love to read how others do it, too.
Personally, I rather have it done as part of the build process as to avoid the performance cost of dynamically doing this on each request. I guess you can lessen the hit by implementing proper caching, but why bother... IIS can handle that for you already (unless you are not running on top of IIS, I guess).
As a general recommendation, things that Steven Souders talk about is also great if you want to speed up browser rendering. If you have not already, take a look at this.
My team recently moved away from keeping scripts as embedded resources and we are very happy with the results. Yes, you can combine and minify them using handlers, but it's a bit of a hassle, especially when you want to host them from a separate domain.
What we do now is we keep all of our control script files separated and then use a tool, like js-builder, during the build process to combine and minify them. We actually output two files from the tool, one simply combined for debugging, and the combined and minified one for production use.