As far as you know, are there any problems in running a C# application from a shared exe file? This is a request from a customer asking their 20 clients to run the same exe file on shared path.
First tests didn't show problems, but don't know on long terms. I personally don't like this, don't think that framework was developed with this in mind, but they do for a quick upgrade of the exe file when needed.
Any point to discourage this?
Thanks
Sav
The first consideration is deployment concerns. Prior to .NET 3.5 SP1, this was not allowed by default because the shipped security policy treated network locations in a less trusted way. .NET 3.5 SP1 and later, this is no longer the case. You could, of course, use caspol to modify this security policy to allow this, if you are working with versions of the framework prior to that. Additionally, some more recent versions of Windows may have additional security policies outside of .NET that can prevent execution from remote locations.
The second consideration is making sure the application is designed in a way that it is aware of its environment, not assuming the environment is relative to the local machine when it is expected to be so (which could affect resolution of external resources and, depending on the situation, could result in resource contention or users overwriting each other's data).
The third is availability. What if the server hosting that executable becomes unavailable (is powered off by accident, crashes, experiences networking issues, is renamed, etc.)? Is that acceptable? How large is the executable? If it is large, that can increase network traffic and at any rate result in the executable being slow to start as it is invoked over the network.
I suppose for trivial applications, these issues may be negligible. However, there are lots of ways of installing applications on client computers in a way that they are installed and updated quickly and easily, such as ClickOnce deployment.
We currently run software designed in house. This runs off a central SQL database. Each computer is set up with a batch program which runs through Windows Start Up and downloads the current program files from the central server. The .exe is therefore run off the individuals computer and not off the server. This has been found, in our case at least, to be the most efficient method.
Related
Please help me to understand which one is better for logging performancewise.
logging in sql vs files vs aws which is faster in c# Applications
If I understand it correctly, you want to log useful information from your application (C#) somewhere to be able to refer to it (presumably when something goes wrong or to extract information for analytics).
Rule of thumb, in interprocess communications, maximum time is spent on sending data over network. If you apply this knowledge, you will be able to order your choices (and other options) from performance point of view.
As an indication order in terms of performance for few cases will be
Log file on the same drive as your program and being written from within the same process
Log file on a mounted drive on the same machine that runs your program and being written from within the same process
Log written in a database that resides on the same machine (localhost) as program
Log written in a database that resides on a different machine but in a local network
Log written on AWS which obviously will not be within your local network.
...
This said there are other considerations as well. For example a DB in local high bandwidth network on a powerful machine may write faster than a low configuration machine (e.g. ordinary laptop) having DB and program. Similarly, use of Direct Connect or fibre line between AWS and local network boosts the performance many many folds.
Thus, the answer is not straight forward, lot many factors contribute to change the order. Safest bet is to use log files on the same machine. You can always run a separate process to read asynchronously from the file and write wherever you wish.
I recently ran into a problem, when deploying an application on a customer machine. I asked him to drop a number of (updated) assemblies into the program folder. Because he downloaded them via the Dropbox website, the OS marked them as blocked and the assemblies couldn't be loaded via reflection. It took me some time to figure out the issue, with the help of this post:
.net local assembly load failed with CAS policy
I am know wondering whether it is a good idea to load the assemblies with Assembly.LoadUnsafeFrom(...) instead of Assembly.LoadFrom(...), just to avoid these kind of issues in future. (I am aware of the fact that sending assemblies over the internet and letting the customer drop them into the program files folder isn't the golden path of software deployment, but in reality you sometimes need to improvise...).
As I read, the method requires the calling application to run in a full-trust environment, which is usually the case with the application I am talking about.
My question is: Apart from that - the requirement of running in full trust - are there any side effects of this method. Are there scenarios, where the application will throw an exception because of lacking privileges of the Windows User account, etc., etc.?
I have read here on how to set the LARGEADDRESSAWARE flag and this is done to my Windows Service. This Windows Service is however hosting a WCF service based on another project and this service is using library's and so on from other projects.
I need the entire application to use the LARGEADDRESSAWARE, is it enouth to set it on the Window Service project(ServiceBase)? Or do I need to set it on all projects?
At this point I canĀ“t switch to 64bits so this will have to do.
It is not an option that's exposed by the IDE, you'll to turn it on by running editbin.exe in a post-build event. This answer shows the commands you need to use.
Do note however that it is fairly likely that you are wasting energy on this. It will only have an effect when the operating system can provide an execution environment that supports "large addresses". That used to be possible many years ago with the /3GB boot option but has stopped being useful a while ago. Also very detrimental on servers, they really need the kernel address space. It is still useful when your server boots a 64-bit version of Windows, any 32-bit code can get a 4 GB address space if they are linked with /LARGEADDRESSAWARE. But if you have such an operating system then changing the project's Target platform to AnyCPU is certainly the much more productive way to take advantage of the much larger address space you get in a 64-bit process. Maybe that doesn't apply in your specific case but is otherwise the best general advice.
We are seeing a very high amount of CPU and memory usage from one of our .NET MVC apps and can't seem to track down what the cause of it is. Our group does not have access to the web server itself but instead gets notified automatically when certain limits are hit (90+% of CPU or memory). Running locally we can't seem to find the problem. Some items we think might be the culprit
The app has a number of threads running in the background when users take certain actions
We are using memcached (on a different machine than the web server)
We are using web sockets
Other than that the app is pretty standard as far as web applications go. Couple of forms here, login/logout there, some admin capabilities to manage users and data; nothing super fancy.
I'm looking at two different solutions and wondering what would be best.
Create a page inside the app itself (available only to app admins) that shows information about memory and CPU being used. Are there examples of this or is it even possible?
Use some type of 3rd party profiling service or application that gets installed on the web servers and allows us to drive down to find what is causing the high CPU and memory usage in the app.
i recommed the asp.net mvc miniprofiler. http://miniprofiler.com/
it is simple to implement and to extend, can run in production mode, can store its results to SQL Server. i used it many times to find difficult performance issues.
Another possibility is to use http://getglimpse.com/ in combination with the miniprofiler glimpse-plugin https://github.com/mcliment/miniprofiler-glimpse-plugin
both tools are open source and don't require admin access to the server.
You can hook up Preemptive's Runtime Intelligence to it. - http://www.preemptive.com/
Otherwise a profiler, or load test could help find the problem. Do you have anything monitoring the actual machine health? (Processor usage, memory usage, disk queue lengths, etc..).
http://blogs.msdn.com/b/visualstudioalm/archive/2012/06/04/getting-started-with-load-testing-in-visual-studio-2012.aspx
Visual studio has a built-in profiler (depending on version and edition). You may be able to WMI query the web server that has the issues, or write/provide diagnostic recording/monitoring tools to hand them over to someone that does have access.
Do you have any output caching? what version of IIS? Is the 90% processor usage you are getting alerted to showing that your web process is actually the one doing it? ( Perhaps it's not your app if the alert is improperly configured)
I had a similar situation and I created a system monitor to my app admins based on this project
I am writting a client-server laucher application.
An administrator will, from the server side, select executable files ('.exes') from a list and add these to a short-list of the apps that standard users can run on the client.
To complile this list, my client app will reculsively search though all the folders in the system for exes and send this list over to the server via wcf.
To save search time, and keep the list short, I would like to avoid searching through folders that are not LIKELY to contain '.exes' that human users are intended to run directly.
Examples (i think) :
%windir%\WinSxS - Windows Side-by-Side - used to store versions of Windows components that are built to reduce configuration problems with Dynamic Link Libraries.
%windir%\installer - used to store installation information for installed programs
C:\MSOCache - MS Office local install source
Most hidden folders
What other folders should I avoid searching through and what are they likely to contain?
I am interested in WinXP/WinVista/Win7.
EDIT:
Search time is not the most important factor.
It is very important not to exclude exes that the user may need to run AND to exclude exes like:
c:\Windows\winsxs\x86_microsoft-windows-x..rtificateenrollment_31bf3856ad364e35_6.1.7600.20520_none_f43289dd08ebec20\CertEnrollCtrl.exe
that were never meant to be directly launched by the user.
Since any heuristic approach dealing in "likely" will need the ability to add further items to catch cases deemed "unlikely" but which were actually important, you can't be "wrong" as such, just not "perfect".
With this in mind, I would take the opposite approach, and concentrate on those that are likely to have executables.
Recurse through the directories contained in the directory %ProgramFiles%
points to.
Look in all other directories mentioned by the %Path% system variable (semicolon-delimited) but do not recurse it.
Look for shortcuts on desktops, start menus and quicklaunch folders.
As a rule, one would expect the former to find executables used by people selecting executables from the start menu and icons, and for the second to find executables used from the command-line and by the system. Between the two you'll find the large majority of executables on a system with relatively little wasted directory examinations.
How often are you going to run this program? It doesn't seem like it would take very long at all to just scan the whole drive for all the executables. I have a one-gigabyte drive on my development machine and "dir /b/s *.exe" only takes a minute or two. And it's very likely that my development machine has a whole lot more files on it than the typical user's computer.
Come to think of it, your client program could execute that command, capture the output, and send it to the server. Perhaps after a little pre-processing.
My point is that it shouldn't matter if the process were to take a full five minutes, if it's done only once (or very rarely) on each machine. The benefit is that you won't miss any executables that way.