I have developed a complicated console app that do the following:
I insatll the console app as window service.
In this console app i created a web api host self site.
In addition in the background of the console i open socket connection with
another computer.
Now when i run this app(not as windows service) in VS everything works good and the app use less than 2% cpu.
When i install this app on the server it use at least 20% cpu and somtimes even 30% cpu..
Now i can't understand where is the problem..
Anyone can suggest me how can i check what exactly "eat" so much cpu??
Related
I have a c# web app that was developed under vs2015 / iis express and a console app that runs on the same box. I had set up a named mutex to mitigate file access contention and all worked great.
The console app created the mutex "IPC" and the web app opened it as an existing mutex.
Once I deployed that web app and console app to the production server (2008 r2/iis) the two apps cant see each others mutexes and the whole scheme comes crashing to the ground.
Is there a way to get around this?
I have just finished a project , for short an Instagram robot for following and other tasks which is written in c# and selenium, for make it more practical and prevent from cracking and stealing my code I would like to run it on a virtual server because it needs a single system to work continuously for running.
As I don't have any experience in working with VPS, I need some hints, first is my approach correct? can I execute my application in a virtual server like a personal computer?
Moreover can I use a VPS as a host? like some parts of my application run on the VPS and some parts run on the other host?I mean could they be connected together? something like a panel(run on host) which users are able to do different settings on their robots(run on vps).
thanks for any tips , article or advise in advance!
all you should do is connect to a VPS using the windows Remote Desktop application.
then it's basically like a personal computer, you can run your robot and close the Remote Desktop application and it keeps running.
I have an IIS server version 8.5. I have web site and a number of web-services hosted on this web site. A number of windows services and desktop apps are working this with IIS instance. And everything is ok for some time. But some time later IIS begin to use 100% of cpu resources. I can suppose that my code is the probem, but firstly i'm doing next steps:
I'm switching off all windows services and desktop apps.
Switching off w3wp process from processes.
Restrating several times app pool, iis and site.
But after i'm startig again iis, pool and site and nothing else (nothing is using iis) i can see that iis worker process using about 20% of cpu resources. And the situation above can be repeated again after some time. It means that the problem can't be in the my code.
What can be the problem of the iis high-load then it just started and then it uses 100% of cpu?
It happens, we've all struggled with high CPU in a worker process before. It in almost all cases it is the code.
If you're threading (That's probably your answer right their)
But here's what you need to do.
Right click on the process consuming the CPU and click "Dump Process", this will create a debug file.
Then use debug diagnostic tool from Microsoft and open the file, it has a wealth of information in it. It's your starting point. Unless you're willing to share the code.
On other windows operating systems, I could create a Windows Service that listens to TCP/IP for any communication (like a PBX for example) through sockets.
But, how would i handle this in Windows 8? Applications can't run there all the time and when Windows 8 Metro starts up, applications for the normal desktop aren't running.
I need this service to start up all the time in any situation as soon as the computer starts up...
I also need http://pcapdotnet.codeplex.com an open source project to analyze the network...
What's the best future proof practice in an enterprise environment?
I don't have any Windows 8 experience on me yet.
Try this: If you make your windows service depend on something like the computer browser or could be the workstation service. The net connection should be up then.
You have to code this dependency in (no sample on hand).
I have a C# library that does some file processing. I created a console and desktop application that uses the library and processes a 256mb file in about 1min. I then created a WCF service hosted in a windows service which uses the same file processing library yet takes 10x longer to process the same 256mb file when called from a website. The windows service is running under a domain account with administrator privileges.
The overhead in calling the WCF service is very fast yet the LoadFile method takes much much longer. I tried increasing the process priority during startup via
Process.GetCurrentProcess ().PriorityClass = ProcessPriorityClass.High;
to no avail. I've run this service on a Win7 64bit desktop system (6gb), 2003 XP 32bit server (4gb) and 2008 R2 32bit server (4bg) all with similar results. The console and desktop apps each process the file in about 1min on the above system. The process does not appear to be memory constrained and entering swapville.
Are windows services somehow process constrained? Would I get better results running the WCF service under IIS?
EDIT: I tried calling the library directory from the website and that too takes 10x longer than the console or desktop application.
UPDATE: Turns out it was Log4PostSharp. The console and desktop apps didn't have any traces of log4net in the configuration files yet the website and windows service did. There was a log4net TraceAppender silently eating up precious CPU cycles.
I cannot think why the behaviour you describe is happening - it does seem very strange. Since you are processing a relatively large file in memory though, the garbage collector may be affecting it. You could try changing the mode the garbage collector runs in to see if it has any effect.
The garbage collector has three modes - workstation, server and concurrent. Each one behaves in a different way and is optimised for different types of applications. Workstation mode is the default mode, and is what all processes run using unless configured to use something else. More info about the modes can be found here.
Try explicitly setting the garbage collector to use server mode (it will only have an effect on a multi-processor machine though). To do this, put the following in your app.config file:
<configuration>
<runtime>
<gcServer enabled="true" />
</runtime>
</configuration>