Extremely high CPU load in render loop - c#

I am still working on my video game and my own game engine and I make a good progress. This weekend, I wanted to take some time to improve the performance of my game and to check for any memory leaks and so on. While most things look fine, I have an incredible high CPU load of around 45% (on an Intel i5 with four cores). First, I thought I had some very bad design in one of my modules, but even after removing all parts from the render process I still had a CPU load of around 40%!
This was my render loop after removing all my modules' render calls:
public void Run()
{
_logger.BeginFunction(this.ToString(), "Run");
RenderLoop.Run(Form, () =>
{
_deviceContext.ClearDepthStencilView(_depthView, SharpDX.Direct3D11.DepthStencilClearFlags.Depth, 1.0f, 0);
_deviceContext.ClearRenderTargetView(_renderTargetView, Color.Black);
_swapChain.Present(0, SharpDX.DXGI.PresentFlags.None);
});
OnApplicationClosing();
_logger.EndFunction();
}
So, as you can see almost nothing happens and I still got that 40% CPU load. I checked, if anything is running in the background by printing the current stack trace every 5 seconds. Nothing was running in this process, after I disabled all my game engine's modules.
Then I remembered I had a similar issue many years ago during my studies, when a calculation thread was in an endless loop. Since the loop was not slowed down, it was executed as fast as possible which caused a high CPU load. Remembering this, I added an ugly line at the end of my render loop above:
System.Threading.Thread.Sleep(10);
Et voilà, my CPU load goes down to around 10%, even with all my game engine's modules activated.
However, this is not an acceptable final solution for my game engine. I took some time looking online for SharpDX render loop examples, but I was not able to figure out how other people handle this problem.
Is there any way to avoid the high CPU load without slowing the render loop down with a thread sleep?
I'd appreciate any hints and help from you! :-)

Simply replace:
_swapChain.Present(0, SharpDX.DXGI.PresentFlags.None);
by
_swapChain.Present(1, SharpDX.DXGI.PresentFlags.None);
As stated in MSDN Swapchain:Present doc:
0 - The presentation occurs immediately, there is no synchronization.
1 through 4 - Synchronize presentation after the nth vertical blank.

Related

c# 80% Cpu at once. Can't find where the bug is / (on a game server with 250 online)

Hello I made a game server in c# for a big online game.
The only problem is that I get sometimes out of no where 90% cpu usage.
It also got stuck on that 90% when the bug is there it stays on the 90% for ever..
So my question is how can I find the code failure on a simple way because the server is huge.
Are there any .net profilers good foor or something like that?
It also got stuck on that 90% when the bug is there it stays on the 90% for ever
That's awesome! You just got handed the holy grail right there my friend!
Simply attach a debugger when it gets in that state and break. Chances are you'll break in the buggy code, if not keep running and breaking. You should get there really quick if it's using 90% of your CPU.
Alternatively run it through VS's profiler and zoom the graph in only the zone with extremely high sustained CPU usage, you'll get a list of functions that use the most time (assuming it's a CPU bound issue, if it's I/O I don't think it will show).

Swapchain.Present() taking far too long, causing lag

I've recently been getting a bit of lag since I moved all of my c# SlimDX DX11 rendering code from my Form (yes, I'm a lazy developer) to bespoke classes. I whacked my program into EQATEC Profiler and got this as the major contributor to my lag:
Now it's clear here that whatever's in postRender() is really hogging the precious milliseconds. In fact, whatever crazy, convoluted code I have in there is effectively reducing my frame rate to ~15 FPS on its own.
So what's in postRender()? Just one line of code:
swapChain.Present(0, PresentFlags.None);
I just have no idea what's caused it to slow down so much, I've not made any changes to the swapchain code at all. All I've altered is the screen resolution (1680x1050), but that should be absolutely fine (for reference, this machine can run crysis2 at maximum settings at that resolution without breaking a sweat).
Does anybody have any idea what might cause a swapchain to take so long on presenting or where I should look next for problems?
EDIT:
Looking at the structure of my code, my RenderFrame() function is as follows:
preRender();
DeferredRender(preShader);
//Composite scene to output image
CompositeScene(compositeShader);
//Post Process
PostProcess(postProcShader);
//Depth of Field
DoF(dofShader);
//Present the swapchain
postRender();
The results of some of these functions are based on the functions before (for example, DeferredRender uses four render targets to capture Diffuse lighting, Normals, Positions and Color in a per-pixel manner. CompositeScene then puts them all together. This would require the GPU to have computed the previous step before it can continue. This whole process continues along, with DoF requiring the results of PostProcess, etc. Therefore the only shader that could possibly be holding Swapchain.Present() up must be the shader which runs in the function DoF, as all the other shaders cause the CPU to lock until they're finished. Correct?
There are a few reasons why you might find Present() taking up that much time in your frame. The Present call is the primary method of synchronization between the CPU and the GPU; if the CPU is generating frames much faster than the GPU can process them, they'll get queued up. Once the buffer gets full, Present() turns into a glorified Sleep() call while it waits for it to empty out.
Of course, it's pretty much impossible to say with the little information that you've given here. Just because a machine runs Crysis doesn't mean you can get away with throwing anything you want at the card. Double check that you're not expecting to render crazy amounts of geometry and that your shaders aren't abnormally long and complex.
Also take a look at your app using one of the available GPU profilers; PIX is good as a base point, while NVIDIA and AMD each have their own more specific offerings for their own products. Finally, make sure your drivers are updated. If there's a bug in the driver, any chance you have at reasoning about the issue goes out the window.

C# Loading screen threading loading and animation

I'm making a loading screen for a game in c#. Do I need to create a thread for drawing the spinning animation as well as a thread for loading the level?
I'm a bit confused as to how it works. I've spent quite a few hours messing with it to no avail. Any help would be appreciated.
Short of anything that XNA may provide for you, anytime you require doing multiple units of work at once, multiple threads are usually required - and almost certainly if you want to benefit from multiple CPUs. Depending upon exactly what you're looking to do, you're already in one thread (for your main method / program execution) - so you don't likely need to create 2 additional threads - but just one additional for either the loading of your level, or for the animation.
Alternatively, as was probably more common-place in older development when developers weren't concerned with multi-core CPUs, etc., you could use tricks such as doing both the level loading and the animation in the same thread - but at the expense of additional complexity for combining both concerns into the same unit of processing. (In every x # of lines of processing for loading the level, add code to update the loading animation.) However, given today's technology, you are almost certainly better off using multiple threads for this.
Loading takes time because it makes long calculations and long calculations are usually done in a different thread so that the program woun't freeze.
So the answer is yes.

Game Jitters on xBox360

Ok i've got my game nearing completion, everything is working perfectly in it and it runs fairly well on my computer (running at around 99Mb RAM). But when i run it on my xBox, i tend to get occasional jitters of the main player character. I do have explosions and billboard effects inside my game however i'm doing all that rendering on the xBox GPU (that was originally causing the jitters when explosions occurred, but not anymore).
The jitters are random as well, not when i'm spawning large amounts of units, or performing lots of actions. I'm at a loss as to why this is happening, any ideas??
P.s. The game does have multi-threading integrated into it, update on one thread, rendering on another thread.
It sounds like the garbage collector is causing your jitters. On Xbox it kicks in every time 1MB of data is allocated. The amount of time it takes depends on how many references are in use in your program.
Use the XNA Framework Remote Performance Monitor for Xbox 360 to tell if you have a garbage collection problem. Read How to tell if your garbage collection is too slow by Shawn Hargreaves.
If you do have a garbage collection problem, you'll need a profiler that can determine which objects are generating the garbage. The CLR Profiler for the .NET Framework 2.0 is one option. It's not supported in XNA 4.0 by default, but Dave on Crappy Coding has a workaround. For strategies to solve your garbage collection problem read Twin paths to garbage collection Nirvana by Shawn Hargreaves.
Update: The CLR Profiler for the .NET Framework 4.0 is now available.
Sounds like your rendering thread is just sat there waiting for the update thread to finish what its doing which causes the "jitter". Perhaps put in some code that logs how long a thread has to wait for before being allowed access to the other thread to find out if this really is the issue.

How does XNA timing work?

How does XNA maintain a consistent and precise 60 FPS frame rate? Additionally, how does it maintain such precise timing without pegging the CPU at 100%?
While luke’s code above is theoretical right the used methods and properties are not the best choices:
As the precision of DateTime.Now is only about 30ms (see C# DateTime.Now precision and give or take 20ms) its use for high performance timing is not advisable (60 FPS are 16ms). System.Diagnostics.Stopwatch is the timer of choice for real time .NET
Thread.Sleep suffers the same precision/resolution problem and is not guaranteed to sleep for the specified time only
The current XNA FX seems to hook into the Windows message loop and execute its internal pre-update each step and calling Game.Update only if it the elapsed time since the last update matches the specified framerate (e.g. each 16ms for the default settings). If you want to really know how the XNA FX does the job Reflector is your friend :)
Random tidbit: Back in the XNA GameStudio 1.0 Alpha/Beta time frame there were quite a few blog posts about the “perfect WinForms game loop”, albeit I fail to find them now…
I don't know specifically how XNA does it but when playing around with OpenGL a few years ago I accomplished the same thing using some very simple code.
at the core of it i assume XNA has some sort of rendering loop, it may or may not be integrated with a standard even processing loop but for the sake of example lets assume it isn't. in this case you could write it some thing like this.
TimeSpan FrameInterval = TimeSpan.FromMillis(1000.0/60.0);
DateTime PrevFrameTime = DateTime.MinValue;
while(true)
{
DateTime CurrentFrameTime = DateTime.Now;
TimeSpan diff = DateTime.Now - PrevFrameTime;
if(diff < FrameInterval)
{
DrawScene();
PrevFrameTime = CurrentFrameTime;
}
else
{
Thread.Sleep(FrameInterval - diff);
}
}
In reality you would probably use something like Environment.Ticks instead of DateTimes (it would be more accurate), but i think that this illustrates the point. This should only call drawScene about 60 times a second, and the rest of the time the thread will be sleeping so it will not incur any CPU time.
games should run at 60fps but that doesn't mean that they will. That's actually an upper limit for a released game.
If you run a game in debug mode you could get much higher frames per second - for example a blank starter template on my laptop in debug mode runs well over 1,000fps.
That being said the XNA framework in a released game will do its best to run at 60fps - but the code you included in your project has the chance of lowering that performance. For example having something constantly fire the garbage collection you normally would see a dip in the fps of the game, or throwing some complex math in the update or draw methods - thus having them fire every frame..which would usually be a bit excessive. There are a number of things to keep in mind to keep your game as streamlined as possible.
If you are asking how the XNA framework makes that ceiling happen - that I cant really explain - but I can say that depending on how you layout your code - and what you can do can definitely negatively impact this number, and it doesnt always have to be CPU related. In the instance of garbage Collection its just the cleaning up of a RAM which may not show a spike in CPU usage at all - but could impact your FPS, depending on the amount of garbage and interval it has to run.
You can read all about how the XNA timer was implemented here Game timing in XNA Game Studio but basicly it would try and wiat 1/60 of a second before continuing the loop again, also note that update can be called multiple times before a render if XNA needs to "catch up".

Categories

Resources