when I run my application I got this exception
a busy cat http://img21.imageshack.us/img21/5619/bugxt.jpg
I understood that the program is out of memory .. are there any other possible meaning for that exception?
given that I am calling a dll files (deployment from matlab)
thank you all
It's absolutely possible, just use Process Explorer to see your processe's WorkingSet.
For 32 bit Windows systems maximum available memory for .NET Provecesses is arround 2GB, but it can be less based on your version configuration. Here is the SO Link on subject.
Considering the fact that you use matlab, so probably make a massive or complex calculations, you, probably, create a lot of objects/values to pass to DLL functions, which can be a one possible sources of bottleneck. But this is only a guess, cause you need to measure you program to figure out real problem.
Regards.
Note: check your old questions and accept an answer you prefer among responses you got for every question, your rate is too low !
Related
What I'm doing is to pass an array from C# to C++ dll. Then I do some calculation in CUDA, and I need to copy the data to the mentioned array. After that I will use it in later steps in C# project.
The problem is after I did some calculation with many kernels and then use cudaMemcpy() to copy data from device to host. The first cudaMemcpy() takes much more time than the similar operations cudaMemcpy() later.
People said the cudaMemcpy() from device to host (not pinned memory) is synchronous. Is that true? Is there any workaround for this?
Many thanks in advance.
People said the cudaMemcpy() from device to host (not pinned memory) is synchronous. Is that true?
Yes this is true.
Is there any workaround for this?
The one thing that could help you is using Cuda Streams. So you overlap the execution with the copying.
EDIT:
If you cannot use streams then there are no workarounds possible that I know.
One of the things you can consider is as I mentioned in the comment is to build the whole application on the GPU so that you avoid the memory transfer completely(or may be consider copying very very few bytes to the CPU). This really depends on the type of the application. It may or may not be possible in your case.
I want something like a static class variable, except when different applications load my assembly I want them all to be sharing the same variable.
I know I could write to disk or to a database, but this is for a process that's used with sql queries and that would probably slow it down too much (actually I am going to test these options out but I'm asking this question in the meantime b/c I don't think it's going to be an acceptable solution).
I would prefer to use the solution that incurrs the least overhead in deployment, and I don't mind if the solution isn't easy to create so long as it's easy to use when I'm done.
I'm aware that there are some persistent memory frameworks out there. I haven't checked any of them out yet and maybe one of them would be perfect so feel free to recommend one. I am also perfectly content to write something myself, particularly if it makes deployment easier for me to do so.
Thanks in advance for any and all suggestions!
Edit: Looks like I was overlooking a really easy solution. My problem involved SQL only providing 8000 bytes of space to serialize data between calls to a SQL aggregate function I wrote. I read an article on how to compress your data and get the most out of that 8000 bytes, and assumed there was nothing more I could do. As it turns out, I can set the MaxBytes = -1 instead of a range between 0 to 8000 to get up to 2gb of space. I believe that this was something new they added in the 3.5 framework because there are various articles out there talking about this 8000 byte limitation.
Thank you all for you answers though as this is a problem I've wanted to solve for other reasons in the past and now I know what to do if I need a really easy and fast way to communicate between apps.
You can't store this as in-memory data and have it shared between processes, since each process has it's own isolated memory address space.
One option, however, would be to use the .NET Memory-mapped file support to "store" the shared data. This would allow you to write a file that contained the information in a place that every process could access.
Each process has its own address space. You cannot simply share a variable like you intend
You can use shared memory though.
If you are on .NET 4, you can simply use Memory-Mapped Files
If you want some sort of machine-wide count or locking you can look into use of named synchronization objects like semaphore - http://msdn.microsoft.com/en-us/library/z6zx288a.aspx or mutexes http://msdn.microsoft.com/en-us/library/hw29w7t1.aspx. When name is specified such objects are machine-wide instead of process-wide.
Good afternoon,
I inherited some C# code from years ago. I have refactored it a bit to be asynchronous.
Evaluating the impact of my changes on the performance of the CPU, I used Process Explorer to watch, roughly, what my app was doing.
To my surprise, it appears to be doing what Process Explorer reports as I/O. In general, this is related to Disk I/O or Network I/O.
Based on what I can see of the code, I can't figure out an explicit call to either of those 2 I/O sources.
My question is: what is the best way to identify which section of code is causing I/O? We use dotTrace from JetBrains to profile our application, but, from what I can tell, it only handles CPU and Memory performance.
Thanks in advance for any pointers.
Regards,
Eric.
Process Monitor may be your answer. Refer to the following StackOverflow question for more information.
How can I profile file I/O?
Building on that answer, you may be able to search your solution for the filename of any commonly read or written files found with Process Monitor.
The stackshot method, also called random pausing, will find it, if it takes significant time.
If the I/O code is managed, you can load the symbols for the .net framework and set breakpoints in crucial functions (e.g. FileStream constructors etc.)
It involves some guess work but can be informative if you succeed.
In addition to Process Monitor, I find the Resource Monitor on Win7 (also available under 'Performance and Reliability' I think on Vista) very useful for diagnosing I/O-related slowdowns. Switch to the disk view and sort by Read/Write or Total (win7 only). Also keep an eye on the list of files that appear.
How exactly do RAM test applications work, and is it possible to write such using C# (Example)?
Most use low-level hardware access to write various bit patterns to memory, then read them back to ensure they are identical to the pattern written. If not, the RAM is probably faulty.
They are generally written in low-level languages (assembler) to access the RAM directly - this way, any caching (that could possibly affect the result of the test) is avoided.
It's certainly possible to write such an application in C# - but that would almost certainly prevent you from getting direct bit-level access to the memory, and hence could never be as thorough or reliable as low-level memory testers.
You basically write to the RAM, read it back and compare this with the expected result. You might want to test various patterns to detect different errors (always-0, always-1), and run multiple iterations to detect spurious errors.
You can do this in any language you like, as long as you have direct access to the memory you want to test. If you want to test physical RAM, you could use P-invoke to reach out of the CLR.
However, this won't solve one specific problem if your computer is based on the Von Neumann architecture: The program that tests the memory is actually located inside the very same memory. You would have to relocate the program to test all of it. The German magazine c't found a way around this issue for their Ramtest: They run the test from video memory. In practice, this is impossible with C#.
As discovered by some Linux guru trying to write a memtest program in C, any such program must be compiled to run on either bare hardware or a MMU-less OS to be effective.
I don't think any compiler for C# can do that.
You probably can't do as good of a job testing memory from a C# program in Windows as you could from a C or Assembly language program running with no OS, but you could still make something useful.
You're going to need to use the native Windows API (via dllimpott and P/invoke) to allocate sone memory and lock it into RAM. Once you've done that, reading and writing patterns to the memory is pretty easy.
At the end of the test, you can tell the user how much of their memory you were able to test.
When I try to compile an assembly in VS 2008, I got (occasionally, usually after 2-3 hours of work with the project) the following error
Metadata file '[name].dll' could not be opened --
'Not enough storage is available to process this command.
Usually to get rid of that I need to restart Visual Studio
The assembly I need to use in my project is BIG enough (> 70 Mb) and probably this is the reason of that bug, I've never seen some thing like this in my previous projects. Ok, if this is the reason my question is why this happens and what I need to do to stop it.
I have enough of free memory on my drives and 2Gb RAM (only ~1.2 Gb are utilized when exception happens)
I googled for the answers to the questions like this.
Suggestions usually related to:
to the number of user handlers that is limited in WinXP...
to the physical limit of memory available per process
I don't think either could explain my case
For user handlers and other GUI resources - I don't think this could be a problem. The big 70Mb assembly is actually a GUI-less code that operates with sockets and implements parsers of a proprietary protocols. In my current project I have only 3 GUI forms, with total number of GUI controls < 100.
I suppose my case is closer to the fact that in Windows XP the process address space is limited with 2 GB memory (and, taking into account memory segmentation, it is possible that I don't have a free segment large enough to allocate a memory).
However, it is hard to believe that segmentation could be so big after just 2-3 hours of working with the project in Visual Studio. Task Manager shows that VS consumes about 400-500 Mb (OM + VM). During compilation, VS need to load only meta-data.
Well, there are a lot of classes and interfaces in that library, but still I would expect that 1-2 Mb is more then enough to allocate metadata that is used by compiler to find all public classes and interfaces (though it is only my suggestion, I don't know what exactly happens inside CLR when it loads assembly metadata).
In addition, I would say that entire assembly size is so big only because it is C++ CLI library that has other um-managed libraries statically linked into one DLL. I estimated (using Reflector) that .NET (managed) code is approx 5-10% of this assembly.
Any ideas how to define the real reason of that bug? Are there any restrictions or recommendations as to .NET assembly size? (Yes I know that it worth thinking of refactoring and splitting a big assembly into several smaller pieces, but it is a 3rd party component, and I can't rebuilt it)
The error is misleading. It really should say "A large enough contiguous space in virtual memory could not be found to perform the operation". Over time allocations and deallocations of virtual memory space leads to it becoming fragmented. This can lead to situations where a large allocation cannot be filled despite there being a plenty total space available.
I think this what your "segmentation" is refering to. Without knowing all the details of everything else that needs to load and other activity which occupies the 2-3 hour period its difficult to say whether this really is the cause. However I would not put it into the category of unlikely, in fact it is the most likely cause.
In my case the following fix helped:
http://confluence.jetbrains.net/display/ReSharper/OutOfMemoryException+Fix
As Anthony pointed out, the error message is a bit misleading. The issue is less about how big your assembly is and more about how much contiguous memory is available.
The problem is likely not really the size of your assembly. It's much more likely that something inside of Visual Studio is fragmenting memory to the point that a build cannot complete. The usual suspects for this type of problem are
Too many projects in the solution.
Third party add-ins
If you have more than say 10 projects in the solution. Try breaking up the solution and see if that helps.
If you have any 3rd party addins, try disabling them one at a time and seeing if the problem goes away.
I am getting this error on one of my machines and surprisingly, this problem is not seen on other dev machines. May be something wrong with VS installation.
But I found an easier solution.
If I delete the .suo file of teh solution and re-open the solution again, it will start working smoothly.
Hope this will be useful for somebody in distress..
If you are just interested to make it work then restart your computer and it will work like a charm. I Had same kind of error in my application and then after reading all of the answer here at stackoverflow, I decided to first restart my computer before doing any other modifications. And it saved me a lot of time.
Another cause for this problem can be using too many typed datasets via the designer. or other types that can be instaniated via a designer like lots of databound controls on lots of forms.
I imagine your the sort of hardcore programmer though who wouldn't drag n' drop a DS! :D
in relation to your problem, Bogdan, have you tried to reproduce the problem w/o your c++ component loaded? If you can't then maybe its this. How are you loading the component? have you tried other techniques like late binding, etc? any difference?
Additional:
Yes you are right, the other culprits are lots of controls on the form. I once saw this same issue with a dev that had imported a very VB6 app over to .net. he had literally 100's of forms. He would get periodic crashing of the IDE after a couple of hours. I'm pretty sure it was thread exhaustion. It might be worth setting up a vanilla box w/ no addins loaded just to rule addins out, but my guess is you are just hitting the wall in terms of a combined limiation of VS and your box specs. Try running Windows Vista 64bit and install some extra RAM modules.
If memory usage and VM size is small for devenv.
Explicitly kill "ALL" instances of devenv.exe running.
I had 3 devenv.exe running where as I had two instances of Visual studion opened in front.
That was solution in my case.
I know it has been a long time since this was commented on but I ran into this exact issue today with a telerik dll in VS2010. I had never seen this issue before until today when I was making some setting changes in IE.
There is a setting in Tools/Folder Option/View in the Files and Folders section called "Launch folder windows in a separate process".
I am not sure the amount of memory used for each window when using this setting but until today I have never had this checked. After checking this option for misc reasons I started getting the "not enough storage is available to process this command". The telerik dll is an 18mb dll that we are using located in our library folder as a reference in our project.
Unchecking this resolved the problem.
Just passing along as another possible solution
I also faced the same problem.
Make sure that the windows os is with 64bit.
I switched to windows 64bit from windows 32bit. I problem got solved.
I had this same issue and in my case, the exception name was very misleading. The actual problem was that the DLL couldn't be loaded at all due to invalid path. The exception i was getting said "
I used DllImport attribute in C#, ASP.NET application with declaration like below and it was causing the exception:
[DllImport(#"Calculation/lib/supplier/SupplierModule.dll", CallingConvention = CallingConvention.StdCall, CharSet = CharSet.Ansi, EntryPoint = "FunctionName")]
Below is working code snippet:
[DllImport(#"Calculation\lib\supplier\SupplierModule.dll", CallingConvention = CallingConvention.StdCall, CharSet = CharSet.Ansi, EntryPoint = "FunctionName")]
The actual problem was using forward slashes in path, instead of back slashes. This cost me way too much to figure out, hope this will help others.