I am getting on this command
Dictionary<UInt64, int> myIntDict = new Dictionary<UInt64, int>(89478458);
this error:
System.OutOfMemoryException was unhandled HResult=-2147024882
Message=Array dimensions exceeded supported range.
Source=mscorlib
StackTrace:
at System.Collections.Generic.Dictionary`2.Initialize(Int32 capacity)
at System.Collections.Generic.Dictionary`2..ctor(Int32 capacity, IEqualityComparer`1 comparer)
On 89478457 there is no error. Here is the source of Initialize in Dictionary.cs:
private void Initialize(int capacity)
{
int size = HashHelpers.GetPrime(capacity);
...
entries = new Entry[size];
...
}
When I reproduce this, the error happens on the array creation. Entry is a struct in this case with size 24. When we get max int32 (0x80000000-1) and divide on 24 = 89478485 and this number is between prime numbers 89478457 and 89478503.
Does this mean, that array of struct cannot be bigger as maxInt32/sizeOfThisStruct?
EDIT:
Yes. I actually go over 2 GB. This happens, when the dictionary creates the internal array of struct Entry, where are the (key,value) pairs stored. In my case the sizeof(Entry) is 24 bytes and as value type is inline allocated.
And the solution is to use the gcAllowVeryLargeObjects flag (thank you Evk). Actually in .net core the flag is the environment variable COMPlus_gcAllowVeryLargeObjects (thank you svick).
And yes, Paparazzi is right. I have to think about, how not to waste memory.
Thank you all.
There is known limitation of .NET runtime - maximum object size allowed on the heap is 2 GB, even on 64-bit version of runtime. But, starting from .NET 4.5 there is configuration option which allows you to relax this limit (only on 64-bit version of runtime still) and create larger arrays. Example of configuration to enable that is:
<configuration>
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
</runtime>
</configuration>
On the surface Dictionary does not make sense
You can only have int unique values
Do you really have that my duplicates
UnInt32 goes to 4,294,967,295
Why are you wasting 4 bytes?
89,478,458 rows
Currently a row is 12 bytes
You have 1 GB at about 83,333,333 rows
Since an object needs contiguous memory 1 GB is more of a practical limit
If values is really a strut 24
Then 1 gb in 31,250,000
That is just a really big collection
You can split is up into more than one collection
Or use a class as then it is just a reference with I think is 4 bytes
Related
This question already has answers here:
Can't create huge arrays
(2 answers)
What is the maximum length of an array in .NET on 64-bit Windows
(3 answers)
Closed 1 year ago.
Even though this post says it should work, if you create an int array of size Int32.MaxValue, it throws an OutOfMemoryException: Array dimensions exceeded supported range.
From my testing, it seems like the maximum size that an array can be initialized to is Int32.MaxValue - 1048576 (2,146,435,071). 1048576 is 2^20. So only this works:
var maxSizeOfIntArray = Int32.MaxValue - 1048576;
var array = new int[maxSizeOfIntArray];
Does any one know why? Is there a way to create a larger integer array?
PS: I need to use arrays instead of lists because of a Math.Net library that only returns arrays for sets of random numbers that are cryptographically secure pseudo random number generator
Yes I have looked at the other questions linked but they are not correct as those questions say the largest size is Int32.MaxValue which is not the same as what my computer lets me do
Yes, I do know the size of the array will be 8GB, I need to generate a data set of billions of rows in order to test the randomness with the die harder suite of tests
I also tried the option of creating a BigArray<T> but that doesn't seem to be supported in C# anymore. I found one implementation of it, but that throws an IndexOutOfRangeException at index 524287, even though I set the array size to 3 million.
An Int32 is 32 bits, or 4 bytes. The max value of an Int32 is 2,147,483,647. So, if you could create an array of 2,147,483,647 elements, where each element is 4 bytes, you would need a contiguous piece of memory that is 8GB in size. That is ridiculously huge, and even if your machine had 128GB of RAM (and you were running in a 64-bit process), that would be outside of realistic proportions. If you really need to use that much memory (and your system has it), I would recommend going to native code (i.e., C++).
I'm looking for some documentation on the differences between an array's .Length and .LongLength properties.
Specifically, if the array's length is larger than Int32.MaxValue, will .Length throw an exception, return Int32.MaxValue, go negative, return 0?
(to clear up "possible duplicate" concerns: I'm not asking about the maximum length of an array, or the maximum size of a .NET CLR object. Assume a 64 bit system and a CLR version which supports large objects)
It is not possible to create a one dimensional array having more than 2,147,483,591 elements (for comparison, int.MaxValue is 2,147,483,647). OutOfMemoryException is thrown if an attempt is made to create an array with more elements. It means that the LongLength property is still useless and you can use the Length property instead.
I've tested it on the x64 platform using .NET 4.5. In order to create the array with 2,147,483,591 elements I've modified the configuration file and added:
<configuration>
<runtime>
<gcAllowVeryLargeObjects enabled="true" />
</runtime>
</configuration>
Basically, I used this MSDN page to enable arrays that are greater than 2 (GB) in total size. The real limit for arrays:
The maximum index in any single dimension is 2,147,483,591
(0x7FFFFFC7) for byte arrays and arrays of single-byte structures, and
2,146,435,071 (0X7FEFFFFF) for other types.
I have the following code below:
List<long> numbers = new List<long>();
for (long i = 1; i <= 300000000; i++)
{
numbers.Add(i);
}
What I wanted to do is to populate the list from 1-300 million. But when it hit the 67108865, it throws an exception on line 4: Exception of type 'System.OutOfMemoryException' was thrown.
I tried using ulong but still no luck.
I believe the maximum range for long data type is 9,223,372,036,854,775,807 but why am I having an error here?
Thanks in advance!
EDIT
Thanks for all the answers. It helped my realized that my design is not good. I ended up changing my code design.
Well, it is not that your number are to large, but your list is...
Lets calculate it's size:
300,000,000 * 64 bits (size of long) = 19,200,000,000 bits
19,200,000,000 /8 (size of byte) = 2,400,000,000
2,400,000,000 / 2^10 = 2,343,750 KB
2,343,750 / 2^10 = 2,288~ MB
2,288/ 2^10 = 2.235~ GB
You wanted a list of about 2.35 GB.
The current CLR limitation is 2 GB (see this or this SO thread)
If you need a list with size of 300,000,000, split it into 2 lists (either in place or using a wrapping object that will handle managing the lists).
First note System.OutOfMemoryException is not thrown when limit is reached for a variable,
Secondly, it is thrown because there is not enough memory available to continue the execution of a program.
Sadly you can not configure, .the NET runtime makes all the decisions about heap size and memory.
Either switch to 64-bit machine.
For info, on 32 bit machine you can increase the memory by using /3GB boot switch option in Boot.ini
EDIT While searching I found in MSDN Documentation under Remarks section
By default, the maximum size of an Array is 2 gigabytes (GB). In a 64-bit environment, you can avoid the size restriction by setting the enabled attribute of the gcAllowVeryLargeObjects configuration element to true in the run-time environment. However, the array will still be limited to a total of 4 billion elements, and to a maximum index of 0X7FEFFFFF in any given dimension (0X7FFFFFC7 for byte arrays and arrays of single-byte structures).
A List<long> is backed by an long[]. You will fail as soon as the backing array cannot be allocated, during the reallocation, there has to be enough total memory for both the old and the new arrays.
But, if you want a collection with more than 231 elements, you have to write your own implementation by using multiple arrays or List, and managing them.
Your numbers are not too large; your list is too long. It is using all the available memory.
Reduce 300000000 to 3000000, and it will (probably) work fine.
It should ideally hold number of elements that you are telling.
But, as per current CLR implementation each object can have max 2 GB size. As you are storing long values (size 8 bytes), then while populating the List, it is somewhere trying to exceed the 2GB limit. That's why you are getting System.OutOfMemoryException
Why does following code throw exception "Arithmetic operation resulted in an overflow." ?
UInt64[] arr=new UInt64[UInt64.MaxValue];
I guess because totally 8 * UInt64.MaxValue bytes is requested to be allocated, and this multiplication obviously overflows a 64-bit register.
Because indexers only take Int32 values. You can do
UInt64[] arr=new UInt64[Int32.MaxValue];
But thats the limit.
EDIT: Technically you can index an array with value structures that can theoretically be higher than Int32.MaxValue (because you can index an array with a long or a uint for example), however, you will run into that runtime error when the value exceeds Int32.MaxValue.
Because
a) all object are limited to 2GB in .NET
b) You don't have 64 PetaBytes of memory to spend
According to Microsoft's documentation, with framework .NET 4.5 these limitations apply:
The maximum number of elements in an array is UInt32.MaxValue.
The maximum index in any single dimension is 2,147,483,591 (0x7FFFFFC7) for byte arrays and arrays of single-byte structures, and 2,146,435,071 (0X7FEFFFFF) for other types.
The maximum size for strings and other non-array objects is unchanged
I would like to estimate the amount of memory used by each session on my server in my ASP.NET web application. A few key questions:
How much memory is allocated just to have each Session instance?
Is the memory usage of each variable equal to its given address space (e.g. 32 bits for an Int32)?
What about variables with variable address space (e.g. String, Array[]s)?
What about custom object instances (e.g. MyCustomObject which holds various other things)?
Is anything added for each variable (e.g. address of Int32 variable to tie it to the session instance) adding to overhead-per-variable?
Would appreciate some help in figuring out on how I can exactly predict how much memory each session will consume. Thank you!
The HttpSessionStateContainer class has ten local varaibles, so that is roughly 40 bytes, plus 8 bytes for the object overhead. It has a session id string and an items collection, so that is something like 50 more bytes when the items collection is empty. It has some more references, but I believe those are references to objects shared by all session objects. So, all in all that would make about 100 bytes per session object.
If you put a value type like Int32 in the items collection of a session, it has to be boxed. With the object overhead of 8 bytes it comes to 12 bytes, but due to limitations in the memory manager it can't allocate less than 16 bytes for the object. With the four bytes for the reference to the object, an Int32 needs 20 bytes.
If you put a reference type in the items collection, you only store the reference, so that is just four bytes. If it's a literal string, it's already created so it won't use any more memory. A created string will use (8 + 8 + 2 * Length) bytes.
An array of value types will use (Length * sizeof(type)) plus a few more bytes. An array of reference types will use (Length * 4) plus a few more bytes for the references, and each object is allocated separately.
A custom object uses roughly the sum of the size of it's members, plus some extra padding in some cases, plus the 8 bytes of object overhead. An object containing an Int32 and a Boolean (= 5 bytes) will be padded to 8 bytes, plus the 8 bytes overhead.
So, if you put a string with 20 characters and three integers in a session object, that will use about (100 + (8 + 8 + 20 *2) + 3 * (20)) = 216 bytes. (However, the session items collection will probably allocate a capacity of 16 items, so that's 64 bytes of which you are using 16, so the size would be 264 bytes.)
(All the sizes are on a 32 bit system. On a 64 bit system each reference is 8 bytes instead of 4 bytes.)
.NET Memory Profiler is your friend:
http://memprofiler.com/
You can download the trial version for free and run it. Although these things sometimes get complicated to install and run, I found it surprisingly simple to connect to a running web server and inspect all the objects it was holding in memory.
You can probably get some of these using Performance Counters and Custom Performance Counters. I never tested these with ASP.NET, but otherwise they'r quite nice to measure performances.
This old, old article from Microsoft containing performance tips for ASP (not ASP.NET) states that each session has an overhead of about 10 kilobytes. I have no idea if this is also applicable to ASP.NET, but it sure is a lot more than then 100 bytes overhead the Guffa mentions.
Planning a large scale application there are some other things you probably need to consider besides rough memory usage.
It depends on a Session State Provider you choose, and default in-process sessions are probably not the case at all.
In case of Out-Of-Process session storage (which may be the preferred for scalable applications) the picture will be completely different, and depends on how session objects are serialized and stored.
With SQL session storage, there will not be linear RAM consumption.
I would recommend integration testing with an out-of-process flavour of session state Provider from the very beginning for a large-scale application.