I really hope someone can help me.
I have a single byte[] that has to show the amount of bytes in die byte[] to follow. Now my value is above 255. Is there a way to display/enter a large number?
A byte holds a value from 0 to 255. To represent 299, you either have to use 2 bytes, or use a scheme (which the receiver will have to use as well) where the value in the byte is interpreted as more than its nominal value in order to expand the possible range of values. For instance, the value could be the length / 2. This would allow lengths of 0 - 510, but would allow only even lengths (odd length arrays would need a pad byte).
You can use two (or more) bytes to represent a number larger than 255. Is that what you want ?
short value = 2451;
byte[] data = BitConverter.GetBytes(value);
If this is needed in order to exchange data with some external system, remember to read about Endianness.
That depends on what you consider a good approach. You can perform some form of encoding to allow you store larger than 2 bytes worth of data. I.e. perhaps setting the first byte as 0xFF means you will consider the next byte as part of its data.
[0x01,0x0A,0xFF,0x0A]
Would be interpreted as 3 values of [1,10,265]
Related
Just doing this for fun and I was reading the pseudo-code on Wikipedia and it says when pre-processing to append the bit '1' to the message and then append enough '0' bits to the resulting message length modulus 512 is 448. Then append the length of the message in bits as a 64-bit big-endian integer.
Okay. I'm not sure how to append just a '1' bit but I figure it could be possible to just append 128 (1000 0000) but that wouldn't work in the off chance the resulting message length modulus 512 was already 448 without all those extra 0's. In which case I'm not sure how to append just a 1 because I'd need to deal with at least bytes. Is it possible in C#?
Also, is there a built-in way to append a big-endian integer because I believe my system is little-endian by default.
It's defined in such a way that you only need to deal with bytes if the message is an even number of bytes. If the message length (mod 64) is 56, then append one byte of 0b10000000, folowed by 63 0 bytes, followed by the length. Otherwise, append one byte of 0b10000000, followed by 0 to 62 0 bytes, followed by the length.
You might check out the BitArray class in System.Collections. One of the ctor overloads takes an array of bytes, etc.
I have an application that expects 5 bytes that is derived of a number. Right now I am using Uint32 which is 4 bytes. I been told to make a class that uses a byte and a Uint32. However, im not sure how to combine them since they are numbers.
I figured the best way may be to use a Uint64 and convert it down to 5 bytes. However, I am not sure how that can be done. I need to be able to convert it to 5 bytes and a separate function in the class to convert it back to a Uint64.
Does anyone have any ideas on the best way to do this?
Thank you
Use BitConverter.GetBytes and then just remove the three bytes you don't need.
To convert it back use BitConverter.ToUInt64 remembering that you'll have to first pad your byte array with three extra (zero) bytes.
Just watch out for endianness. You may need to reverse your array if the application you are sending these bytes to expects the opposite endianess. Also the endianess will dictate whether you need to add/remove bytes from the start or end of the array. (check BitConverter.IsLittleEndian)
I am using a c# struct as a pseudo-union (by using the LayoutKind.Explicit attribute), to pass network messages around my program. I understand how to use the layout with the primitive types, as they are of a know size.
However, how would I do this with one of the fields being a char array? I know a char is 2 bytes of data (when in unicode format), but how big is char[]? Am I correct in believing that this is a reference type, so its size is not just number of items * 2?
How would I layout the struct for this? Is it even possible?
The size is the width of a reference; so 4 bytes on x86 or 8 bytes on x64. The size of the array is irrelevant, as the array is stored separately on the heap. If you want to serialize that data to a byte stream, then it probably depends on which encoding you use for the char data. UTF16 would indeed be 2 * number of characters, but UTF8 or UTF32 will be different.
That is strange, shouldn't it be equal to the length times the number of bytes per character ?
Just wondering if someone could explain why the two following lines of code return "different" results? What causes the reversed values? Is this something to do with endianness?
int.MaxValue.ToString("X") //Result: 7FFFFFFF
BitConverter.ToString(BitConverter.GetBytes(int.MaxValue)) //Result: FF-FF-FF-7F
int.MaxValue.ToString("X") outputs 7FFFFFFF, that is, the number 2147483647 as a whole.
On the other hand, BitConverter.GetBytes returns an array of bytes representing 2147483647 in memory. On your machine, this number is stored in little-endian (highest byte last). And BitConverter.ToString operates separately on each byte, therefore not reordering output to give the same as above, thus preserving the memory order.
However the two values are the same : 7F-FF-FF-FF for int.MaxValue, in big-endian, and FF-FF-FF-7F for BitConverter, in little-endian. Same number.
I would guess because GetBytes returns an array of bytes which BitConverter.ToString formatted - in my opinion - rather nicely
And also keep in mind that the bitwise represantattion may be different from the value! This depends where the most signigicant byte sits!
hth
Is it possible to get strings, ints, etc in binary format? What I mean is that assume I have the string:
"Hello" and I want to store it in binary format, so assume "Hello" is
11110000110011001111111100000000 in binary (I know it not, I just typed something quickly).
Can I store the above binary not as a string, but in the actual format with the bits.
In addition to this, is it actually possible to store less than 8 bits. What I am getting at is if the letter A is the most frequent letter used in a text, can I use 1 bit to store it with regards to compression instead of building a binary tree.
Is it possible to get strings, ints,
etc in binary format?
Yes. There are several different methods for doing so. One common method is to make a MemoryStream out of an array of bytes, and then make a BinaryWriter on top of that memory stream, and then write ints, bools, chars, strings, whatever, to the BinaryWriter. That will fill the array with the bytes that represent the data you wrote. There are other ways to do this too.
Can I store the above binary not as a string, but in the actual format with the bits.
Sure, you can store an array of bytes.
is it actually possible to store less than 8 bits.
No. The smallest unit of storage in C# is a byte. However, there are classes that will let you treat an array of bytes as an array of bits. You should read about the BitArray class.
What encoding would you be assuming?
What you are looking for is something like Huffman coding, it's used to represent more common values with a shorter bit pattern.
How you store the bit codes is still limited to whole bytes. There is no data type that uses less than a byte. The way that you store variable width bit values is to pack them end to end in a byte array. That way you have a stream of bit values, but that also means that you can only read the stream from start to end, there is no random access to the values like you have with the byte values in a byte array.
What I am getting at is if the letter
A is the most frequent letter used in
a text, can I use 1 bit to store it
with regards to compression instead of
building a binary tree.
The algorithm you're describing is known as Huffman coding. To relate to your example, if 'A' appears frequently in the data, then the algorithm will represent 'A' as simply 1. If 'B' also appears frequently (but less frequently than A), the algorithm usually would represent 'B' as 01. Then, the rest of the characters would be 00xxxxx... etc.
In essence, the algorithm performs statistical analysis on the data and generates a code that will give you the most compression.
You can use things like:
Convert.ToBytes(1);
ASCII.GetBytes("text");
Unicode.GetBytes("text");
Once you have the bytes, you can do all the bit twiddling you want. You would need an algorithm of some sort before we can give you much more useful information.
The string is actually stored in binary format, as are all strings.
The difference between a string and another data type is that when your program displays the string, it retrieves the binary and shows the corresponding (ASCII) characters.
If you were to store data in a compressed format, you would need to assign more than 1 bit per character. How else would you identify which character is the mose frequent?
If 1 represents an 'A', what does 0 mean? all the other characters?