Fastest way to serialize C# object array into string - c#

I am looking for the fastest way to serialize and deserialize a C# array of objects into a string...
Why a string and not a byte array? Well, I am working with a networking system (The Unity3d networking system to be specific) and they have placed a rather annoying restriction which does not allow the sending of byte arrays or custom types, two things I need (hard to explain my situation).
The simplest solution I have come up with for this is to serialize my custom types into a string, and then transmit that string as opposed to directly sending the object array.
So, that is the question! What is the fastest way to serialize an object array into a string? I would preferably like to avoid using voodoo characters (invisible/special characters), as I am not sure if Unity3d will cull them, but base64 encoding doesn't take full advantage of the allowed character spectrum. I am also worried about the efficiency of using base 64.
Obviously, since this is networking related, having the serialized data be as small as possible is a plus.
EDIT:
One possible way to do this would be to serialize to a byte array, and then pretend that that byte array is a string. The problem is, I am afraid that .Net 2.0, or Unity's networking system will end up culling some of the special or invisible characters created using this method... Something which very much needs to be avoided. I am hoping for a solution that has near or equal speed to this, but does not use any of the characters that are likely to be culled. (I have no idea what characters these are, but I have had bad experiences with Unity when it came to direct conversions to strings from byte arrays)

Json.Net is what I always use its simple and gets the job done in a human readable way. Json is about as lightweight as it gets and is widely used for sending data over the wire.
I'll give you this answer as accepted, but I suggest adding base64 encoding to your answer!
–Georges Oates Larsen
Thank you, and yes that is also a great option if readability is not an issue.

We use SoapFormatter so that the object be embedded in Javascript variables and otherwise be "safe" to pass around:
using (MemoryStream oStream = new MemoryStream())
{
(new SoapFormatter()).Serialize(oStream, theObject);
return Encoding.Default.GetString(oStream.ToArray());
}

using(MemoryStream s = new MemoryStream()) {
new BinaryFormatter().Serialize(s, obj);
return Convert.ToBase64String(s.ToArray());
}
and
using(MemoryStream s = new MemoryStream(Convert.FromBase64String(str))) {
return new BinaryFormatter().Deserialize(s);
}

Related

How to serialize very large files to a byte array?

I have a custom object. One of the properties on the object is a byte array with the contents of a file. This file can be VERY large (800+ MB in some instances). Since using the JsonSerializer and XmlSerializer are out of the question (the resulting string is too large), I think going with a byte array is the next best option.
I've looked through some other solutions (like this) but currently have no luck with what I need. We're working out of .NET 5 and things like the BinaryFormatter are no-go.
Is it somehow possible to take my object and write it to a stream so I can deal with it in a byte array?
Thanks

C# export / write multidimension array to file (csv or whatever)

Hi Designing a program and i just wanted advise on writing a multiDim array to a file.
I am using XNA and have a multidimension array with a Vector3(x, y, z)'s in it.
There are hundred thousand if not millions of values, and i want to be able to save them in a file (saving the game level). I have no bias to one idea, i just need to store data...thats it! All the other game data like player stats etc etc i am using XMLSerializer and its working wonders.
Now i was playing with xml serializer alot and have learn that you cannot export MultiDim Arrays... so frustrating (but i am sure there is a good reason why - hopefully). I played with Jagged's with no luck.
Used System.IO.File.WriteAllText then quickly relised that is only for string... daahhh
Basically i think i need to go down the BinaryWrite method, re-writing my own serializer, over even try running a sql server to host the masses of data... stupid idea? please tell me and can you point me in the write direction. As i primarily have a web (php) background the thought of running a server that syncs data / level data is attractive to me... but might not be applicable here.
Thanks for anything,
Mal
You can just serialise the lot with the built-in .NET serialisers provided the objects in the array are serialisable (and IIRC Vector3s are).
void SerializeVector3Array(string filename, Vector3[,] array)
{
BinaryFormatter bf = new BinaryFormatter();
Stream s = File.Open(filename, FileMode.Create);
bf.Serialize(s, array);
s.Close();
}
Vector3[,] DeserializeVector3Array(string filename)
{
Stream s = File.Open(filename, FileMode.Open);
BinaryFormatter bf = new BinaryFormatter();
Vector3[,] array = (Vector3[,])bf.Deserialize(s);
s.Close();
return array;
}
That should be a rough template of what you're after.
Why dont you try Json Serialization? Json has less noise than XML, occupies less space when written to file especially if you do so without indenting, etc. It does not have trouble with arrays, dictionaries, dates and other objects as far as my experience with it goes.
I recommend using JSON.NET and if not, then look at this thread
If Json.net finds it difficult to serialize a library class with many private & static variables, then it is trivial to write a POCO class and map the library class essential properties to your POCO and serialize and map the POCO back and forth.

Compress Guids by hashing in small data sets

I'm working on a mobile app and i want to optimise the data that it's receiving from the server (as JSON).
There are 3 lists returned (each containing its own class of objects, the approximate list sizes are 50, 100 and 170). Each object has a Guid id and there is some relation data for each object. E.g.:
o = { Id = "8f088552-5b24-4ba4-a6e5-8958c4353581",
RelatedIds = ["19d2e562-0874-473f-8e05-7052e8defd9a", "615b4c47-199a-4f7d-8268-08ed43d9c891", ... ] }
Is there a way to compress these Guids to something sorter without storing an identity map? Perhaps using a hash function?
You can convert the 16-byte representation of a GUID into a Base 64 string. However you didn't mention a programming language so we can't help further.
A hash function is not recommended here because hash functions are generally lossy.
No. One of the attributes of (non-cryptographic) hashes is that they collide: hash(a) == hash(b) but a != b. They are a performance optimization in the case where you are doing a lot of equality checks and you expect many false results (because if hash(a) != hash(b) then a != b). A GUID->counter map is probably the best way to get smaller ids here.
You can convert hex (base16) to base64, and remove all the punctuation. You should save 25% for using base64, and another 4 bytes for punctuation.
Thinking about it some more i've realized that HTTP compression (if enabled) is probably going to compress that data well enough anyway, so it's not really worth the effort to compress data manually.

How do you write byte[] array using log4.net

I have a byte[] with some data in it, I would like to write this byte array AS-IS to the log file using log4.net. The problems that i am facing is that
There are no overload for byte[] in TextWriter, so even implementing an IObjectRenderer is of no use.
I dont have access to the underlying Stream object of Log4.net
Also tried converting byte[] into char[] still when i write it, it adds an extra byte.
Is this even possible with Log4.net.
Thanx in Advance.
Log files are usually plain text files. It's probably best to log your byte array represented as string.
Have a look at BitConverter.ToString or Convert.ToBase64String.
Nope. Have you thought about writing it out as a hex string (see this post)?
I also think that logging any larger data is kind of useless, however, i guess this is what you are looking for - this converts your bytes to string.
System.Text.Encoding.ASCII.GetString(byteArray)
I believe you can figure out how to use that for logging.
Pz, the TaskConnect developer
If you are logging into DB then use Binary type with maximum size

parsing binary file in C#

I have a binary file. i stored it in byte array. file size can be 20MB or more. then i want to parse or find particular value in the file. i am doing it by 2 ways ->
1. By converting full file in char array.
2. By converting full file in hex string.(i also have hex values)
what is best way to parse full file..or should i do in binary form. i am using vs-2005.
From the aspect of memory consumption, it would be best it you could parse it directly, on-the-fly.
Converting it to a char array in C# means effectively doubling it's size in memory (presuming you are converting each byte to a char), while hex string will take at least 4 times the size (C# chars are 16-bit unicode characters).
On the other hand, it you need to make many searches and parsing over an existing set of data repeatedly, you may benefit from having it stored in any form which suits your needs better.
What's stopping you from seaching in the byte[]?
IMHO, If you're simply searching for a byte of specified value, or several continous bytes, this is the easiest way and most efficient way to do it.
If I understood your question correctly you need to find strings which can contain any characters in a large binary file. Does the binary file contain text? If so do you know the encoding? If so you can use StreamReader class like so:
using (StreamReader sr = new StreamReader("C:\test.dat", System.Text.Encoding.UTF8))
{
string s = sr.ReadLine();
}
In any case I think it's much more efficient using some kind of stream access to the file, instead of loading it all to memory.
You could load it by chunks into the memory, and then use some pattern matching algorithm (like Knuth-Moris-Pratt or Karp-Rabin)

Categories

Resources