I have a java backend that sends messages using protobuf, it sends delimited message objects in one big byte array blob over tib. I can deserialize them fine using the function parseDelimitedFrom(yourStreamHere) in java but on the C# side we are having some issues and I couldn't find any examples but I may just be missing something obvious here.
We are doing something in C# like this
using (MemoryStream mem = new MemoryStream())
{
mem.Write(byteArray, 0, byteArray.Length);
mem.Position = 0;
return Serializer.Deserialize<List<OrderState>>(mem);
}
Note: I saw an older post on this but it looked pretty dated and I think changes have occurred to protobuf-net since then but correct if I'm wrong there
The developer was using tag 0 and prefix style 128 at one point yesterday like so
IEnumerable<SomeObject> list = (Serializer.DeserializeItems<SomeObject>(memoryStream, PrefixStyle.Base128, 0));
but we were still getting an error. When we called getProto on the C# side today it appears it was converting our properties that were set to the double type to the fixed64 type, on the java side we had specified double so I think this mismatch was causing the errors we were seeing. We temporarily changed those fields to the string type and now the above snippet works. Of course ideally we don't want to send strings when we don't have to.
Related
I have a Windows service that publishes data to a .NET client and a web client over SignalR. I've recently had some funny issues, but can't quite get a consistent behavior.
The problem lies in serializing the degrees sign, i.e. "°C". Most of the time it is serialized correctly, but I've had a few times where I see the following in my debugger:
See how the first time the "°" is serialized correctly, but the second time we see the question marks in the diamonds?
I've read this means it is an invalid UTF-8 character. But then why do all the other properties serialize correctly? This is a screenshot where you see one correct and one incorrect, but the entire JSON contains hundreds of these "°C" strings that look correct.
So why this one exception? It's not always the same position/property, and it doesn't always happen. This makes me think it must be a combination with preceding/succeeding characters, no?
Any ideas how to fix this or at least how to investigate this further?
Update
This is how I do serialization. I set it up on startup:
var serializerSettings = new JsonSerializerSettings();
serializerSettings.Converters.Add(new StructuredAmountJsonConverter());
var serializer = JsonSerializer.Create(serializerSettings);
GlobalHost.DependencyResolver = new AutofacDependencyResolver(_lifetimeScope);
GlobalHost.DependencyResolver.Register(typeof(JsonSerializer), () => serializer);
What's happening here is I'm telling SignalR to use Autofac to resolve dependencies. Then I register my JSON.NET serializer. The JSON.NET serializer has one custom converter, which converts my Amount class to the structure you see above (with a value and a unit property).
So you could think the problem lies in the converter, but then why is it working 95% of the time? Or should I specify the encoding in my converter?
Update 2
I've been using Fiddler to capture my network traffic and I can't see the wrong characters there. So I'm guessing the encoding problem is at the client side. I will investigate further.
Update 3
I've managed to capture the traffic in Fiddler and while it looks good in the Text view, when I select the HexView I do see something weird:
Notice how it says "°C" instead of "°C". So maybe it is sending it from the server in the wrong way?
Also, keep in mind my client is a .NET (WPF) client. This is my code to connect on the client side (simplified):
var url = "myUrl...";
_hubConnection = new HubConnection(url);
var hubProxy = _hubConnection.CreateHubProxy("MyHub");
hubProxy.On<object>("receive", OnDataReceived);
await _hubConnection.Start();
And when receiving data:
var message = JsonConvert.DeserializeObject<MyDataContract>(obj.ToString(), new StructuredAmountJsonConverter());
Update 4
This post makes me think this is happening:
The server/SignalR is sending my data as UTF-8, but the client is expecting latin-1 or Windows-1252 (probably the latter). So now I need to find out how I can make it use UTF-8.
May I know how you are serializing the objects? I think there is a way to specify the encoding types for serializing special characters like this. Here is a link I found- Special characters in object making JSON invalid (DataContractJsonSerializer)
I'm looking to send bitmap (or image) over ethernet in order to perform some image processing, then send them back.
The 'client' is running c# code, extracting frames from a video using emgucv. The 'server' will be running c/c++ on an arm cpu, although at the moment is x86 on my laptop using elementary os. So I need to avoid using things like opencv for the image processing itself, but that's another point.
I looked into sockets etc and can send some data to/from the server, just text typed into a console at the moment.
From initial research, it seems like I'll need to convert the bitmap into a byte array in order to then send it, which I've done (I think) using the following code:
Stream stm = tcpclnt.GetStream();
int l;
using (MemoryStream ms = new MemoryStream())
{
bmpFrame.Save(ms, ImageFormat.Bmp);
byte[] byteFrame = ms.ToArray();
l = byteFrame.Length;
stm.Write(byteFrame, 0, byteFrame.Length);
stm.Flush();
}
Then on the server side trying to read it using:
char buff[10000];
int n;
n = read(conn_desc, buff, sizeof(buff)-1);
if ( n > 0)
{
//MemoryStream ms = new MemoryStream(receivedBytes);
//Bitmap bmpReceived = new Bitmap(ms, System.Drawing.Imaging.ImageFormat.Bmp);
printf("Received %s\n", buff);
}
else
{
printf("Failed receiving\n");
}
You can see the commented code where I thought I'd be able to change it back into a bitmap, but I'm not sure if I want/need to anymore, if I can just edit the images by accessing the bytes directly, and also I don't know how/if bitmaps work in C rather than C#.
Am I going along the right lines? Ideally I want to send all the data for a single frame, then do stuff to it, then send it back, but I've no idea if that's what I'm doing. Finding it more difficult than usual as on the server I'm just writing in scratch/gedit and using gcc to compile, having never coded on linux before so I'm missing things like intellisense/debugging.
Any help/recommendations would be appreciated
Yes, in short, I would say your approach for doing what you say you want to do is correct.
I would ask though, why are you sending it to the server? I'm guessing it has to do with parallel processing or something? Unless you're doing very heavy processing, working with a single frame is probably faster than transferring it to the server (as far as I know, EmguCV in C# isn't considerably slower than OpenCV in C/C++).
The System.Drawing.Bitmap class is part of the .Net framework, and you might not find a direct correspondance in C. You might want to consider first converting the image to some known format, such as a .png file. Then you send the raw bytes of that file. On the server side, you receive the raw byte array, and use whatever C struct to load that image as a .png file. The point is to remove some abstraction and have more precise knowledge of exactly what is being sent.
Perhaps it'd be easier to use something like the System.Net.WebRequest class? Again, it depends a bit on exactly what problem you're trying to solve.
I hope this helps at all - this response is a bit vague (your question is, too :P), so feel free to ask for clarification on specific parts :)
I'm passing messages between Windows C# client and Linux C++ server via TCP sockets. C# code uses protobuf-net v2, Linux Google's version of protobuf. Small test object that I pass has 6 fields ( Enum, Int, String). I need help with two problems:
C# portion unable to deserialize data sent from Linux, unless the Memory stream used as the storage for the data is initialized with the binary array in the constructor. Array can not be larger than the data sent from Linux ( 9 bytes in my case ). Code sample - byte[] data = new byte[9], copy data from the socket into the array.
MemoryStream myStream = new MemoryStream(data), pass myStream to Serializer.Deserialize...
If I initialize MemoryStream without bynary buffer or with array of 1024 bytes, Deserialize will create empty object, without processing data.
When I try to serialize the same object with the same values as Linux same in C# the size of the data is 11 bytes vs 9 on Linux. I checked the byte array in the debugger, C# version has the same 9 fields as Linux data in the indexes 2-11 of the array. Index 0 is 8 and index 1 is 9. I can try to get around the problem, by modifying Linux deserialization code, just need to know, if I always have to deal with two extra fields at the beginning of the message. Also, I can add two extra fields to messages generated on Linux if it going to fix my deserialization in C#, just need to know how to generate values for these two fields.
Thanks.
Alex.
Protobuf data is not self-terminating, simply. You can, however, create either a MemoryStream or ProtoReader that takes a larger payload, but is limited to a virtual length. If you are sending multiple messages, you will need to know the length of the payload - that is inescapable. This is often achieved via a length prefix. I would expect this to throw random errors - most likely "invalid field header 0" - and I wonder if you are swallowing that exception
impossible to comment without the specific example; most likely something to do with the default value of field number 1 (header 8 === "field 1 varint")
I am looking for the fastest way to serialize and deserialize a C# array of objects into a string...
Why a string and not a byte array? Well, I am working with a networking system (The Unity3d networking system to be specific) and they have placed a rather annoying restriction which does not allow the sending of byte arrays or custom types, two things I need (hard to explain my situation).
The simplest solution I have come up with for this is to serialize my custom types into a string, and then transmit that string as opposed to directly sending the object array.
So, that is the question! What is the fastest way to serialize an object array into a string? I would preferably like to avoid using voodoo characters (invisible/special characters), as I am not sure if Unity3d will cull them, but base64 encoding doesn't take full advantage of the allowed character spectrum. I am also worried about the efficiency of using base 64.
Obviously, since this is networking related, having the serialized data be as small as possible is a plus.
EDIT:
One possible way to do this would be to serialize to a byte array, and then pretend that that byte array is a string. The problem is, I am afraid that .Net 2.0, or Unity's networking system will end up culling some of the special or invisible characters created using this method... Something which very much needs to be avoided. I am hoping for a solution that has near or equal speed to this, but does not use any of the characters that are likely to be culled. (I have no idea what characters these are, but I have had bad experiences with Unity when it came to direct conversions to strings from byte arrays)
Json.Net is what I always use its simple and gets the job done in a human readable way. Json is about as lightweight as it gets and is widely used for sending data over the wire.
I'll give you this answer as accepted, but I suggest adding base64 encoding to your answer!
–Georges Oates Larsen
Thank you, and yes that is also a great option if readability is not an issue.
We use SoapFormatter so that the object be embedded in Javascript variables and otherwise be "safe" to pass around:
using (MemoryStream oStream = new MemoryStream())
{
(new SoapFormatter()).Serialize(oStream, theObject);
return Encoding.Default.GetString(oStream.ToArray());
}
using(MemoryStream s = new MemoryStream()) {
new BinaryFormatter().Serialize(s, obj);
return Convert.ToBase64String(s.ToArray());
}
and
using(MemoryStream s = new MemoryStream(Convert.FromBase64String(str))) {
return new BinaryFormatter().Deserialize(s);
}
I don't understand exactly how to send data over c# socket.send( byte[]),
I mean they say I need to send 0800 (Network Management Request) for an echo test, how to convert.
Please I've been programming for a while but I don't understand the instructions.
Thanks
First of all you need to have an understanding of the ISO8583 message format. For echo test messages, in the 87 revision, your MTID should be 0800 and field 70, the Network Management Information code, should be set to 301, indicating echo test.
Building an ISO message is quite tricky. (shameless plug coming up) I have released OpenIso8583.Net, a message parser/builder for .Net which can be extended into the particular flavor of ISO you are using
You should first understand the spec that you're working to; I expect you have something more specific than the bare ISO8583 message spec, something that is specific about the fields required and the content. The important thing is the way you build and deblock the ISO8583 fields from to and from the message based on the bitmap that specifies which fields are present.
When I've built ISO8583 test clients in C# in the past I first put together a set of classes that could build and deblock a message bitmap. Once you have that you need some code to build and deblock your messages. These will set (or test) bits in the bitmap and then extract or insert the expected fields into a byte buffer.
Once you have this working the actual sending and receiving of the byte buffer messages is trivial.
Looking at the spec, it would be impossible to provide a full answer here - but to get you started, you basically need to create the various messages and send them down the pipe. As the Socket class requires an array of bytes rather than a string, you can use one of the Encoding classes to get at the raw bytes of a string. If I am reading the info correctly from Wikipedia:
byte[] echo = System.Text.Encoding.ASCII.GetBytes("0100");
socket.Send(echo);
Disclaimer: I have never had to implement ISO 8583 and the reference I looked at wasn't clear if the codes were in fact simple ASCII characters (though I am betting they are). Hopefully someone more familiar with the standard will clarify or confirm that assumption.
//***************additional encoders***************
UnicodeEncoding encoderUnicode = new UnicodeEncoding();
UTF32Encoding encoder32 = new UTF32Encoding();
UTF7Encoding encoder7 = new UTF7Encoding();
UTF8Encoding encoder8 = new UTF8Encoding();
//*************end of additionals*************
ASCIIEncoding encoder = new ASCIIEncoding();
about the parser, do some google on iso 8583, preferably of 2003