I am trying to create a messenger using asp.net core and WebSockets.I want the client to convert a Message object to byte[] and send it to the server by a WebSocket. But on the server-side, I can't deserialize the object that the client serialized. after some debugging, I found some differences between serialized buffers from the same objects on both sides.
actully, the serialized byte[] from msg object on the server-side is not the same as serialized byte[] from msg object on the client-side.
Here is my code on the server-side:
Message msg = new Message()
{
Text="Test"
};
var test = ObjectToByteArray(msg);
public static byte[] ObjectToByteArray(Object obj)
{
BinaryFormatter bf = new BinaryFormatter();
using (var ms = new MemoryStream())
{
bf.Serialize(ms, obj);
return ms.ToArray();
}
}
[Serializable]
public class Message
{
public string Text { get; set; }
}
And exactly the same code on the client-side (my server-side is asp.net core 5 API and my client-side is Avalonia dotnet 6)
but here is peace of var test on the server-side (note that it has 170 bytes):
and here is peace of var test on the client-side (note that it has 160 bytes):
why they are different? how I can fix this bug?
Related
I have a protobuf object that I am sending from a C# application (using clrZmq) to a C++ service (using the zmq C++ bindings) on a local machine (for testing). I attempt to send my object from C# using the following
Taurus.Odds odds = Util.GetFakeOdds();
using (var context = ZmqContext.Create())
using (var socket = context.CreateSocket(SocketType.REQ))
{
byte[] buffer = null;
socket.Connect(TARGET); // TARGET = "tcp://127.0.0.1:6500"
Taurus.FeedMux mux = new Taurus.FeedMux();
mux.type = Taurus.FeedMux.Type.ODDS;
mux.odds = odds;
SendStatus status = socket.Send(mux.ToByteArray());
if (status == SendStatus.Sent)
{
int i;
byte[] arr = socket.Receive(buffer, SocketFlags.None, out i);
Taurus.Bet bet = buffer.ToObject<Taurus.Bet>();
}
...
}
Where I am serializing to my Taurus.Odds object to byte[] via the extension method
public static byte[] ToByteArray(this object o)
{
if(o == null)
return null;
BinaryFormatter bf = new BinaryFormatter();
using (MemoryStream ms = new MemoryStream())
{
bf.Serialize(ms, o);
return ms.ToArray();
}
}
I see in my C++ application that the code receives the message, but the C++ ZMQ classes fail to de-serialize it correctly. I have some Java code that send to the C++ code in the same way without issue. My question is, am I sending my object via ZMQ correctly in the above and if not what am I doing wrong?
Thanks for your time.
Here's your error:
I am serializing to my Taurus.Odds object to byte[] via the extension method
...
BinaryFormatter bf = new BinaryFormatter();
...
You seem to be unaware of what BinaryFormatter is. It is in no way related to ProtoBuf. The docs say the following:
Serializes and deserializes an object, or an entire graph of connected objects, in binary format.
This binary format is a .NET-specific implementation detail. And it's very rigid at that, with poor versioning support. It was mainly used in the .NET remoting days, and it's generally considered a bad idea to use it today, as there are much better serializers around.
As you can see, there's no way your C++ app could be able to read that, as it's not in protobuf format.
So throw this method away and replace it with some proper protobuf serializing code, as explained in the protobuf-net docs. You'll need to add [ProtoContract] and [ProtoMember] attributes in your objects. Then you could write something like:
public static byte[] ToByteArray<T>(this T o)
{
if (o == null)
return null;
using (MemoryStream ms = new MemoryStream())
{
ProtoBuf.Serializer.Serialize(ms, o);
return ms.ToArray();
}
}
In my client project(asp.net MVC4) I have a class file
public class Person
{
public string Name { get; set; }
public int Phone { get; set; }
}
And from the controller I try to call my web service method. The web service part is different project.
Person per = new Person();
per.Name = "Vibin";
per.Phone = 123456789;
FirstService.WebService service = new FirstService.WebService();
service.TakeList(per);
And my web service method is
[WebMethod]
public void TakeList(Person theList)
{
// stuff
}
The problem is I am unable to pass the value to the web method. I searched a lot to find a solution but failed. Please help me to fix this issue. Also please provide an example for sending array values to web service in asp.net c#. Thanks in advance.
I don't know but it is issue of type ambiguity.
You person class that you have used is not same as Web Service defined even though both has same property.
When you add web reference it will create its own entity and you have to map your MVC model to Web service model.
Person per = new Person(); // your MVC4 App local model.
per.Name = "Vibin";
per.Phone = 123456789;
FirstService.WebService service = new FirstService.WebService();
mvcEmpty.FirstService.Person p = new mvcEmpty.FirstService.Person(); // Web service person generated during proxy generation when you add web reference.
p.Name = per.Name;
p.phone = per.Phone;
service.TakeList(p);
By using (serialize and deserialize) you can do that.
Serialize the object at client side(it will sending as byte of array).
Deserialize the received byte of array to your object at web service side.
Use the below code:
// Convert an object to a byte array
private byte[] ObjectToByteArray(Object obj)
{
if(obj == null)
return null;
BinaryFormatter bf = new BinaryFormatter();
MemoryStream ms = new MemoryStream();
bf.Serialize(ms, obj);
return ms.ToArray();
}
// Convert a byte array to an Object
private Object ByteArrayToObject(byte[] arrBytes)
{
MemoryStream memStream = new MemoryStream();
BinaryFormatter binForm = new BinaryFormatter();
memStream.Write(arrBytes, 0, arrBytes.Length);
memStream.Seek(0, SeekOrigin.Begin);
Object obj = (Object) binForm.Deserialize(memStream);
return obj;
}
I have sent data as byte using TcpClient and I wanted to send my own class instead bytes of data.
By bytes of data, what I meant is that I am sending the data converted into bytes like this:
using (MemoryStream bufferStream = new MemoryStream())
{
using (BinaryWriter bufferData = new BinaryWriter(bufferStream))
{
// Simple PONG Action
bufferData.Write((byte)10);
}
_logger.Info("Received PING request, Sending PONG");
return bufferStream.ToArray();
}
And instead I would like to send it like this, without having to declare its size or w/e
public class MyCommunicationData
{
public ActionType Action { get; set; }
public Profile User { get; set; }
...
}
Normally, when I send my data as bytes the first 5 bytes I use to indicate the action and the message size.
But if I migrate to serialize all the data as a single class, do I still need to send what action and size it is or using serialized messages the client and server would know what to read etc or is there a way to do so I can send it without having to specify things out of the serialization object ?
Not sure if this matters here, I am using AsyncCallback to read and write to the network stream:
_networkStream = _client.tcpClient.GetStream();
_callbackRead = new AsyncCallback(_OnReadComplete);
_callbackWrite = new AsyncCallback(_OnWriteComplete);
Let me know if you need me to post any other functions.
If you use a text based serializer(for ex, Json), you can utilize StreamReader's ReadLine and StreamWriter's WriteLine (created from tcpClient.GetStream).
Your code would be something like
writer.WriteLine(JsonConvert.SerializeObject(commData))
and to get the data on the other end
var myobj = JsonConvert.DeserializeObject<MyCommunicationData>(reader.ReadLine())
--EDIT--
//**Server**
Task.Factory.StartNew(() =>
{
var reader = new StreamReader(tcpClient.GetStream());
var writer = new StreamReader(tcpClient.GetStream());
while (true)
{
var myobj = JsonConvert.DeserializeObject<MyCommunicationData>(reader.ReadLine());
//do work with obj
//write response to client
writer.WriteLine(JsonConvert.SerializeObject(commData));
}
},
TaskCreationOptions.LongRunning);
I am trying to learn udp sockets etc.... I created two programs server and client. The client sends a packet to the server, the server bounces it back.
This is the code I use in both programs for converting the data to and from a byte[]
but I am getting an error when converting from byte[]
public static Packet Open(byte[] b)
{
MemoryStream memStream = new MemoryStream();
BinaryFormatter binForm = new BinaryFormatter();
memStream.Write(b, 0, b.Length);
memStream.Seek(0, SeekOrigin.Begin);
object obj = new object();
try
{
// this line here is where the error is occurring
obj = (object)binForm.Deserialize(memStream);
}
catch (Exception er)
{
MessageBox.Show(er.Message);
}
if (obj is Packet)
return (Packet)obj;
else
return null;
}
public byte[] Bundle()
{
BinaryFormatter bf = new BinaryFormatter();
MemoryStream ms = new MemoryStream();
bf.Serialize(ms, this);
return ms.ToArray();
}
If I do this, all from one program it works
Packet p =new Packet();
p.Message="hello";
byte[] data = p.Bundle();
Packet p2 = Packet.Open(data);
MessageBox.Show(p2.Message);
The error I am receiving is "unable to find assembly in "the name of my client program"
AnyIdeas?
It sounds to me like you are serializing a type that is not shared via a reference between both ends. Note: it is not sufficient to have the same class compiled into both, since BinaryFormatter includes the full type name including the assembly, so: it will still count as an unrelated type. The common fix there (and I use the word "fix" entirely incorrectly) is to write an assembly for the DTO and reference that assembly from both client and server. This approach still has many issues, though.
For info, there are other serializers that are compatible with just having a similar class at each end. I'm biased, but I would suggest having a look at protobuf-net; the output is usually significantly smaller, and it isn't tied to the type, meaning the class just has to be broadly similar at each end (it is very version tolerant). Plus it is faster (CPU-wise) too!
What I am doing is attempting to send an IPEndpoint through protobuf-net and what I observed is that when deserializing the array of 4 bytes into the IP4 address, the set code recieves a value of 8 bytes. Four bytes containing the orignal address, and 4 more bytes containing the address that was serialized. By stepping through the code I have been able to confirm that when Deserialize is called, it first reads the bytes, and then sets they bytes.
After doing some reading I learned about OverwriteList, and as can been seen in the example below, I have set that to true. However the setter is still provided an 8 byte value.
Does anyone have a clue what I am doing wrong?
This sample code should throw an exception when used with protobuf-net r480, Visual Studio 2010 as a .Net 4.0 console application.
using ProtoBuf;
using System.Net;
using System.IO;
namespace ConsoleApplication1
{
[ProtoContract]
class AddressOWner
{
private IPEndPoint endpoint;
public AddressOWner()
{ endpoint = new IPEndPoint(new IPAddress(new byte[] {8,8,8,8}), 0); }
public AddressOWner(IPEndPoint newendpoint)
{ this.endpoint = newendpoint; }
[ProtoMember(1, OverwriteList=true)]
public byte[] AddressBytes
{
get { return endpoint.Address.GetAddressBytes(); }
set { endpoint.Address = new IPAddress(value); }
}
}
class Program
{
static void Main(string[] args)
{
AddressOWner ao = new AddressOWner(new IPEndPoint(new IPAddress(new byte[] { 192, 168, 1, 1 }), 80));
MemoryStream ms = new MemoryStream();
Serializer.Serialize(ms, ao);
byte[] messageData = ms.GetBuffer();
ms = new MemoryStream(messageData);
AddressOWner aoCopy = Serializer.Deserialize<AddressOWner>(ms);
}
}
}
It looks like this is actually a bug, specific to byte[], which is handled as a particular protobuf primitive. Other arrays/lists are mapped to repeated (in protobuf terms), and handle the OverwriteList option correctly. I will tweak the byte[] handling to support this option.
Edit: this is fixed in r484, with supporting integration test