Here is my method for sending data:
// this method serializes an object and is returned as a string (Base64 encoded)
public void Send(Packet packetData)
{
try
{
StreamWriter w = new StreamWriter(cli.GetStream());
string s = SerializeObject(packetData);
w.WriteLine(s + "\n");
w.Flush();
}
catch (ObjectDisposedException) { ShutdownClient(-2); }
}
cli is TcpClient object
Packet is a serializable object
Here's my receive method:
private string GetMessage(StreamReader r)
{
try
{
string s = r.ReadLine();
s = s.Replace(" ", "");
// this string is returned and then deserialized
return s;
}
catch (Exception e) { System.Windows.Forms.MessageBox.Show(e.Message); return null; }
}
When I use this, 50% of the time it works. If it doesn't work, it's because of this:
"The input stream is not a valid binary format. The starting contents (in bytes) are: 6D-2E-44-72-61-77-69-6E-67-2E-43-6F-6C-6F-72-0F-00 ..."
I have tried using Encoding.Default.GetString/GetBytes in replacement of Base64, but then I get this:
"Binary stream '0' does not contain a valid BinaryHeader. Possible causes are invalid stream or object version change between serialization and deserialization."
If I print out the length of this (Default encoded) string, it is 183. But if I print out the actual string, nothing is printed.
How else can I send a byte[] as a string over StreamWriter?
How else can I send a byte[] as a string
Not the way you do it now, the byte[] content will get corrupted when the string is normalized. A string should only ever be used to store Unicode characters. Not every Unicode codepoint is valid.
If using a string is paramount then you should use Convert.ToBase64String() at the transmitter, Convert.FromBase64String() at the receiving end.
Do keep in mind that TCP is entirely capable of transferring binary data. You possibly fell into this hole because TCP implements a stream, it doesn't do anything to help you transmit messages. The simple way to transfer a binary 'message' is to first write the Length of the byte[]. The receiver first read that length, then knows what it should pass to the Read() call to recover the byte[] back from the TCP stream.
Related
I am using C# and .NET to encode and decode base64 string. The following are snippets of my code:
Base64 encoding:
using (var stream = new MemoryStream())
…...
return Convert.ToBase64String(stream.ToArray());
}
Base64 decoding
byte[] bytes = Convert.FromBase64String(messageBody);
My code fails 99% of the time with 1% chance to succeed though. The stack trace is as follows:
5xx Error Returned:System.FormatException: The input is not a valid Base-64 string as it contains a non-base 64 character, more than two padding characters, or an illegal character among the padding characters. at System.Convert.FromBase64_ComputeResultLength(Char inputPtr, Int32 inputLength) at System.Convert.FromBase64CharPtr(Char* inputPtr, Int32 inputLength) at System.Convert.FromBase64String(String s)*
Does anyone know what can cause base64 decoding to fail? My encoding and decoding methods are symmetric and I am really confused about what can be the root cause for this issue?
Thanks for all your replies.
It turned out there were still some old messages in Json format that previously failed in getting delivered and kept retrying in our system; however the new code change of our receiving side got deployed and our receiving side starts to expect messages in protobuf format which results in Deserialization failure when receiving old Json format messages.
In order to debug an issue like this I usually write some tests or create a console app to watch the variables as they change from function to function.
One of the possible scenario's for base64 decoding to fail is if the decoder input is HTMLEncoded. This is common when you pass an encrypted string into a URL for example. It will automatically be HTML encoded and then it sometimes can and sometimes can't be decoded depending on the characters that the encoded output has.
Here's a simple console app to demonstrate this.
class Program
{
static void Main(string[] args)
{
string input = "testaa";
TestEncodeDecode("test");
TestEncodeDecode("testa");
TestEncodeDecode("testaa");
Console.ReadLine();
}
private static void TestEncodeDecode(string input)
{
string encoded = Encode(input);
Console.WriteLine($"Encoded: {encoded}");
string htmlEncoded = WebUtility.UrlEncode(encoded);
Console.WriteLine($"htmlEncoded: {htmlEncoded}");
string decodedString = Decode(htmlEncoded);
Console.WriteLine($"Decoded: {decodedString}");
Console.WriteLine();
}
private static string Decode(string htmlEncoded)
{
try
{
byte[] decoded = Convert.FromBase64String(htmlEncoded);
return Encoding.ASCII.GetString(decoded);
}
catch(Exception)
{
return "Decoding failed";
}
}
private static string Encode(string input)
{
byte[] bytes = Encoding.ASCII.GetBytes(input);
using (var stream = new MemoryStream())
{
stream.Write(bytes);
return Convert.ToBase64String(stream.ToArray());
}
}
}
You'll see that the first two arguments ("test" and "testa") fail to decode, but the third ("testaa") will succeed.
In order to "fix" this, change the Decode method as follows:
private static string Decode(string htmlEncoded)
{
try
{
string regularEncodedString = WebUtility.UrlDecode(htmlEncoded);
byte[] decoded = Convert.FromBase64String(regularEncodedString);
return Encoding.ASCII.GetString(decoded);
}
catch(Exception)
{
return "Decoding failed";
}
}
I have a problem with my school project, i use Protobuf library but i have the following error:
Google.Protobuf.InvalidProtocolBufferException" Protocol message contained an invalid tag (zero).
My protocol message wrapper is:
syntax = "proto3";
package CardGameGUI.Network.Protocol.Message;
message WrapperMessage {
enum MessageType {
HELLO_MESSAGE = 0;
JOIN_ROOM_MESSAGE = 1;
JOIN_ROOM_RESPONSE_MESSAGE = 2;
}
MessageType type = 1;
bytes payload = 2;
}
I use this to send a message:
public void SendObject<T>(Protocol.Message.WrapperMessage.Types.MessageType type, T messageObject)
{
byte[] message;
// Serialize message
using (var stream = new MemoryStream())
{
((IMessage)messageObject).WriteTo(stream);
message = stream.GetBuffer();
}
byte[] wrapper = new Protocol.Message.WrapperMessage{Type = type, Payload = Google.Protobuf.ByteString.CopyFrom(message)}.ToByteArray();
Connection.SendObject<byte[]>("ByteMessage", wrapper);
}
And my server handler:
private void IncommingMessageHandler(PacketHeader header, Connection connection, byte[] message)
{
Protocol.Message.WrapperMessage wrapper = Protocol.Message.WrapperMessage.Parser.ParseFrom(message);
switch (wrapper.Type)
{
case Protocol.Message.WrapperMessage.Types.MessageType.HelloMessage:
GetClient(connection.ConnectionInfo.NetworkIdentifier).MessageHandler(Protocol.Message.HelloMessage.Parser.ParseFrom(wrapper.Payload.ToByteArray()));
break;
}
}
The wrapper message is perfectly unserialized, and type is correctly matched, but at the treatment of my Payload, the exception pops.
Do i do something bad?
Edit: a small screen of the message Payload
The problem is probably that you used GetBuffer without making use of the known length. GetBuffer returns the oversized backing array. The data after the stream's .Length is garbage and should not be consumed - it will typically (but not always) be zeros, which is what you are seeing.
Either use ToArray() instead of GetBuffer(), or track the .Length of the stream and only consume that much of the oversized buffer.
Another possibility is "framing" - it looks like you're handling packets, but if this is TCP there is no guarantee that the chunks you receive are the same sizes as the chunks you send. If you are sending multiple messages over TCP you need to implement your own framing (typically via a length prefix, since you're talking binary data).
Incidentally, this isn't protobuf-net.
If neither of those is the problem: check the data you receive is exactly (byte for byte) the data you send (including lengths). It is easy for data to get corrupted or mis-chunked by IO code.
i encounter this problem in this case
because my serialize bytestream loss the varint lenth
such as if i serialize a "Person.proto" message which have 672 bytes
if i deserialize the 672 bytes will encounter the error
the solve strategy is that add the varint len in the 672bytes so youcan get a 674bytes stream
the extra amount data is the "varint" code for 672, which is 160,5
you can get the varint bytes by the function
public static byte[] VarInt(int value)
{
//data len
List<byte> varIntBuffer = new List<byte>();
int index = 0;
while (true)
{
if ((value & ~0x7f) == 0)
{
varIntBuffer.Add((byte)(value & 0x7f));
break;
}
else
{
varIntBuffer.Add((byte)((value & 0x7f) | 0x80));
value = value >> 7;
}
index++;
}
return varIntBuffer.ToArray();
}
I had this same issue when attempting to deserialize a byte array which had been initialized to a fixed size but there was a bug which meant I was not populating the array with proto bytes (so the byte array was populated with zeros when I was attempting to deserialize).
It turns out that I was reading bytes from a JMS BytesMessage twice in a test case but was not calling BytesMessage.reset() before the second read.
I'm guessing you could get a similar bug if attempting to read from an InputStream twice without calling reset()
I'm trying to send a string containing special characters through a TcpClient (byte[]). Here's an example:
Client enters "amé" in a textbox
Client converts string to byte[] using a certain encoding (I've tried all the predefined ones plus some like "iso-8859-1")
Client sends byte[] through TCP
Server receives and outputs the string reconverted with the same encoding (to a listbox)
Edit :
I forgot to mention that the resulting string was "am?".
Edit-2 (as requested, here's some code):
#DJKRAZE here's a bit of code :
byte[] buffer = Encoding.ASCII.GetBytes("amé");
(TcpClient)server.Client.Send(buffer);
On the server side:
byte[] buffer = new byte[1024];
Client.Recieve(buffer);
string message = Encoding.ASCII.GetString(buffer);
ListBox1.Items.Add(message);
The string that appears in the listbox is "am?"
=== Solution ===
Encoding encoding = Encoding.GetEncoding("iso-8859-1");
byte[] message = encoding.GetBytes("babé");
Update:
Simply using Encoding.Utf8.GetBytes("ééé"); works like a charm.
Never too late to answer a question I think, hope someone will find answers here.
C# uses 16 bit chars, and ASCII truncates them to 8 bit, to fit in a byte. After some research, I found UTF-8 to be the best encoding for special characters.
//data to send via TCP or any stream/file
byte[] string_to_send = UTF8Encoding.UTF8.GetBytes("amé");
//when receiving, pass the array in this to get the string back
string received_string = UTF8Encoding.UTF8.GetString(message_to_send);
Your problem appears to be the Encoding.ASCII.GetBytes("amé"); and Encoding.ASCII.GetString(buffer); calls, as hinted at by '500 - Internal Server Error' in his comments.
The é character is a multi-byte character which is encoded in UTF-8 with the byte sequence C3 A9. When you use the Encoding.ASCII class to encode and decode, the é character is converted to a question mark since it does not have a direct ASCII encoding. This is true of any character that has no direct coding in ASCII.
Change your code to use Encoding.UTF8.GetBytes() and Encoding.UTF8.GetString() and it should work for you.
Your question and your error is not clear to me but using Base64String may solve the problem
Something like this
static public string EncodeTo64(string toEncode)
{
byte[] toEncodeAsBytes
= System.Text.ASCIIEncoding.ASCII.GetBytes(toEncode);
string returnValue
= System.Convert.ToBase64String(toEncodeAsBytes);
return returnValue;
}
static public string DecodeFrom64(string encodedData)
{
byte[] encodedDataAsBytes
= System.Convert.FromBase64String(encodedData);
string returnValue =
System.Text.ASCIIEncoding.ASCII.GetString(encodedDataAsBytes);
return returnValue;
}
im trying to send an image via network stream, i have a sendData and Getdata functions
and i always get an invalid parameter when using the Image.FromStream function
this is my code :
I am Getting the pic from the screen, then converting it to a byte[]
Inserting it to a Memory stream that i send via a networkStream.
private void SendData()
{
StreamWriter swWriter = new StreamWriter(this._nsClient);
// BinaryFormatter bfFormater = new BinaryFormatter();
// this method
lock (this._secLocker)
{
while (this._bShareScreen)
{
// Check if you need to send the screen
if (this._bShareScreen)
{
MemoryStream msStream = new MemoryStream();
this._imgScreenSend = new Bitmap(this._imgScreenSend.Width, this._imgScreenSend.Height);
// Send an image code
swWriter.WriteLine(General.IMAGE);
swWriter.Flush();
// Copy image from screen
this._grGraphics.CopyFromScreen(0, 0, 0, 0, this._sizScreenSize);
this._imgScreenSend.Save(msStream, System.Drawing.Imaging.ImageFormat.Jpeg);
msStream.Seek(0, SeekOrigin.Begin);
// Create the pakage
byte[] btPackage = msStream.ToArray();
// Send its langth
swWriter.WriteLine(btPackage.Length.ToString());
swWriter.Flush();
// Send the package
_nsClient.Write(btPackage, 0, btPackage.Length);
_nsClient.Flush();
}
}
}
}
private void ReciveData()
{
StreamReader srReader = new StreamReader(this._nsClient);
string strMsgCode = String.Empty;
bool bContinue = true;
//BinaryFormatter bfFormater = new BinaryFormatter();
DataContractSerializer x = new DataContractSerializer(typeof(Image));
// Lock this method
lock (this._objLocker)
{
while (bContinue)
{
// Get the next msg
strMsgCode = srReader.ReadLine();
// Check code
switch (strMsgCode)
{
case (General.IMAGE):
{
// Read bytearray
int nSize = int.Parse(srReader.ReadLine().ToString());
byte[] btImageStream = new byte[nSize];
this._nsClient.Read(btImageStream, 0, nSize);
// Get the Stream
MemoryStream msImageStream = new MemoryStream(btImageStream, 0, btImageStream.Length);
// Set seek, so we read the image from the begining of the stream
msImageStream.Position = 0;
// Build the image from the stream
this._imgScreenImg = Image.FromStream(msImageStream); // Error Here
Part of the problem is that you're using WriteLine() which adds Environment.NewLine at the end of the write. When you just call Read() on the other end, you're not dealing with that newline properly.
What you want to do is just Write() to the stream and then read it back on the other end.
The conversion to a string is strange.
What you're doing, when transferring an image, is sending an array of bytes. All you need to do is send the length of the expected stream and then the image itself, and then read the length and the byte array on the other side.
The most basic and naive way of transferring a byte array over the wire is to first send an integer that represents the length of the array, and read that length on the receiving end.
Once you now know how much data to send/receive, you then send the array as a raw array of bytes on the wire and read the length that you previously determined on the other side.
Now that you have the raw bytes and a size, you can reconstruct the array from your buffer into a valid image object (or whatever other binary format you've just sent).
Also, I'm not sure why that DataContractSerializer is there. It's raw binary data, and you're already manually serializing it to bytes anyway, so that thing isn't useful.
One of the fundamental problems of network programming using sockets and streams is defining your protocol, because the receiving end can't otherwise know what to expect or when the stream will end. That's why every common protocol out there either has a very strictly defined packet size and layout or else does something like sending length/data pairs, so that the receiving end knows what to do.
If you implement a very simple protocol such as sending an integer which represents array length and reading an integer on the receiving end, you've accomplished half the goal. Then, both sender and receiver are in agreement as to what happens next. Then, the sender sends exactly that number of bytes on the wire and the receiver reads exactly that number of bytes on the wire and considers the read to be finished. What you now have is an exact copy of the original byte array on the receiving side and you can then do with it as you please, since you know what that data was in the first place.
If you need a code example, I can provide a simple one or else there are numerous examples available on the net.
Trying to keep it short:
the Stream.Read function (which you use) returns an int that states how many bytes were read, this is return to you so you could verify that all the bytes you need are received.
something like:
int byteCount=0;
while(byteCount < nSize)
{
int read = this._nsClient.Read(btImageStream, byteCount, nSize-byteCount);
byteCount += read;
}
this is not the best code for the job
In my Blackberry application I am loading JSON using the following method.
private static Object loadJson(String uriStr){
Object _json = null;
Message response = null;
BlockingSenderDestination bsd = null;
try
{
bsd = (BlockingSenderDestination)
DestinationFactory.getSenderDestination
("CommAPISample", URI.create(uriStr));
if(bsd == null)
{
bsd =
DestinationFactory.createBlockingSenderDestination
(new Context("CommAPISample"),
URI.create(uriStr), new JSONMessageProcessor()
);
}
response = bsd.sendReceive();
_json = response.getObjectPayload();
}
catch(Exception e)
{
System.out.println(e.toString());
}
finally
{
if(bsd != null)
{
bsd.release();
}
}
return _json;
}
This is working fine. But the problem is when I am getting JSON, Arabic characters show as junk
(الرئيس التنÙ) . I submitted this issue to Blackberry support form
Arabic shows corrupted in the JSON output
As per the discussion, I encode the Arabic character into \uxxxx format(In my server side application) and it was working. But now I have to use a JSON from somebody else where I can’t change the server side code.
They are using asp.net C# , as per them they are sending the data like the following.
JsonResult result = new JsonResult();
result.ContentEncoding = System.Text.Encoding.UTF8;
result.JsonRequestBehavior = JsonRequestBehavior.AllowGet;
result.Data = “Data Object (Contains Arabic) comes here”
return result;
So my question is, If the server provide the data in the above manner, BlockingSenderDestination.sendReceive method can get a utf-8 data? Or it is expecting only \uxxxx encoded data for non-ascii. Or I have to do something else (like sending some header to server) so that I can directly use the utf-8 data.
In debug mode I check the value of 'response'. It is already showing junk characters.
Except from JSON I am able to handle Arabic everywhere else.
Yesterday I posted this issue in Blackberry form . But till now no reply.
I am new to blackberry and Java. So I am sorry if this is silly question.
Thanks in advance.
What is the content type in the response? Is the server explicitly defining the UTF-8 character encoding in the HTTP header? e.g.:
Content-Type: text/json; charset=UTF-8
If the API is ignoring the charset in the HTTP content type, an easier way to do the String conversion is by determining whether the Message received is a ByteMessage or a StreamMessage. Retrieve the message as a byte array and then convert to a string using the UTF-8 encoding
i.e.:
Message msg = bsd.sendReceive();
byte[] msgBytes = null;
if (msg instanceof ByteMessage) {
msgBytes = ((ByteMessage) msg).getBytePayload();
}
else { /* StreamMessage */
// TODO read the bytes from the stream into a byte array
}
return new String(msgBytes,"UTF-8");
At last I found the solution myself.
The data sending from server was in UTF-8 which uses double byte to show single character. But BlockingSenderDestination.sendReceive() is not able to identify that. So it is creating one character for each byte. So the solution was to get each character and get the byte from that character and add to a byte array. From that byte array create a string with UTF8 encoding.
If anyone know to use BlockingSenderDestination.sendReceive() for utf-8 please post here. So that we can avoid this extra conversion method.