I am trying to read the response of the server when attempting to log on using networkStream.read() using the following code:
if (connectionStream.DataAvailable && connectionStream.CanRead)
{
byte[] myReadBuffer = new byte[64];
string responseMessage = string.Empty;
int numberOfBytesRead = 0;
do
{
numberOfBytesRead = connectionStream.Read(myReadBuffer, 0, myReadBuffer.Length);
responseMessage = Encoding.ASCII.GetString(myReadBuffer, 0, numberOfBytesRead);
} while (connectionStream.DataAvailable);
Debug.Log("Message:" + responseMessage);
#breakpoint
if (responseMessage.Contains("OK"))
{
Debug.Log("logon sucessful");
}
else
{
Debug.LogError("Logon denied!");
}
}
By inspecting my local variables at breakpoint i know the Read() is excecuted without problem as numberOfBytesRead is set to 32, and myReadBuffer is filled with 32 bytes (all bytes in myReadBuffer match the bytes sent by the server). However after trying to extract the string from myReadbuffer using Encoding.ASCII.GetString() the string is still empty (Visual studio also says it is empty at the breakpoint), even though myReadBuffer isn't.
The bytes in myReadBuffer read:
32 0 0 0
1 0 0 0
0 0 0 0
76 79 71 79
78 58 32 48
59 79 75 59
32 83 83 61
54 66 67 0
which translates to: _ _ _ _ _ _ _ _ _ _ L O G O N : 0 ; O K ; S S = 5 A 8 _
Any suggestions as to what can cause this?
The response from the server contains some null ('\0') characters. Despite what the docs on Strings (C#) say about null termination in C#:
There is no null-terminating character at the end of a C# string; therefore a C# string can contain any number of embedded null characters ('\0').
Unity does not seem to comply to this, and actually does terminate a string after a null character has been encountered. Though I couldn't find any references to this in the unity docs.
The fix i ended up going with was replacing the null characters by spaces (Could also remove the null characters completely, but i want to know the characters were there at some point) like so: responseMessage = responseMessage.Replace('\x0', '\x0020');
While creating this post i figured it out, but coulnd't find any other posts on SO describing my problem. So answering it myself for future references. If anyone has any other/better solutions or additional information i'd still be glad to hear/accept that.
Related
I'm, trying to figure out bytes order after convertation from BitArray to byte[].
Firstly, here is the BitArray content:
BitArray encoded = huffmanTree.Encode(input);
foreach (bool bit in encoded)
{
Console.Write((bit ? 1 : 0));
}
Console.WriteLine();
Output:
Encoded: 000001010110101011111111
Okay, so if we convert these binary to Hex manually we will get: 05 6A FF
However, when I am using convertation in C#, here is what I get:
BitArray encoded = huffmanTree.Encode(input);
byte[] bytes = new byte[encoded.Length / 8 + (encoded.Length % 8 == 0 ? 0 : 1)];
encoded.CopyTo(bytes, 0);
string StringByte = BitConverter.ToString(bytes);
Console.WriteLine(StringByte); // just to check the Hex
Output:
A0-56-FF
Nevertheless, as I have mentioned, it should be 05 6A FF. Please help me to understand why is that so.
This question already has answers here:
Help with \0 terminated strings in C#
(7 answers)
Closed 2 years ago.
I am calling an API that returns a string with the following information: "abc \\u0000\\u0000 fjkdshf". I have tried removing this using the following code but it doesn't seem to work.
string res = str.Replace("\\", string.Empty)
.Replace("u0000", string.Empty)
.Trim();
I have read in a couple of articles that this string doesn't actually display when you are not debugging using Visual Studio so I don't know how to fix this problem. Please help!
you mistake the double backslash as actual characters. They are displayed in the debugger because the second backslash is escaped and the first is used as an escape charachter. if you want to replace it simply use "\u0000".
Here is an examplary programm that prints the UTF code of each character
void Main()
{
string s = "abc \u0000\u0000 fjkdshf";
Console.WriteLine(string.Join(" ", s.Select(x => Convert.ToInt32(x))));
string res = s.Replace("\u0000", string.Empty);
Console.WriteLine(string.Join(" ", res.Select(x => Convert.ToInt32(x))));
}
Output:
97 98 99 32 0 0 32 102 106 107 100 115 104 102
97 98 99 32 32 102 106 107 100 115 104 102
as you can see in the output, after the replacement the zeros are gone!
More information about literal string escaping
I've modified an example to send & receive from serial, and that works fine.
The device I'm connecting to has three commands I need to work with.
My experience is with C.
MAP - returns a list of field_names, (decimal) values & (hex) addresses
I can keep track of which values are returned as decimal or hex.
Each line is terminated with CR
:: Example:
MEMBERS:10 - number of (decimal) member names
NAME_LENGTH:15 - (decimal) length of each name string
NAME_BASE:0A34 - 10 c-strings of (15) characters each starting at address (0x0A34) (may have junk following each null terminator)
etc.
GET hexaddr hexbytecount - returns a list of 2-char hex values starting from (hexaddr).
The returned bytes are a mix of bytes/ints/longs, and null terminated c-strings terminated with CR
:: Example::
get 0a34 10 -- will return
0A34< 54 65 73 74 20 4D 65 20 4F 75 74 00 40 D3 23 0B
This happens to be 'Test Me Out'(00) followed by junk
etc.
PUT hexaddr hexbytevalue {{value...} {value...}} sends multiple hex byte values separated by spaces starting at hex address, terminated by CR/LF
These bytes are a mix of bytes/ints/longs, and null terminated c-strings :: Example:
put 0a34 50 75 73 68 - (ascii Push)
Will replace the first 4-chars at 0x0A34 to become 'Push Me Out'
SAVED OK
See my answer previously about serial handling, which might be useful Serial Port Polling and Data handling
to convert your response to actual text :-
var s = "0A34 < 54 65 73 74 20 4D 65 20 4F 75 74 00 40 D3 23 0B";
var hex = s.Substring(s.IndexOf("<") + 1).Trim().Split(new char[] {' '});
var numbers = hex.Select(h => Convert.ToInt32(h, 16)).ToList();
var toText = String.Join("",numbers.TakeWhile(n => n!=0)
.Select(n => Char.ConvertFromUtf32(n)).ToArray());
Console.WriteLine(toText);
which :-
skips through the string till after the < character, then splits the rest into hex string
then, converts each hex string into ints ( base 16 )
then, takes each number till it finds a 0 and converts each number to text (using UTF32 encoding)
then, we join all the converted strings together to recreate the original text
alternatively, more condensed
var hex = s.Substring(s.IndexOf("<") + 1).Trim().Split(new char[] {' '});
var bytes = hex.Select(h => (byte) Convert.ToInt32(h, 16)).TakeWhile(n => n != 0);
var toText = Encoding.ASCII.GetString(bytes.ToArray());
for converting to hex from a number :-
Console.WriteLine(123.ToString("X"));
Console.WriteLine(123.ToString("X4"));
Console.WriteLine(123.ToString("X8"));
Console.WriteLine(123.ToString("x4"));
also you will find playing with hex data is well documented at https://msdn.microsoft.com/en-us/library/bb311038.aspx
my input is this 12 13 13 AF 3F 5f.
I need output the same.
I pass the input through client to server:
byte[] = system.text.encoding.ascii.getbyte(input);
and receive at server side
string some = System.Text.Encoding.ASCII.GetString(output);
but I get excess 0's in the end of the byte almost around 1000's,
how do I trim this 0's with out changing my byte array size
Various options here:
some = some.Substring(0, some.IndexOf('\0') + 1); Or
some = some.Remove(some.IndexOf('\0')); Or
some = some.TrimEnd('\0');
ASCII is 7-bit. Your value AF exceeds 7 bit, try to use UTF8 encoding
I have been using C# to write a concrete provider implementation for our product for different databases. W/out getting into details, one of the columns is of byte array type (bytea in postgres - due to the preferences bytea was chosen over blob). The only problem, is that it does not return same value that was inserted. When I insert Int32 ("0") I get 8 [92 and 8x 48] (instead of [0,0,0,0]). I need a performance wise solution, that will return pure bytes I have inserted, instead of ASCII representation of value "0" on 8 bytes.
I am using Npgsql to retrive data. If someone knows solution for c# I will be happy to learn it as well.
Edit:
Postgres 9.0, .Net 3.5
Simplification
Command query: - inside it only does an insert statment
select InsertOrUpdateEntry(:nodeId, :timeStamp, :data)
Data parameter:
byte [] value = BitConverter.GetBytes((int)someValue);
Parameter is assigned as below
command.Parameters.Add(new NpgsqlParameter("data", NpgsqlDbType.Bytea)
{ Value = value });
Select statments:
select * from Entries
Same byte array I have entered, I want to get back. I would really appreciate your help.
Input: 0 0 0 0
Current Output: 92 48 48 48 48 48 48 48 48
Expected Output: 0 0 0 0
In Npgsql there is NpgsqlDataReader class to retrieve inserted rows, e.g:
NpgsqlConnection conn = new NpgsqlConnection(connStr);
conn.Open();
NpgsqlCommand insertCmd =
new NpgsqlCommand("INSERT INTO binaryData (data) VALUES(:dataParam)", conn);
NpgsqlParameter param = new NpgsqlParameter("dataParam", NpgsqlDbType.Bytea);
byte[] inputBytes = BitConverter.GetBytes((int)0);
Console.Write("Input:");
foreach (byte b in inputBytes)
Console.Write(" {0}", b);
Console.WriteLine();
param.Value = inputBytes;
insertCmd.Parameters.Add(param);
insertCmd.ExecuteNonQuery();
NpgsqlCommand selectCmd = new NpgsqlCommand("SELECT data FROM binaryData", conn);
NpgsqlDataReader dr = selectCmd.ExecuteReader();
if(dr.Read())
{
Console.Write("Output:");
byte[] result = (byte[])dr[0];
foreach(byte b in result)
Console.Write(" {0}", b);
Console.WriteLine();
}
conn.Close();
Result from C# app:
Input: 0 0 0 0
Output: 0 0 0 0
Result from pgAdmin:
"\000\000\000\000"
EDIT:
I found explanation why you getting:
92 48 48 48 48 48 48 48 48
I checked my code with previous version Npgsql2.0.10-bin-ms.net3.5sp1.zip and get above result (of course pgAdmin returns \000\000\000\000), so I think that best what you can do is to use another version without this bug.
ANSWER: User higher version of Npgsql than 2.0.10
Ran the same problem, but managed to solve the problem without having to resort to changing drivers.
PHP documentation has a good description of what's happening, Postgres is returning escaped data. Check your output against an ASCII table, when you see 92 48 ... it's the text lead in to an octal escape sequence, \0xx, just like PHP describes.
Postgres's binary data type explains the output escaped octets. Fret not, there are code examples.
The solution is to tell Postgres how to bytea output is escaped, which can be either escape or hex. In this case issue the following to Postgres via psql to match your data:
ALTER DATABASE yourdb SET BYTEA_OUTPUT TO 'escape';