received byte never over 127 in serial port - c#

I have a program that sends a stream bytes to eother pc.
The values range from 0 to 255. I set up my serialport like this
sp.BaudRate = 115200;
sp.PortName = "COM53";
sp.DataBits = 8;
sp.StopBits = System.IO.Ports.StopBits.One;
sp.Parity = System.IO.Ports.Parity.None;
sp.ReadTimeout = 0;
sp.Open();
sp.DataReceived += new
System.IO.Ports.SerialDataReceivedEventHandler(sp_ DataReceived);
and then I have this
void sp_DataReceived(object sender,
System.IO.Ports.SerialDataReceivedEventArgs e)
{
string Mystring = sp.ReadExisting();
byte testbyte = 254;
// Gather all the bytes until 102 is reached
foreach (byte c in Mystring)
{
if(pixelcount<102)
pixel[pixelcount] = c;
pixelcount++;
if (c 126)
Console.WriteLine("big number {0}", c);// biggest number ever printed is 127
}
//got all the bytes, now draw them
if (pixelcount == 102)
{
Console.WriteLine("testbyte = {0}", testbyte);
oldx = 0;
pixelcount = 0;
pictureBox_rawData.Invalidate();
}
}
My problem is that "c" is never over 127.
What am I missing here?
i've test all encoding but i can not solve this problem. please help.
thanks
int91h

If you want to get the raw bytes, you should be using SerialPort.Read to read it into a byte array. Using SerialPort.ReadExisting to read the data into a string is going to force a conversion of some kind (i.e. encoding will convert bytes to chars).

In the documentation for SerialPort.Write (Remarks section):
By default, SerialPort uses ASCIIEncoding to encode the characters. ASCIIEncoding encodes all characters greater then 127 as (char)63 or '?'. To support additional characters in that range, set Encoding to UTF8Encoding, UTF32Encoding, or UnicodeEncoding.
Maybe ReadExisting behaves similar and converts every byte greater then 127 to 63.

You are not reading bytes, you are reading text. Which is produced by converting the bytes that the port receives according to the SerialPort.Encoding property value. Which defaults to Encoding.ASCII, an encoding that only has characters for byte values 0 through 127. Byte values out of that range are replaced by the "?" character.
Which explains what you see. Choosing another Encoding is an unlikely solution in your case, use SerialPort.Read() instead. The equivalent of ReadExisting is calling Read() with a sufficiently large count argument. You'll get back whatever fits, the actual number of bytes copied into the buffer is the method return value. It blocks when the input buffer is empty. Which can only happen in the DataReceived event handler when e.EventType is not equal to SerialData.Chars. Not usually a problem.
Beware that your call to pictureBox_rawData.Invalidate() is invalid. DataReceived runs on a threadpool thread. You can only touch control members on the UI thread. You'll need to use Control.BeginInvoke().

Just as what Hans Passant said, you need to use SerialPort.Read().
Something like this would work
'retrieve number of bytes in the buffer
Dim bytes1 As Integer = ComPort.BytesToRead
'create a byte array to hold the awaiting data
Dim comBuffer As Byte() = New Byte(bytes1 - 1) {}
'read the data and store it to comBuffer
ComPort.Read(comBuffer, 0, bytes1)

Related

Indicating the end of a raw data chunk in an RLE algorithm that can contain all byte values

I'm writing an RLE algorithm in C# that can work on any file as input. The approach to encoding I'm taking is as follows:
An RLE packet contains 1 byte for the length and 1 byte for the value. For example, if the byte 0xFF appeared 3 times in a row, 0x03 0xFF would be written to the file.
If representing the data as raw data would be more efficient, I use 0x00 as a terminator. This works because the length of a packet can never be zero. If I wanted to add the bytes 0x53 0x2C 0x01 to my compressed file it would look like this:
0x03 0xFF 0x00 0x53 0x2C 0x01
However a problem arises when trying to switch back to RLE packets. I can't use a byte as a terminator like I did for switching onto raw data because any byte value from 0x00 to 0xFF can be in the input data, and when decoding the bytes the decoder would misinterpret the byte as a terminator and ruin everything.
What can I do to indicate that I have to switch back to RLE packets when it can't be written as data in the file?
Here is my code if it helps:
private static void RunLengthEncode(ref byte[] bytes)
{
// Create a list to store the bytes
List<byte> output = new List<byte>();
byte runLengthByte;
int runLengthCounter = 0;
// Set the RLE byte to the first byte in the array and increment the RLE counter
runLengthByte = bytes[0];
// For each byte in the input array...
for (int i = 0; i < bytes.Length; i++)
{
if (runLengthByte == bytes[i] || runLengthCounter == 255)
{
runLengthCounter++;
}
else
{
// RLE packets under 3 should be written as raw data to avoid increasing the file size
if (runLengthCounter < 3)
{
// Add a 0x00 to indicate raw data
output.Add(0x00);
// Add the bytes that were skipped while counting the run length
for (int j = i - runLengthCounter; j < i; j++)
{
output.Add(bytes[j]);
}
}
else
{
// Add 2 bytes, one for the number of bytes and one for the value
output.Add((byte)runLengthCounter);
output.Add(runLengthByte);
}
runLengthCounter = 1;
runLengthByte = bytes[i];
}
// Add the last bytes to the list when finishing
if (i == bytes.Length - 1)
{
// Add 2 bytes, one for the number of bytes and one for the value
output.Add((byte)runLengthCounter);
output.Add(runLengthByte);
}
}
// Set the bytes to the RLE encoded data
bytes = output.ToArray();
}
Also if you want to comment and say that RLE isn't very efficient for binary data, I know it isn't. This is a project I'm doing to implement many kinds of compression to learn about them, not for an actual product.
Any help would be appreciated! Thanks!
There are many ways to unambiguously encode run-lengths. One simple way is, when decoding: if you see two equal bytes in a row, then the next byte is a a count of repeats of that byte after those first two. I.e. 0..255 additional repeats, so encoding runs of 2..257. (There's no point in encoding runs of 0 or 1.)

Decimal to Hex in middle of hex command

I am trying to print stored bitmap images in some printers.
The program is a Windows Form.
The command to print the logo (bitmap)-(if there is one stored) is:
port.Write("\x1C\x70\x01\x00");
('port' being my name for new serial port object).
There can be from 0 to 255 DEC (00 to FF HEX) different locations in the
printers memory.
I need a for loop or while loop that will increment the above line of code so,
port.Write("\x1C\x70\x01\x00"); would become
port.Write("\x1C\x70\x02\x00");
port.Write("\x1C\x70\x03\x00"); up to FF
port.Write("\x1C\x70\xFF\x00");
etc etc.
I looked on MSDN & Search Stack Overflow:
https://msdn.microsoft.com/en-us/library/bb311038.aspx
int to hex string in C#
Also, as an alternative to Coriths solution. The SerialPort object lets you write a byte array directly, rather than converting your bytes to a string that the SerialPort then converts back into bytes again.
for (byte i = 0; i < 255; i++)
{
var bytes = new byte[] { 0x1C, 0x70, i, 0x00 };
port.Write(bytes, 0, 4);
}
This loop should work for you. You can always use 0x to work in hexadecimal numbers in your loops.
for(var c = 0x01; c <= 0xFF; c++)
{
port.Write($"\x1C\x70\x{c:X2}\x00");
}

Identifying socket messages

I have a code snippet below that process a socket message, and I would like to know what should be the message sent in order not to result in a return.
Where SocketPacket is a class which stores the received socket, and DataLength would be the length of the received message, dataBuffer stores the message.
int num3;
byte num6 = 0;
SocketPacket workSocket;
int DataLength;
if (workSocket.dataBuffer[0] == 0x33)
{
if (DataLength < 0xbb)
{
return false;
}
for (num3 = 0; num3 < 0xba; num3++)
{
num6 = (byte) (num6 + workSocket.dataBuffer[num3]);
}
// how to get pass this if condition??
if (num6 != workSocket.dataBuffer[0xba])
{
return false;
}
}
So,
What would be the message to send to the server such to get pass the last if condition? (According to my understanding, the message should be at least 187 in length and the first digit should be "3:.........................")
What are the 0xba, 0x33, 0xbb etc....? Hexadecimals? How should I re-construct the input message? Convert these to ASCII? or.... dec? Doesn't make any sense to me.......
I tried to convert workSocket.dataBuffer[0 or 1 or any int] to a readable string. Convert.ToChar(workSocket.dataBuffer[0]) and workSocket.dataBuffer[0].toString() gives different results. Why is that?
Well, what you have there is a fixed-length message (a 187 bytes message). The first byte is a mark to identify the begining of the message then if the first byte is not 0x33 then your code doesn't process the bytes in the buffer.
Next, in the For statement you have a checksum. It is adding all the first 186 bytes in order to compare the result with the last byte (the precalculated checksum). It is to verify the message is okay (and it is useless by the way because protocols warranty the stream/datagram is okey).
So, about your questions:
What would be the message to send to the server such to get pass the last if condition?
Well, you need to send 187-bytes-length message (simply a byte[187]): the first one has to be 0x33, next the content and the last one has to be the checksum (you should calculate in the same way your snippet shows)
[0x33 | THE CONTENT | CHKSUM]
0 1 185 186
For example: the following buffer has a valid message (one that will pass the if condition). It simply begins with the mark byte (0x33) and the next 185 bytes are zero (I didn't assign values) then, the checksum is 0x33 + 0 + 0 + 0 + 0 ... 0 ... = 0x33
var buffer = new byte[187];
buffer[0] = 0x33;
buffer[186] = 0x33;
What are the 0xba, 0x33, 0xbb etc....? Hexadecimals?
Yes, they are just numbers in hexadecimal.
I tried to convert (sic) gives different results. Why is that?
Sockets send/receive bytes (just numbers) but the real question is: why do you assume they have to be text? Probably they are text, yes but who knows. That is part of the agreements (the protocol) that both endpoints agreed and that allows them to exchange data. So, you have to know what those 185 bytes (187 - 1 byte for mark - 1 byte checksum) mean in order to be able to process them.
Now, what you are doing is a reverse engineering of a protocol and that is because it is clear you don't know the message format and I guess you don't know what the content meaning is, and even when you are right and the content is just text, you ignore the encoding used. Those are the things you need to focus on.
I hope this helps you.

How to Reduce the size of a speccific format string?

I have designed a 2 Pass Assembler for my project. The output is in Hexadecimal form i.e. 15 is 0F.
I am working with ComPort and to send "0F" over the line it should be sent as String.
But the problem is that I can only receive 1 byte on the other end and sizeOf("0F") > 1 byte .
There is no way of decompressing data on the other end and I need to do all work on my end and still i want to receive "0F" on the other end.
Can i do this if yes then how?
I did this to get the hexadecimal string :
String.format("{0:X2}",15);
In addition,
using System.IO.Ports;
private SerialPort comPort = new SerialPort();
comPort.Write("0F");
On the receiving end I have a 8-bit processor which have a 1byte * 256 blocks i.e. 256 bytes. "0F" when received is received as 2 bytes and cannot be stored in a single block of 1 byte. So I want "0F" to be of 1 byte.
Looks like you need something like this:
// create buffer
byte[] buffer = new byte[256];
// put values you need to send to buffer
buffer[0] = 0x0f;
// ... add another bytes if you need...
// send them
var comPort = new SerialPort();
comPort.Write(buffer, 0, 1); // 0 is buffer offset, 1 is number of bytes to write

C# Can't generate initialization vector IV

I get the following error when I try to create a IV initialization vector for TripleDES encryptor.
Please see the code example:
TripleDESCryptoServiceProvider tripDES = new TripleDESCryptoServiceProvider();
byte[] key = Encoding.ASCII.GetBytes("SomeKey132123ABC");
byte[] v4 = key;
byte[] connectionString = Encoding.ASCII.GetBytes("SomeConnectionStringValue");
byte[] encryptedConnectionString = Encoding.ASCII.GetBytes("");
// Read the key and convert it to byte stream
tripDES.Key = key;
tripDES.IV = v4;
This is the exception that I get from the VS.
Specified initialization vector (IV) does not match the block size for this algorithm.
Where am I going wrong?
Thank you
MSDN explicitly states that:
...The size of the IV property must be the same as the BlockSize property.
For Triple DES it is 64 bits.
The size of the initialization vector must match the block size - 64 bit in case of TripleDES. Your initialization vector is much longer than eight bytes.
Further you should really use a key derivation function like PBKDF2 to create strong keys and initialization vectors from password phrases.
Key should be 24 bytes and IV should be 8 bytes.
tripDES.Key = Encoding.ASCII.GetBytes("123456789012345678901234");
tripDES.IV = Encoding.ASCII.GetBytes("12345678");
The IV must be the same length (in bits) as tripDES.BlockSize. This will be 8 bytes (64 bits) for TripleDES.
I've upvoted every answer (well the ones that are here before mine!) here as they're all correct.
However there's a bigger mistake you're making (one which I also made v.early on) - DO NOT USE A STRING TO SEED THE IV OR KEY!!!
A compile-time string literal is a unicode string and, despite the fact that you will not be getting either a random or wide-enough spread of byte values (because even a random string contains lots of repeating bytes due to the narrow byte range of printable characters), it's very easy to get a character which actually requires 2 bytes instead of 1 - try using 8 of some of the more exotic characters on the keyboard and you'll see what I mean - when converted to bytes you can end up with more than 8 bytes.
Okay - so you're using ASCII Encoding - but that doesn't solve the non-random problem.
Instead you should use RNGCryptoServiceProvider to initialise your IV and Key and, if you need to capture a constant value for this for future use, then you should still use that class - but capture the result as a hex string or Base-64 encoded value (I prefer hex, though).
To achieve this simply, I've written a macro that I use in VS (bound to the keyboard shortcut CTRL+SHIFT+G, CTRL+SHIFT+H) which uses the .Net PRNG to produce a hex string:
Public Sub GenerateHexKey()
Dim result As String = InputBox("How many bits?", "Key Generator", 128)
Dim len As Int32 = 128
If String.IsNullOrEmpty(result) Then Return
If System.Int32.TryParse(result, len) = False Then
Return
End If
Dim oldCursor As Cursor = Cursor.Current
Cursor.Current = Cursors.WaitCursor
Dim buff((len / 8) - 1) As Byte
Dim rng As New System.Security.Cryptography.RNGCryptoServiceProvider()
rng.GetBytes(buff)
Dim sb As New StringBuilder(CType((len / 8) * 2, Integer))
For Each b In buff
sb.AppendFormat("{0:X2}", b)
Next
Dim selection As EnvDTE.TextSelection = DTE.ActiveDocument.Selection
Dim editPoint As EnvDTE.EditPoint
selection.Insert(sb.ToString())
Cursor.Current = oldCursor
End Sub
Now all you need to do is to turn your hex string literal into a byte array - I do this with a helpful extension method:
public static byte[] FromHexString(this string str)
{
//null check a good idea
int NumberChars = str.Length;
byte[] bytes = new byte[NumberChars / 2];
for (int i = 0; i < NumberChars; i += 2)
bytes[i / 2] = Convert.ToByte(str.Substring(i, 2), 16);
return bytes;
}
There are probably better ways of doing that bit - but it works for me.
I do it like this:
var derivedForIv = new Rfc2898DeriveBytes(passwordBytes, _saltBytes, 3);
_encryptionAlgorithm.IV = derivedForIv.GetBytes(_encryptionAlgorithm.LegalBlockSizes[0].MaxSize / 8);
The IV gets bytes from the derive bytes 'smusher' using the block size as described by the algorithm itself via the LegalBlockSizes property.

Categories

Resources