I'm a C# programmer and I know nothing about hardware protocol things.
today I received a document that is some protocols of a Lock hardware, like this:
lock command
start 0x80
board address 0x01-0xf
lock address 0x00-18
command 0x33
verify code XX
sample:
machine send 0x80 0x01 0x01 0x33 0xB2
if recieve 0x80 0x01 0x01 0x01 0x91 (means unlock)
if receive 0x80 0x01 0x01 0x00 0x80 (means locked)
All I want to know is, if C# can handle these commands? if can, where I can have a quick start, or what should I search on google?
Thanks.
Yes. C# can handle this. This is call polling. Basically, the idea is
send command, receive reply, substring information that you need (in your case, most probably the last 2 bytes) and perform some function on it. I'm not sure if you understand but I'll give you an example on something I've done previously but this is event where data is send to machine whenever event is triggered.
public enum Transaction
{
LOCK = 0x01,
UNLOCK
};
private static string getTransactionDescription(Transaction transaction, string data = "")
{
string result = "";
switch (transaction)
{
case Transaction.UNLOCK:
case Transaction.LOCK:
var slot = ByteOperation.reverse4ByteBitPattern(data.Substring(32, 64));
for (int i = 8 - 1; i >= 0; i--)
{
for (int j = 0; j < 8; j++)
{
if ((Convert.ToInt32(ByteOperation.ToggleEndian_4Bytes(slot.Substring(i * 8, 8)), 16) & (1 << j)) > 0)
{
if (!string.IsNullOrWhiteSpace(result))
{
result += ", ";
}
result += "Slot " + (((7 - i) * 8) + j + 1).ToString("D3");
}
}
}
break;
}
return result;
Related
I have a C# server side websocket code, and I am sending test data from Chrome (javascript websocket client).
Following this: https://stackoverflow.com/a/8125509/2508439, and some other link, I created my C# webserver (server side) decode function as following:
public String DecodeMessage(Byte[] bytes)
{
Console.WriteLine("+DecodeMessage+");
String incomingData = String.Empty;
Byte secondByte = bytes[1];
bool masked = (bytes[1] & 128) != 0;
//long dataLength = secondByte & 127;
dataLength = secondByte & 127;
//Int32 indexFirstMask = 2;
indexFirstMask = 2;
if (masked)
{
Console.WriteLine("Masked bit SET");
}
if (dataLength == 126)
{
indexFirstMask = 4;
dataLength = bytes[3] | bytes[2] << 8;
}
else if (dataLength == 127)
{
indexFirstMask = 10;
dataLength = bytes[9] | bytes[8] << 8 | bytes[7] << 16 | bytes[6] << 24 | bytes[5] << 32 |
bytes[4] << 40 | bytes[3] << 48 | bytes[2] << 56;
}
//IEnumerable<Byte> keys = bytes.Skip(indexFirstMask).Take(4);
keys = bytes.Skip(indexFirstMask).Take(4);
Int32 indexFirstDataByte = indexFirstMask + 4;
Byte[] decoded = new Byte[bytes.Length - indexFirstDataByte];
Console.WriteLine("dataLength : " + dataLength + " ; bytes.Length : " + bytes.Length);
Int32 j = 0;
for (Int32 i = indexFirstDataByte; i < bytes.Length; i++)
{
decoded[j] = (Byte)(bytes[i] ^ keys.ElementAt(j % 4));
j++;
}
Console.WriteLine("-DecodeMessage-");
return incomingData = Encoding.UTF8.GetString(decoded, 0, decoded.Length);
}
public String DecodeRemainingMessage(Byte[] bytes, long bytesAlreadyRead)
{
Console.WriteLine("+DecodeRemainingMessage+");
String incomingData = String.Empty;
Int32 indexFirstDataByte = 0;
if ( indexFirstMask == 10 )//special case, what to do here?
{
indexFirstDataByte = 10;
}
Byte[] decoded = new Byte[bytes.Length - indexFirstDataByte];
//Byte[] decoded = new Byte[bytes.Length];
Int32 j = 0;
for (Int32 i = indexFirstDataByte; i < bytes.Length; i++)
{
decoded[j] = (Byte)(bytes[i] ^ keys.ElementAt(j % 4));
j++;
}
Console.WriteLine("-DecodeRemainingMessage-");
return incomingData = Encoding.UTF8.GetString(decoded, 0, decoded.Length);
}
Simple packets (125 size or less arrive just fine).
In case of size about 125 and less than 65535, also arrive fine (kind of: there is some detail but I'm not going in that right now [*]).
Packets above 65535: the whole decode function goes crazy:
Only the first time packet is decoded properly, and after that, what ever data I receive is totally binary (unreadable), and after the first packet arrives, in consecutive packets:
if (dataLength == 126)
{
...
}
else if (dataLength == 127) ...
both conditions are never fulfilled, and dataLength is always less than 126, which is then decoded as (small) packet, and hence never reconstructed properly.
Can anyone highlight what I may be doing wrong?
Thanks
[*]=> data below 65535 length sometimes comes in more than two packets, which then behaves the same way as the larger packets, and packets after the first time this function is hit never gets reconstructed again properly.
edit 1:
#Marc
Based on your comment, I have put the 'masked bit check' in above function, and I can see it is always set to '1' (as expected since this is only server side code for now).
I am also parsing the control frame in a different function, and in this function, provided my code is correct, I may be getting lots of junk data.
To elaborate, please see these functions below:
whole logical code:
The enum:
public enum ControlFrame { NA=0, CloseConnection=1, Ping=2, Pong=4, Text=8, Binary=16, ContinueFrame =32, FinalFrame=64 };
The parse control frame function:
private int ParseControlFrame(byte controlFrame)
{
int rv = (int)ControlFrame.NA;
bool isFinalFrame = (controlFrame & 0x80) == 0x80 ;
byte opCode = (byte)((controlFrame & 0x0F));
if ( opCode >= 0x3 && opCode <= 0x7 ||
opCode >= 0xB && opCode <= 0xF )//special frame, ignore it
{
Console.WriteLine("Reserved Frame received");
return rv;
}
if (opCode == 0x8 || opCode == 0x0 || opCode == 0x1 || opCode == 0x2 || opCode == 0x9 || opCode == 0xA) //proceed furter
{
if (opCode == 0x0) //continue frame
{
rv |= (int)ControlFrame.ContinueFrame;
Console.WriteLine("Continue Frame received");
}
if (opCode == 0x1) //text frame
{
rv |= (int)ControlFrame.Text;
Console.WriteLine("Text Frame received");
}
if (opCode == 0x2) //binary frame
{
rv |= (int)ControlFrame.Binary;
Console.WriteLine("Binary frame received");
}
if (opCode == 0x8) //connection closed
{
rv |= (int)ControlFrame.CloseConnection;
Console.WriteLine("CloseConnection Frame received");
}
if (opCode == 0x9) //ping
{
rv |= (int)ControlFrame.Ping;
Console.WriteLine("PING received");
}
if (opCode == 0xA) //pong
{
rv |= (int)ControlFrame.Pong;
Console.WriteLine("PONG received");
}
}
else // invalid control bit, must close the connection
{
Console.WriteLine("invalid control frame received, must close connection");
rv = (int)ControlFrame.CloseConnection;
}
if (isFinalFrame) //Final frame ...
{
rv |= (int)ControlFrame.FinalFrame;
Console.WriteLine("Final frame received");
}
//else
//{
// rv |= (int)ControlFrame.ContinueFrame;
// Console.WriteLine("Continue frame received");
//}
return rv;
}
Logical flow (code snippet from actual):
if (stream.DataAvailable)
{
long bytesAlreadyRead = 0;
bool breakMain = false;
while (client.Available > 0 )
{
byte[] bytes = new byte[client.Available];
stream.Read(bytes, 0, bytes.Length);
Console.WriteLine("if (stream.DataAvailable):\nclient.Available : " + client.Available +
" ; bytes.Length : " + bytes.Length);
//translate bytes of request to string
String data = Encoding.UTF8.GetString(bytes);
Console.WriteLine("Message received on: " + DateTime.Now);
if (bytesAlreadyRead == 0)
{
int controlFrame = ParseControlFrame(bytes[0]);
if (controlFrame == (int)ControlFrame.NA ||
(int)(controlFrame & (int)ControlFrame.Ping) > 0 ||
(int)(controlFrame & (int)ControlFrame.Pong) > 0) //ignore it
{
}
else
{
if ((int)(controlFrame & (int)ControlFrame.CloseConnection) > 0)
{
Console.WriteLine("Connection #" + c.Key + " Closed on: " + DateTime.Now);
tcpClients.Remove(c.Key);
breakMain = true;
break;
}
else
{
string result = c.Value.DecodeMessage(bytes);
File.WriteAllText("recvfile.txt", result);
}
}
}
else
{
string result = c.Value.DecodeRemainingMessage(bytes, bytesAlreadyRead);
File.AppendAllText("recvfile.txt", "\n");
File.AppendAllText("recvfile.txt", result);
}
bytesAlreadyRead += bytes.Length;
}
if ( breakMain == true )
{
break;
}
}
I don't get garbage but data is missed.
If I don't put this check, then, I start receiving garbage.
Based on Console.WriteLine output, I see something similar for data less than 65535:
Message received on: 12/29/2017 12:59:00 PM
Text Frame received
Final frame received
Masked bit SET
Message received on: 12/29/2017 12:59:12 PM
Text Frame received
Final frame received
Masked bit SET
For data above 65535:
Message received on: 12/29/2017 1:02:51 PM
Text Frame received
Final frame received
Masked bit SET
Message received on: 12/29/2017 1:02:51 PM
Reserved Frame received
i.e. Less than 65535, I am ok (most of the time).
With above 65535, things get strange.
When you mentioned:
I wonder if what is happening is that you're getting multiple messages in a single Read call (perfectly normal in TCP), consuming the first message, and incorrectly treating the entire bytes as consumed.
I never thought of this before, maybe I need to handle this somehow as well?
edit 2:
Based on your comment, i have modified the 'if (stream.DataAvailable)' logic, so it keeps on reading data in while loop until all data stored locally is actually flushed out.
So i may be close to solving it (thanks for your feedback), but the 2nd time DecodeMessage() function is called, it still decrypts in garbage (binary) data.
I am working on figuring it out!
Thanks
edit 3:
Ok, based on your suggestion, I have sorted out most of the logic. However, special case in 'DecodeRemainingMessage' function is what still remains a mystery. [I shifted some variables from local to class scope, hence they are commented out in functions]...
If I got it correct, I should not need to place any special condition here, but in that case, I still receive garbage.
Any pointers?
[sorry for the messy code, will update it once I get the right picture!]
Thanks
edit 4:
Following all your advises in comments/chat helped me get to the point where I updated decode logic greatly, and was still unable to get right data in case of above 65535 bytes. But when I tried final logic with Firefox, I got all the data properly! So many thanks to you, and, I still need to figure out how to handle buggy Chrome client! Thanks!!
Edit: your code assumes that the incoming frames are always masked. This is probably OK if your code is only ever a server, but you might want to check whether bytes[1] & 128 is set (masked) or clear (not masked). If it isn't masked: the header is 4 bytes shorter. You should be OK if this is only ever a server, as (from 5.2 in RFC6455):
All frames sent from client to server have this bit set to 1.
but: it would be good to double-check. You'd be amazed how many buggy clients there are in the wild that violate the specification.
--
The overall code looks fine, and is broadly comparable to what I have here. I can't see anything immediately wrong. This makes me suspect that the issue here is TCP streaming; it isn't obvious that you method is doing anything to report back what quantity of bytes should be logically consumed by this frame - i.e. the total header length plus the payload length. For comparison to my code, this would be out int headerLength and frame.PayloadLength combined.
I wonder if what is happening is that you're getting multiple messages in a single Read call (perfectly normal in TCP), consuming the first message, and incorrectly treating the entire bytes as consumed. This would mean that you start reading in the wrong place for the next frame header. A single Read invoke can return a fragment of one frame, exactly one frame, or more than one frame - the only thing it can't return is 0 bytes, unless the socket has closed.
What I'm doing is taking a user entered string, creating a packet with the data, then sending the string out to a serial port. I am then reading the data I send via a loopback connector. My send is working flawlessly, however my receive is randomly throwing an arithmetic overflow exception.
I say randomly because it is not happening consistently. For example, I send the message "hello" twice. The first time works fine, the second time outputs nothing and throws an exception. I restart my program, run the code again, and send hello only to receive "hell" and then an exception. On rare occasion, I'll receive the packet 3 or 4 times in a row without error before the exception.
Here is my relevant code:
public void receivePacket(object sender, SerialDataReceivedEventArgs e)
{
byte[] tempByte = new byte[2];
int byteCount = 0;
while (serialPort1.BytesToRead > 0)
{
if (byteCount <= 1)
{
tempByte[byteCount] = (byte)serialPort1.ReadByte();
}
if (byteCount == 1)
{
receivedString = new byte[tempByte[byteCount]];
receivedString[0] = tempByte[0];
receivedString[1] = tempByte[1];
}
else if (byteCount > 1)
{
byte b = (byte)serialPort1.ReadByte();
receivedString[byteCount] = b;
}
byteCount++;
}
int strLen = (byteCount - 3);
tempByte = new byte[strLen];
int newBit = 0;
for (int i = 2; i <= strLen+1; i++)
{
tempByte[newBit] = receivedString[i];
newBit++;
}
string receivedText = encoder.GetString(tempByte);
SetText(receivedText.ToString());
}
I'm well aware that my implementation using byteCount (which I use to traverse the byte array) is rather sloppy. When I step through the code, I find that when I get the error byteCount == 1, which is making strLen a negative number (since strLen is byteCount - 3, which is done because the packet contains a header, length, and CRC i.e. byteCount - 3 == # of actual data bytes received). This leads to by tempByte having a size of -2, which throws my exceptions. I, however, am having a very hard time figuring out why byteCount is being set to 1.
The code after this basically just traverses the data section of the array, copies it into the tempByte, then is sent off to a function to append the text in another thread.
I am guessing that byteCount is 1 because you only received one byte - or rather, you processed the first byte before the second one arrived in the buffer.
The ReadByte function will wait for a certain amount of time for a byte to arrive if there isn't one waiting.
Maybe if instead of checking BytesToRead, you did something more like this:
byte headerByte = serialPort1.ReadByte();
byte length = serialPort1.ReadByte();
receivedString = new byte[length];
receivedString[0] = headerByte;
receivedString[1] = length;
for (int i = 2; i < length; i++) {
receivedString[i] = serialPort1.ReadByte();
}
I need help with a checksum calculation.
This is (not my code!) but from specification
http://www.leupamed.at/?wpdmact=process&did=NC5ob3RsaW5r
private void CalcCheckSum(string msg, out byte checksum1, out byte checksum2)
{
byte cs1 = 0;
byte cs2 = 0;
// Always use "\n" as line break when calculating the checksum.
msg = msg.Replace("\r\n", "\n"); // Find and replace CR LF with LF
msg = msg.Replace("\r", "\n"); // Find and replace CR with LF.
for (int i = 0; i < msg.Length; i++)
{
cs1 += (byte) msg[i];
cs2 += cs1;
}
checksum1 = cs1;
checksum2 = cs2;
}
I must create a packet like this:
<!--:Begin:Chksum:1:--><!--:Ack:Msg:3:0:--><!--:End:Chksum:1:184:62:-->
The string <!--:Ack:Msg:3:0:--> is the actual data, I must calculate two checksum bytes (184 and 62) and insert these into the final packet (as seen above).
But my result from the calculation is 10 and 62
var msg = "<!--:Ack:Msg:3:0:-->";
byte checksum1 = 0;
byte checksum2 = 0;
CalcCheckSum(msg, out checksum1, out checksum2);
I don't now how to calculate correct checksum values.
This is checksum for response. Not for validating request.
I can't upload image due to low reputation, so look at last line in specification: https://drive.google.com/file/d/0B_Gs9q9SJteadVRwSVc1a2FmUTg/edit?usp=sharing
This acknowledge message is independent on request. Only it must be response to request message ID 3.
Solution?
After calculating checksum:
checksum1 = 256 - (10 + 62) = 184
checksum2 = 62
Device communicating without problem, now.
After calculating checksum:
checksum1 = 256 - (10 + 62) = 184
checksum2 = 62
Probably this question is too specific and no one has experience with this type of checksum calculation.
Device communicating without problem, now.
I am writing a C# application to read from several serial COM ports at the same time to analyze the data communication of an IPOD. The data being sent needs to be interpreted as HEX bytes. For example,
0xFF 0x55 0x01 0x00 0x04 0xC3 0xFF 0x55 ...
I want to be able to read this and display it in a rich textbox, for example
0xFF 0x55 0x01 0x00 0x04 0xC3
0xFF 0x55 ...
The start of a command includes a header (0xFF 0x55) and the rest is is the command + parameters + checksum.
What is the best way to go about this?
I currently have:
private delegate void SetTextDeleg(string text);
void sp_DataReceivedRx(object sender, SerialDataReceivedEventArgs e)
{
Thread.Sleep(500);
try
{
string data = IPODRxPort.ReadExisting(); // Is this appropriate??
// Invokes the delegate on the UI thread, and sends the data that was received to the invoked method.
// ---- The "si_DataReceived" method will be executed on the UI thread which allows populating of the textbox.
this.BeginInvoke(new SetTextDeleg(si_DataReceivedRx), new object[] { data });
}
catch
{ }
}
private void si_DataReceivedRx(string data)
{
int dataLength = data.Length*2;
double numLines = dataLength / 16.0;
for (int i = 0; i < numLines; ++i)
IPODTx_rtxtBox.Text += "\n";
IPODRx_rtxtBox.Text += SpliceText(convertAsciiTextToHex(data), 32) + "\n";
}
I can read data, but it is not in the appropriate format.
Im just not sure what the best way to get the hex data from the com port and display it line by line by command based on the command header (0xFF 0x55).
Any Suggestions?
Alex Farber's method works. Below is my code example:
SerialPort sp = (SerialPort) sender;
// string s = sp.ReadExisting();
// labelSerialMessage.Invoke(this.showSerialPortDelegate, new object[] { s });
int length = sp.BytesToRead;
byte[] buf = new byte[length];
sp.Read(buf, 0, length);
System.Diagnostics.Debug.WriteLine("Received Data:" + buf);
labelSerialMessage.Invoke(this.showSerialPortDelegate, new object[] {
System.Text.Encoding.Default.GetString(buf, 0, buf.Length) });
I'm receiving packets over COM port. Each packet begins with {0xFF, 0xFF} and ends with {0xFE, OxFE}. All received bytes are queued in Queue<byte> and after each void port_DataReceived(object sender, SerialDataReceivedEventArgs e) I'm processing that queue.
If any 0xFF or 0xFE shows up in packet, device add 0x00 after it.
How to extract each packet?
How to delete unnecessary 0x00 inside each packet that have header byte inside?
For the first issue I have:
void port_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
byte[] data = new byte[port.BytesToRead];
try
{
port.Read(data, 0, data.Length);
}
catch (Exception ex)
{
Debug.WriteLine(ex.Message);
}
data.ToList().ForEach(newByte => receivedData.Enqueue(newByte));
processData();
}
private void processData()
{
// Determine if we have a "packet" in the queue
if (Enumerable.SequenceEqual(receivedData.Take(2), new List<byte> { 0xFF, 0xFF }))
{
// Beginning of new packet in the front of queue is ready!
if (Enumerable.SequenceEqual(receivedData.Skip(Math.Max(0, receivedData.Count() - 2)).Take(2), new List<byte> { 0xFE, 0xFE }))
{
List<byte> tempPacket = new List<byte>();
// Whole packet in the queue
while(receivedData.Count > 0)
tempPacket.Add(receivedData.Dequeue());
tempPacket.TrimExcess();
Packet pack = new Packet(tempPacket, PacketOrigin.Serial);
}
}
}
I'm trying to remove all 0x00 that are after any 0xFE and 0xFF that can be found inside Queue<byte> so far I came up with:
List<byte> unconvertedPacket = new List<byte> { 0xFF, OxFF, 0x00, 0x00,0x4D, 0xFA 0xFE, 0x00, 0x01, 0x00, 0x03, 0xFE, 0xFE}
int index = 0;
while (index != null)
{
unconvertedPacket.RemoveAt(index + 1);
index = unconvertedPacket.IndexOf(0xFE);
}
while (index != null)
{
unconvertedPacket.RemoveAt(index + 1);
index = unconvertedPacket.IndexOf(0xFF);
}
Does anyone have maybe any other solution/advice for doing it?
Try the following approach:
In the DataReceived event handler keep reading the incoming data and append it to a buffer (byte[]).
First you need to find the start marker ({0xFF, 0xFF}) in the buffer of received data. You need to determine the index of this marker within the buffer.
Once you have the start index, you need to keep appending incoming data to the buffer and check if the end marker (0xFE, 0xFE) has arrived. Capture the index of the end marker within the buffer.
Once you have the start and end index you can extract the packet between them. Nevermind about the extra 0x00 byte that gets added after it. You know the index of the start and end marker and their length (2). Just extract the array of bytes between them.
You need to create a search algorithm to suit this purpose. Both the needle and the haystack are an array of bytes (byte[]). You can use the Boyer-Moore string search algorithm for this purpose.
Here's a simple C# implementation of the Boyer-Moore algorithm which only implements the bad character rule. Read up on Wikipedia if you also want to implement the good suffix rule.
The algorithm is normally intended for strings, but I modified it to work with byte arrays. Tested this locally with an IP camera to extract the received JPEG images.
Check out the Wikipedia article for more information about it. It contains a full Java implementation which you can easily translate to C#.
public class BoyerMoore
{
public static int IndexOf(byte[] needle, byte[] haystack)
{
if (needle == null || needle.Length == 0)
return -1;
int[] charTable = CreateCharTable(needle);
for (int i = needle.Length - 1, j; i < haystack.Length;)
{
for (j = needle.Length - 1; needle[j] == haystack[i]; i--, j--)
{
if (j == 0)
return i;
}
i += charTable[haystack[i]];
}
return -1;
}
private static int[] CreateCharTable(byte[] needle)
{
const int ALPHABET_SIZE = 256;
var table = new int[ALPHABET_SIZE];
for (int i = 0; i < table.Length; i++)
{
table[i] = needle.Length;
}
for (int i = 0; i < needle.Length - 1; i++)
{
table[needle[i]] = needle.Length - 1 - i;
}
return table;
}
}
Example usage:
var haystack = new byte[]
{0xFF, 0xFF, 0x00, 0x00, 0x4D, 0xFA, 0xFE, 0x00, 0x01, 0x00, 0x03, 0xFE, 0xFE};
var startIndexOf = BoyerMoore.IndexOf(new byte[] {0xFF, 0xFF}, haystack);
var endIndexOf = BoyerMoore.IndexOf(new byte[] {0xFE, 0xFE}, haystack);
var packet = new byte[endIndexOf - 2 - startIndexOf];
for (int i = startIndexOf + 2, j = 0; i < endIndexOf - startIndexOf; i++, j++)
{
packet[j] = haystack[i];
}
Voila, the packet byte array contains 9 bytes in this example and only contains the bytes between the start and end marker. You can now trigger an event and pass the packet as an event arg for example.
Remark: Receiving the data from the COM port is a continuous event. You need to keep monitoring it. Keep appending the received data and keep checking for the start -and index markers, extract the packages...etc. Watch out that your buffer does not overflow. You need to implement some housekeeping there.
Hope it helps. Check out AForge implementation of an MJPEGStream for an example of continuously reading incoming data.
To recapitulate:
Declare an instance variable to store the received data (e.g. _buffer = new byte[4096]).
Append the incoming data to the buffer in the DataReceived event handler.
Search for the start marker. If found remember the start index in an instance variable.
If you already know the position of the start marker, then search for the index of the end marker.
When you find the end marker, extract the packet and fire an event. Use the packet as part of the event's EventArgs.
Wash, rinse, repeat.
You need to implement some housekeeping to make sure the buffer won't overflow (> 4096 bytes). For instance once you find a packet you can clean up the buffer up until the last received end marker.