I have an application in (windows) that sends logs in binary format.
The c# code to convert that to strings is:
public static CounterSampleCollection Deserialize(BinaryReader binaryReader)
{
string name = binaryReader.ReadString(); // counter name
short valueCount = binaryReader.ReadInt16(); // number of counter values
var sampleCollection = new CounterSampleCollection(name);
for (int i = 0; i < valueCount; i++)
{
// each counter value consists of a timestamp + the actual value
long binaryTimeStamp = binaryReader.ReadInt64();
DateTime timeStamp = DateTime.FromBinary(binaryTimeStamp);
float value = binaryReader.ReadSingle();
sampleCollection.Add(new CounterSample(timeStamp, value));
}
return sampleCollection;
}
I have a python udp socket that is listening to the port, but don't know how to convert the binary data I am receiving into strings so that I can parse it further.
Can any python expert please help me to convert that function into python function, so that I can convert the data I receive into python.
My code so far:
import socket
UDP_IP = "0.0.0.0"
UDP_PORT = 40001
sock = socket.socket(socket.AF_INET, # Internet
socket.SOCK_DGRAM) # UDP
sock.bind((UDP_IP, UDP_PORT))
while True:
data, addr = sock.recvfrom(8192) # buffer size is 8192 bytes
print "[+] : ", data
// this prints the binary
// convert the data to strings ??
I use struct to unpack binary data.
https://docs.python.org/2/library/struct.html
here's an example I use to unpack the data from a static file.
import struct
comp = open(traceFile, 'rb')
aData = comp.read()
s = struct.Struct('>' +' i i i f f f d i H H')
sSize = s.size
for n in range(0, len(aData), sSize):
print s.unpack(aData[n:n+sSize])
An example of reading from sockets is covered in the following:
http://www.binarytides.com/receive-full-data-with-the-recv-socket-function-in-python/
A snippet from that reference gives you some tools for writing the Python code you want. The snippet uses the try ... except clause and sleep() funciton. The reference contains other nice tips. But key to your question is that the binary data naturally converts to a python string.
while 1:
#recv something
try:
data = the_socket.recv(8192)
if data:
total_data.append(data)
#change the beginning time for measurement
begin=time.time()
else:
#sleep for sometime to indicate a gap
time.sleep(0.1)
except:
pass
#join all parts to make final string
s = ''.join(total_data) # join accepts type str, so binary string is converted
After you have string "s", you need to parse based on (1) the separator for the data pair that you have, (2) the separator between date and (3) the format of the date field. I do not know what your binary string looks like, so I will just sketch some code that you might use:
results = []
from datetime import datetime
pairs = s.split('\n') # assume that the pairs are linefeed-separated
for pair in pairs:
sdate, scount = pair.split(',') # assume that a pair is separated by a comma
timestamp = datetime.strptime(sdate, "%Y-%m-%d %H:%M:%S.%f") # format must match sdate
count = int(scount)
results.append(timestamp, count)
return results
Related
I received an output string from a camera which has a default format:<Ticket><length>CR LF <Ticket><content>CR LF
Example of this format:<ticket=0010>L<length>CR LF<ticket=0010><unique message ID>:<JSON content>CR LF
Result:0010L000000045\r\n0010000500000:{"ID": 1034160761,"Index":1,"Name": "Pos 1"}\r\n
So basically, my output string has some preliminary objects and then a JSON string.
After I start my client-side and connect it to the camera which runs the server-side of the TCP/IP, I received the output string of the camera which was this: (0000L000000114
0000, 20)
(24.380417;13.144794;62.600601;51.364979;+0.491;;59.664135;77.126488;97.884323;115.346687;+0.464;;
, 99)
Here I always used to get two string, first string, if compared with default format, has:
<Ticket><length>CR LF <Ticket>
and the second string which is of 99 bytes should be a JSON content according to the default format.
In the second string, the string indicates x,y, and z coordinates of two different regions of interest which I want in integer format because the x and y coordinates are of the corner and I want to find that of the center. So, for my application only 2nd string is of interest so I used this code to eliminate the first string:
if (readByCount > 30)
{
var output = (new string(buff).TrimEnd('\u0000'), readByCount);
Debug.WriteLine(output);}
So as the first string was of less than 30 bytes, it was not displayed. I tried splitting the 2nd string first by (;;) and then by (;) and make them of integer format by using code :
public async void ReadDataAsync(TcpClient mClient)
{
try
{
StreamReader clientStreamReader = new StreamReader(mClient.GetStream());
char [] buff = new char[1024];
int readByCount = 0;
while (true)
{
readByCount = await clientStreamReader.ReadAsync(buff, 0, buff.Length);
Directory.CreateDirectory("Camera o3D1");
if (readByCount <= 0)
{
Console.WriteLine("Disconnected from Server.");
mClient.Close();
break;
}
if (readByCount > 30)
{
var output = new string(buff).TrimEnd('\u0000');
Debug.WriteLine(output);
var output1 = output.Split(new[] { ";;" }, StringSplitOptions.RemoveEmptyEntries).Select(s => s.Split(';')
.Select(i => int.Parse(i)).ToArray()).ToArray();
Console.WriteLine(output1);
}
Array.Clear(buff, 0, buff.Length);
}
}
catch (Exception excp)
{
Console.WriteLine(excp.ToString());
}
}
But I am getting an error saying: Input string was not in correct format.
I tried the above code to split a normal string like
string text = "3;4;5;6;7;;3;4;5;6;7;;3;4;5;6;7;;3;4;5;6;7;;3;4;5;6;7;;";
and I was successful, but it's not working with JSON content. I believe the method which works with normal string won't work with JSON content. What do I have to do to split each component of the JSON and change the data type to an integer so that I can do some maths wit it?
"Input string was not in correct format" is the error generated by "int.Parse(i)" when the input string is not parseable as a valid integer
You are parsing every string returned by the split. One of those strings is"0000L000000114" which cannot be parsed into an integer, hence your error.
I need to receive and transmit data with a serial port. I have no problem receiving and transmitting, but I do not see the received data correctly.
If I use a program, ComTestSerial, I see these correct data:
{STX}1H|\^&|||cobas6000^1|||||host|RSUPL^BATCH|P|1 P|1|||||||U||||||^
O|1| IANNETTA M
BIS|0^5016^1^^S1^SC|^^^480^|R||||||N||||1|||||||20191018113556|||F
C|1|I| ^ ^^^|G
R|1|^^^480/|11|U/L{ETB}A6
But if I use my program in c #, with RicheditText or Texbox, I see this wrong data:
2497212492943812412412499111989711554484848944912412412412412410411111511612482838580769466658467721248012449138012449124124124124124124124851241241241241241249413791244912432323232323232327365787869848465327732667383124489453484954944994948349948367124949494525648941248212412412412412412478124124124124491241241241241241241245048495749484956494951535354124124124701367124491247312432323232323232323232323232323232323232323232323232323232323294323232323232323232323232323232323232323232323232329494941247113821244912494949452564847124494912485477623655413104.
I use this simple code (written by a colleague) to receive:
string cMsg = "";
while (this.ComPort.BytesToRead > 0)
{
int nChar = this.ComPort.ReadChar();
cMsg += nChar.ToString();
}
Thread.Sleep(100);
return cMsg;
Which reads data from a serial connection that works perfectly.
What could be the problem?
You're converting a number to a string, so, say, when nChar is 2, the output will be a string "2", and when nChar is 49, the output will be "49".
So, the message begins with {STX}1. {STX} is an ASCII control code 2, and 1 is ASCII code 49. Thus the "wrong data" begins with "249".
Thus, the data isn't wrong, and the code does exactly what you told it to, except that your colleague didn't mean what you intended :)
Instead of converting ASCII codes to strings, convert them to characters, and also use a stringbuilder to minimize the number of times the string is resized.
StringBuilder message(ComPort.BytesToRead);
while (ComPort.BytesToRead > 0)
{
message.Append((char)ComPort.ReadChar());
}
return message.ToString();
But you don't need to do any of it! SerialPort.ReadExisting does what you want:
return ComPort.ReadExisting();.
Stylistic note: C# is not Java, and littering the code with this. is not idiomatic nor necessary. Don't do it unless there's a good reason to.
From your code, it seems that you are getting an integer from your port, and when you are using ToString() you just write a number to string
int nChar = this.ComPort.ReadChar();
cMsg += nChar.ToString();
This integer should be A 21-bit Unicode code point.
So you just can use Char.ConvertFromUtf32(Int32) Method, it will convert integer to the actual character:
https://learn.microsoft.com/en-us/dotnet/api/system.char.convertfromutf32?view=netframework-4.8
Your full code should then look like this:
string cMsg = "";
while (this.ComPort.BytesToRead > 0)
{
int nChar = this.ComPort.ReadChar();
cMsg += Char.ConvertFromUtf32(nChar);
}
Thread.Sleep(100);
return cMsg;
Im getting a string from client side like this:
This is a face :grin:
And i need to convert the :grin: to unicode in order to send it to other service.
Any clue how to do that?
Here is a link to a quite good json file with relevant information. It contains huge array (about 1500 entries) with emojis, and we are interested in 2 properties: "short_name" which represents name like "grin", and "unified" property, which contains unicode representation like "1F601".
I built a helper class to replace short names like ":grin:" with their unicode equivalent:
public static class EmojiParser {
static readonly Dictionary<string, string> _colonedEmojis;
static readonly Regex _colonedRegex;
static EmojiParser() {
// load mentioned json from somewhere
var data = JArray.Parse(File.ReadAllText(#"C:\path\to\emoji.json"));
_colonedEmojis = data.OfType<JObject>().ToDictionary(
// key dictionary by coloned short names
c => ":" + ((JValue)c["short_name"]).Value.ToString() + ":",
c => {
var unicodeRaw = ((JValue)c["unified"]).Value.ToString();
var chars = new List<char>();
// some characters are multibyte in UTF32, split them
foreach (var point in unicodeRaw.Split('-'))
{
// parse hex to 32-bit unsigned integer (UTF32)
uint unicodeInt = uint.Parse(point, System.Globalization.NumberStyles.HexNumber);
// convert to bytes and get chars with UTF32 encoding
chars.AddRange(Encoding.UTF32.GetChars(BitConverter.GetBytes(unicodeInt)));
}
// this is resulting emoji
return new string(chars.ToArray());
});
// build huge regex (all 1500 emojies combined) by join all names with OR ("|")
_colonedRegex = new Regex(String.Join("|", _colonedEmojis.Keys.Select(Regex.Escape)));
}
public static string ReplaceColonNames(string input) {
// replace match using dictoinary
return _colonedRegex.Replace(input, match => _colonedEmojis[match.Value]);
}
}
Usage is obvious:
var target = "This is a face :grin: :hash:";
target = EmojiParser.ReplaceColonNames(target);
It's quite fast (except first run, because of static constructor initialization). On your string it takes less than 1ms (was not able to measure with stopwatch, always shows 0ms). On huge string which you will never meet in practice (1MB of text) it takes 300ms on my machine.
Trying to convert a huge hex string to a binary string, but the OverflowException keeps gets thrown. This is my code to convert an image file to a hex string (which when used with a FlowDocument works perfectly!):
string h = new System.Runtime.Remoting.Metadata.W3cXsd2001.SoapHexBinary(System.IO.File.ReadAllBytes(Path)).ToString();
Now, however, I want to take this hex string and convert it to a binary string so that it may also displayed in FlowDocument. First, I tried writing it to a temp text file and then attempt to read it into a byte array:
string TempPath = System.IO.Path.Combine(System.IO.Path.GetTempPath(), "Text.txt");
using (System.IO.StreamWriter sw = new System.IO.StreamWriter(TempPath))
{
sw.WriteLine(Convert.ToString(Convert.ToInt64(h, 16), 2).PadLeft(12, '0'));
}
byte[] c = System.IO.File.ReadAllBytes(TempPath);
When that didn't work, I tried reading it into a string:
string c = System.IO.File.ReadAll(TempPath);
Neither worked and still throw OverflowException. I have also tried just doing this and skipped writing to a file altogether:
string s = Convert.ToString(Convert.ToInt64(h, 16), 2).PadLeft(12, '0')
And despite what approach I take, I still get an exception thrown. How are large strings like this normally handled?
Update
I've modified my algorithm to convert one character at a time, so now it looks like this:
string NewBinary = "";
try
{
int i = 0;
foreach (char c in h)
{
if (i == 100) break;
NewBinary = string.Concat(NewBinary, Convert.ToString(Convert.ToInt64(c.ToString(), 16), 2).PadLeft(12, '0'));
i++;
}
}
The problem with this is that the string is always going to be super long and the code above takes a LONG time to generate the binary string. I limited the length to 100 to test conversion, so the conversion itself is not an issue.
An int64 is represented by a 16 character hex string, which is why attempting to convert a "huge string" causes an OverflowException - the value is more than can be represented by an int64. You will need to break the string up into groups of max 16 chars & convert those to binary & concatenate them.
You could convert a nibble at a time using a lookup array, for example:
public static string HexStringToBinaryString(string hexString)
{
var result = new StringBuilder();
string[] lookup =
{
"0000", "0001", "0010", "0011",
"0100", "0101", "0110", "0111",
"1000", "1001", "1010", "1011",
"1100", "1101", "1110", "1111"
};
foreach (char nibble in hexString.Select(char.ToUpper))
result.Append((nibble > '9') ? lookup[10+nibble-'A'] : lookup[nibble-'0']);
return result.ToString();
}
Convert each hex character of the string into its corresponding binary pattern (eg A becomes 1010 etc)
I'm trying to recreate the functionallity of
slappasswd -h {md5}
on .Net
I have this code on Perl
use Digest::MD5;
use MIME::Base64;
$ctx = Digest::MD5->new;
$ctx->add('fredy');
print "Line $.: ", $ctx->clone->hexdigest, "\n";
print "Line $.: ", $ctx->digest, "\n";
$hashedPasswd = '{MD5}' . encode_base64($ctx->digest,'');
print $hashedPasswd . "\n";
I've tried to do the same on VB.Net , C# etc etc , but only works the
$ctx->clone->hexdigest # result : b89845d7eb5f8388e090fcc151d618c8
part in C# using the MSDN Sample
static string GetMd5Hash(MD5 md5Hash, string input)
{
// Convert the input string to a byte array and compute the hash.
byte[] data = md5Hash.ComputeHash(Encoding.UTF8.GetBytes(input));
// Create a new Stringbuilder to collect the bytes
// and create a string.
StringBuilder sBuilder = new StringBuilder();
// Loop through each byte of the hashed data
// and format each one as a hexadecimal string.
for (int i = 0; i < data.Length; i++)
{
sBuilder.Append(data[i].ToString("x2"));
}
// Return the hexadecimal string.
return sBuilder.ToString();
}
With this code in Console App :
string source = "fredy";
using (MD5 md5Hash = MD5.Create())
{
string hash = GetMd5Hash(md5Hash, source);
Console.WriteLine("The MD5 hash of " + source + " is: " + hash + ".");
}
outputs : The MD5 hash of fredy is: b89845d7eb5f8388e090fcc151d618c8.
but i need to implement the $ctx->digest function, it outputs some binary data like
¸˜E×ë_ƒˆàüÁQÖÈ
this output happens on Linux and Windows with Perl.
Any ideas?
Thanks
As I already said in my comment above, you are mixing some things up. What the digest in Perl creates is a set of bytes. When those are printed, Perl will convert them automatically to a string-representation, because (simplified) it thinks if you print stuff it goes to a screen and you want to be able to read it. C# does not do that. That doesn't mean the Perl digest and the C# digest are not the same. Just their representation is different.
You have already established that they are equal if you convert both of them to a hexadecimal representation.
Now what you need to do to get output in C# that looks like the string that Perl prints when you do this:
print $ctx->digest; # output: ¸˜E×ë_ƒˆàüÁQÖÈ
... is to convert the C# byte[] data to a string of characters.
That has been answered before,f or example here: How to convert byte[] to string?
Using that technique, I believe your function to get it would look like this. Please note I am a Perl developer and I have no means of testing this. Consider it C#-like pseudo-code.
static string GetMd5PerlishString(MD5 md5Hash, string input)
{
// Convert the input string to a byte array and compute the hash.
byte[] data = md5Hash.ComputeHash(Encoding.UTF8.GetBytes(input));
string result = System.Text.Encoding.UTF8.GetString(data);
return result;
}
Now it should look the same.
Please also note that MD5 is not a secure hashing algorithm for passwords any more. Please do not store use it to store user passwords!