I want to read data from weighing scale data via RS232 and i try any more way
my weighing scale model yh-t7e datasheet
The output of the scale on the AccessPort program is this value .
output on Access Port
The weight on the scales = 3.900 kg
in picture =009.300
baud rate 1200
When I use this code
string s = "";
int num8 = 0;
string RST = "";
while (this.serialPort1.BytesToRead > 0)
{
string data = serialPort1.ReadExisting();
if (data != null)
{
if (data.ToString() != "")
{
if (data.Length > 6)
{
RST = data.Substring(6, 1) + data.Substring(5, 1) + data.Substring(4, 1) + data.Substring(3, 1) + data.Substring(2, 1);
this.textBox4110.Text = RST.ToString();
}
}
}
}
output in my program
When I use the above code in the program
Sometimes displays the weight number and sometimes does not. I have to open and close the program several times.
And by changing the weight on the scale, its number does not change on the program and the display value is fixed.
and When I use this code
while (this.serialPort1.BytesToRead > 0)
{
int data = serialPort1.ReadByte();
this.textBox4110.Text = data.ToString();
}
in my program Displays the number 48
What should I do?
thanks regards
I don't know why your serial port sometimes responds and sometimes doesn't.
I worked with RS232 years ago and never had this problem.
About other questions:
You're working with a byte array, can't call the ToString, since it will convert to string the byte rappresentation
If you have to reverse the bites order (4 - 3 - 2 - 1), you can call the Array.Reverse method
Just for make an example about what I mean, I took your code:
while (this.serialPort1.BytesToRead > 0)
{
int data = serialPort1.ReadByte();
this.textBox4110.Text = data.ToString();
}
your "data" variable contains a byte with value 48 that is the 0 char in ASCII table.
So, if you want the char, you have to convert it using the right encoding.
Suppose you are working with UTF8:
while (this.serialPort1.BytesToRead > 0)
{
var dataLen = this.serialPort1.BytesToRead;
var byteArray = new byte[dataLen];
this.serialPort1.Read(byteArray, 0, dataLen);
var txt = Encoding.UTF8.GetString(byteArray);
this.textBox4110.Text = txt;
}
Honestly I know the Encoding.UTF8.GetString accept a byte array, not sure it will work only with a single byte...
Related
I have some code which does a binary search over a file with sorted hex values (SHA1 hashes) on each line. This is used to search the HaveIBeenPwned database. The latest version contains a count of the number of times each password hash was found, so some lines have extra characters at the end, in the format ':###'
The length of this additional check isn't fixed, and it isn't always there. This causes the buffer to read incorrect values and fail to find values that actually exist.
Current code:
static bool Check(string asHex, string filename)
{
const int LINELENGTH = 40; //SHA1 hash length
var buffer = new byte[LINELENGTH];
using (var sr = File.OpenRead(filename))
{
//Number of lines
var high = (sr.Length / (LINELENGTH + 2)) - 1;
var low = 0L;
while (low <= high)
{
var middle = (low + high + 1) / 2;
sr.Seek((LINELENGTH + 2) * ((long)middle), SeekOrigin.Begin);
sr.Read(buffer, 0, LINELENGTH);
var readLine = Encoding.ASCII.GetString(buffer);
switch (readLine.CompareTo(asHex))
{
case 0:
return true;
case 1:
high = middle - 1;
break;
case -1:
low = middle + 1;
break;
default:
break;
}
}
}
return false;
}
My idea is to seek forward from the middle until a newline character is found, then seek backwards for the same point, which should give me a complete line which I can split by the ':' delimiter. I then compare the first part of the split string array which should be just a SHA1 hash.
I think this should still centre on the correct value, however I am wondering if there is a neater way to do this? If the midpoint isn't that actual midpoint between the end of line characters, should it be adjusted before the high and low values are?
I THINK this may be a possible simpler (faster) solution without the backtracking to the beginning of the line. I think you can just use byte file indexes instead of trying to work with a full "record/line. Because the middle index will not always be at the start of a line/record, the "readline" can return a partial line/record. If you were to immediately do a second "readline", you would get a full line/record. It wouldn't be quite optimal, because you would actually be comparing a little ahead of the middle index.
I downloaded the pwned-passwords-update-1 and pulled out about 30 records at the start, end, and in the middle, it seemed to find them all. What do you think?
const int HASHLENGTH = 40;
static bool Check(string asHex, string filename)
{
using (var fs = File.OpenRead(filename))
{
var low = 0L;
// We don't need to start at the very end
var high = fs.Length - (HASHLENGTH - 1); // EOF - 1 HASHLENGTH
StreamReader sr = new StreamReader(fs);
while (low <= high)
{
var middle = (low + high + 1) / 2;
fs.Seek(middle, SeekOrigin.Begin);
// Resync with base stream after seek
sr.DiscardBufferedData();
var readLine = sr.ReadLine();
// 1) If we are NOT at the beginning of the file, we may have only read a partial line so
// Read again to make sure we get a full line.
// 2) No sense reading again if we are at the EOF
if ((middle > 0) && (!sr.EndOfStream)) readLine = sr.ReadLine() ?? "";
string[] parts = readLine.Split(':');
string hash = parts[0];
// By default string compare does a culture-sensitive comparison we may not be what we want?
// Do an ordinal compare (0-9 < A-Z < a-z)
int compare = String.Compare(asHex, hash, StringComparison.Ordinal);
if (compare < 0)
{
high = middle - 1;
}
else if (compare > 0)
{
low = middle + 1;
}
else
{
return true;
}
}
}
return false;
}
My way of solving your problem was to create a new binary file containing the hashes only. 16 byte/hash and a faster binary search ( I don't have 50 reps needed to comment only )
I have an application which an ST-LINK programs firmware too and then a report is made on the buffer size, data bits etc.
I have a text file which stores these results. However, there may be a lot of circuit boards in one batch and each time they need to increase there product number by one. So what I want to do is get the program to look at the last board report, see if it has the same batch number (as batches will all be done at once), and if it does increase the product number by 1. If it doesn't have the same batch number then it must be a new batch and the product number will be 1.
At the moment the product number is not updating. Every time it is returning 1.
Here is my code:
public int previousNumber()
{
int pNumber = 0;
string line; //set string
int counter = 0; //create int
int numberOfLines = File.ReadLines("report.txt").Count();
System.IO.StreamReader file = new System.IO.StreamReader("report.txt"); //create streamreader
while ((line = file.ReadLine()) != null) //until no empty lines
{
string[] allLines = File.ReadAllLines("report.txt"); //read in report file
if (allLines[numberOfLines - 9] == batchNumberTextBox.Text)
{
pNumber = int.Parse(allLines[numberOfLines - 7]);
}
else
{
pNumber = 0;
}
}
file.Close();
pNumber = pNumber + 1;
return pNumber;
}
private void saveReport()
{
getValues();
int number = previousNumber();
BatchNumber = batchNumberTextBox.Text;
SerialNumber = serialNumberTextBox.Text;
ProductNumber = number;
string ProductNumberString = ProductNumber.ToString();
string inDate = DateTime.Now.ToString("f",
CultureInfo.CreateSpecificCulture("en-UK")); //set date in that format
try
{
board newBoard = new board(BatchNumber, SerialNumber, ProductNumberString, BufferSize, StopBits, Parity, DataBits, baudRate, inDate);
newBoard.Save("report.txt");
File.AppendAllText("batches.txt", "BATCH NUMBER: " + BatchNumber + " - DATE: " + inDate + Environment.NewLine);
System.Windows.MessageBox.Show("Report Saved");
}
catch
{
System.Windows.MessageBox.Show("Save failed"); //tell user save failed
}
}
And here is the text file for the reports:
Can you see anywhere why it may not be working? I feel like I may have gone a weird way about this so if you can think of a better way it would be much appreciated!
Thank you in advance,
Lucy
I take it, the batch number is 1234 in your data sample. If so, it's offset from the last line is -10 and not -9. This is the primary source of your error.
There is also a lot of redundancy in your code: you read the whole file several times while reading it once and storing all its lines in an array would be pretty enough.
A simplified (but still correct) version of previousNumber() may look like this:
public int previousNumber()
{
var allLines = File.ReadAllLines("report.txt");
int pNumber = 0;
if (allLines.Length > 10 && allLines[allLines.Length - 10] == batchNumberTextBox.Text)
// Note: if the desired value is "1" and not "4096", then the offset is "-8".
int.TryParse(allLines[allLines.Length - 8], out pNumber);
return pNumber + 1;
}
My problem is as follows :
My problem is that even after doing LSB replacement after the quantization step I still get errors and changes on the detection side. for strings, letters get changed but for bitmaps the image isn't readable as deduced from getting "Parameters no valid". I've tried a lot of debugging and I just can't figure it out.
My goal is pretty simple, insert a set of bits (before string or Bitmap) into a JPEG image, save it and be able to detect and extract said set of bits to its original form. I've been successful with BMP and PNG as there is no compression there, but JPEG is another story. Btw I'm doing LSB replacement.
I understand what I need to do, apply the LSB replacement after the DCT coefficients have been quantized. For that purpose I have been using a JPEG Encoder and modified what I needed in the appropriate spot.
I modified the method EncodeImageBufferToJpg to convert a string or bitmap into a bit array (int[]) and then do LSB replacement to one Coefficient per block for each channel Y, Cb, Cr.
This here is my modified method for EncodeImageBufferToJpg, plus the Detection+Process method I use to reconstruct the message: Link Here.
For the Y channel for example :
In encoding :
Int16[] DCT_Quant_Y = Do_FDCT_Quantization_And_ZigZag(Y_Data, Tables.FDCT_Y_Quantization_Table);
if (!StegoEncodeDone)
{
// We clear the LSB to 0
DCT_Quant_Y[DCIndex] -= Convert.ToInt16(DCT_Quant_Y[DCIndex] % 2);
// We add the bit to the LSB
DCT_Quant_Y[DCIndex] += Convert.ToInt16(MsgBits[MsgIndx]);
// Ys for debug print
Ys.Add(DCT_Quant_Y[DCIndex]);
MsgIndx++;
if (MsgIndx >= MsgBits.Length) StegoEncodeDone = true;
}
DoHuffmanEncoding(DCT_Quant_Y, ref prev_DC_Y, Tables.Y_DC_Huffman_Table, Tables.Y_AC_Huffman_Table, OutputStream);
and in detection :
Int16[] DCT_Quant_Y = Do_FDCT_Quantization_And_ZigZag(Y_Data, Tables.FDCT_Y_Quantization_Table);
// SteganoDecode *********************************************
if (!StegoDecodeDone)
{
int Dtt = Math.Abs(DCT_Quant_Y[DCIndex] % 2);
int DYY = Y_Data[DCIndex];
int DDCTYYB = DCT_Quant_Y[DCIndex];
Ys.Add(DCT_Quant_Y[DCIndex]);
// Si le DCT Coefficient est negatif le % retournais un -1 mais binaire => 0,1 => positif
charValue = charValue * 2 + Math.Abs(DCT_Quant_Y[DCIndex] % 2);
ProcessStaganoDecode();
}
// End *********************************************************
DCT_Quant_Y.CopyTo(Y, index);
public void ProcessStaganoDecode()
{
Counter++;
cc++;
if (IDFound) MsgBits.Add(charValue % 2);
else IDBits.Add(charValue % 2);
if (Counter == 8)
{
// If we find a '-' we inc, else we set to 0. because they have to be 3 consecutive "---"
char ccs = (char)reverseBits(charValue);
if (((char)reverseBits(charValue)) == '-')
{
SepCounter++;
}
else SepCounter = 0;
if (SepCounter >= 3)
{
if (IDFound)
{
MsgBits.RemoveRange(MsgBits.Count - 3 * 8, 3 * 8);
StegoDecodeDone = MarqueFound = true;
}
else
{
IDFound = true;
IDBits.RemoveRange(IDBits.Count - 3 * 8, 3 * 8);
string ID = BitToString(IDBits);
IDNum = Convert.ToInt16(BitToString(IDBits));
Console.WriteLine("ID Found : " + IDNum);
}
SepCounter = 0;
}
charValue = 0;
Counter = 0;
}
}
All the code is in the class: BaseJPEGEncoder.
Here's the VS 2015 C# project for you to check the rest of the classes etc. I can only put 2 links, so sorry couldn't put the original: Here. I got the original encoder from "A simple JPEG encoder in C#" at CodeProject
I've read some answers to other questions from these two people, and I would love to get their attention to give me some help if they can: Sneftel and Reti43. Couldn't find a way to contact them.
I have a file done like this
10 NDI 27 2477 6358 4197 -67 0 VVFAˆ ÿÿÿÿ
The last column is binary.
I have to read this file, the problem is that I can not read it as a text because in some lines the last columns has a new line character and thus I wouldn't read the entire line.
Then I should read it as a binary file, but then how can I retrieve the first and the third column?
I tried by reading bytes in this way:
byte[] lines1 = System.IO.File.ReadAllBytes("D:\\dynamic\\ap1_dynamic\\AP_1.txt");
And then convert it into string with
for (i = 0; i < lines1.Length; i++) {
Convert.ToString(lines1[i],2);
}
but then it reads everything as 0 and 1.. I would like to read the first 8 columns as text, while the last one as binary..
I am using Visual Studio 2013, C#.
Reading the file as binary is correct, as you can convert part of the binary data to text. In this context binary means bytes.
Converting the bytes to binary is not what you want to do. In this context binary means text representation in base 2, but you don't want a text representation of the data.
If the lines are fixed length, you can do something like this to read the values:
int lineLen = 70; // correct this with the actual length
int firstPos = 0;
int firstLen = 3; // correct with actual length
int thirdPos = 15; // correct with actual position
int thirdLen = 3; // correct with actual length
int lastPos = 60; // correct with actual position
int lastLen = 10; // correct with actual length
int lines = lines.length / lineLength;
for (int i = 0; i < lines; i++) {
int first = Int32.Parse(Encoding.UTF8.GetString(i * lineLen + firstPos, firstLen).Trim());
int third = Int32.Parse(Encoding.UTF8.GetString(i * lineLen + thirdPos, thirdLen).Trim());
byte[] last = new byte[lastLen];
Array.Copy(lines1, i * lineLen + lastPos, last, 0, lastLen);
// do something with the data in first, third and last
}
I have a base64 string in the view side. If I pass the whole base64 array at a time I can convert that in to bytes like this
byte[] myBinary = Convert.FromBase64String(data);
where data represents the data that is coming form the view page. But I am having huge data. So, I am splitting the data in the view page like
var arr = [];
for (var i = 0; i < data.length - 1; i += 1000000) {
arr.push(data.substr(i, 1000000));
}
And now I am passing the data to the controller
for (var x = 0; x < arr.length; x++) {
if (x = 0) {
r = "first";
}
else if (x = arr.length - 1) {
r = "last";
}
else {
r = "next";
}
$.post('/Home/Content', { content: e, data: r }, function (d) {
});
}
And in the controller side I have written code like:
public JsonResult Content(string content, string data)
{
datavalueincont += content;
if (data == "last")
{
byte[] myBinary = Convert.FromBase64String(datavalueincont);
var fname = "D://sri//data.mp4";
FileStream stream = new FileStream(fname, FileMode.Create, FileAccess.Write);
System.IO.BinaryWriter br = new System.IO.BinaryWriter(stream);
br.Write(myBinary);
br.Close();
read.Close();
stream.Close();
}
return Json("suc", JsonRequestBehavior.AllowGet);
}
But I am getting error at:
byte[] myBinary = Convert.FromBase64String(datavalueincont);
and that error is
The input is not a valid Base-64 string as it contains a non-base 64
character, more than two padding characters, or an illegal character
among the padding characters.
How can I rectify this. If I pass the data at a time I am able to get the bytes in the
myBinary array. Hope you understand my question.
I have an idea.
As you are sending your data using Ajax, nothing ensures you that your chunks will be sent sequentially.
So maybe when you aggregate your data your chunks are not in a good order.
Try to make your Ajax call sequentially to confirm this point.
[Edit]
something like this (not tested):
var data = [];//your data
var sendMoreData = function (firstTime) {
if (data.length == 0)
return;//no more data to send
var content = data.shift();
var r = firstTime ? "first" :
data.length == 0 ? "last":
"next";
$.post('/Home/Content', { content: content, data: r }, function (d) {
sendMoreData();
});
};
sendMoreData(true);
You can't use byte[] myBinary = Convert.FromBase64String(datavalueincont); until you have the fully encrypted string.
The problem is that you're splitting the Base64 data into chunks after which you send those chunks to the server -> on the server you're trying to convert back from base64 on each individual chunk rather than the whole collection of chunks.
The way I see it, you have 2 options:
Encrypt each individually split chunk of data to base64 (rather than the whole thing before hand) and decrypt it on the server.
Encrypt the whole thing, then split it into pieces (like you're doing now) -> send it to the server -> cache each result (any way you
want -> session, db etc.) till you get the last one -> decrypt at
once
As a side note:
if (x = 0) {
r = "first";
}
else if (x = arr.length - 1) {
r = "last";
}
should really be:
if (x == 0) {
r = "first";
}
else if (x == arr.length - 1) {
r = "last";
}
Not sure if typo, just sayin'.
I think your concept is fine... from what I understand you are doing the following...
View converts binary data to Base64String
View splits string into chunks and sends to controller
Controller waits for all chunks and concatenates them
Controller converts from Base64String
The problem is in how you are splitting your data in the view... I am assuming the splitting code has some extra padding characters on the end maybe?
var arr = [];
for (var i = 0; i < data.length - 1; i += 1000000) {
arr.push(data.substr(i, 1000000));
}
I can't build a test rig to check the code but certainly on your last section of text you can't get 1000000 characters from .substr because there aren't that many characters in the string. I don't know what .substr will return but I would troubleshoot the splitting section of code to find the problem.
Are you sure that datavalueincont += content; is really aggregating all your data. How do you store datavalueincont after each http request?
Maybe you are only missing that.
Have you debugged when data == "last" to see if you have all your data in datavalueincont ?