.NET SerialPort Write() adding characters and moving bytes - c#

I am developing a firmware upgrade app in C# using VisualStudio 2015. The app uses a XAML-based form and C# code. The target of the firmware upgrade is a custom board running an STM32F417 processor (ARM Cortex M4). The C# code connects with the 417, causes the 417 to jump to its bootloader (factory installed) and uses the bootloader API to load the new firmware.
After every transfer of 256 bytes (max transfer allowed), a checksum of the data and the number of bytes is sent, and an ACK byte is received from the 417 to show that everything checked out for that transfer. The transfer would fail after a couple of blocks of 256 and to test the transfer, I set up the C# code to just transfer the .hex file without any of the other checks.
I set the code to transfer the updated firmware as a .hex file. line by line. It writes data out the a serial line and is captured as text in HyperTerminal.
I implemented it as a write of 256 bytes:
byte[] buffer256 = new byte[256];
// Read the file and process one line at a time
System.IO.StreamReader file = new System.IO.StreamReader(path);
while ((line = file.ReadLine()) != null)
{
line = line.Substring(1, line.Length - 1);
// Create byte array from string
var bytes = GetBytesFromByteString(line).ToArray();
// Assign values to variables from byte array
type = bytes[3];
// BLOCK WRITE TO MEMORY
if (type == 0) // data type
{
length = bytes[0];
if (byteCounter >= 255)
{
_serialPort.Write(buffer256, 0, 256);
Array.Clear(buffer256, 0, buffer256.Length);
byteCounter = 0;
}
for (int i = 0; i < length; i++)
{
buffer256[byteCounter++] = bytes[4 + i];
}
} // end BLOCK WRITE TO MEMORY
counter++;
} // end WHILE loop for loading hex file
file.Close();
And also as a write of single bytes:
byte[] buffer256 = new byte[256];
// Read the file and process one line at a time
System.IO.StreamReader file = new System.IO.StreamReader(path);
while ((line = file.ReadLine()) != null)
{
line = line.Substring(1, line.Length - 1);
// Create byte array from string
var bytes = GetBytesFromByteString(line).ToArray();
// Assign values to variables from byte array
type = bytes[3];
if (type == 0)
{
length = bytes[0];
for (int i = 0; i < length; i++)
{
_serialPort.Write(bytes, 4 + i, 1);
} // end FOR loop
}// end SINGLE BYTE WRITING
counter++;
} // end WHILE loop for loading hex file
file.Close();
In both cases, the hex file is changed by writing it using the SerialPort class. I tried various other methods, even WriteAsync in the BaseStream class on which the SerialPort class is built (overriding Stream methods).
Here is a sample (8 words) of the original .hex file (data only) compared to the received data:
Original:
00F8012B 002AF9D1 18467047 70B50646
2DF0FEFC 04680546 0A220021 304600F0
Received:
00F8012B 002AF9D1 18464670 4770B506
462DF0FE FC046805 460A2200 21304600
These are lines 49 and 50 out of 13391. The checksums for some of the blocks were also checked and were wrong. I'm pretty sure that is why the firmware upgrades are failing — the bootloader is calculating a checksum based on what it is receiving and fails when it compares it to the checksum the C# app is calculating and sending over.
My question is: if the SerialPort (and underlying Stream) is unreliable, what should I replace it with and how? Is it possible to just replace the SerialPort class, maybe with a C++ class? Or would it be better to rewrite the entire app in C++ or something else?

The SerialPort class itself isn't unreliable. I've used it successfully on the desktop and numerous CE devices. The problem could be the hardware on either end, one of the OSes, some fiddly configuration value, or some combination of any of those.
It could also be your code. I'm not sure you're doing anything wrong, but reading binary data into a string using ReadLine seems pretty weird to me. Have you verified bytes[] is correct in a debugger?
PS--If one of the hardware issues is related to timing, switching to C++ can be helpful. Interop is pretty painless for something like this, so I wouldn't bother rewriting the whole app, but I don't know your situation so couldn't say for sure.

Related

Convert.ToBase64String throws 'System.OutOfMemoryException' for byte [] (file: large size)

I am trying to convert byte[] to base64 string format so that i can send that information to third party. My code as below:
byte[] ByteArray = System.IO.File.ReadAllBytes(path);
string base64Encoded = System.Convert.ToBase64String(ByteArray);
I am getting below error:
Exception of type 'System.OutOfMemoryException' was thrown. Can you
help me please ?
Update
I just spotted #PanagiotisKanavos' comment pointing to Is there a Base64Stream for .NET?. This does essentially the same thing as my code below attempts to achieve (i.e. allows you to process the file without having to hold the whole thing in memory in one go), but without the overhead/risk of self-rolled code / rather using a standard .Net library method for the job.
Original
The below code will create a new temporary file containing the Base64 encoded version of your input file.
This should have a lower memory footprint, since rather than doing all data at once, we handle it several bytes at a time.
To avoid holding the output in memory, I've pushed that back to a temp file, which is returned. When you later need to use that data for some other process, you'd need to stream it (i.e. so that again you're not consuming all of this data at once).
You'll also notice that I've used WriteLine instead of Write; which will introduce non base64 encoded characters (i.e. the line breaks). That's deliberate, so that if you consume the temp file with a text reader you can easily process it line by line.
However, you can amend per your needs.
void Main()
{
var inputFilePath = #"c:\temp\bigfile.zip";
var convertedDataPath = ConvertToBase64TempFile(inputFilePath);
Console.WriteLine($"Take a look in {convertedDataPath} for your converted data");
}
//inputFilePath = where your source file can be found. This is not impacted by the below code
//bufferSizeInBytesDiv3 = how many bytes to read at a time (divided by 3); the larger this value the more memory is required, but the better you'll find performance. The Div3 part is because we later multiple this by 3 / this ensures we never have to deal with remainders (i.e. since 3 bytes = 4 base64 chars)
public string ConvertToBase64TempFile(string inputFilePath, int bufferSizeInBytesDiv3 = 1024)
{
var tempFilePath = System.IO.Path.GetTempFileName();
using (var fileStream = File.Open(inputFilePath,FileMode.Open))
{
using (var reader = new BinaryReader(fileStream))
{
using (var writer = new StreamWriter(tempFilePath))
{
byte[] data;
while ((data = reader.ReadBytes(bufferSizeInBytesDiv3 * 3)).Length > 0)
{
writer.WriteLine(System.Convert.ToBase64String(data)); //NB: using WriteLine rather than Write; so when consuming this content consider removing line breaks (I've used this instead of write so you can easily stream the data in chunks later)
}
}
}
}
return tempFilePath;
}

NetworkStream.Read() not correctly reading after first iteration of loop

I'm having a problem where my code works as expected when I'm stepping through it but will read incorrect data when running normally. I thought the problem could have been timing however NetworkStream.Read() should be blocking and I also tested this by putting the thread to sleep for 1000ms (more than enough time and more time than I was giving whilst stepping through).
The purpose of the code (and what is does when stepping through) is to read a bitmap image into a buffer that is preceded by a string containing the image size in bytes followed by a carriage return and a new line. I believe the problem lies in the read statements, but I can't be sure. The following code is contained within a larger loop also containing Telnet reads, however I have not has a problem with those and they are only reading ASCII strings, no binary data.
List<byte> len = new List<byte>();
byte[] b = new byte[2];
while (!Encoding.ASCII.GetString(b).Equals("\r\n"))
{
len.Add(b[0]);
b[0] = b[1];
b[1] = (byte)stream.ReadByte();
}
len = len.FindAll(x => x != 0);
len.Add((byte)0);
string lenStr = Encoding.ASCII.GetString(len.ToArray());
int imageSize = int.Parse(lenStr);
byte[] imageIn = new byte[imageSize];
stream.Read(imageIn, 0, imageSize);
using (MemoryStream g = new MemoryStream(imageIn))
{
g.Position = 0;
bmp = (Bitmap)Image.FromStream(g);
}
The actual problem that occurs with the code is that the first time it runs it correctly receives the length and image, however it does not seem to recognize the \r\n in consecutive reads, however this may only be a symptom and not the problem itself.
Thanks in advance!
EDIT:
So I did narrow the problem down and manage to fix it by adding in some artificial delay between my Telnet call using NetworkStream.Write() to retrieve the image and the networkStream.Read() to retrieve it, however this solution is messy and I would still like to know why this issue is happening
The Read() operation returns the number of bytes actually read. It only blocks when there is no data to read, and it can return less bytes as specified by the count parameter.
You can easily fix this by putting this inside a loop:
byte[] imageIn = new byte[imageSize];
int remaining = imageSize;
int offset = 0;
while (remaining > 0)
{
int read = stream.Read(imageIn, offset, remaining);
if (read == 0) throw new Exception("Connection closed before expected data was read");
offset += read;
remaining -= read;
}

How to read the content of file from port

I am in needed to use the text data in to a program when someone prints a file.
I have basic ideas about the TCP/IP client and listener programming.
Already I could send and receive txt files between two machines.
But how to receive the file contents if the files were in docx,xlx,pdf or any other format?
My requirement is,
I wanted to use the contents (texts) of a file in to another program when someone prints a file.
Please suggest me if there is some alternative ways to do it.
Thank in advance.
Since you haven't posted any code I'll write the code part in "my way" but you should have a bit of understanding after reading this.
First on both of the ends ( client and server ) you should apply unified protocol which will describe what data you're sending. Example could be:
[3Bytes - ASCII extension][4Bytes - lengthOfTheFile][XBytes - fileContents]
Then in your sender you can receive data according to the protocol which means first your read 3 bytes to decide what format file has, then you read 4 bytes which will basically inform you how large file is incomming. Lastly you have to read the content and write it directly to the file. Example sender could look like this :
byte[] extensionBuffer = new byte[3];
if( 3 != networkStream.Read(extensionBuffer, 0, 3))
return;
string extension = Encoding.ASCII.GetString(extensionBuffer);
byte[] lengthBuffer = new byte[sizeof(int)];
if(sizeof(int) != networkStream.Read(lengthBuffer, 0, 3))
return;
int length = BitConverter.ToInt32(lengthBuffer, 0);
int recv = 0;
using (FileStream stream = File.Create(nameOfTheFile + "." + extension))
{
byte #byte = 0x00;
while( (#byte = (byte)networkStream.ReadByte() ) != 0x00)
{
stream.WriteByte(#byte);
recv++;
}
stream.Flush();
}
On the sender part you can read the file extension then open up the file stream get the length of the stream then send the stream length to the client and "redirect" each byte from FileStream into a networkStream. This can look something like :
FileInfo meFile = //.. get the file
byte[] extBytes = Encoding.ASCII.GetBytes(meFile.Extension);
using(FileStream stream = meFile.OpenRead())
{
networkStream.Write(extBytes, 0, extBytes.Length);
networkStream.Write(BitConverter.GetBytes(stream.BaseStream.Length));
byte #byte = 0x00;
while ( stream.Position < stream.BaseStream.Length )
{
networkStream.WriteByte((byte)stream.ReadByte());
}
}
This approach is fairly easy to implement and doesn't require big changes if you want to send different file types. It lack some validator but I think you do not require this functionality.

StreamReader.BaseStream issue after using EndOfStream property

First of all I understand that I can solve this issue using different ways. I guess that this issue exists only because of using different methods in incorrect way. But I want to find out what exactly happened in my example.
I was using StreamReader for reading file. In order to get bytes from it I decided to use BaseStream.Read:
int length = (int)reader.BaseStream.Length;
byte[] file = new byte[length];
while(!reader.EndOfStream)
{
int readBytes = reader.BaseStream.Read(file, 0,
(length-offset)>bufferSize?bufferSize:(length - offset));
for (int i = 0; i<readBytes; i++)
{
...
}
offset += readBytes;
}
BaseStream.Read refuses to get last 1024 bytes when property StreamReader.EndOfStream was used before reading. Later I've found information, that EndOfStream trying to read 1 byte, but in fact he reads 1024 bytes due performance. Apparently this 1kb become impossible to reach.
EDIT: If I delete reader.EndOfStream property in code, reader.BaseStream.Read will work correctly. That was the main point of question.
Again, I understand, that this code example is absolutely inefficient. I'm just trying to understand how streams work in that example and does this issue exist because of bad code only (or StreamReader.BaseStream has some issues)? Thanks in advance.
It is not StreamReader.BaseStream has some issues but is a problem in your code. When you work directly with the Stream wraped inside StreamReader.
From MSDN about StreamReader.DiscardBufferedData:
You need to call this method only when the position of the internal buffer and the BaseStream do not match. These positions can become mismatched when you read data into the buffer and then seek a new position in the underlying stream.
That mean, in your case, when the Stream already reached end position, the position of StreamReader internal buffer still remain the value before you read the underlying stream directedly, therefore reader.EndOfStream still = false. That why you can not finish the loop.
Edit:
I think you are missing something, I give you this code to prove that the file is successfully reached to the end. Run it and you see that your app repeatly say: I'm at the end of the file!
static void Main()
{
using (StreamReader reader = new StreamReader(#"yourFile"))
{
int offset = 0;
int bufferSize = 102400;
int length = (int)reader.BaseStream.Length;
byte[] file = new byte[length];
while (!reader.EndOfStream)
{
// Add this line:
Console.WriteLine(reader.BaseStream.Position);
Console.ReadLine();
int readBytes = reader.BaseStream.Read(file, 0,
(length - offset) > bufferSize ? bufferSize : (length - offset));
string str = Encoding.UTF8.GetString(file, 0, readBytes);
offset += readBytes;
if (reader.BaseStream.Position == length)
{
Console.WriteLine("I'm at the end of the file! Current Tickcount: " + Environment.TickCount);
Thread.Sleep(100);
}
}
}
}
Edit 2
But still , offset and length should be equal, im my case length - offset = 1024 (in case of files that bigger than 1kb). Maybe I'm doing something wrong, but if I use files with size less than 1kb, readBytes always equals 0.
That because your first call to while (!reader.EndOfStream), the reader have to read the file (this case is 1024 bytes - read bytes to internal buffer) to detemine if file is ended or not (see two lines of code I add above), after it read the file is seeked 1024 bytes, that why length - offset = 1024, and if your file small than 1kb then with this first call, it already seek to end of file. This is where you lost data.
The second call to it, it don't seek because you don't send any read request to the reader, so it consider unchanged, then it don't need read file again to check if at the end of file, that why the second call don't loss data.

c# SslStream.Read Loop problem

I've been learning C# by creating an app and i've hit a snag i'm really struggling with.
Basicly i have the code below which is what im using to read from a network stream I have setup. It works but as its only reading 1 packet for each time the sslstream.Read() unblocks. It's causes a big backlog of messages.
What im looking at trying to do is if the part of the stream read contains multiple packets read them all.
I've tried multiple times to work it out but i just ended up in a big mess of code.
If anyone could help out I'd appreciate it!
(the first 4bytes of each packet is the size of the packet.. packets range between 8 bytes and 28,000 bytes)
SslStream _sslStream = (SslStream)_sslconnection;
int bytes = -1;
int nextread = 0;
int byteslefttoread = -1;
byte[] tmpMessage;
byte[] buffer = new byte[3000000];
do
{
bytes = _sslStream.Read(buffer, nextread, 8192);
int packetSize = BitConverter.ToInt32(buffer, 0);
nextread += bytes;
byteslefttoread = packetSize - nextread;
if (byteslefttoread <= 0)
{
int leftover = Math.Abs(byteslefttoread);
do
{
tmpMessage = new byte[packetSize];
Buffer.BlockCopy(buffer, 0, tmpMessage, 0, packetSize);
PacketHolder tmpPacketHolder = new PacketHolder(tmpMessage, "in");
lock (StaticMessageBuffers.MsglockerIn)
{
//puts message into the message queue.. not very oop... :S
MessageInQueue.Enqueue(tmpPacketHolder);
}
}
while (leftover > 0);
Buffer.BlockCopy(buffer, packetSize , buffer, 0, leftover);
byteslefttoread = 0;
nextread = leftover;
}
} while (bytes != 0);
If you are using .Net 3.5 or later I would highly suggest you look into Windows Communication Foundation (wcf). It will simply anything you are trying to do over a network.
On the other hand, if you are doing this purely for educational purposes.
Take a look at this link. Your best bet is to read from the stream in somewhat smaller increments, and feed that data into another stream. Once you can identify the length of data you need for a message, you can cut the second stream off into a message. You can setup an outside loop where available bytes are being checked and wait until its value is > 0 to start the next message. Also should note, that any network code should be running on its own thread, so as to not block the UI thread.

Categories

Resources