Picture Doesn't Fully Send Over TCP - c#

I have a server & client model. The client is supposed to take a screenshot, and then send it to the server. The problem I'm having, is part of the screenshot is missing. What I mean by this is like 3/4 of the screen is black when opened in Paint or another app. When I send the screenshot command a second time, the file doesn't open at all, it's corrupt.
Here is my client side
if (plainText.Contains("screenshot"))
{
Bitmap bitmap = new Bitmap(Screen.PrimaryScreen.Bounds.Width, Screen.PrimaryScreen.Bounds.Height);
Graphics graphics = Graphics.FromImage(bitmap);
graphics.CopyFromScreen(0, 0, 0, 0, bitmap.Size);
bitmap.Save("test.bmp");
writebuffer = File.ReadAllBytes("test.bmp");
stream.Write(writebuffer, 0, writebuffer.Length);
}
As you can see, it takes a screen shot, saves the image to a bitmap file, then reads the bytes into a buffer and sends it.
Here is my server side
foreach (user c in clientList)
{
if (c.RemoteIPAddress == currentRow)
{
NetworkStream stream = c.clientTCP.GetStream();
writeBuffer = Encoding.ASCII.GetBytes("screenshot");
stream.Write(writeBuffer, 0, writeBuffer.Length);
byte[] readBuffer = new byte[0];
readBuffer = new byte[c.clientTCP.ReceiveBufferSize];
int data = stream.Read(readBuffer, 0, readBuffer.Length);
string x = new Random().Next().ToString();
FileStream f = new FileStream(new Random().Next() + ".bmp", FileMode.Create, FileAccess.Write);
f.Write(readBuffer, 0, data);
f.Close();
Process.Start(x + ".bmp");
}
}
Here, I send the command to tell the client to send a screenshot, then receive the screen, and then write the said screenshot to a bitmap file.
I cannot seem to figure out what is causing the aforementioned issues in this code.

David's "answer" is useless. "hurr muh protocol". The reason your code isn't working is because the max size of a packet is 65535, and you're assuming the image isn't going to be any bigger than that - i.e you're only reading one packet. To read all the packets, use this loop.
while (stream.DataAvailable)
{
data = stream.Read(readBuffer, 0, readBuffer.Length);
f.Write(readBuffer, 0, data);
}
It will check if data is avaliable in the stream, and read it till there's nothing left.

I cannot seem to figure out what is causing the aforementioned issues in this code.
That's because you have nothing to compare it to. There are lots of correct ways to send an image and lots of correct ways to receive an image. But if you want your code to work, the sender and receiver have to agree on how to send an image. This is called a "protocol" and they should always be documented.
If you had a document for this protocol, it would specify how the sender indicates the size of the image. And it would specify how the receiver determines when it has the entire image. You could then check to make sure both the sender and receiver do what the protocol says. If they do, then it would be the protocol that was broken.
When you're using a network connection or file and you're not using an existing protocol or format, document the protocol or format you plan to use at the byte level. That way, it is possible to tell if the sender or receiver is broken by comparing its behavior to the behavior the protocol specifies.
Without a protocol, there's no way to say who's at fault, since there's no standard of correct behavior to compare them to.
Do not ever think that your protocol is too simple to document. If it's so simple, then documenting it should be simple. And if you think it's too complex to document, then you have no prayer of implementing it correctly.
By the way, it's possible to have a protocol for which your sending code is correct. But it's very difficult. It's impossible to have a protocol for which your receiving code is correct since it has literally no possible way to know if it has the whole file or not.

Related

Send Pillow/numpy image over ZeroMQ socket

I need to take a Pillow image and either convert it to a byte array or pack it somehow to send it over a ZeroMQ socket. Since the other socket connection isn't in python (it's c#), I can't use pickle, and I'm not sure JSON would work since it's just the image (dimensions sent separately). I'm working with an image created from a processed numpy array out of opencv, so I don't have an image file. Sending the numpy array over didn't work, either.
Since I can't read the image bytes from a file, I've tried Image.getdata() and Image.tobytes() but I'm not sure either were in the right form to send over the socket. What I really need is a bytestream that I can reform into an array after crossing the socket.
UPDATE: So now I'm specifically looking for anything easily undone in C#. I've looked into struct.pack but I'm not sure there's an equivalent to unpack it. Turning the image into an encoded string would work as long as I could decode it.
UPDATE 2: I'm thinking that I could use JSON to send a multidimensional array, then grab the JSON on the other side and turn it into a bitmap to display. I'm using clrzmq for the C# side, though, and I'm not sure how that handles JSON objects.
In case anyone was wondering, here's what I ended up doing:
On the python side:
_, buffer = cv2.imencode('.jpg', cFrame )
jpg_enc = base64.b64encode(buffer).decode('utf-8')
self.frameSender.send_string(jpg_enc)
self.frameSender.send_string(str(height))
self.frameSender.send_string(str(width))
On the C# side:
byte[] bytebuffer = Convert.FromBase64String(frameListener.ReceiveFrame().ReadString());
int height = int.Parse(frameListener.ReceiveFrame().ReadString());
int width = int.Parse(frameListener.ReceiveFrame().ReadString());
using (MemoryStream ms = new MemoryStream(bytebuffer)) //image data
{
Bitmap img = new Bitmap(Image.FromStream(ms), new Size(height, width));
pictureboxVideo.BeginInvoke((MethodInvoker)delegate{pictureboxbVideo.Image = img;});
}

C# image compression for serial transmission

This is my first question, so I hope to provide what it needs to get a decent answer.
I want to send an image received by a webcam over a serial link.
The Image is converted into a byte array and then written to the serial port.
The first issue I ran into was, that when I tried to send the image, it lead to a TimeoutException. Looking at the lenght of the byte array, it showed me around 1 MB of data that needs to be transmitted. Shrinking the actual size of the image resulted in an much faster transmission, but afterwards the image was way too small.
The second isuue was when I tried to compress the image. Using different methods, the size of transmission was always excactly the same.
I hope you can help me find a way to improve my implementation, so that the transmission only takes a few seconds while still maintaining reasonable resolution of the image. Thanks.
Specific Information
Webcam Image
The image from the webcam is received by the AForge library
The image is handled as a Bitmap
(Obviously) it doesn't transmit every frame, only on the click of a button
Serial Port
The port uses a baud rate of 57600 bps (defined by hardware beneath)
The WriteTimeout-value is set to 30s, as it would be unacceptable to wait longer than that
Text transmission works with default values on the SerialPort-item in a WinForm
Image Manipulation
I used different approaches to compress the image:
Simple method like
public static byte[] getBytes(Bitmap img)
{
MemoryStream ms = new MemoryStream();
img.Save(ms, System.Drawing.Imaging.ImageFormat.Jpeg);
byte[] output = ms.toArray();
ms.Dispose();
return output;
}
as well as more advanced methods like the one posted here. Not only with Encoder.Quality but also with Encoder.Compression.
My Application
private void btn_Send(...)
{
Bitmap currentFrame = getImageFromWebcam();
//Bitmap sendFrame = new Bitmap(currentFrame, new Size(currentFrame.Width/10, currentFrame.Height/10));
Bitmap sendFrame = compressImage(currentFrame);
byte[] data = getBytes(sendFrame);
serialPort.Write(data, 0, data.Lenght);
}
C Hanging the timeout property of the serial port would solve the timeout issue. How is show in this link https://msdn.microsoft.com/en-us/library/system.io.ports.serialport.writetimeout(v=vs.110).aspx. File compression works by looking at blocks of data and associating similar blocks with each other for a given segment of blocks. If your image is too unique it will not compress depending on the compression software being used.

C# Networkstream BeginRead How to obtain buffer length/size?

I have a problem to obtain the right buffer size of my application.
What i read from the site about specifying the buffer size is normally declared before reading.
byte[] buffer = new byte[2000];
And then using to get the result.
However, this method will stop once the received data contains '00', but my return code contains something like this... 5300000002000000EF0000000A00. and the length is not fixed, can be this short until 400 bytes
So the problems comes, if i define a prefixed length like above, eg 2000, the return value is
5300000002000000EF0000000A000000000000000000000000000000000000000000000000000..........
thus making me unable to split the bytes to the correct amount.
Can any1 show me how to obtain the actual received data size from networkstream or any method/cheat to get what i need?
Thanks in advance.
Network streams have no length.
Unfortunately, your question is light on detail, so it's hard to offer specific advice. But you have a couple of options:
If the high-level protocol being used here offers a way to know the length of the data that will be sent, use that. This could be as simple as the remote host sending the byte count before the rest of the data, or some command you could send to the remote host to query the length of the data. Without knowing what high-level protocol you're using, it's not possible to say whether this is even an option or not.
Write the incoming data into a MemoryStream object. This would always work, whether or not the high-level protocol offers a way to know in advance how much data to expect. Note that if it doesn't, then you will simply have to receive data until the end of the network stream.
The latter option looks something like this:
MemoryStream outputStream = new MemoryStream();
int readByteCount;
byte[] rgb = new byte[1024]; // can be any size
while ((readByteCount = inputStream.Read(rgb, 0, rgb.Length)) > 0)
{
outputStream.Write(rgb, 0, readByteCount);
}
return outputStream.ToArray();
This assumes you have a network stream named "inputStream".
I show the above mainly because it illustrates the more general practice of reading from a network stream in pieces and then storing the result elsewhere. Also, it is easily adapted to directly reading from a socket instance (you didn't mention what you're actually using for network I/O).
However, if you are actually using a Stream object for your network I/O, then as of .NET 4.0, there has been a more convenient way to write the above:
MemoryStream outputStream = new MemoryStream();
inputStream.CopyTo(outputStream);
return outputStream.ToArray();

Writing my own Screen Sharing Server and Protocol

I'm building a client/server solution which needs to have screen sharing functionality. I have something already "working" but the problem is that it only works over internal network, because my methodology is not fast enough.
What I am basically doing is that the client makes a request for the server asking for a screen image each 5 seconds (for example). And this is the code which is processed once this requests are received:
private void GetImage(object networkstream)
{
NetworkStream network = (NetworkStream)networkstream;
Bitmap bitmap = new Bitmap(
SystemInformation.PrimaryMonitorSize.Width,
SystemInformation.PrimaryMonitorSize.Height);
Graphics g = Graphics.FromImage(bitmap);
g.CopyFromScreen(new Point(0, 0), new Point(0, 0), bitmap.Size);
g.Flush();
g.Dispose();
MemoryStream ms = new MemoryStream();
bitmap.Save(ms, System.Drawing.Imaging.ImageFormat.Jpeg);
bitmap.Dispose();
byte[] array = ms.ToArray();
network.Write(array, 0, array.Length);
network.Flush();
ms.Dispose();
}
What are best methods to do what I'm trying to? I need to get at least 0.2 FPS (refresh each 5 seconds) Obs.: I'm using Windows Forms and it is being done over sockets.
How do TeamViwer and .rdp files work?
You can send only difference betwen present and last image. Look here: Calculate image differences in C#
If it wont be fast enough, you can divide your screen into smallers, like 100x100 or 50x50 bitmaps, check if this area had changed and if yes just send it to client.
You need to optimize your protocol, here are some suggestions:
break the input image in segments, send segments instead of full screen
only send a segment if it's different from the previously sent version
use http notification type of communication where your viewer sends a request but only gets a response if the server received new segments from the presenter, possibly several grouped together.
compress the image data, don't transmit raw
give users the option to choose the level of compression to speed things up or to get a better image
I doubt this would be in your budget but you can also encode the stream as streaming video
What about using an existing implementation? Or learning from it?
http://cdot.senecac.on.ca/projects/vncsharp/

send inkcanvas strokes via sockets

So i am trying to create something like a syncronized paint program by using sockets.I have a server side..and a client side and i am trying to send the inkCollection from the server to the client.This works for simple text, but i cant seem to send the inkCollection.Or it would be even cooler if you could help me send the last stroke so that the data transfers faster.Here is some code of what i've been trying:
sending strokes:
byte[] data;
using (MemoryStream ms = new MemoryStream())
{
inkcanvas.Strokes.Save(ms);
data = ms.ToArray();
}
svr.SendToAll("u getting some strokes");
svr.SendStrokes(data);
svr.SendStrokes(byte[] data):
public void SendStrokes(byte[] data)
{
for (int i = 0; i < no; i++)
{
byte[] dt = data;
accepted[i].Send(dt);
}
MessageBox.Show("dONE");
}
and this is on the clientside:
byte[] buffer=new byte[1024];
MessageBox.Show("receiving strokes");
int rec = conn.Receive(buffer, 0, buffer.Length, 0);
if (rec <= 0)
throw new SocketException();
MessageBox.Show("strokes received");
//Array.Resize(ref buffer, rec);
using (MemoryStream ms = new MemoryStream(buffer))
{
inkcanvas.Strokes = new System.Windows.Ink.StrokeCollection(ms);
ms.Close();
}
MessageBox.Show("strokes added");
these exact same methods work perfectly for string, but when i try to do it with the strokeCollection, it fails.Nothing shows up on the client and i get the following SocketException ont the serverside: An existing connection was forcibly closed by the remote host.
But if you guys got a better way on how to do this it would be great...is it something i am missing? i mean..if it works for text transformed into a byte array...why wouldint it work for this strokecollection?
Thanks!
EDIT: You think you could help me out with some sample code? i cant really seem to implement it;
You forgot to either design or implement a protocol! You can't just send a bunch of bytes over TCP and assume the receiver will be able to make sense out of it.
You have an application message that consists of a bunch of strokes which you are trying to send over TCP. But TCP is a byte stream service, not an application message service. So you need some kind of application message protocol to package the message for transport and unpackage it on receipt. But you have not written any such code. So you're basically expecting it to work by magic.
Define a protocol. For example, it might say:
Canvas strokes shall be sent by a '1' byte indicating canvas strokes, followed by 4 bytes indicating the number of bytes in the strokes object in network byte order, followed by the stroke data. The receiver will read the first byte and identify that it's a canvas strokes object. Then the receiver will read the next four bytes to determine the length. The receiver shall accumulate that number of bytes (using multiple reads if necessary) and then process the reconstructed canvas strokes object.
Do not skip the step of creating a written protocol definition.
Then, when you have problems, follow this handy troubleshooting guide:
Does the sender follow the specification? If not, stop, the sender is broken.
Does the receiver follow the specification? If not, stop, the receiver is broken.
Stop, the specification is broken.
If you want simple, you can convert the data into base64 and encode each message as a line of text. That will allow you to use a ReadLine function to grab exactly one message. Then you can just use a message format like an "S" (for "strokes") followed by the data in base64 format. Use a WriteLine function to send the text message followed by a newline.
i think you forgot to set the memorystream position. you should set the memory stream position to 0 before you send out the stream, cause after strokes.save, the position of the stream is at the end.

Categories

Resources