Send Pillow/numpy image over ZeroMQ socket - c#

I need to take a Pillow image and either convert it to a byte array or pack it somehow to send it over a ZeroMQ socket. Since the other socket connection isn't in python (it's c#), I can't use pickle, and I'm not sure JSON would work since it's just the image (dimensions sent separately). I'm working with an image created from a processed numpy array out of opencv, so I don't have an image file. Sending the numpy array over didn't work, either.
Since I can't read the image bytes from a file, I've tried Image.getdata() and Image.tobytes() but I'm not sure either were in the right form to send over the socket. What I really need is a bytestream that I can reform into an array after crossing the socket.
UPDATE: So now I'm specifically looking for anything easily undone in C#. I've looked into struct.pack but I'm not sure there's an equivalent to unpack it. Turning the image into an encoded string would work as long as I could decode it.
UPDATE 2: I'm thinking that I could use JSON to send a multidimensional array, then grab the JSON on the other side and turn it into a bitmap to display. I'm using clrzmq for the C# side, though, and I'm not sure how that handles JSON objects.

In case anyone was wondering, here's what I ended up doing:
On the python side:
_, buffer = cv2.imencode('.jpg', cFrame )
jpg_enc = base64.b64encode(buffer).decode('utf-8')
self.frameSender.send_string(jpg_enc)
self.frameSender.send_string(str(height))
self.frameSender.send_string(str(width))
On the C# side:
byte[] bytebuffer = Convert.FromBase64String(frameListener.ReceiveFrame().ReadString());
int height = int.Parse(frameListener.ReceiveFrame().ReadString());
int width = int.Parse(frameListener.ReceiveFrame().ReadString());
using (MemoryStream ms = new MemoryStream(bytebuffer)) //image data
{
Bitmap img = new Bitmap(Image.FromStream(ms), new Size(height, width));
pictureboxVideo.BeginInvoke((MethodInvoker)delegate{pictureboxbVideo.Image = img;});
}

Related

Convert ByteArray to Image File received in Python from SoftwareBitmap frame on C# Windows.Graphic.Imaging

I have a requirement to get the VideoFrame.SoftwareBitMap from the Windows.Media.VideoFrame in C# and then convert this SOftwareBitmap to a ByteArray and then send the ByteArray using ZeroMQ to Python (using ZMQ) using TCP socket connection.
I am receiving the Bytearray from C# to Python but i am not able to convert this ByteArray in Python to an image . I have tried the below code in python -
message = socket.recv_multipart()
print('message2 received' ,message[2])
img = cv2.imdecode(message[2], -1)
message[2] is the entire bytearray.
I receive the below error on the cv2.imdecode line -
TypeError: buf is not a numpy array, neither a scalar
Please advise what am i doing wrong ?
Example of the ByteArray (some part) received -
14\x1d\xff\n\x13\x1c\xff\n\x14\x1b\xff\t\x13\x1a\xff\x06\x11\x15\xff\x08\x13\x17\xff\x07\x13\x15\xff\x0c\x15\x18\xff\x0e\x17\x1a\xff\r\x14\x17\xff\x0e\x15\x18\xff\x12\x17\x1a\xff\x10\x14\x19\xff\x12\x14\x1c\xff\x12\x15\x1d\xff\x11\x16\x1f\xff\x0e\x17!\xff\x0c\x18$\xff\x0c\x18$\xff\x0c\x18"\xff\x0c\x19!\xff\x15\x1e\'\xff\x18 \'\xff\x19\x1f$\xff\x1f%,\xff\x1f$-\xff\x1e%.\xff!*4\xff\x1c%3\xff\x1e\';\xff#+B\xff&.E\xff&.E\xff#.D\xff!,B\xff\x1f*#\xff\x1e)?\xff (?\xff\x1e\';\xff %:\xff!\':\xff#%7\xff!#5\xff""2\xff$$2\xff\x1f *\xff\x17\x1a\x1f\xff\x17\x18\x1c\xff\x13\x15\x16\xff\x0e\r\x0f\xff\x07\x07\x07\xff\x00\x00\x00\xff\x01\x03\x03\xff\x04\x03\x05\xff\x01\x00\x02\xff\x04\x00\x03\xff\x06\x00\x02\xff\r\x00\x02\xff\x11\x01\x02\xff\x17\x03\x02\xff\x1a\x05\x03\xff\x18\x05\x00\xff\x19\x06\x00\xff\x1f\x08\x00\xff$\x08\x01\xff)\n\x01\xff.\n\x02\xff0\x0c\x02\xff0\r\x03\xff-\x0b\x05\xff%\t\x02\xff\x1b\x05\x00\xff\x10\x01\x00\xff\t\x00\x00\xff\x04\x00\x00\xff\x00\x00\x01\xff\x00\x00\x01\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x01\x01\x01\xff\x01\x01\x01\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\xff\x00\x00\x00\
Many Thanks,
Prakhar
Save it as binary:
outimg = open('out.jpg', wb)
outimg.write(message[2])
outimg.close()

C# image compression for serial transmission

This is my first question, so I hope to provide what it needs to get a decent answer.
I want to send an image received by a webcam over a serial link.
The Image is converted into a byte array and then written to the serial port.
The first issue I ran into was, that when I tried to send the image, it lead to a TimeoutException. Looking at the lenght of the byte array, it showed me around 1 MB of data that needs to be transmitted. Shrinking the actual size of the image resulted in an much faster transmission, but afterwards the image was way too small.
The second isuue was when I tried to compress the image. Using different methods, the size of transmission was always excactly the same.
I hope you can help me find a way to improve my implementation, so that the transmission only takes a few seconds while still maintaining reasonable resolution of the image. Thanks.
Specific Information
Webcam Image
The image from the webcam is received by the AForge library
The image is handled as a Bitmap
(Obviously) it doesn't transmit every frame, only on the click of a button
Serial Port
The port uses a baud rate of 57600 bps (defined by hardware beneath)
The WriteTimeout-value is set to 30s, as it would be unacceptable to wait longer than that
Text transmission works with default values on the SerialPort-item in a WinForm
Image Manipulation
I used different approaches to compress the image:
Simple method like
public static byte[] getBytes(Bitmap img)
{
MemoryStream ms = new MemoryStream();
img.Save(ms, System.Drawing.Imaging.ImageFormat.Jpeg);
byte[] output = ms.toArray();
ms.Dispose();
return output;
}
as well as more advanced methods like the one posted here. Not only with Encoder.Quality but also with Encoder.Compression.
My Application
private void btn_Send(...)
{
Bitmap currentFrame = getImageFromWebcam();
//Bitmap sendFrame = new Bitmap(currentFrame, new Size(currentFrame.Width/10, currentFrame.Height/10));
Bitmap sendFrame = compressImage(currentFrame);
byte[] data = getBytes(sendFrame);
serialPort.Write(data, 0, data.Lenght);
}
C Hanging the timeout property of the serial port would solve the timeout issue. How is show in this link https://msdn.microsoft.com/en-us/library/system.io.ports.serialport.writetimeout(v=vs.110).aspx. File compression works by looking at blocks of data and associating similar blocks with each other for a given segment of blocks. If your image is too unique it will not compress depending on the compression software being used.

Sending object from Android Client App to C# Server App

Java Code:
public class EMessage implements Serializable
{
private Bitmap image;
private String type;
EMessage()
{}
}
...
EMessage eMessage=new EMessage();
outToServer = new DataOutputStream(clientSocket.getOutputStream());
objectOutputStream=new ObjectOutputStream(outToServer);
objectOutputStream.writeObject(eMessage);
C# Code:
[Serializable]
class EMessage
{
private Bitmap image;
private String type;
EMessage()
{ }
}
client = server.AcceptTcpClient();
Connected = client.Connected;
ns = client.GetStream();
IFormatter formatter = new
System.Runtime.Serialization.Formatters.Binary.BinaryFormatter();
EMessage recievedmsg = (EMessage)formatter.Deserialize(ns);
When I send an object from Android Client App (java coded) and I recieve the object in C# Server App but with an Exception.
"The Input Stream is not a valid binary format. The Starting Content(in bytes) are:
00-05-73-72-00-1D-63-6F-6D-2E etc";
Please suggest any simple solution. My project isn't that much complex. I just need to send an EMessage object.
Serialization formats are specific to the platforms, and Java and .NET serialization aren't compatible with each other. Use JSON instead (and it's easier to debug as well).
Why not use SOAP, here's an article on exactly what you're doing (android to .net)
http://www.codeproject.com/Articles/29305/Consuming-NET-Web-Services-via-the-kSOAP-library
I suggest you drop the Serialization for the above mentioned reasons (Java serialization being different from C# serialization), and transfer your data between your Java and C# applications in plain byte arrays.
You can convert your Bitmap image to a byte array like so (taken from this post on SO):
Bitmap bmp = intent.getExtras().get("data");
ByteArrayOutputStream stream = new ByteArrayOutputStream();
bmp.compress(Bitmap.CompressFormat.PNG, 100, stream);
byte[] byteArray = stream.toByteArray();
Of course you could change the CompressFormat if circumstances so require. After that, you could convert your type string to a byte array too, and add a null-terminator to the end of it.
Once you're there, you can send your type string first, and add the byte array of the bitmap after it. On the C# end, you could read the incoming data until you reach the 0 terminator, at which point you'll know you've read the string portion of your EMessage object, and then read the rest of the bytes you've sent over and parse them into a Bitmap object.
That way you'll be sure that between your Java and C# implementations, you won't run into any compatibility issues. It may require a bit more code and a little more understanding to do, but it's far more reliable than serializing between two languages.

Trying to move a file from a python client to a c# server

I am trying to copy a jpg image from a c# server in one PC to a python client in another.
The idea is simply to read the image content:
string text = File.ReadAllText(newPath);
//or
byte[] text = File.ReadAllBytes(newPath);
and to send the text with:
Byte[] sendBytes = text
networkStream.Write(sendBytes, 0, sendBytes.Length);
networkStream.Flush();
and the python client recieves the text and save it into a jpg file right away.
I know it's sound crazy, but it worked! I saw it in another server and wanted to know how did they do it.
I looked for a solution for days, but I still recieve only part of the data all the time (if the file is 7.78 MB, I recieve only 7.74 MB).
I already checked for dupicate posts here, and all I found was transfering files from same language server to the same language client.
I tried using StreamReader and BitConverter, but still I get only part of the image, not all of it.
The python code to save the image that recieved is:
rcvdD = socketPCP.recv(512000000) #I thought that the recv Size is causing to the problem
try:
filename = "image.jpg"
print "NAME:",filename
print "\n\r\n\rNEW FILE RECIEVED!\n\r\n\r"
f=open ('D:/Files/'+filename , 'w')
f.write(rcvdD)
except Exception,e:
print e
Thank You!
I'm not too good at python, but as I remember you should keep recv()-ing appending each chunk in a final buffer until you receive a zero length data or an error, then write to disk what you got in that buffer.

send inkcanvas strokes via sockets

So i am trying to create something like a syncronized paint program by using sockets.I have a server side..and a client side and i am trying to send the inkCollection from the server to the client.This works for simple text, but i cant seem to send the inkCollection.Or it would be even cooler if you could help me send the last stroke so that the data transfers faster.Here is some code of what i've been trying:
sending strokes:
byte[] data;
using (MemoryStream ms = new MemoryStream())
{
inkcanvas.Strokes.Save(ms);
data = ms.ToArray();
}
svr.SendToAll("u getting some strokes");
svr.SendStrokes(data);
svr.SendStrokes(byte[] data):
public void SendStrokes(byte[] data)
{
for (int i = 0; i < no; i++)
{
byte[] dt = data;
accepted[i].Send(dt);
}
MessageBox.Show("dONE");
}
and this is on the clientside:
byte[] buffer=new byte[1024];
MessageBox.Show("receiving strokes");
int rec = conn.Receive(buffer, 0, buffer.Length, 0);
if (rec <= 0)
throw new SocketException();
MessageBox.Show("strokes received");
//Array.Resize(ref buffer, rec);
using (MemoryStream ms = new MemoryStream(buffer))
{
inkcanvas.Strokes = new System.Windows.Ink.StrokeCollection(ms);
ms.Close();
}
MessageBox.Show("strokes added");
these exact same methods work perfectly for string, but when i try to do it with the strokeCollection, it fails.Nothing shows up on the client and i get the following SocketException ont the serverside: An existing connection was forcibly closed by the remote host.
But if you guys got a better way on how to do this it would be great...is it something i am missing? i mean..if it works for text transformed into a byte array...why wouldint it work for this strokecollection?
Thanks!
EDIT: You think you could help me out with some sample code? i cant really seem to implement it;
You forgot to either design or implement a protocol! You can't just send a bunch of bytes over TCP and assume the receiver will be able to make sense out of it.
You have an application message that consists of a bunch of strokes which you are trying to send over TCP. But TCP is a byte stream service, not an application message service. So you need some kind of application message protocol to package the message for transport and unpackage it on receipt. But you have not written any such code. So you're basically expecting it to work by magic.
Define a protocol. For example, it might say:
Canvas strokes shall be sent by a '1' byte indicating canvas strokes, followed by 4 bytes indicating the number of bytes in the strokes object in network byte order, followed by the stroke data. The receiver will read the first byte and identify that it's a canvas strokes object. Then the receiver will read the next four bytes to determine the length. The receiver shall accumulate that number of bytes (using multiple reads if necessary) and then process the reconstructed canvas strokes object.
Do not skip the step of creating a written protocol definition.
Then, when you have problems, follow this handy troubleshooting guide:
Does the sender follow the specification? If not, stop, the sender is broken.
Does the receiver follow the specification? If not, stop, the receiver is broken.
Stop, the specification is broken.
If you want simple, you can convert the data into base64 and encode each message as a line of text. That will allow you to use a ReadLine function to grab exactly one message. Then you can just use a message format like an "S" (for "strokes") followed by the data in base64 format. Use a WriteLine function to send the text message followed by a newline.
i think you forgot to set the memorystream position. you should set the memory stream position to 0 before you send out the stream, cause after strokes.save, the position of the stream is at the end.

Categories

Resources