I'm trying to send an image using a TCP socket. The client connects to the server without any problems and start to receive the data. The problem is when I try to convert the stream to an image using FromStream() method, I get an OutOfMemory Exception. Can anyone help me out? Really important!! Here is the code;
client snippet
private void btnConnect_Click(object sender, EventArgs e)
{
IPAddress ipAddress = IPAddress.Parse("127.0.0.1");
TcpClient client = new TcpClient();
client.Connect(ipAddress, 9500);
NetworkStream nNetStream = client.GetStream();
while (client.Connected)
{
lblStatus.Text = "Connected...";
byte[] bytes = new byte[client.ReceiveBufferSize];
int i;
if (nNetStream.CanRead)
{
nNetStream.Read(bytes, 0, bytes.Length);
Image returnImage = Image.FromStream(nNetStream); //exception occurs here
pictureBox1.Image = returnImage;
}
else
{
client.Close();
nNetStream.Close();
}
}
client.Close();
}
server snippet
try
{
IPAddress ipAddress = Dns.Resolve("localhost").AddressList[0];
TcpListener server = new TcpListener(ipAddress, 9500);
server.Start();
Console.WriteLine("Waiting for client to connect...");
while (true)
{
if (server.Pending())
{
Bitmap tImage = new Bitmap(Image URL goes here);
byte[] bStream = ImageToByte(tImage);
while (true)
{
TcpClient client = server.AcceptTcpClient();
Console.WriteLine("Connected");
while (client.Connected)
{
NetworkStream nStream = client.GetStream();
nStream.Write(bStream, 0, bStream.Length);
}
}
}
}
}
catch (SocketException e1)
{
Console.WriteLine("SocketException: " + e1);
}
}
static byte[] ImageToByte(System.Drawing.Image iImage)
{
MemoryStream mMemoryStream = new MemoryStream();
iImage.Save(mMemoryStream, System.Drawing.Imaging.ImageFormat.Gif);
return mMemoryStream.ToArray();
}
Thanks a lot in advanced,
There are a couple of things wrong, including, possibly the protocol you are using. First, the client:
If you expect a single image, there is no need for the while loop
Your client first does a Read which reads some information from the server into the buffer, and then it calls Image.FromStream(nNetStream) which will read incomplete data.
Whenever you read from a stream, keep in mind that a single Read call is not guaranteed to fill your buffer. It can return any number of bytes between 0 and your buffer size. If it returns 0, it means there is no more to read. In your case this also means that your client currently has no way of knowing how much to read from the server. A solution here is to have the server send the length of the image as the first piece of information. Another solution would be to have the server disconnect the client after it has sent the information. This may be acceptable in your case, but it will not work if you need persistent connections (e.g. pooled connections on client side).
The client should look something like this (assuming the server will disconnect it after it sends the data):
IPAddress ipAddress = IPAddress.Parse("127.0.0.1");
using (TcpClient client = new TcpClient())
{
client.Connect(ipAddress, 9500);
lblStatus.Text = "Connected...";
NetworkStream nNetStream = client.GetStream();
Image returnImage = Image.FromStream(nNetStream);
pictureBox1.Image = returnImage;
}
Next, the server:
Instead of Pending, you can simply accept the client
The server sends the stream over and over again to the same client, until they disconnect. Instead, send it only once.
The server loop should look something like this:
Bitmap tImage = new Bitmap(Image URL goes here);
byte[] bStream = ImageToByte(tImage);
while (true)
{
// The 'using' here will call Dispose on the client after data is sent.
// This will disconnect the client
using (TcpClient client = server.AcceptTcpClient())
{
Console.WriteLine("Connected");
NetworkStream nStream = client.GetStream();
try
{
nStream.Write(bStream, 0, bStream.Length);
}
catch (SocketException e1)
{
Console.WriteLine("SocketException: " + e1);
}
}
}
This part looks funky to me:
byte[] bytes = new byte[client.ReceiveBufferSize];
int i;
if (nNetStream.CanRead)
{
nNetStream.Read(bytes, 0, bytes.Length);
Image returnImage = Image.FromStream(nNetStream); //exception occurs here
First you read client.ReceiveBufferSize bytes into the "bytes" array, and then you proceed to construct the image from what's left on the stream. What about the bytes you just read into "bytes"?
I recomend you to use this code(I've created it by myself and tested it and it works perfect.):
public void Bitmap ConvertByteArrayToBitmap(byte[] receivedBytes)
{
MemoryStream ms = new MemoryStream(receivedBytes);
return new Bitmap(ms, System.Drawing.Imaging.ImageFormat.Png); // I recomend you to use png format
}
Use this to convert received byteArray to an image.
It seems like your server is sending the image over and over again:
while (client.Connected)
{
NetworkStream nStream = client.GetStream();
nStream.Write(bStream, 0, bStream.Length);
}
If the server can send data fast enough, the client will probably keep receiving it.
Here's a solution :
Server side :
tImage.Save(new NetworkStream(client), System.Drawing.Imaging.ImageFormat.Png);
Cliend side:
byte[] b = new byte[data.ReceiveBufferSize];
client.Receive(b);
MemoryStream ms = new MemoryStream(b);
Image receivedImag = Image.FromStream(ms);
or :
Image receivedImag = Image.FromStream(new NetworkStream(client));
Related
I have the following client code, that I write stuff to my Server pipe, I was able to read it on the Server side but before I could reply back, the client would already try to read the still empty pipe. How do you wait on the NamedPipeClientStream?
using (NamedPipeClientStream pipe = new NamedPipeClientStream(".", pipename, PipeDirection.InOut))
{
pipe.Connect(5000);
pipe.ReadMode = PipeTransmissionMode.Byte;
byte[] ba = Encoding.Default.GetBytes("hello world");
pipe.Write(ba, 0, ba.Length);
var result = await Task.Run(() => {
// this would return as soon as Server finished reading
// but then server hasn't wrote anything back yet
pipe.WaitForPipeDrain();
// sample code on how i am planning to read, not tested,
// since the pipe is still empty at this point
using (StreamReader reader = new StreamReader(pipe))
using (MemoryStream ms = new MemoryStream())
{
reader.BaseStream.CopyTo(ms);
return Encoding.Default.GetString(ms.ToArray());
}
});
return result;
}
I don't think I should be using WaitForPipeDrain but then there are no other options for my to wait, or to know when is ready to read? There are many examples out there, but none of them shows the proper way for the client to wait for a response.
The example shown from Microsoft seems to be using ReadLine() and leveraging EOL character when you send string data, but I'm dealing with byte[] data ("hello world" is just to get some bytes).
You don't need to wait for data. NamedPipeClientStream represents a stream of bytes (it derives from System.IO.Stream) and if no data is currently available, reading from pipe (or from StreamReader that wraps that pipe) will simply block until data arrives.
For transfering textual data, reading with StreamReader.ReadLine() and writing with StreamWriter.WriteLine() will work fine. To transfer binary data, you can either encode binary data into textual form (for example using base64 encoding) and keep using StreamReader.ReadLine() / StreamWriter.WriteLine(). Or you can set server and client pipes into PipeStream.TransmissionMode to Message mode, and transfer each byte array as a single message, as follows (error checks omitted for brevity):
Client:
class Client
{
static async Task Main(string[] args)
{
using (NamedPipeClientStream pipe = new NamedPipeClientStream(".", "testpipe", PipeDirection.InOut))
{
pipe.Connect(5000);
pipe.ReadMode = PipeTransmissionMode.Message;
byte[] ba = Encoding.Default.GetBytes("hello world");
pipe.Write(ba, 0, ba.Length);
var result = await Task.Run(() => {
return ReadMessage(pipe);
});
Console.WriteLine("Response received from server: " + Encoding.UTF8.GetString(result));
Console.ReadLine();
}
}
private static byte[] ReadMessage(PipeStream pipe)
{
byte[] buffer = new byte[1024];
using (var ms = new MemoryStream())
{
do
{
var readBytes = pipe.Read(buffer, 0, buffer.Length);
ms.Write(buffer, 0, readBytes);
}
while (!pipe.IsMessageComplete);
return ms.ToArray();
}
}
}
Server:
class Server
{
static void Main(string[] args)
{
using (NamedPipeServerStream pipeServer = new NamedPipeServerStream(
"testpipe",
PipeDirection.InOut,
NamedPipeServerStream.MaxAllowedServerInstances,
PipeTransmissionMode.Message))//Set TransmissionMode to Message
{
// Wait for a client to connect
Console.Write("Waiting for client connection...");
pipeServer.WaitForConnection();
Console.WriteLine("Client connected.");
//receive message from client
var messageBytes = ReadMessage(pipeServer);
Console.WriteLine("Message received from client: " + Encoding.UTF8.GetString(messageBytes));
//prepare some response
var response = Encoding.UTF8.GetBytes("Hallo from server!");
//send response to a client
pipeServer.Write(response, 0, response.Length);
Console.ReadLine();
}
}
private static byte[] ReadMessage(PipeStream pipe)
{
byte[] buffer = new byte[1024];
using (var ms = new MemoryStream())
{
do
{
var readBytes = pipe.Read(buffer, 0, buffer.Length);
ms.Write(buffer, 0, readBytes);
}
while (!pipe.IsMessageComplete);
return ms.ToArray();
}
}
}
I have a C# desktop app. It connects to another PC on my network which is a UWP C# app.
I am trying to send an image or 2 to my listening socket and to test this I get the listening socket to send me the image back.
The trouble is that even though my server recieves all the bytes that were orginally sent the recieved image back to the client is not of the same size.
To make this even more weird is sometimes the returned bytes are correct and I get the whole image and when I attempt to send 2 images the 1st one is OK and the 2nd one is not.
Then it will/can revert back to no images being sent back correctly.
I think is maybe to do with the async/await parts bit I am not sure how.
This is my server code:
using (IInputStream input = args.Socket.InputStream)
{
byte[] data = new byte[BufferSize];
IBuffer buffer = data.AsBuffer();
uint dataRead = BufferSize;
while (dataRead == BufferSize)
{
await input.ReadAsync(buffer, BufferSize, InputStreamOptions.Partial);
requestInBytes.AddRange(data.Take((int) buffer.Length));
dataRead = buffer.Length;
}
}
var ct = requestInBytes.Count;
I then trip out the header info:
int counter = 0;
counter = requestCommand[0].Length;
counter = counter + requestCommand[1].Length;
counter = counter + requestCommand[2].Length;
counter = counter + requestCommand[3].Length;
counter = counter + requestCommand[4].Length;
counter = counter + requestCommand[5].Length;
counter = counter + 6;
Now I extract the image:
var imgBody = new byte[totalBytes.Length- counter];
System.Buffer.BlockCopy(totalBytes, counter, imgBody, 0, imgBody.Length);
byteArray = imgBody;
And send just the image back:
using (IOutputStream output = args.Socket.OutputStream)
{
using (Stream response = output.AsStreamForWrite())
{
MemoryStream stream = new MemoryStream(byteArray);
await response.WriteAsync(byteArray, 0, byteArray.Length);
await response.FlushAsync();
}
}
This is my client code:
StringBuilder sb = new StringBuilder();
foreach (var gallery in Shared.CurrentJobGallery)
{
try
{
sb.Clear();
sb.Append(GeneralTags.ACTION_ADD);
sb.Append(Shared.DLE);
sb.Append("GALLERY");
sb.Append(Shared.DLE);
sb.Append(Shared.CurrentClientId);
sb.Append(Shared.DLE);
sb.Append(gallery.Title);
sb.Append(Shared.DLE);
sb.Append(gallery.Description);
sb.Append(Shared.DLE);
sb.Append(jobRef);
sb.Append(Shared.DLE);
byte[] galleryHdr = Encoding.UTF8.GetBytes(sb.ToString());
byte[] byteArray = new byte[galleryHdr.Length + gallery.ImageData.Length];
Buffer.BlockCopy(galleryHdr, 0, byteArray, 0, galleryHdr.Length);
Buffer.BlockCopy(gallery.ImageData, 0, byteArray, galleryHdr.Length, gallery.ImageData.Length);
List<byte> requestInBytes2 = new List<byte>();
System.Diagnostics.Debug.WriteLine("SENT: " + gallery.ImageData.Length.ToString());
using (TcpClient clientSocket = new TcpClient())
{
await clientSocket.ConnectAsync(GeneralTags.RASPBERRY_PI_IP_ADDRESS, GeneralTags.RASPBERRY_PI_PORT);
using (NetworkStream serverStream = clientSocket.GetStream())
{
List<byte> requestInBytes = new List<byte>();
serverStream.Write(byteArray, 0, byteArray.Length);
serverStream.Flush();
int i;
Byte[] bytes = new Byte[1024];
do
{
i = serverStream.Read(bytes, 0, bytes.Length);
byte[] receivedBuffer = new byte[i];
Array.Copy(bytes, receivedBuffer, i);
requestInBytes2.AddRange(receivedBuffer);
} while (serverStream.DataAvailable);
}
}
using (MemoryStream ms = new MemoryStream())
{
System.Diagnostics.Debug.WriteLine("BACK: " + requestInBytes2.Count.ToString());
ms.Write(requestInBytes2.ToArray(), 0, requestInBytes2.ToArray().Length);
Shared.ViewImage(Image.FromStream(ms, true));
}
}
catch (Exception ex)
{
System.Diagnostics.Debug.WriteLine(ex.ToString());
}
}
Your problem is that TCP sockets are based around streams, not packets. It's true that "on the wire" everything is a packet, but when you're using TCP, you have no control over how the data is split up into packets or is reassembled into a stream.
In particular, this line of code is incorrect:
await input.ReadAsync(buffer, BufferSize, InputStreamOptions.Partial);
According to the docs, you must use the buffer returned from ReadAsync. Also note that this buffer may be a partial image, and it's up to your code to detect that situation, read more if necessary, and append those blocks together. Also, the buffer may contain part of one image and part of the next image; it's also up to your code to detect that and handle it correctly.
For this reason, most TCP applications use some form of message framing (described in more detail on my blog). Note that getting this right is surprisingly hard.
I strongly recommend that you use SignalR instead of raw TCP sockets. SignalR handles message framing for you, and it is capable of self-hosting (i.e., it does not require ASP.NET).
Here's the problem: I've successfully setup an app that will pull individual frames of video from a phone and move them to the server. But as you know, the inherent latency won't allow for a smooth transition on the server side. I'm guessing I need to package a number of images in a single byte array and unpack them on the server side. But is there a clean/simple way to do this? Could I use a multi-dimensional array? Something else?
Here's what I got client side (payload is an incoming image converted to a single byte array):
if (_socket != null)
{
SocketAsyncEventArgs socketEventArg = new SocketAsyncEventArgs();
socketEventArg.RemoteEndPoint = _socket.RemoteEndPoint;
socketEventArg.UserToken = null;
socketEventArg.Completed += new EventHandler<SocketAsyncEventArgs>(delegate(object s, SocketAsyncEventArgs e)
{
response = e.SocketError.ToString();
// Unblock the UI thread
_clientDone.Set();
});
socketEventArg.SetBuffer(payload, 0, payload.Length);
_clientDone.Reset();
_socket.SendAsync(socketEventArg);
_clientDone.WaitOne(TIMEOUT_MILLISECONDS);
}
And here's what I'm doing on the server side:
const int portNo = portNumber;
TcpListener listener = new TcpListener(portNo);
listener.Start();
Console.WriteLine("Listening...");
TcpClient tcpClient = listener.AcceptTcpClient();
NetworkStream NWStream = tcpClient.GetStream();
byte[] bytesToRead = new byte[tcpClient.ReceiveBufferSize + 1];
FrameCount += 1;
int numBytesRead = NWStream.Read(bytesToRead, 0, Convert.ToInt32(tcpClient.ReceiveBufferSize));
string FILE_NAME = "c:\\frames\\" + FrameCount.ToString() + ".gif";
System.IO.FileStream fs = null;
fs = new FileStream(FILE_NAME, FileMode.CreateNew, FileAccess.Write);
fs.Write(bytesToRead, 0, numBytesRead);
fs.Close();
tcpClient.Close();
listener.Stop();
StartImage();
As long as your images are larger than the tcp package size (I think it's 1024 by default on Windows) you won't get performance from sending multiple images in one Send/Receive.
However, sending multiple images per socket connection is useful because you will avoid the overhead of multiple connection handshakes.
Before you optimize your packages, try to send multiple images one after another on the same socket connection.
I'm trying C# sockets to send images. It works, but it's unstable. The images sent through are quite large and are updated very quickly which causes it to flicker every now and then. I'm looking for a way to compress the data sent if possible. I'm using this code:
Server side:
System.IO.MemoryStream stream = new System.IO.MemoryStream();
// !! Code here that captures the screen !!
bitmap.Save(stream, myImageCodecInfo, myEncoderParameters);
byte[] imageBytes = stream.ToArray();
stream.Dispose();
// Send the image
clientSocket.Send(imageBytes);
// Empty the byte array?
for (int i = 0; i < imageBytes.Length; i++)
{
imageBytes[i] = 0;
}
Client side:
private void OnConnect(IAsyncResult ar)
{
try
{
MessageBox.Show("Connected");
//Start listening to the data asynchronously
clientSocket.BeginReceive(byteData,
0,
byteData.Length,
SocketFlags.None,
new AsyncCallback(OnReceive),
null);
}
catch (Exception ex)
{
MessageBox.Show(ex.Message, "Stream Error",MessageBoxButtons.OK, MessageBoxIcon.Error);
}
}
private void OnReceive(IAsyncResult ar)
{
try
{
int byteCount = clientSocket.EndReceive(ar);
// Display the image on the pictureBox
MemoryStream ms = new MemoryStream(byteData);
pictureBox1.Image = Image.FromStream(ms);
}
catch (ArgumentException e)
{
//MessageBox.Show(e.Message);
}
clientSocket.BeginReceive(byteData,0,byteData.Length,SocketFlags.None,new AsyncCallback(OnReceive),null);
}
I ended up using gzip.
Turns out the flicker wasn't due to it being updated very quickly, but was because of the way I had sockets set up. It wasn't sending the whole image.
I am trying to set up two programs in C#. Basically, a simple client server set up where I want the server to listen for an image from the client. Then, upon receiving the image, will display it in a PictureBox.
I keep running into the following error:
A first chance exception of type
'System.ArgumentException' occurred in
System.Drawing.dll
The error is happening on the server code that is listening at this line:
Image bmp = Image.FromStream(ms);
Any ideas?
The Server code that listens:
using System;
using System.Collections.Generic;
using System.ComponentModel;
using System.Data;
using System.Drawing;
using System.Linq;
using System.Text;
using System.Windows.Forms;
using System.IO;
using System.Net;
using System.Net.Sockets;
namespace NetView
{
public partial class Form1 : Form
{
public Form1()
{
InitializeComponent();
startListening();
}
private void startListening()
{
////////////////////////////////////////////
Console.WriteLine("Server is starting...");
byte[] data = new byte[1024];
IPEndPoint ipep = new IPEndPoint(IPAddress.Any, 9050);
Socket newsock = new Socket(AddressFamily.InterNetwork,
SocketType.Stream, ProtocolType.Tcp);
newsock.Bind(ipep);
newsock.Listen(10);
Console.WriteLine("Waiting for a client...");
Socket client = newsock.Accept();
IPEndPoint newclient = (IPEndPoint)client.RemoteEndPoint;
Console.WriteLine("Connected with {0} at port {1}",
newclient.Address, newclient.Port);
while (true)
{
data = ReceiveVarData(client);
MemoryStream ms = new MemoryStream(data);
try
{
Image bmp = Image.FromStream(ms);
pictureBox1.Image = bmp;
}
catch (ArgumentException e)
{
Console.WriteLine("something broke");
}
if (data.Length == 0)
newsock.Listen(10);
}
//Console.WriteLine("Disconnected from {0}", newclient.Address);
client.Close();
newsock.Close();
/////////////////////////////////////////////
}
private static byte[] ReceiveVarData(Socket s)
{
int total = 0;
int recv;
byte[] datasize = new byte[4];
recv = s.Receive(datasize, 0, 4, 0);
int size = BitConverter.ToInt32(datasize, 0);
int dataleft = size;
byte[] data = new byte[size];
while (total < size)
{
recv = s.Receive(data, total, dataleft, 0);
if (recv == 0)
{
break;
}
total += recv;
dataleft -= recv;
}
return data;
}
private void Form1_Load(object sender, EventArgs e)
{
}
}
}
The Client Code:
using System;
using System.Net;
using System.Net.Sockets;
using System.Text;
using System.Drawing;
using System.IO;
namespace ConsoleApplication2
{
class Program
{
static void Main(string[] args)
{
byte[] data = new byte[1024];
int sent;
IPEndPoint ipep = new IPEndPoint(IPAddress.Parse("127.0.0.1"), 9050);
Socket server = new Socket(AddressFamily.InterNetwork,
SocketType.Stream, ProtocolType.Tcp);
try
{
server.Connect(ipep);
}
catch (SocketException e)
{
Console.WriteLine("Unable to connect to server.");
Console.WriteLine(e.ToString());
Console.ReadLine();
}
Bitmap bmp = new Bitmap("c:\\eek256.jpg");
MemoryStream ms = new MemoryStream();
// Save to memory using the Jpeg format
bmp.Save(ms, System.Drawing.Imaging.ImageFormat.Jpeg);
// read to end
byte[] bmpBytes = ms.GetBuffer();
bmp.Dispose();
ms.Close();
sent = SendVarData(server, bmpBytes);
Console.WriteLine("Disconnecting from server...");
server.Shutdown(SocketShutdown.Both);
server.Close();
Console.ReadLine();
}
private static int SendVarData(Socket s, byte[] data)
{
int total = 0;
int size = data.Length;
int dataleft = size;
int sent;
byte[] datasize = new byte[4];
datasize = BitConverter.GetBytes(size);
sent = s.Send(datasize);
while (total < size)
{
sent = s.Send(data, total, dataleft, SocketFlags.None);
total += sent;
dataleft -= sent;
}
return total;
}
}
}
ArgumentException tells you that the image format in the stream is invalid. Which is probably caused by the client application closing the memory stream before the data were sent.
Try replacing byte[] bmpBytes = ms.GetBuffer(); with
byte[] bmpBytes = ms.ToArray();
Or close the stream after the data were sent.
Remember that the byte-array returned by the .GetBuffer() returns the underlying array, not a copy of it (.ToArray() returns a copy), that is valid as long as the parent stream.
If you have access to the JPG file itself (as in the example), you should send the file bytes and not use the Image/Bitmap classes. By reading a JPG file and re-encoding into JPG you are decreasing the image quality and incurring unnecessary overhead. You can use File.ReadAllBytes() to quickly get the complete byte[] or read/send it in pieces if your memory space is limited.
A better way to send the image would be to use BinaryFormatter.
eg, some snippets from my own code to send an image every second...
sending:
TcpClient client = new TcpClient();
try
{
client.Connect(address, port);
// Retrieve the network stream.
NetworkStream stream = client.GetStream();
MessageData data = new MessageData(imageToSend);
IFormatter formatter = new BinaryFormatter();
while(true)
{
formatter.Serialize(stream, data);
Thread.Sleep(1000);
data.GetNewImage();
}
}
receiving:
TcpListener listener = new TcpListener(address, port);
listener.Start();
try
{
using (TcpClient client = listener.AcceptTcpClient())
{
stream = client.GetStream();
IFormatter formatter = new BinaryFormatter();
while (true)
{
MessageData data = (MessageData)formatter.Deserialize(stream);
if (ImageReceivedEvent != null) ImageReceivedEvent(data.Picture);
}
}
}
and the MessageData class simply holds the image and has the [Serializable] attribute.
Use Arul's code to get the data to send correctly -- you want .ToArray(), not .GetBuffer(). Then, you'll want to run the server's 'startListening' method on a background thread or you won't actually see anything (as the form thread will be busy running the server code. Try:
var t = new Thread(startListening);
t.IsBackground = true;
t.start();
In your Form_Load method instead of directly calling startListening in your constructor.
Here is a code that works for me. User starts server with a button and client selects photo by opening the file dialog of computer. At last sends the image but be careful about the photo size because UDP cannot transmit much large data.
Server:
delegate void showMessageInThread(string message);
UdpClient listener = null;
IPEndPoint endpoint = null;
Thread listenThread = null;
public Form1()
{
InitializeComponent();
}
private void Form1_Load(object sender, EventArgs e)
{
}
private void button1_Click(object sender, EventArgs e)
{
startServer();
}
private void startServer()
{
endpoint = new IPEndPoint(IPAddress.Any, 1234);
listener = new UdpClient(endpoint);
ShowMsg("Waiting for a client!");
listenThread = new Thread(new ThreadStart(Listening));
listenThread.Start();
}
private void Listening()
{
while (true)
{
//take the coming data
byte[] comingDataFromClient = listener.Receive(ref endpoint);
ImageConverter convertData = new ImageConverter();
Image image = (Image)convertData.ConvertFrom(comingDataFromClient);
pictureBox1.Image = image;
}
private void ShowMsg(string msg)
{
this.richTextBox1.Text += msg + "\r\n";
}
Client:
Socket server = null;
MemoryStream ms;
IPEndPoint endpoint = null;
public Form1()
{
InitializeComponent();
}
private void Form1_Load(object sender, EventArgs e)
{
server = new Socket(AddressFamily.InterNetwork,
SocketType.Dgram, ProtocolType.Udp);
endpoint = new IPEndPoint(IPAddress.Parse("127.0.0.1"), 1234);
}
private void btn_browse_Click(object sender, EventArgs e)
{
openFileDialog1.ShowDialog();
string path = openFileDialog1.FileName;
pictureBox1.Image = Image.FromFile(path);
textPath.Text = path;
}
private void btn_send_Click(object sender, EventArgs e)
{
try
{
ms = new MemoryStream();
Bitmap bmp = new Bitmap(this.openFileDialog1.FileName);
bmp.Save(ms, ImageFormat.Jpeg);
byte[] byteArray = ms.ToArray();
server.Connect(endpoint);
server.SendTo(byteArray, endpoint);
}
}
catch (Exception ex)
{
}
data = ReceiveVarData(client);
MemoryStream ms = new MemoryStream(data);
Image bmp = Image.FromStream(ms);
pictureBox1.Image = bmp;
The error may due to corrupted or incomplete bmp image received in the MemoryStream
it worked fine for me after increasing the socket send/receive buffers values
adjust the sender "Socket.SendBufferSize" and the receiver "Socket.ReceiveBufferSize" to large values for example = 1024 * 2048 *10
this will help sending the entire image at once.
or another solution is to check whether the received data size (data.length) is the same as the sent image data size, before the code line of forming the received image stream