I have the following code to send a picture to a receiving application
public static void sendFile(string file, string ip)
{
using (TcpClient client = new TcpClient())
{
client.Connect(IPAddress.Parse(ip), 44451);
//Console.WriteLine(ip);
NetworkStream nwStream = client.GetStream();
MemoryStream ms = new MemoryStream();
Image x = Image.FromFile(file);
x.Save(ms, x.RawFormat);
byte[] bytesToSend = ms.ToArray();
nwStream.Write(bytesToSend, 0, bytesToSend.Length);
nwStream.Flush();
client.Close();
}
}
and I'm receiving the file on the other end with this
NetworkStream nwStream = clientCopy.GetStream();
byte[] buffer = new byte[clientCopy.ReceiveBufferSize];
//---read incoming stream---
int bytesRead = nwStream.Read(buffer, 0, clientCopy.ReceiveBufferSize);
MemoryStream ms = new MemoryStream(buffer);
Image returnImage = Image.FromStream(ms);
//ms.Flush();
//ms.Close();
String path;
if (!Directory.Exists(path = #"C:\Users\acer\AppData\Roaming\test"))
{
Directory.CreateDirectory(#"C:\Users\acer\AppData\Roaming\test");
}
string format;
if (ImageFormat.Jpeg.Equals(returnImage.RawFormat))
{
format = ".jpg";
}
else if (ImageFormat.Png.Equals(returnImage.RawFormat))
{
format = ".png";
}
else
{
format = ".jpg";
}
returnImage.Save(#"C:\Users\acer\AppData\Roaming\test\default_pic" + format, returnImage.RawFormat);
If i'm sending a picture that is small (around <20kb) the file is received 100% on the other end but if I send a file around >=100kb, the picture is received but only half of the image is loaded. I'm aware of the approach of reading the stream until all data is read but I don't know how to implement it right.
Thank you
You're only calling Read once, which certainly isn't guaranteed to read all the bytes. You could either loop, calling Read and copying the relevant number of bytes on each iteration, or you could use Stream.CopyTo:
var imageStream = new MemoryStream();
nwStream.CopyTo(imageStream);
// Rewind so that anything reading the data will read from the start
imageStream.Position = 0;
... or you could just read the image straight from the network stream:
// No need for another stream...
Image returnImage = Image.FromStream(nwStream);
(It's possible that would fail due to the stream being non-seekable... in which case using CopyTo as above would be the simplest option.)
The TCP protocol (like any other stream protocol) can't be used to transfer data as is. Most of the time it is impossible to know whether all data is arrived or whether it is received unrelated chunk of data together with the expected one. Therefore it is almost always needed to define underlying protocol, for example by sending a message header (like in HTTP) or defining a message separator (like line break in Telnet; however, using separators for big size messages are impractical). In most simple case it is enough to define very simple header that contains only the length of the message
Thus, in your case you can send 4 byte image length and then the image. On the server side you will read the 4 bytes size and then in the loop call the Read until complete message is recieved.
Please note that you can receive more bytes than expected. It means that the last chunk contains the beginning of the next message.
Related
The goal of the project is to stream video captured from a python host to a c# client via tcp sockets.
Relavent python2 server script:
import cv2
import numpy as np
import socket
from threading import Thread
_continue = True
def imageStreamer4():
global _continue
cam = cv2.VideoCapture(0)
camSocket = socket.socket(socket.AF_INET, socket.SOCK_STREAM)
camSocket.bind(("",8081))
camSocket.listen(1)
# set flip image to false if you don't want the image to be flipped
flipImage = True
while _continue:
try:
client,address = camSocket.accept()
print("client connected")
ret,camImage = cam.read()
if flipImage:
camImage = cv2.flip(camImage,1)
#uncomment the below code to view the webcam stream locally
"""
cv2.imshow('image',camImage)
if cv2.waitKey(1) == 27:
break # esc to quit
"""
byteString = bytes(cv2.imencode('.jpg', camImage)[1].tostring())
fileSize = len(byteString)
totalSent = 0
client.send(str(fileSize).encode())
sizeConfirmation = client.recv(1024)
totalSent = 0
while totalSent < fileSize:
totalSent += client.send(byteString[totalSent:])
print(str(fileSize), str(totalSent),sizeConfirmation.decode('utf-8'))
except Exception as e:
print(e)
print("shutting down video stream")
_continue = False
print("video stream exited.")
Relevant c# client code:
using System.Collections;
using UnityEngine;
using System;
using System.Net;
using System.Text;
using System.Net.Sockets;
using System.Threading;
using UnityEngine.UI;
using System.IO;
void getVideoStream()
{
byte[] header;
int recieved;
int fileSize;
NetworkStream dataStream;
MemoryStream ms;
while (connectCam)
{
fileSize = 0;
recieved = 0;
camClient = new TcpClient(camIP, camPort);
//get header
dataStream = camClient.GetStream();
while (!dataStream.DataAvailable)
{
//waste time
}
header = new byte[1024];
dataStream.Read(header, 0, header.Length);
fileSize = Int32.Parse(Encoding.Default.GetString(bytesReducer(header)));
byte[] result = Encoding.ASCII.GetBytes(fileSize.ToString());
//send response
dataStream.Write(result, 0, result.Length);
ms = new MemoryStream();
while (!dataStream.DataAvailable)
{
//waste time
}
while (recieved < fileSize)
{
byte[] data = new byte[camClient.ReceiveBufferSize];
recieved += dataStream.Read(data, 0, data.Length);
ms.Write(data, 0, data.Length);
}
//the below class simply sends function calls from secondary thread back to the main thread
UnityMainThreadDispatcher.Instance().Enqueue(convertBytesToTexture(ms.ToArray()));
dataStream.Close();
camClient.Close();
}
}
void convertBytesToTexture(byte[] byteArray) {
try
{
camTexture.LoadImage(byteArray); //Texture2D object
camImage.texture = camTexture; //RawImage object
}
catch (Exception e)
{
print(e);
}
}
The byte counts sent and received match as they should. I'm admittedly new to working with sockets but I'm pretty certain that the data is arriving whole and intact. Unfortunately I really have no idea why the image is splitting as it is. (As shown in the above image) If it's relevant at all, both the server and client functions are being run on their own separate threads.
I've run the scripts on separate hosts and clients and the results remain the same.If any different information is required to help, just ask. I'll be happy to update as required.
I was able to bring the picture together properly by removing empty space inserted by the following code:
while (recieved < fileSize)
{
byte[] data = new byte[camClient.ReceiveBufferSize];
recieved += dataStream.Read(data, 0, data.Length);
ms.Write(data, 0, data.Length);
}
replacing it with this:
int increment = 0;
while (recieved < fileSize)
{
byte[] data = new byte[camClient.ReceiveBufferSize];
increment = dataStream.Read(data, 0, data.Length);
recieved += increment;
ms.Write(data.Take(increment).ToArray(), 0, increment);
}
So instead of taking an array the size of the client's buffer into the memory stream(even if it wasn't full), it's being condensed to only the amount of information received via the read method. This effectively removed all the blank space in the received image.
Because Tcp is a stream you should consider it as a flow of bytes so you should apply some mechanism to identify each image
this way you can apply:
A delimiter:
use a set of bytes to separate each image( but this byte set must not repeated in your stream) for example:
1- send image bytes
2-send delimiter
in the client you start to read until reach the delimiter
Consider defining a protocol and send the image size first and then send image data for example :
1-first 8 bytes is always image size
2- then the image bytes
and when you read data on the client first you always read 8 bytes and then base on this size you read the rest of data.
Another way is to use fixed size data(which is a bit hard for you in this case) you get a fix size byte array in the server and if the image data is less than the byte array fill it with zero and on the client always read bytes as many as the server array size.
Server Send byte[2048] and Client reads until the bytes which is readed reach 2048
I have a client application that uses a TcpClient to send a file to a server application that uses a TcpListener to receive it. Sometimes, the file transfers fine. But at other times, the transfer starts but does not finish. After I have read in a varying number of bytes on the server, I get an IOException with the message: "An existing connection was forcibly closed by the remote host."
On the client side, I create a header byte array containing data about my file, including the total size in bytes and a few other bits of data. I combine this byte array with that of the file, and send it to the server.
TcpClient fileClient = new TcpClient();
fileClient.Connect("mydomain.com", 7728);
NetworkStream clientStream = fileClient.GetStream();
ASCIIEncoding encoder = new ASCIIEncoding();
FileStream fs = File.Open(filePath, FileMode.Open);
string header = filePath + "|" + fs.Length.ToString() + "|" + this.ID); //ID = short string of mine
header = header.PadRight(512, '*');
byte[] sizeArray = encoder.GetBytes(header);
byte[] fileBuffer = new byte[fs.Length];
fs.Read(fileBuffer, 0, Convert.ToInt32(fs.Length));
fs.Close();
byte[] buffer = Combine(sizeArray, fileBuffer); //Combine = method I use to combine byte arrays
clientStream.Write(buffer, 0, buffer.Length);
clientStream.Flush();
On the server side, I create a TcpListener, and start listening. When I get a connection, I handle it, read my header array, the first 512 bytes, determine the size of the new file, and then use this code to read the file / rest of the bytes:
int newSize = ... //The size that I sent in my header array
byte[] fileArray = new byte[newSize];
int off = 0;
while (true)
{
try
{
off += clientStream.Read(fileArray, off, fileArray.Length - off);
}
catch (Exception ex)
{
off = newSize; // Enables partial receive on error, not total loss
}
if (off >= newSize)
{
break;
}
}
This is where it sometimes - about 25% of the time - goes wrong. I will get the exception at the off += clientStream.Read(fileArray, off, fileArray.Length - off) line. I surrounded it with a try catch, which enables the application to still get part of the file even when this error occurs. However, I need to be able to get the full file all the time. What am I doing wrong with this code?
The files I am transferring are JPEG images around 100KB in size, and generally take no more than two seconds to transfer on even the slowest of connections. Changing the timeout values for the clients and the listener does nothing to help.
i have a simplistic file server\client application ive written in c#. but i commonly run into the problem that my stream writes two different reads into a single buffer. i have a synchronized stream, still isnt helping. any suggestions? thanks!
System.Threading.Thread.Sleep(25);
receive_fspos = new byte[30];
int bytesread = stream_1.Read(receive_fspos, 0, receive_fspos.Length);//this is where it gets combined
if (bytesread == 0)
{
finished = true;
System.Threading.Thread.Sleep(25);
}
string string_1 = utf.GetString(receive_fspos).TrimEnd(new char[] { (char)0 });
int fsposition = (int)Convert.ToInt64(string_1);
bytestosend = fsposition;
filestream.Position = fsposition;
byte[] buffer_1 = new byte[bufsize];
int bytesreadfromfs = filestream.Read(buffer_1, 0, buffer_1.Length);
stream_1.Write(buffer_1, 0, buffer_1.Length);
Console.Write("\rSent " + fsposition + " / " + length + " bytes");
finished = true;
I would not recommend writing your own stream method if you do not fully understand it.
The problem that you are having is because the incoming data is a stream of bytes that does not give you a way of knowing how many bytes in length that the message is.
In the code below you are stating that you would like to read "receive_fspos.Length" bytes of the stream. Since "receive_fspos.Length" is 30, the amount of bytes that will be read will be anywhere from 0 to 30.
If there is only 15 bytes that have been received by the connection. It will give you 15 bytes. If the message was 20 bytes long. Then the message is now split up into different segments.
If the first message was 4 bytes and the second message is 12 bytes. Now you have 2 messages and a set of 16 blank bytes at the end. Even worse those 16 "blank" bytes could be the beginning of a third message coming in to the stream.
If the message is 50 bytes long. Then you will only receive half of the message. Now you would need to add the bytes that were read to a seperate buffer. Read from the stream again. Then repeat this until you have determined that you have read the exact amount of bytes that are needed to complete the entire message. Then concat all of the read bytes back to a single byte[].
receive_fspos = new byte[30];
int bytesread = stream_1.Read(receive_fspos, 0, receive_fspos.Length);//this is where it gets combined
Instead of rolling your own loop please use the BCL methods. It sounded like you are using strings so this would be the preferred method.. I would suggest the following.
using(NetworkStream networkStream = tcpClient.GetStream())
using(StreamReader streamReader = new StreamReader(networkStream))
using(StreamWriter streamWriter = new StreamWriter(networkStream))
{
networkStream.ReadTimeout = timeout; //Set a timeout to stop the stream from reading indefinately
//To receive a string
string incomingString = stream.ReadLine();
//To send a string
stream.WriteLine(messageToSend);
stream.Flush();
}
Your answer clarified that you are trying to send a file. For this I would recommend sending an array of bytes[]. Using this method you can send anything that can be serialized. This includes a file. Please note that the size of the file is limited since it must be kept in memory. To write a larger file you would want to save the data in chunks as it is being streamed in.
//Please note that if the file size is large enough. It may be preferred to use a stream instead of holding the entire file in memory.
byte[] fileAsBytes = File.ReadAllBytes(fileName);
using(NetworkStream networkStream = tcpClient.GetStream())
using(BinaryReader binaryReader = new BinaryReader(networkStream))
using(BinaryWriter binaryWriter = new BinaryWriter(networkStream))
{
networkStream.ReadTimeout = timeout; //Set a timeout to stop the stream from reading indefinately
//To receive a byte array
int incomingBytesLength = BinaryReader.ReadInt32(); //The header is 4 bytes that lets us know how large the incoming byte[] is.
byte[] incomingBytes = BinaryReader.ReadBytes(incomingBytesLength);
//To send a byte array
BinaryWriter.Write(fileAsBytes.Length); //Send a header of 4 bytes that lets the listener know how large the incoming byte[] is.
BinaryWriter.Write(fileAsBytes);
}
got it working, code > 30000 chars :\
it is a little messy but hey, it's functional.
server : https://www.dropbox.com/s/2wyccxpjbja10z3/Program.cs?m
client : https://www.dropbox.com/s/yp78nx4ubacsz6f/Program.cs?m
im trying to send an image via network stream, i have a sendData and Getdata functions
and i always get an invalid parameter when using the Image.FromStream function
this is my code :
I am Getting the pic from the screen, then converting it to a byte[]
Inserting it to a Memory stream that i send via a networkStream.
private void SendData()
{
StreamWriter swWriter = new StreamWriter(this._nsClient);
// BinaryFormatter bfFormater = new BinaryFormatter();
// this method
lock (this._secLocker)
{
while (this._bShareScreen)
{
// Check if you need to send the screen
if (this._bShareScreen)
{
MemoryStream msStream = new MemoryStream();
this._imgScreenSend = new Bitmap(this._imgScreenSend.Width, this._imgScreenSend.Height);
// Send an image code
swWriter.WriteLine(General.IMAGE);
swWriter.Flush();
// Copy image from screen
this._grGraphics.CopyFromScreen(0, 0, 0, 0, this._sizScreenSize);
this._imgScreenSend.Save(msStream, System.Drawing.Imaging.ImageFormat.Jpeg);
msStream.Seek(0, SeekOrigin.Begin);
// Create the pakage
byte[] btPackage = msStream.ToArray();
// Send its langth
swWriter.WriteLine(btPackage.Length.ToString());
swWriter.Flush();
// Send the package
_nsClient.Write(btPackage, 0, btPackage.Length);
_nsClient.Flush();
}
}
}
}
private void ReciveData()
{
StreamReader srReader = new StreamReader(this._nsClient);
string strMsgCode = String.Empty;
bool bContinue = true;
//BinaryFormatter bfFormater = new BinaryFormatter();
DataContractSerializer x = new DataContractSerializer(typeof(Image));
// Lock this method
lock (this._objLocker)
{
while (bContinue)
{
// Get the next msg
strMsgCode = srReader.ReadLine();
// Check code
switch (strMsgCode)
{
case (General.IMAGE):
{
// Read bytearray
int nSize = int.Parse(srReader.ReadLine().ToString());
byte[] btImageStream = new byte[nSize];
this._nsClient.Read(btImageStream, 0, nSize);
// Get the Stream
MemoryStream msImageStream = new MemoryStream(btImageStream, 0, btImageStream.Length);
// Set seek, so we read the image from the begining of the stream
msImageStream.Position = 0;
// Build the image from the stream
this._imgScreenImg = Image.FromStream(msImageStream); // Error Here
Part of the problem is that you're using WriteLine() which adds Environment.NewLine at the end of the write. When you just call Read() on the other end, you're not dealing with that newline properly.
What you want to do is just Write() to the stream and then read it back on the other end.
The conversion to a string is strange.
What you're doing, when transferring an image, is sending an array of bytes. All you need to do is send the length of the expected stream and then the image itself, and then read the length and the byte array on the other side.
The most basic and naive way of transferring a byte array over the wire is to first send an integer that represents the length of the array, and read that length on the receiving end.
Once you now know how much data to send/receive, you then send the array as a raw array of bytes on the wire and read the length that you previously determined on the other side.
Now that you have the raw bytes and a size, you can reconstruct the array from your buffer into a valid image object (or whatever other binary format you've just sent).
Also, I'm not sure why that DataContractSerializer is there. It's raw binary data, and you're already manually serializing it to bytes anyway, so that thing isn't useful.
One of the fundamental problems of network programming using sockets and streams is defining your protocol, because the receiving end can't otherwise know what to expect or when the stream will end. That's why every common protocol out there either has a very strictly defined packet size and layout or else does something like sending length/data pairs, so that the receiving end knows what to do.
If you implement a very simple protocol such as sending an integer which represents array length and reading an integer on the receiving end, you've accomplished half the goal. Then, both sender and receiver are in agreement as to what happens next. Then, the sender sends exactly that number of bytes on the wire and the receiver reads exactly that number of bytes on the wire and considers the read to be finished. What you now have is an exact copy of the original byte array on the receiving side and you can then do with it as you please, since you know what that data was in the first place.
If you need a code example, I can provide a simple one or else there are numerous examples available on the net.
Trying to keep it short:
the Stream.Read function (which you use) returns an int that states how many bytes were read, this is return to you so you could verify that all the bytes you need are received.
something like:
int byteCount=0;
while(byteCount < nSize)
{
int read = this._nsClient.Read(btImageStream, byteCount, nSize-byteCount);
byteCount += read;
}
this is not the best code for the job
I am Sending Jpeg Encoded Images as Serialized Complex object over UDP Socket..As UDP Datagram Support max. Length of 52KB to 54KB,I m Writing the arrived Datagrams to memory stream that I could DeSerialize it at once.
Receiver End Code:
while (AccumulatingBytes <= TotalSizeOfComplexObject)//Size of Complex Object after Serialization which I get through TCP//
{
byte[] Recievedbytes = UdpListener.Receive(ref RemoteEndPoint);//I m Sending fixed size of 204 NUMBER OF BYTES
ImageStream = new MemoryStream();
ImageStream.Position = (long)AccumulatingBytes;
ImageStream.Write(Recievedbytes, 0, Recievedbytes.Length);
AccumulatingBytes += 204;
}
When I deSerialize this Memory Stream Exception is Thrown.
Some obvious observations that may help...
why a new MemoryStream each time in the loop? this is the most immediate problem
why not AccumulatingBytes += Receivedbytes.Length;
Also; if you aren't handling errors and missing data yourself, use TCP.
So something like:
ImageStream = new MemoryStream();
while (AccumulatingBytes <= TotalSizeOfComplexObject)
{
byte[] Recievedbytes = UdpListener.Receive(ref RemoteEndPoint);
ImageStream.Write(Recievedbytes, 0, Recievedbytes.Length);
AccumulatingBytes += Recievedbytes.Length;
}
then set ImageStream.Position = 0 before deserializing. You should also probably check that UdpListener isn't reporting EOFs.
Did you write bytes stream at 0 place all the time?
ImageStream.Write(Recievedbytes, 0, Recievedbytes.Length);
Maybe you need instead of zero place variable AccumulatingBytes ?