I'm currently working on a project that involves me taking a picture with my laptop camera on a given command. I'm using my webcam as a security camera right now due to the fact that I have not gotten the time to buy a real one yet so my laptop will have to do.
The project is structured like this:
I have my Server (this runs on my laptop) and I have my Client (which runs on my PC which will later be on my phone - but thats not relevant). I send a command to the server using my client (in this case it's "Webcam") and the server receives it, takes a picture using the webcam, gets the bytes and then sends the bytes over a network stream to the client.
However when I download the stream with my client, it downloads 0 bytes. To clarify, it does save the image in my folder but I can't open it because it's 0 bytes.
Server
while (true)
{
if (nwStream.DataAvailable)
{
//Create a byte array (a buffer). This will hold the byte size that we recieve from the client in memory.
byte[] buffer = new byte[client.ReceiveBufferSize];
//Now we need to read the bytes and store the bytes we read in a int, we do this by using our nwStream.Read function and pass it the correct parameters.
//1. [Buffer] - an array of type byte, and we declared that above [buffer] <- This is what we are reading from.
//2. [Offset] - Now we need to set the offset, where we want to start reading in the buffer. so since its an array we start at 0.
//3. [Size] - The number of bytes we want to read from the NetworkStream.
int bytesRead = nwStream.Read(buffer, 0, client.ReceiveBufferSize);
//Now we need to decode the message we recieved by using the Encoding.ASCII.GetString to get the string and passing the correct parameters.
//1. [Bytes] - What we want to decode, this is where we give it a byte array
//2. [Index] - We need to give it the first index of the array that we want to decode so it knows where to start, we do this bya dding 0 since its an array.
//3. [Count] - The number of bytes we want to decode and we created an int to hold that number above so let's pass it as a parameter.
string dataRecieved = Encoding.Default.GetString(buffer, 0, bytesRead);
if (dataRecieved == "webcam")
{
Console.WriteLine("Starting the webcam feature..");
CameraFeature();
}
}
}
}
private static void CameraFeature()
{
VideoCaptureDevices = new FilterInfoCollection(FilterCategory.VideoInputDevice);
foreach (FilterInfo Device in VideoCaptureDevices)
{
Devices.Add(Device.Name);
Console.WriteLine("Device: " + Device.Name);
}
FinalVideo = new VideoCaptureDevice(VideoCaptureDevices[0].MonikerString);
FinalVideo.NewFrame += new NewFrameEventHandler(FinalVideo_NewFrame);
FinalVideo.Start();
}
private static void exitcamera()
{
FinalVideo.SignalToStop();
// FinalVideo.WaitForStop(); << marking out that one solved it
FinalVideo.NewFrame -= new NewFrameEventHandler(FinalVideo_NewFrame); // as sugested
FinalVideo = null;
}
static void FinalVideo_NewFrame(object sender, NewFrameEventArgs eventArgs)
{
Bitmap video = (Bitmap)eventArgs.Frame.Clone();
//video.Save($"image{imgCount}.png");
Console.WriteLine("Picture taken!");
Connection(video);
exitcamera();
imgCount++;
}
public static void Connection(Bitmap tImage)
{
Console.WriteLine("Starting the transfer..");
byte[] bStream = ImageToByte(tImage);
try
{
nwStream.Write(bStream, 0, bStream.Length);
Console.WriteLine("Done..");
}
catch (SocketException e1)
{
Console.WriteLine("SocketException: " + e1);
}
}
static byte[] ImageToByte(Bitmap iImage)
{
MemoryStream mMemoryStream = new MemoryStream();
iImage.Save(mMemoryStream, System.Drawing.Imaging.ImageFormat.Png);
return mMemoryStream.ToArray();
}
}
Client
private static void SendCommand()
{
while (true)
{
Console.WriteLine("Please enter a command: ");
string userInput = Console.ReadLine();
//Convert out string message to a byteArray because we will send it as a buffer later.
byte[] bytesToSend = Encoding.Default.GetBytes(userInput);
//Write out to the console what we are sending.
Console.WriteLine("Sending: " + userInput);
//Use the networkstream to send the byteArray we just declared above, start at the offset of zero, and the size of the packet we are sending is the size of the messages length.
nwStream.Write(bytesToSend, 0, bytesToSend.Length);
//RecieveBuffer();
//Recieve the bytes that are coming from the other end (server) through the client and store them in an array.
byte[] bytesToRead = new byte[client.ReceiveBufferSize];
//byte[] bitmap = GetYourImage();
//read the bytes, starting from the offset 0, and the size is what ever the client has recieved.
int bytesRead = nwStream.Read(bytesToRead, 0, client.ReceiveBufferSize);
//Decode the bytes we just recieved using the Encoding.ASCII.GetString function and give it the correct parameters
//1. What it should decode
//2. Starting to decode from what offset
//3. How much do we want to decode?
Bitmap bmp;
using (var ms = new MemoryStream(bytesToRead))
{
bmp = new Bitmap(ms);
bmp.Save("Image.png");
}
//Console.WriteLine("Recieved: " + Encoding.ASCII.GetString(bytesToRead, 0, bytesRead));
}
Try MemoryStream.Seek(0, SeekOrigin.Begin) before sending stream to client. You may download 0 bytes, because stream at its end when you receive it.
Related
I recently started to work with TCP in C#. I'm now at the point where I want the client to receive the data sent by the server.
I know that there is no guarantee for the client to receive all data at once. If the size of the data sent is bigger than the buffer's size at the client's side, then the data will be sent in parts. So my question is: how can I store all my received data in a byte array, and then convert it to the actual message when all is received?
I've set the buffer size to 1, so I can see what happens when all sent data doesn't fit in the buffer. Here are my methods in which I call stream.BeginRead() in Client.cs:
// Deliberately setting the buffer size to 1, to simulate what happens when the message doesn't fit in the buffer.
int bufferSize = 1;
byte[] receiveBuffer;
private void ConnectCallback(IAsyncResult result)
{
client.EndConnect(result);
Console.WriteLine("Connected to server.");
stream = client.GetStream();
// At this point, the client is connected, and we're expecting a message: "Welcome!"
receiveBuffer = new byte[bufferSize];
stream.BeginRead(receiveBuffer, 0, receiveBuffer.Length, new AsyncCallback(ReadCallback), stream);
}
private void ReadCallback(IAsyncResult result)
{
int bytesLength = stream.EndRead(result);
// Should be "Welcome!". But of course it's "W", because the bufferSize is 1.
string message = Encoding.UTF8.GetString(receiveBuffer, 0, bytesLength);
Console.WriteLine("Received message: {0}", receivedMessage);
// Reset the buffer and begin reading a new message, which will be "e".
// However, I want the whole message ("Welcome!") in one byte array.
receiveBuffer = new byte[bufferSize];
stream.BeginRead(receiveBuffer, 0, receiveBuffer.Length, ReadCallback, null);
}
This is the output when sending the message "Welcome!":
Connected to server.
Received message: W
Received message: e
Received message: l
Received message: c
Received message: o
Received message: m
Received message: e
Received message: !
Should I temporary store the data until the whole message has arrived, and then convert that to a string?
Follow up question: What if 2 messages are sent closely after each other, for example Welcome! and then What's your name? How do I distinguish the two messages then?
Should I temporary store the data until the whole message has arrived, and then convert that to a string?
Yes, exactly.
Follow up question: What if 2 messages are sent closely after each other, for example Welcome! and then What's your name? How do I distinguish the two messages then?
The general approach is to send the length of the message before the message itself. That way the receiving end will know when it has received a complete package.
As 500 - Internal Server Error already pointed out, you use a buffer for it. Here is some Code example:
Receiving:
while (true) //you should use a bool variable here to stop this on disconnect.
{
byte[] bytes;
bytes = ReadNBytes(ns, 4);
//read out the length field we know is there, because the server always sends it.
int msgLenth = BitConverter.ToInt32(bytes, 0);
bytes = ReadNBytes(ns, msgLenth);
//working with the buffer...
if (bytes.Length > 0)
{
try
{
//do stuff here. bytes contains your complete message.
}
catch (Exception e) { Log(e.Message); }
}
}
public static byte[] ReadNBytes(NetworkStream stream, int n)
{
byte[] buffer = new byte[n];
try
{
int bytesRead = 0;
int chunk;
while (bytesRead < n)
{
chunk = stream.Read(buffer, (int)bytesRead, buffer.Length - (int)bytesRead);
if (chunk == 0)
{
// error out
Log("Unexpected disconnect");
stream.Close();
}
bytesRead += chunk;
}
}
catch (Exception e) { Log(e.Message); }
return buffer;
}
To send stuff use something linke this:
public static void SendObject(NetworkStream ns, byte[] data)
{
byte[] lengthBuffer = BitConverter.GetBytes(data.Length);
ns.Write(lengthBuffer, 0, lengthBuffer.Length);
ns.Write(data, 0, data.Length);
}
I hope that helped!
I am trying to receive data at my server of any length through tcp connection. First my client sends length of data to server through stream.write then it send the actual data.
At Client I receive the length and loop until whole the data is received successfully.
The problem is: "I receive 0 size on the server no matters what the length of data is". I tried to figure out the issue but could not get where the problem is. Any kind of help/hint would be appreciated.
Server Side Code:
byte[] lengthOfData = new byte[2048];
byte[] buffer;
try
{
stream = client.GetStream();
eventLog1.WriteEntry("Size of 1st = "+stream.Read(lengthOfData,0,lengthOfData.Length));
int numBytesToRead = ByteArraySerializer.BytesArrayToInt(lengthOfData);
eventLog1.WriteEntry("number of bytes to read= "+numBytesToRead);
buffer = new byte[numBytesToRead+10];
int numBytesRead = 0;
do
{
int n = stream.Read(buffer, numBytesRead, 10);
numBytesRead += n;
numBytesToRead -= n;
eventLog1.WriteEntry("number of bytes read= " + numBytesRead);
} while (numBytesToRead > 0);
}
catch (Exception e) // Called automatically when Client Diposes or disconnects unexpectedly
{
eventLog1.WriteEntry("Connection Closes: "+e.ToString());
lock (connectedClients)
{
connectedClients.Remove(client);
}
client.Close();
break;
}
Client Side Code
byte[] command = ByteArraySerializer.Serialize<Command>(cmd);
byte[] sizeOfData = ByteArraySerializer.IntToBytesArray(command.Length);
stream.Write(sizeOfData, 0, sizeOfData.Length);
Console.WriteLine("Size of Data = "+command.Length);
stream.Write(command, 0, command.Length);
Change the following line at server
byte[] lengthOfData = new byte[2048];
to
byte[] lengthOfData = new byte[sizeof(int)];
The issue was that the read at server was supposed to read only 4 bytes integer whereas it was reading other data as well which was getting written after writing the length of data. We are supposed to read only 4 bytes if we want to get the length of data(integer).
Need help with sending and receiving compressed data over TCP socket.
The code works perfectly fine if I don't use compression, but something very strange happens when I do use compression.. Basically, the problem is that the stream.Read() operation gets skipped and I don't know why..
My code:
using (var client = new TcpClient())
{
client.Connect("xxx.xxx.xx.xx", 6100);
using (var stream = client.GetStream())
{
// SEND REQUEST
byte[] bytesSent = Encoding.UTF8.GetBytes(xml);
// send compressed bytes (if this is used, then stream.Read() below doesn't work.
//var compressedBytes = bytesSent.ToStream().GZipCompress();
//stream.Write(compressedBytes, 0, compressedBytes.Length);
// send normal bytes (uncompressed)
stream.Write(bytesSent, 0, bytesSent.Length);
// GET RESPONSE
byte[] bytesReceived = new byte[client.ReceiveBufferSize];
// PROBLEM HERE: when using compression, this line just gets skipped over very quickly
stream.Read(bytesReceived, 0, client.ReceiveBufferSize);
//var decompressedBytes = bytesReceived.ToStream().GZipDecompress();
//string response = Encoding.UTF8.GetString(decompressedBytes);
string response = Encoding.UTF8.GetString(bytesReceived);
Console.WriteLine(response);
}
}
You will notice some extension methods above. Here is the code in case you are wondering if something is wrong there.
public static MemoryStream ToStream(this byte[] bytes)
{
return new MemoryStream(bytes);
}
public static byte[] GZipCompress(this Stream stream)
{
using (var memoryStream = new MemoryStream())
{
using (var gZipStream = new GZipStream(memoryStream, CompressionMode.Compress))
{
stream.CopyTo(gZipStream);
}
return memoryStream.ToArray();
}
}
public static byte[] GZipDecompress(this Stream stream)
{
using (var memoryStream = new MemoryStream())
{
using (var gZipStream = new GZipStream(stream, CompressionMode.Decompress))
{
gZipStream.CopyTo(memoryStream);
}
return memoryStream.ToArray();
}
}
The extensions work quite well in the following, so I'm sure they're not the problem:
string original = "the quick brown fox jumped over the lazy dog";
byte[] compressedBytes = Encoding.UTF8.GetBytes(original).ToStream().GZipCompress();
byte[] decompressedBytes = compressedBytes.ToStream().GZipDecompress();
string result = Encoding.UTF8.GetString(decompressedBytes);
Console.WriteLine(result);
Does anyone have any idea why the Read() operation is being skipped when the bytes being sent are compressed?
EDIT
I received a message from the API provider after showing them the above sample code. They had this to say:
at a first glance I guess the header is missing. The input must start
with a 'c' followed by the length of the input
(sprintf(cLength,"c%09d",hres) in our example). We need this because
we can't read until we find a binary 0 to recognize the end.
They previously provided some sample code in C, which I don't fully understand 100%, as follows:
example in C:
#include <zlib.h>
uLongf hres;
char cLength[COMPRESS_HEADER_LEN + 1] = {'\0'};
n = read(socket,buffer,10);
// check if input is compressed
if(msg[0]=='c') {
compressed = 1;
}
n = atoi(msg+1);
read.....
hres = 64000;
res = uncompress((Bytef *)msg, &hres, (const Bytef*)
buffer/*compressed*/, n);
if(res == Z_OK && hres > 0 ){
msg[hres]=0; //original
}
else // errorhandling
hres = 64000;
if (compressed){
res = compress((Bytef *)buffer, &hres, (const Bytef *)msg, strlen(msg));
if(res == Z_OK && hres > 0 ) {
sprintf(cLength,"c%09d",hres);
write(socket,cLength,10);
write(socket, buffer, hres);
}
else // errorhandling
makefile: add "-lz" to the libs
They're using zlib. I don't suspect that to make any difference, but I did try using zlib.net and I still get no response anyway.
Can someone give me an example of how exactly I'm supposed to send this input length in C#?
EDIT 2
In response to #quantdev, here is what I am trying now for the length prefix:
using (var client = new TcpClient())
{
client.Connect("xxx.xxx.xx.xx", 6100);
using (var stream = client.GetStream())
{
// SEND REQUEST
byte[] bytes = Encoding.UTF8.GetBytes(xml);
byte[] compressedBytes = ZLibCompressor.Compress(bytes);
byte[] prefix = Encoding.UTF8.GetBytes("c" + compressedBytes.Length);
byte[] bytesToSend = new byte[prefix.Length + compressedBytes.Length];
Array.Copy(prefix, bytesToSend, prefix.Length);
Array.Copy(compressedBytes, 0, bytesToSend, prefix.Length, compressedBytes.Length);
stream.Write(bytesToSend, 0, bytesToSend.Length);
// WAIT
while (client.Available == 0)
{
Thread.Sleep(1000);
}
// GET RESPONSE
byte[] bytesReceived = new byte[client.ReceiveBufferSize];
stream.Read(bytesReceived, 0, client.ReceiveBufferSize);
byte[] decompressedBytes = ZLibCompressor.DeCompress(bytesReceived);
string response = Encoding.UTF8.GetString(decompressedBytes);
Console.WriteLine(response);
}
}
You need to check the return value of the Read() calls you are making on the TCP stream: it is the number of bytes effectively read.
MSDN says :
Return Value
The total number of bytes read into the buffer. This can be less than the number of bytes requested if that many bytes are not
currently available, or zero (0) if the end of the stream has been
reached.
If the socket is closed, the call will return immediately 0 (which is what might be happening here).
If is not 0, then you must check how many bytes you did actually received, if it is less than client.ReceiveBufferSize, you will need additional calls to Read to retrieve the remaining bytes.
Prior to you call to read, check that some data is actually available on the socket :
while(client.Available == 0)
// wait ...
http://msdn.microsoft.com/en-us/library/system.net.sockets.tcpclient.available%28v=vs.110%29.aspx
I think you may have the end of file or so. Can you try setting the stream position before reading the stream
stream.position = 0;
http://msdn.microsoft.com/en-us/library/vstudio/system.io.stream.read
Encoding.UTF8.GetString shouldn't be used on arbitrary byte array.
e.g.: The compressed bytes may contain NULL character, which is not allowed in UTF-8 encoded text except for being used as terminator.
If you want to print the received bytes for debugging, maybe you should just print them as integers.
I am trying to create a RS232 application that reads incoming data and stores it in a buffer. I found the following code in an RS232 example but I am not sure how to use it
*RS232 Example port_DataReceived*
private void port_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
if (!comport.IsOpen) return;
if (CurrentDataMode == DataMode.Text)
{
string data = comport.ReadExisting();
LogIncoming(LogMsgType.Incoming, data + "\n");
}
else
{
int bytes = comport.BytesToRead;
byte[] buffer = new byte[bytes];
comport.Read(buffer, 0, bytes);
LogIncoming(LogMsgType.Incoming, ByteArrayToHexString(buffer) + "\n");
}
}
I am trying to write another method that takes an incoming byte array and combines it with another array ... see below:
private void ReadStoreArray()
{
//Read response and store in buffer
int bytes = comport.BytesToRead;
byte[] respBuffer = new byte[bytes];
comport.Read(respBuffer, 0, bytes);
//I want to take what is in the buffer and combine it with another array
byte AddOn = {0x01, 0x02}
byte Combo = {AddOn[1], AddOn[2], respBuffer[0], ...};
}
I currently have both methods in my code as I am confused how to read and store the incoming bytes to the port. Can I use the "port_DataReceived" method in my "ReadStoreArray" method? Do I need to modify my "ReadStoreArray" method? Or should I just start over?
When you create your SerialPort:
SerialPort comport = new SerialPort("COM1");
comport.DataReceived += new SerialDataReceivedEventHandler(port_DataReceived);
private void port_DataReceived(object sender, SerialDataReceivedEventArgs e)
{
// Shortened and error checking removed for brevity...
if (!comport.IsOpen) return;
int bytes = comport.BytesToRead;
byte[] buffer = new byte[bytes];
comport.Read(buffer, 0, bytes);
HandleSerialData(buffer);
}
//private void ReadStoreArray(byte[] respBuffer)
private void HandleSerialData(byte[] respBuffer)
{
//I want to take what is in the buffer and combine it with another array
byte [] AddOn = {0x01, 0x02}
byte [] Combo = {AddOn[1], AddOn[2], respBuffer[0], ...};
}
Don't bother with using the DataRecieve handler, it's horribly inaccurate, you're better off to start a thread which constantly reads the serial port and grabs every byte which comes across it.
You can't read the same data from the port twice. You'll need to read it once into a buffer, then either share the buffer (pass as a function parameter) or clone it to give each function its own copy.
I am sending jpg via tcp:
On the sender side we got:
private void sendResponse() {
BitmapImage bmi = new BitmapImage(new Uri(#"C:\a.jpg", UriKind.Absolute));
byte[] data;
JpegBitmapEncoder encoder = new JpegBitmapEncoder();
encoder.Frames.Add(BitmapFrame.Create(bmi));
using (MemoryStream ms = new MemoryStream()) {
encoder.Save(ms);
data = ms.ToArray();
}
clientStream.Write(data, 0, data.Length);
}
but how should I set buffer size on the other side(receiver side)?
I got buffer set for 4096 bytes which is obviously wrong:
public string Receive() {
string response = "Operation Timeout";
if (connection != null) {
connectArgs.RemoteEndPoint = connection.RemoteEndPoint;
connectArgs.SetBuffer(new Byte[4096], 0, 4096);
connectArgs.Completed += new EventHandler<SocketAsyncEventArgs>(delegate(object s, SocketAsyncEventArgs e) {
if (e.SocketError == SocketError.Success) {
BITMAP DECODING
} else {
response = e.SocketError.ToString();
}
_clientDone.Set();
});
_clientDone.Reset();
connection.ReceiveAsync(connectArgs);
_clientDone.WaitOne(3000);
} else {
response = "Socket is not initialized";
}
return response;
}
If you know the exact size of the data, set the buffer to that size. If not, set it to something reasonable and read from the stream until it's empty.
See this question as an example.
In addition to Sebastian Negraszus. You can add your own data formatting (application protocol).
For example you can first send the size of the data you're trying to transmit. Say, it will be 4 bytes (size of the int type). The other end will know that it has to read 4 bytes from network to know the amount of data to expect from the stream.
After data size is known at the other end - memory allocation should not be a problem.