C# Tcp is losing packets? - c#

I wrote a C# chat software that uses a new (at least for me) system that I called request system. I don't know if that has been created before, but for now I think of it as my creation :P
Anyhow, this system works like this:
soc receives a signal
checks the signal
if the data it just received is the number 2, the client software knows that the server is about to send a chat message. if the number is 3, so the client knows that the server is about to send the member list, and so on.
The problem is this: when I do step-by-step in VS2012 it works fine, the chat is working properly. When I use it on debug mode or just run it on my desktop, there seems to be missing data, and it shouldn't be because the code is working just fine...
Example of code for the sending&receiving message on client:
public void RecieveSystem()
{
while (true)
{
byte[] req = new byte[1];
soc.Receive(req);
int requestID = int.Parse(Encoding.UTF8.GetString(req));
if (requestID == 3)
{
byte[] textSize = new byte[5];
soc.Receive(textSize);
byte[] text = new byte[int.Parse(Encoding.UTF8.GetString(textSize))];
soc.Receive(text);
Dispatcher.Invoke(() => { ChatBox.Text += Encoding.UTF8.GetString(text) + "\r\n"; });
}
}
}
public void OutSystem(string inputText)
{
byte[] req = Encoding.UTF8.GetBytes("3");
soc.Send(req);
byte[] textSize = Encoding.UTF8.GetBytes(Encoding.UTF8.GetByteCount(inputText).ToString());
soc.Send(textSize);
byte[] text = Encoding.UTF8.GetBytes(inputText);
soc.Send(text);
Thread.CurrentThread.Abort();
}
and on the server:
public void UpdateChat(string text)
{
byte[] req = Encoding.UTF8.GetBytes("3");
foreach (User user in onlineUsers)
user.UserSocket.Send(req);
byte[] textSize = Encoding.UTF8.GetBytes(Encoding.UTF8.GetByteCount(text).ToString());
foreach (User user in onlineUsers)
user.UserSocket.Send(textSize);
byte[] data = Encoding.UTF8.GetBytes(text);
foreach (User user in onlineUsers)
user.UserSocket.Send(data);
}
public void RequestSystem(Socket soc)
{
~~~
}
else if (request == 3)
{
byte[] dataSize = new byte[5];
soc.Receive(dataSize);
byte[] data = new byte[int.Parse(Encoding.UTF8.GetString(dataSize))];
soc.Receive(data);
UpdateChat(Encoding.UTF8.GetString(data));
}
}
catch
{
if (!soc.Connected)
{
Dispatcher.Invoke(() => { OnlineMembers.Items.Remove(decodedName + " - " + soc.RemoteEndPoint); Status.Text += soc.RemoteEndPoint + " Has disconnected"; });
onlineUsers.Remove(user);
Thread.CurrentThread.Abort();
}
}
}
}
What could be the problem?

You're assuming that you'll have one packet for each Send call. That's not stream-oriented - that's packet-oriented. You're sending multiple pieces of data which I suspect are coalesced into a single packet, and then you'll get them all in a single Receive call. (Even if there are multiple packets involved, a single Receive call could still receive all the data.)
If you're using TCP/IP, you should be thinking in a more stream-oriented fashion. I'd also encourage you to change the design of your protocol, which is odd to say the least. It's fine to use a length prefix before each message, but why would you want to encode it as text when you've got a perfectly good binary connection between the two computers?
I suggest you look at BinaryReader and BinaryWriter: use TcpClient and TcpListener rather than Socket (or at least use NetworkStream), and use the reader/writer pair to make it easier to read and write pieces of data (either payloads or primitives such as the length of messages). (BinaryWriter.Write(string) even performs the length-prefixing for you, which makes things a lot easier.)

Related

Unable to create same TCP message with C# and C++

I want to create same message and send it with C# as I do it with C++ where all works. Note that I have C# client where I have troubles, C++ client where all works fine and C++ server that should read messages from both C# and C++ clients.
Here is how I send the message from C++:
void ConnectAuthserverCommand::SendLogin(tcp::socket &s, const flatbuffers::FlatBufferBuilder &builder) const {
ClientOpcode opc = CLIENT_LOGIN_REQUEST;
flatbuffers::FlatBufferBuilder builder2;
auto email = builder2.CreateString("test#abv.bg");
auto password = builder2.CreateString("test");
auto loginRequest = Vibranium::CreateLoginRequest(builder2, email, password);
builder2.FinishSizePrefixed(loginRequest);
size_t size2 = builder2.GetSize();
uint8_t *buf2 = builder2.GetBufferPointer();
uint8_t *actualBuffer2 = new uint8_t[size2 + 2];
actualBuffer2[1] = (opc >> 8);
actualBuffer2[0] = (opc&0xFF);
memcpy(actualBuffer2 + 2, buf2, size2);
boost::asio::write(s, boost::asio::buffer(actualBuffer2,size));
}
ClientOpcode is as follows:
enum ClientOpcode : uint16_t{
CLIENT_AUTH_CONNECTION = 0x001,
CLIENT_LOGIN_REQUEST = 0x002,
CLIENT_NUM_MSG_TYPES = 0x003,
};
What I do is the following: I get a ClientOpcode which I want to put infront of FlatBuffers message. So I create an array of uint8_t which I extend with exactly 2 bytes(Because the size of uint16_t is 2 bytes.) Than on the server I read the first 2 bytes in order to get the header and here is how I do that:
void Vibranium::Client::read_header() {
auto self(shared_from_this());
_packet.header_buffer.resize(_packet.header_size);
boost::asio::async_read(socket,
boost::asio::buffer(_packet.header_buffer.data(), _packet.header_size),
[this, self](boost::system::error_code ec,std::size_t bytes_transferred)
{
if ((boost::asio::error::eof == ec) || (boost::asio::error::connection_reset == ec))
{
Disconnect();
}
else
{
assert(_packet.header_buffer.size() >= sizeof(_packet.headerCode));
std::memcpy(&_packet.headerCode, &_packet.header_buffer[0], sizeof (_packet.headerCode));
if(_packet.headerCode)
read_size();
else
Logger::Log("UNKNOWN HEADER CODE", Logger::FatalError);
}
});
}
So far so good, however I am not able to send correctly formatted same message from the C# client. Note that I send exactly same data, take a look:
Client authClient = GameObject.Find("Client").GetComponent<AuthClient>().client; // This is how I get Client class instance.
ClientOpcode clientOpcode = ClientOpcode.CLIENT_LOGIN_REQUEST;
var builder = new FlatBuffers.FlatBufferBuilder(1);
var email = builder.CreateString("test#abv.bg");
var password = builder.CreateString("test");
var loginRequest = LoginRequest.CreateLoginRequest(builder, email, password);
builder.FinishSizePrefixed(loginRequest.Value);
authClient.Send(builder, clientOpcode);
And here is how I actually prepend the header and send the data in C#:
public static Byte[] PrependClientOpcode(FlatBufferBuilder byteBuffer, ClientOpcode code)
{
var originalArray = byteBuffer.SizedByteArray();
byte[] buffer = new byte[originalArray.Length + 2];
buffer[1] = (byte)((ushort)code / 0x0100);
buffer[0] = (byte)code;
Array.Copy(originalArray, 0, buffer, 2, originalArray.Length);
return buffer;
}
public void Send(FlatBufferBuilder builder, ClientOpcode opcode)
{
byte[] buffer = builder.SizedByteArray();
var bufferToSend = PrependClientOpcode(builder, opcode);
if (bufferToSend.Length > MaxMessageSize)
{
Logger.LogError("Client.Send: message too big: " + bufferToSend.Length + ". Limit: " + MaxMessageSize);
return;
}
if (Connected)
{
// respect max message size to avoid allocation attacks.
if (bufferToSend.Length <= MaxMessageSize)
{
// add to send queue and return immediately.
// calling Send here would be blocking (sometimes for long times
// if other side lags or wire was disconnected)
sendQueue.Enqueue(bufferToSend);
sendPending.Set(); // interrupt SendThread WaitOne()
}
}
else
{
Logger.LogWarning("Client.Send: not connected!");
}
}
ClientOpcode enum on C# is as follows:
public enum ClientOpcode : ushort
{
CLIENT_AUTH_CONNECTION = 0x001,
CLIENT_LOGIN_REQUEST = 0x002,
CLIENT_NUM_MSG_TYPES = 0x003,
}
I think I can use ushort as a replacement of uint16_t in C#. That is why ClientOpcode is ushort.
When I send the message I get error on the client saying UNKNOWN HEADER CODE. If you take a look at the C++ server code to read the header you'll see that this message is displayed when the server is unable to read the header code. So somehow I am unable to place the ClientOpcode header correctly infront of the TCP message send from the C# client.
In order to find out what are the differences I installed WireShark on the host to track both messages. Here are they:
This one is from the correctly working C++ client:
And this one is the dump of the C# client:
As you can see on the second image of the TCP dump the Length of is bigger. C++ message is with length of 58 where C# message's length is 62. Why?
The C++ client is sending data:
0200340000000c00000008000c00040008000800000014000000040000000400000074657374000000000b00000074657374406162762e626700
When the C# client is sending:
0000003a0200340000000c00000008000c00040008000800000014000000040000000400000074657374000000000b00000074657374406162762e626700
The C# client is adding to it's message in front 0000003a. If I remove that messages should be the same and all will work.
Why is my C# client adding those extra data in front and how can I fix it?

c# Using static variables in different projects

I have a solution with two projects which act as Server and Client respectively. The Client is a simple console application which sends data to the server. The server is a WPF application which receives the data and displays it in a datagrid. The MVVM approach is used here.
In the Server UI there are three textboxes in which the user can type in:
IP Address: ("127.0.0.1")
Port: (some port)
Delimeter: (some char like '#' for example)
The challenge for me in this one is that, whatever delimeter the user provides, it should be used in the client project, to be put in between the data which is to be sent. For example the client sends:
Name + Delimeter + Surname + Delimeter + Age
What i have tried:
I added a Utils class with static fields for IPAddress, port and delimeter like this:
public class Utils
{
public static string IP_ADDRESS = " ";
public static int PORT = 0;
public static char DELIMETER = '\0';
}
I then tried to change these values in my ViewModel where the respective properties which are bound to the UI are by assigning them:
private void storeData()
{
Utils.IP_ADDRESS = IP;
Utils.PORT = Port;
Utils.DELIMETER = Delimeter;
}
In the client program:
static void Main(string[] args)
{
Client client = new Client(Utils.IP_ADDRESS, Utils.PORT);
while (true)
{
client.SendData("some Name" + Utils.DELIMETER + "some Surname" + Utils.DELIMETER + some Age + Utils.DELIMETER + "something else");
Thread.Sleep(3000);
}
}
The problem here is that whenever i start a new Client instance the values from the util class are still the default ones (null).
Any help is appreciated.
Let's break down your problem:
The server can change ip or ports at will and the clients will somehow guess the new port and connect.
The server changes the delimiter at will and the clients adapt to the new delimiter.
Problem 1 is impossible. Information cannot magically get transferred to clients before the client connects to the server, and the client needs ip and ports to connect to the server. Whatever technique you use to transfer the ip and port to the client is a better communication channel than your client/server, so you don't need a client/server.
Problem 2 has been solved by WCF already. Use WCF and SOAP or REST (which is just HTML).
Here is a sample of what the code would look like for the clients to determine the delimiter before sending the main request:
class Server
{
private TcpListener _listener = new TcpListener(12312);
public void Start()
{
_listener.Start();
while (true)
{
var client = _listener.AcceptTcpClient();
var stream = client.GetStream();
var request = getRequest(stream);
if (request == "GetDelimiter")
{
SendResponse(Utils.DELIMITER, stream);
}
else
{
ProcessNameSurnameAge(request);
}
}
}
}
class Client
{
private TcpClient _client = new TcpClient();
public void DoTheThing()
{
_client.Connect("127.0.0.1", 12312);
var stream = _client.GetStream();
SendRequest("GetDelimiter", stream);
var delimiter = GetResponse(stream);
var newRequest = "some Name" + delimiter + "some Surname" + delimiter + "some Age" + delimiter + "something else";
SendRequest(newRequest);
}
}
Note that I skip over the encoding details of sending data over TCP because it seems like you've already got a handle on that.
I was able to solve this in a rather simple manner. Steps i used to solve are as follow:
In the server:
Created a text file in my solution.
When the server starts in my view model, i saved the properties ip, port and delimeter in a string array.
Next i used the IO File class to write the content of the array in the text file.
In the client:
First i read from the file.
Next i created the client instance and passed the ip and port as parameters to it's constructor.
Thank you D Stanley and Damian Galletini for your suggestions. Also thank you everybody else who tried to help.

Socket receives the correct bytes, bytes translate to empty string

I've been breaking my head over a bug in this system I've been building. Basically, I use sockets to communicate between two C# applications. Or rather a Unity C# script server and a C# client application.
With manual tests, the system works perfectly fine, no anomalies whatsoever.
In order to test performance and multi-user functionality, I wrote up a tester class which launches multiple threads(clients), and have those fire X amount of messages at the server. Here's where my problem occurs...Sometimes.
When a Socket sends or receives, it returns an integer container the amount of bytes that was sent/received. When the problem occurs, I can see that the correct amount of bytes arrived at the server. However, after putting the bytes into a string, suddenly I'm left with an empty string, instead of the message I'd normally see here.
I'm at a loss at to what's causing this problem. I'm using Encoding.Default.GetString() to translate the bytes into a string.
Any help is appreciated!
David
public void ReceiveFromClient (Socket handlerSocket)
{
serverBuffer = new byte[iBufferSize]; //iBufferSize = 8192;
int i = handlerSocket.Receive (serverBuffer);
Debug.Log ("Bytes received: " + i);
string message = Encoding.UTF8.GetString (serverBuffer, 0, i);
Debug.Log ("Message received: " + message);
//Do stuff with the message
}
bool SendMessageToUnity(string input)
{//returns a bool saying whether the message was sent or not
if (clientSocket != null)
{
if (clientSocket.Connected)
{
byte[] bytes = Encoding.UTF8.GetBytes(input+"|");
txtOutput.BeginInvoke(new Action(() => txtOutput.AppendText("Sending message: " + Encoding.UTF8.GetString(bytes) + Environment.NewLine)));
int i = clientSocket.Send(bytes);
txtOutput.BeginInvoke(new Action(() => txtOutput.AppendText("Sending "+i+" bytes. "+ Environment.NewLine)));
return true;
}
}
return false;
}
Look for for a zero value ('\0') in your array of bytes before converting it to a string.
private string GetString(byte[] data)
{
data = data.Where(b => b != 0).ToArray();
return Encoding.UTF8.GetString(data);
}
If you get the byte array correctly than the problem in the Encoding.
Check the sending Encoding usually UTF8 but you have to check it out.
and then var inputStr = Encoding.UTF8.GetString(InputByteArray);
^^

TCP Client - loop in communication

I am writing a program which goal is to communicate with the weight terminal using TCP client.
I'm sending specified messages (eg. checking status) and depending on the replies I'm making some another process.
First, some code.
Connection:
public static void PolaczZWaga(string IP, int port)
{
IP = IP.Replace(" ", "");
KlientTCP = new TcpClient();
KlientTCP.Connect(IPAddress.Parse(IP), port);
}
Sending message (eg. checking status)
public static string OdczytDanychZWagi(byte[] WysylaneZapytanie)
{
// Wysyłka komunikatu do podłączonego serwera TCP
byte[] GotoweZapytanie = KomunikatyWspolne.PoczatekKomunikacji.Concat(WysylaneZapytanie).Concat(KomunikatyWspolne.KoniecKumunikacji).ToArray();
NetworkStream stream = KlientTCP.GetStream();
stream.Write(GotoweZapytanie, 0, GotoweZapytanie.Length);
// Otrzymanie odpowiedzi
// Buffor na odpowiedz
byte[] odpowiedz = new Byte[256];
// String do przechowywania odpowiedzi w ASCII
String responseData = String.Empty;
// Odczyt danych z serwera
Int32 bytes = stream.Read(odpowiedz, 0, odpowiedz.Length);
responseData = System.Text.Encoding.ASCII.GetString(odpowiedz, 0, bytes);
return responseData;
}
After Form1 open I make an connection and checking status
string odp = KomunikacjaSieciowa.OdczytDanychZWagi(OdczytZWagi.Kom_RejestrStatusu);
char status = odp[0];
switch(status)
{
case 'B':
KomunikacjaSieciowa.WysylkaDoWyswietlaczaWagi_4linie(WysylkaDoWyswietlacza_Komunikaty.LogWitaj, WysylkaDoWyswietlacza_Komunikaty.LogZaloguj, WysylkaDoWyswietlacza_Komunikaty.PustaLinia, WysylkaDoWyswietlacza_Komunikaty.LogNrOperatora);
string NrOperatora = KomunikacjaSieciowa.OdczytDanychZWagi(OdczytZWagi.Kom_ZatwierdzoneF1);
//int NrOperatora_int = Convert.ToInt32(NrOperatora);
break;
// here goes next case etc
Here starts my problem - communication takes place only once and the operation requires data on the terminal. Before the operator enters data program ends.
How to change the code / loop / add a timer to repeated communication to achieve a certain status?
More specifically, as in this passage:
case 'B':
KomunikacjaSieciowa.WysylkaDoWyswietlaczaWagi_4linie(WysylkaDoWyswietlacza_Komunikaty.LogWitaj, WysylkaDoWyswietlacza_Komunikaty.LogZaloguj, WysylkaDoWyswietlacza_Komunikaty.PustaLinia, WysylkaDoWyswietlacza_Komunikaty.LogNrOperatora);
string NrOperatora = KomunikacjaSieciowa.OdczytDanychZWagi(OdczytZWagi.Kom_ZatwierdzoneF1);
repeat "string NrOperatora" depending on the returned data?
Where's the best place to make loop?? Maybe I should use thread??
I think using stream.BeginRead and checking the status when reads are complete is the best way so if the status is not OK you can call stream.BeginRead to the same method so it will be a loop calling here self until status is OK

Incomplete data received across network from C# to Arduino

I am trying to send a word over to an Arduino running as a server, from a WPF C# application. Every now and again the complete work is not sent.
C# Code
public void send(String message)
{
TcpClient tcpclnt = new TcpClient();
ConState.Content = "Connecting.....";
try
{
tcpclnt.Connect("192.168.0.177", 23);
ConState.Content = "Connected";
String str = message;
Stream stm = tcpclnt.GetStream();
ASCIIEncoding asen = new ASCIIEncoding();
byte[] ba = asen.GetBytes(str);
stm.Write(ba, 0, ba.Length);
tcpclnt.Close();
}
catch (Exception)
{
ConState.Content = "Not Connected";
return;
}
}
How it is sent to the method:
String mes = "back;";
send(mes);
Arduino code:
if (client.available() > 0) {
// Read the bytes incoming from the client:
char thisChar = client.read();
if (thisChar == ';')
{
//Add a space
Serial.println("");
}
else {
//Print because it's not a space
Serial.write(thisChar);
}
}
The Arduino is using the chat server example. I am sending "back;" and "forward;" across. The results on the serial monitor:
back
forwaback
forward
back
forwaforwar
The problem seems to be with this code:
if (client.available() > 0) {
// read the bytes incoming from the client:
char thisChar = client.read();
...
}
What it does is:
Check if we have received data from the client
Read a single byte from the client buffer
Exit, and go on to do other things
As the OP pointed out, this comes direct from Arduino chat server example. In that example, this working correctly in loop() depends on the alreadyConnected flag being set right after a new connection is made: if it isn't, then the buffer is flushed before any data is read. That's one possible landmine.
Nonetheless, there is no reason to change the if block to be a while loop in the OP's case so, in other words instead of
if (client.available() > 0) {
have
while (client.available() > 0) {
The only reason to have an if statement there is to make sure that you frequently do other processing in loop() if you have clients that send a lot of data: If the reading of client data is done from inside a while this loop will not exit until the there is no more data from the client. Since this doesn't seem to be an issue in the asked-about case, the if to while change makes sense.

Categories

Resources