I'm trying to make Login feature by using TCP Client. I have two forms: Client-side and Server-side.
The Client-side handles user input while Server-side connect to database.
The problem is the reader result, which always combine both inputs into one long string like this:
myusernamemypassword
Here's part of the sender of client-side:
byte[] byteUsername = Encoding.Unicode.GetBytes(username);
byte[] bytePassword = Encoding.Unicode.GetBytes(password);
NetworkStream stream = client.GetStream();
stream.Write(username, 0, byteUsername.Length);
stream.Write(password, 0, bytePassword.Length);
//if offset != 0, the code always return ArgumentOutOfRangeException
And the reader in server-side:
return Encoding.Unicode.GetString(buffer, 0, buffer.Length)
After long-search I found the solution, but it can only handle two strings; the third+ string will be combined together with the second string. I need to send at least 4 strings for other feature.
Here's the updated reader code:
List<string> list = new List<string>();
int totalRead = 0;
do
{
int read = client.GetStream().Read(buffer, totalRead, buffer.Length - totalRead);
totalRead += read;
list.Add(Encoding.Unicode.GetString(buffer, 0, totalRead));
} while (client.GetStream().DataAvailable);
I don't quite understand this code. How can it knows which bytes are part of the first string? The size of Read() parameter is length-totalRead which is length - 0, it should return the whole buffer right?
Any solution guys?
Thanks before
You should prefix each string with its length (in bytes, not characters) as a 4-byte integer.
This way, the server will know how many bytes to read into each string.
Related
I'm trying to send the Base64 string of a screenshot to the server via NetworkStream and it appears I'm receiving the full string, problem is it's scrambled...
I assume this has something to do with it being fragmented and put back together? What would be the appropriate way to go about this...
Client Code
byte[] ImageBytes = Generics.Imaging.ImageToByte(Generics.Imaging.Get_ScreenShot_In_Bitmap());
string StreamData = "REMOTEDATA|***|" + Convert.ToBase64String(ImageBytes);
SW.WriteLine(StreamData);
SW.Flush();
Server Code
char[] ByteData = new char[350208];
SR.Read(ByteData, 0, 350208);
string Data = new string(ByteData);
File.WriteAllText("C:\\RecievedText", Data);
Also the size of the sent message and the char array are exactly the same.\
EDIT:
After messing around with it some more I realized the text isnt scrambled but the proper text is trailing the previous stream.. How can I ensure the stream is clear or gets the entire text
It's likely that you're not reading all of the previous response. You have to read in a loop until you get no data, like this:
char[] ByteData = new char[350208];
int totalChars = 0;
int charsRead;
while ((charsRead = SR.Read(ByteData, totalChars, ByteData.Length - totalChars) != 0)
{
totalChars += charsRead;
}
string Data = new string(ByteData, 0, totalChars);
File.WriteAllText("C:\\RecievedText", Data);
The key here is that StreamReader.Read reads up to the maximum number of characters you told it to. If there aren't that many characters immediately available, it will read what's available and return those. The return value tells you how many it read. You have to keep reading until you get the number of characters you want, or until Read returns 0.
I'm sending a message over a socket.
On the client side i'm assembling the message using StringBuilder
StringBuilder sb = new StringBuilder(message);
sb.Insert(0, (char)11);
sb.Append((char)28);
sb.Append((char)13);
Sending it from client to server
Byte[] data = new Byte[1024];
data = Encoding.ASCII.GetBytes(message.ToString());
NetworkStream stream = client.GetStream();
stream.Write(data, 0, data.Length);
Server Side
StringBuilder message = new StringBuilder(Encoding.ASCII.GetString(bytesReceived, 0, bytesReceived.Length));
I then want to check to see if my message is contained within the correct container but for some reason the last 2 characters equal 0 in the check instead of the correct 28 and 13.
if (((int)messsage[message.Length - 2] == 28) && ((int)message[message.Length - 1] == 13))
Thanks in advance for any help
Added Data that was asked for
byte[] bytes = new byte[1024];
NetworkStream stream = tcpClient.GetStream();
stream.Read(bytes, 0, bytes.Length);
Stream.Read will read up to bytes.Length bytes, it's return value will tell you how many bytes it actually read.
If that is not enough, then you will need to call Stream.Read() again.
Also, bytes.Length will always return the length of the array, not the number of bytes read.
Looking at your sending code, you probably want to read as much as you can from the stream, append what was read to the string builder, then check to see if the last 2 characters are 28 and 13, and if they are then you have your complete message.
Was just wondering if anybody can look at the following code and explain to me why the length variable returns 0:
textToBeEncrypted = Encoding.ASCII.GetString(buffer);
txtEncryptedText = AESEncryption(textToBeEncrypted, key, true);
byte[] encText = Encoding.ASCII.GetBytes(txtEncryptedText);
NetworkStream stream = s.GetStream();
stream.Write(encText, 0, PACKET_SIZE);
s.ReceiveTimeout = Timeout;
int length = stream.Read(buffer, 0, PACKET_SIZE);
if (length == PACKET_SIZE)
{
string decText = Encoding.ASCII.GetString(encText);
txtDecryptedText = AESDecryption(decText, key, true);
buffer = Encoding.ASCII.GetBytes(txtDecryptedText);
retval = Decode();
}
After I've encoded everything using AES, I'm writing out 1366 bytes of data in encText (PACKET_SIZE is 1036). I get no complaints regarding the Send; the data is sent out happily. When it tries to read it back in, however, length is always 0, meaning I don't get to enter the decode statement bracket. Any ideas? (retval is a string, before anyone asks)
If the length is zero from this:
int length = stream.Read(buffer, 0, PACKET_SIZE);
it means the other machine has closed their outbound socket (your inbound socket), and no more data is ever going to be available.
You should also be very careful about this:
if (length == PACKET_SIZE)
{...}
There is absolutely no guarantee about what you read. What you should do here is buffer the data until you have an entire message (frame), and then process what you buffered. In particular, if the other end sends less than PACKET_SIZE bytes, your code is guaranteed to do nothing. Even if the other end sent exactly PACKET_SIZE bytes, it would be pretty unlikely to arrive in exactly a single chunk of PACKET_SIZE bytes.
For example:
int length;
MemoryStream ms = new MemoryStream();
while((length = stream.Read(buffer, 0, buffer.Length)) > 0) {
ms.Write(buffer, 0, length); // append what we just recieved
// now: could check `ms` to see if we have a "frame" here...
}
//...or you could just process the entire recieved data here
Server side code:
byte[] size = new byte[4];
size = BitConverter.GetBytes(fileData.Length);
stream.Write(size, 0, 4);
Client side code:
byte[] size = new byte[4];
ReadWholeArray(s, size);
int fileSize = BitConverter.ToInt32(size, 0);
Definition of ReadWholeArray method:
public static void ReadWholeArray(Stream stream, byte[] data)
{
int offset = 0;
int remaining = data.Length;
while (remaining > 0)
{
int read = stream.Read(data, offset, remaining);
if (read <= 0)
throw new EndOfStreamException
(String.Format("End of stream reached with {0} bytes left to read", remaining));
remaining -= read;
offset += read;
}
}
The Program sends fileData.Length (the value for this instance is 2422) from server. On receiving this data at client side the value of received data is -772097985
Why the sent data is not received without alteration in value? What is the problem?
Okay, simple diagnostics to start with: log the individual contents of the byte array at both ends, so you can see what's going on there.
That way you can see if it's the binary data which is getting corrupted in your communication protocol, or whether BitConverter is causing your problem. For example, you could have a big-endian BitConverter at one end, and a little-endian BitConverter at the other. That seems unlikely, but it's possible - particularly if one of your server or client is running Mono rather than .NET itself.
If that does turn out to be the problem, you might want to use my EndianBitConverter class from MiscUtil, which lets you specify the endianness.
If the problem is in the communications layer, you quite possibly want to install Wireshark to see what's happening at the network level. Are you sure you've read all the data you're meant to have read so far, for example? (If you've only read 15 bytes before this, and the size is written at offset 16, then obviously you'll get the "extra" byte first.)
This works fine:
private void button2_Click(object sender, RoutedEventArgs e)
{
MemoryStream ms = new MemoryStream();
byte [] original = BitConverter.GetBytes((int)2224); // 176, 8, 0, 0
ms.Write(original, 0, original.Length);
ms.Seek(0, SeekOrigin.Begin);
byte [] data = new byte[4];
int count = ms.Read(data, 0, 4); // count is 4, data is 176, 8, 0, 0
int fileSize = BitConverter.ToInt32(data, 0); // is 2224
return;
}
Can you use WireShark or something to intercept the bytes? What kind of connection are you using? Could more data be being sent (i.e. telnet control characters at the start of the stream)? Can you debug each end and verify these values or write the byte array contents to a log file? By the way calling this: "byte[] size = new byte[4];" is wasteful because BitConverter.GetBytes() returns a new array.
When i write to a network stream two seperate byte array, i sometimes dont get the first byte array.
Why is that?
For eg This fails, the header is not received, sometimes by Read() on other side
byte[] header = msg.getByteHeader();
byte[] data = msg.getByteData();
clientStream.Write(header, 0, header.Length);
clientStream.Write(data, 0, data.Length);
clientStream.Flush();
however this succeeds
NetworkStream clientStream = tcpClient.GetStream();
byte[] header = msg.getByteHeader();
byte[] data = msg.getByteData();
int pos = 0;
Array.Copy(header, 0, message, pos, header.Length);
pos += header.Length;
Array.Copy(data, 0, message, pos, data.Length);
clientStream.Write(message, 0, message.Length);
This is how my Read() looks
try
{
//blocks until a client sends a message
bytesRead = clientStream.Read(message, 0, 4);
//string stringData = Encoding.ASCII.GetString(message, 0, bytesRead);
len = BitConverter.ToInt32(message, 0);
//MessageBox.Show(len.ToString());
bytesRead = clientStream.Read(message, 0, 5 + len);
}
i believe this is a timing issue. There's a lag between when you first open socket communication and when you can read the first data from the buffer. It's not instantaneous. You can query the DataAvailable boolean status of network stream before attempting to read. If there's no DataAvailable, Sleep the thread for say 100 ms and then try reading again.
You could troubleshoot it by commenting out the second Write and see if your server is sent any data.
Your reading mechanism does look very very fragile and I'd agree with Simon Fox that it does not look like it's correct. Why is the second read asking for len + 5 bytes? I would have thought it would only be len bytes since the first read was for the 4 header bytes.
If I was you I'd add a delimiter to the start of your header transmission. That will allow your receivers to scan for that to determine the start of a packet. With TCP you will often get fragemented transmissions or multiple transmissions bundled into the same packet. Things will go wrong once you deploy to real networks such as the internet if you are always relying on getting exactly the number of bytes you request.
Either that or switch to UDP where you can rely on having one transmission per packet.
Aren't you overwritting what you read in the first call to read with the second call? The second argument to Read is the offset at which to start storing the data read, both calls use 0 so the second overwrites the first...