How can I use C# to retrieve Java data from a socket? - c#

I want to send BigInteger data in socket and my friend wants to retrieve the data.
I am using Java, but my friend uses C#.
String str = "Hello";
BigInteger big = new BigInteger(str.getBytes);
byteToBeSent[] = big.toByteArray();
I am sending this byte array (byteToBeSent[]) through socket. And my friend wants to retrieve "Hello".
How is this possible?

From Java, send your string using String.getBytes(encoding) and specify the encoding to match how your friend will read it (e.g. UTF-8).
This will translate your string into a byte stream that will be translatable at the C# end due to the fact that you're both agreeing on the encoding mechanism.
I'm not sure what your BigInteger mechanism is doing, but I don't believe it'll be portable, nor handle sizable strings.

Honestly, your best bet would to be use the built in Encoding classes in C#.
string str = "Hello";
byte[] data = System.Text.Encoding.UTF8.GetBytes(str);
And then send that through the socket.

Why do you use BigInteger to send String data? Or is that just an example?
If you want to send String data, use String.getBytes(String encoding), send the result and decode it using System.Text.Encoding.

You'll need to get a custom BigInteger class for C#.
But parsing "Hello" as a biginteger isn't going to work. I you want to send text, you're better of using Navaars method.

On the C# end, you can use System.Text.Encoding.ASCII.GetString(bytesToConvert[]) to convert the received byte array back to a string.
I had thought that was some sort of Java idiom to convert a string to a byte array. It does correctly convert the string into ASCII bytes, and since BigInteger length is arbitrary, the length of the string should not be an issue.

Related

How a real packet should be constructed?

Probably this question will anger all stack overflow gods, but I just cant get my head around how a packet should look like.
I mean how to make a packet by these guidelines?
.
byte[] data0 = Encoding.Unicode.GetBytes("1");
byte[] data1 = Encoding.Unicode.GetBytes(0x03);
string data2 = "127.0.0.0:80";
string data3 = "";
I imagine everything like this and make a byte list/array out of this?
Or that string has to be converted to byte and then pack it to array/list?
Or maybe someone has a simple explanation how its done?
You appear to be trying to implement a client to use Valve's "Master Server Query Protocol". If you click on the "String Zero" link, you'll find it simply describes a null-terminated string. Presumably the encoding is ASCII, but I don't see anything in the documentation that makes that clear.
You will need to construct a datagram with the bytes formatted as described in the document. For ASCII characters, you can just cast char to int; for non-ASCII characters, you'd get values outside the valid range for ASCII, but char uses UTF16 which, for the range of characters shared with ASCII also shares the actual value for the character. You can then cast that character to byte for the purpose of the datagram. However, none of this really matters in this particular example because the only part of the protocol described as a specific character, you already know the byte value for and so can just specify that explicitly.
To build up the byte[] for the datagram, I'd recommend using MemoryStream (you could also use BinaryWriter for more complex types of data, but here you're really only dealing with bytes…BinaryWriter uses its own length-prefixed format for strings, so you'd have to convert to byte[] for the strings anyway). Something like the following ought to work:
byte[] GetMasterServerQueryDatagram(byte regionCode, string address, string filter)
{
MemoryStream stream = new MemoryStream();
stream.WriteByte(0x31);
stream.WriteByte(regionCode);
byte[] stringZero = Encoding.ASCII.GetBytes(address + "\0");
stream.Write(stringZero, 0, stringZero.Length);
stringZero = Encoding.ASCII.GetBytes(filter + "\0");
stream.Write(stringZero, 0, stringZero.Length);
return stream.ToArray();
}
Notes:
You might want to declare an enum based on the table in the Valve documentation to represent the regionCode value, so that your other code can refer to the region by name.
Pay close attention to the requirement in the documentation that you pass "0.0.0.0:0" as the first IP:port value, but then pass the last value returned by their servers in your subsequent queries.

Socket message header building

I'm working on a protocol which will transfer block of xml data via tcp socket. Now say I need to read all the bytes from a xml file and build a memory buffer. Then before sending the actual data bytes I need to send one header to other peer end. Say my protocol using below header type.
MessageID=100,Size=232,CRC=190
string strHeader = "100,232,190"
Now I would like to know how can I make this header length fixed (fixed header length is required for other peer to identify it as a header) for any amount of xml data. Currently say I'm having a xml file sized 283637bytes, so the message header will look like.
string strHeader = "100,283637,190"
How can I make it generic for any size of data? The code is being written both in c++ and c#.
There are a number of ways to do it.
Fixed Length
You can pad the numbers numbers with leading zeroes so you know exactly what length of the text you need to work with. 000100,000232,000190
Use Bytes instead of strings
If you are using integers, you can read the bytes as integers instead of manipulating the string. Look into the BinaryReader class. If needing to do this on the C++ side, the concept is still the same. I am sure there many ways to convert 4 bytes into an int.
Specify the length at the beginning
Usually when working with dynamic length strings. There is an indicator of how many bytes need to be read in order to get the entire string. You could specify the first 4 bytes of your message as the length of your string and then read up to that point.
The best approach for you is to implement this as a struct like
struct typedef _msg_hdr {
int messageID;
int size;
int crc;
}msg_hdr;
This will always have 12 bytes length. Now when sending your message, first send header to the receiver. The receiver should receive it in the same structure. This is the best and easiest way

In .NET, why does SerialPort.ReadExisting() return a String instead of a byte array?

In the .NET SerialPort class the ReadExisting() method returns a String instead of an array of bytes. This seems like an odd choice considering that RS232 is typically used to move 7 or 8 bit values, which may or may not be printable characters. Is there a reason for this choice?
Currently I end up using System.Text.Encoding.GetBytes(recvd_data) to convert the String to a byte array. Is there a more efficient method?
SerialPort has a Read overload that reads into the specified Byte[].
http://msdn.microsoft.com/en-us/library/ms143549(v=vs.100).aspx
I have used SerialPort extensively and the best way i've found to read a series of bytes is making multiple calls to ReadByte(). Yes, you read one byte at a time but i've found that keeping it simple has avoided problems.
At best, this method will save you from having to do a convert (since you'll read into a byte array).

Convert the int from c# gethashcode() back to string?

A really simple question:
I am doing a simple thing. I have few string like:
string A = "usd";
And I want to get the hashcode in C#:
int audusdReqId = Convert.ToInt32("usd".GetHashCode());
Now how could I convert the 32-bit integer audusdReqId back to the string "usd"?
You cannot. A hash code does not contain all of the necessary information to convert it back to a string. More importantly, you should be careful what you use GetHashCode for. This article explains what you should use it for. You can't even guarantee that GetHashCode will return the same thing in different environments. So you should not be using GetHashCode for any cryptographic purposes.
If you are trying to create a binary representation of a string, you can use
var bytes = System.Text.Encoding.UTF8.GetBytes(someString);
This will give you a byte[] for that particular string. You can use GetString on Encoding to convert it back to a string:
var originalString = System.Text.Encoding.UTF8.GetString(bytes);
If you are trying to cryptographically secure the string, you can use a symmetric encryption algorithm like AES as Robert Harvey pointed out in the comments.
The purpose of the hashcode/hash function is that it has to be one way and that it cannot be converted back to the original value.
There are good reasons as to why this is so. A common usage would be password storage in a database for example. You shouldn't know the original value(the plain text password) and that is why you would normally use a hashcode to encode it one way.
There are also other usages as storing values to hashsets etc.

C# analog for getBytes in java

There is wonderful method which class String has in java called getBytes.
In C# it's also implemented in another class - Encoding, but unfortunately it returns array of unsigned bytes, which is a problem.
How is it possible to get an array of signed bytes in C# from a string?
Just use Encoding.GetBytes but then convert the byte[] to an sbyte[] by using something like Buffer.BlockCopy. However, I'd strongly encourage you to use the unsigned bytes instead - work round whatever problem you're having with them instead of moving to signed bytes, which were frankly a mistake in Java to start with. The reason there's no built-in way of converting a string to a signed byte array is because it's rarely something you really want to be doing.
If you can tell us a bit about why the unsigned bytes are causing you a problem, we may well be able to help you with that instead.

Categories

Resources