Converting C openssl TripleDes encryption to .NET - c#

I have been trying to replicate an encryption process for a 3rd party integration. The 3rd party uses openssl, and have given me there C code they use to perform the process. I have been trying to port this process across to C# for weeks, but I appear to be missing something I can not work out. Most likely I am missing something in transposing the C code and OpenSSL libraries, but I can not figure it out for the life of me
The main port os OpenSSL to .NET (https://github.com/openssl-net/openssl-net) unfortunatly does not have support for TripleDes, so can not be used
Here is the example C code
#include <stdlib.h>
#include <stdio.h>
#include <errno.h>
#include <string.h>
#include <ctype.h>
#include <openssl/dh.h>
#include <openssl/pem.h>
#include <openssl/engine.h>
#include <openssl/bn.h>
#include <openssl/des.h>
#include <openssl/rand.h>
static void encrypt_ean(int argc, char **argv)
{
size_t i;
if (argc != 3)
usage();
char *mwkstr = argv[1];
char *ean = argv[2];
unsigned char mwk[24];
hex2bin(mwkstr, 48, mwk);
DES_key_schedule keysched[3];
set_key_checked(mwk, keysched);
unsigned char idata[16];
unsigned char odata[16];
DES_cblock zero_iv;
memset(&zero_iv, 0, sizeof(zero_iv));
if (RAND_bytes(idata+0, 8) != 1) {
fprintf(stderr, "RAND_bytes failed.\n");
exit(1);
}
for (i=0; i<8; i++)
idata[8+i] = (i >= eanlen) ? ' ' : ean[i];
idata[7] = chksum((char *)idata+8, 8);
if (g_verbose) {
printf("ean = %s\n", mean);
printf("idata = ");
for (i=0; i<sizeof(idata); i++)
printf("%d %02X\n", (int)i, idata[i]);
printf("\n");
}
DES_ede3_cbc_encrypt(idata, odata, sizeof(odata),
&keysched[0], &keysched[1], &keysched[2], &zero_iv, DES_ENCRYPT);
for (i=0; i<sizeof(odata); i++)
printf("%02X", odata[i]);
printf("\n");
}
static unsigned char chksum(char *data, size_t datalen)
{
size_t i;
unsigned char sum=0;
for (i=0; i<datalen; i++)
sum += data[i];
return sum;
}
static void hex2bin(const char *str, int len, unsigned char *bin)
{
int i, j, x;
for (i=0, j=0; i<len; i+=2) {
char tmpstr[3];
tmpstr[0] = str[i+0];
tmpstr[1] = str[i+1];
tmpstr[2] = '\0';
sscanf(tmpstr, "%02X", &x);
bin[j++] = x;
}
}
static int set_key_checked(unsigned char *key, DES_key_schedule *keysched)
{
if (DES_set_key_checked((const_DES_cblock *)(key+0), &keysched[0]) < 0) {
set_key_err:
fprintf(stderr, "DES_set_key_checked failed.\n");
exit(1);
}
if (DES_set_key_checked((const_DES_cblock *)(key+8), &keysched[1]) < 0)
goto set_key_err;
if (DES_set_key_checked((const_DES_cblock *)(key+16), &keysched[2]) < 0)
goto set_key_err;
return 0;
}
And here is my C# Code (Consider ean = pin for easier transposing)
internal static class PINEncoding
{
internal static string EncodePIN(string unencodedPIN, string decryptedWorkingKey)
{
var bytes = GenerateRandomBytes();
var asciiPin = ConvertPINToASCIIBytes(unencodedPIN);
var checksum = new byte[1];
checksum[0] = ComputeChecksum(asciiPin);
var pinBlock = ObtainPinBlock(bytes, checksum, asciiPin);
return EncryptPIN(pinBlock, decryptedWorkingKey);
}
private static byte[] GenerateRandomBytes()
{
Random rnd = new Random();
byte[] b = new byte[7];
rnd.NextBytes(b);
return b;
}
private static byte[] ConvertPINToASCIIBytes(string pin)
{
return ASCIIEncoding.ASCII.GetBytes(pin);
}
private static byte ComputeChecksum(byte[] data)
{
long longSum = data.Sum(x => (long)x);
return unchecked((byte)longSum);
}
private static byte[] ObtainPinBlock(byte[] random, byte[] checksum, byte[] asciiPin)
{
var result = new byte[random.Length + checksum.Length + asciiPin.Length];
Buffer.BlockCopy(random, 0, result, 0, random.Length);
Buffer.BlockCopy(checksum, 0, result, random.Length, checksum.Length);
Buffer.BlockCopy(asciiPin, 0, result, random.Length + checksum.Length, asciiPin.Length);
return result;
}
private static string EncryptPIN(byte[] eanBlock, string decryptedWorkingKey)
{
var keyAsBytes = HexStringBytesConverter.ConvertHexStringToByteArray(decryptedWorkingKey);
var byteResult = TripleDESEncryption.Encrypt(eanBlock, keyAsBytes);
return BitConverter.ToString(byteResult).Replace("-", "");
}
}
public static class TripleDESEncryption
{
public static byte[] Encrypt(byte[] toEncrypt, byte[] key)
{
using (var tdes = new TripleDESCryptoServiceProvider
{
Key = key,
IV = new byte[8] { 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00, 0x00},
Mode = CipherMode.CBC,
Padding = PaddingMode.None
})
{
var cTransform = tdes.CreateEncryptor();
return cTransform.TransformFinalBlock(toEncrypt, 0, toEncrypt.Length);
}
}
}
One of my example inputs and expected outs are
Unencoded Pin: 71548715
Decrypted Working Key: A7E5A86DB6F41FBA0DE99DE5BC3246ABA7E5A86DB6F41FBA
Expected Encryption Result: C097280EC13B486AE5DA57DB8F779184
Result Obtained by Above : C909165718FCE9A432AD432E7A104DCD

The C/C++ code performs an encryption with TripleDES in CBC mode without padding. The input parameters are the hex encoded key (argv[1]) and the EAN/PIN (argv[2]). The EAN/PIN is preceded by an 8 bytes value before encryption, whose first 7 bytes were randomly generated with RAND_bytes() and whose 8th byte is a checksum byte generated with chksum(). A zero IV is applied as IV.
The C# code does the same! Of course, because of the first random 7 bytes, this cannot be verified by just comparing the ciphertexts as you did, but by comparing the ciphertexts using the identical leading 7 bytes in both codes.
The leading 7 bytes for this test can be determined beforehand by decrypting the posted expected ciphertext using the posted key and a zero IV with a tool or other code. This decryption returns hex encoded the value 51174b043d6274a63731353438373135 (performed e.g. with http://tripledes.online-domain-tools.com/), of which the last 8 bytes are ASCII decoded 71548715, thus corresponding to the posted EAN/PIN. The first 7 bytes are hex encoded 51174b043d6274.
If for the test of the C# code in EncodePIN() the line
var bytes = GenerateRandomBytes();
is replaced by
var bytes = HexStringBytesConverter.ConvertHexStringToByteArray("51174b043d6274");
the call
Console.WriteLine(PINEncoding.EncodePIN("71548715", "A7E5A86DB6F41FBA0DE99DE5BC3246ABA7E5A86DB6F41FBA"));
returns
C097280EC13B486AE5DA57DB8F779184
in accordance with the expected ciphertext, proving that the C/C++ and C# code are functionally identical.
Note that the C/C++ code actually has a bit more functionality under the hood, e.g. checking the key (s. DES_set_key_checked), which however has no effect on the result if the key is valid (odd parity, not weak or semi weak).

Related

SHA1 hash difference between c# windows store app and Objective-c iOS

My Windows Store app on the Surface pro uses the SHA1 to send a hashed password to the server. I wanted to do the same thing for the iOS app but I get different results and I just don't know why.
I tried with different encoding (NSASCII UTF8 UniCode) when converting NSStrings to cString but to no prevail.
c# Windows Store App - Surface pro
/// <summary>
/// Literal copy of the values from web config machinekey
/// </summary>
const String validationKey = "6DB51F17C529AD3CABEC50B3C89CB21F4F1422F58A5B42D0E8DB8CB5CDA146511891C1BAF47F8D29401E3400267682B202B7DA146511891C1BAF47F8D29401E3";
public static string HmacSha1(string baseString)
{
var crypt = MacAlgorithmProvider.OpenAlgorithm("HMAC_SHA1");
var buffer = CryptographicBuffer.CreateFromByteArray(Encoding.Unicode.GetBytes(baseString));
var keyBuffer = Windows.Security.Cryptography.CryptographicBuffer.CreateFromByteArray(HexToByte(validationKey));
var key = crypt.CreateKey(keyBuffer);
var sigBuffer = CryptographicEngine.Sign(key, buffer);
string signature = CryptographicBuffer.EncodeToBase64String(sigBuffer);
return signature;
}
//
// HexToByte
// Converts a hexadecimal string to a byte array. Used to convert encryption
// key values from the configuration.
//
private static byte[] HexToByte(string hexString)
{
byte[] returnBytes = new byte[hexString.Length / 2];
for (int i = 0; i < returnBytes.Length; i++)
returnBytes[i] = Convert.ToByte(hexString.Substring(i * 2, 2), 16);
return returnBytes;
}
Objective-c iOS
//Hash password
NSString *secret = #"6DB51F17C529AD3CABEC50B3C89CB21F4F1422F58A5B42D0E8DB8CB5CDA146511891C1BAF47F8D29401E3400267682B202B7DA146511891C1BAF47F8D29401E3";
NSString *data = password;
const char *cKey = [secret cStringUsingEncoding:NSASCIIStringEncoding];
const char *cData = [data cStringUsingEncoding:NSASCIIStringEncoding];
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, cKey, strlen(cKey), cData, strlen(cData), cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
NSString *signature = [HMAC base64EncodedStringWithOptions:0];
NSString *clientPassword = signature;
So some genius helped me figure it out. In c# code the password is encoded with Unicode as can be seen in this line:
Encoding.Unicode.GetBytes(baseString)
That means I should have changed this the NSASCII encoding line in Objective-C to:
const char *cPassword = [password cStringUsingEncoding:NSUnicodeStringEncoding];
Now comes something tricky. Unicode encodes a string like 'Welkom123' like this: W/0e/0l/0k/0o/0m/01/02/03/0. That means I can't use strlen(cData) to determine the length of cData because it will stop counting when encountering /0, so the length has been wrong the whole time. I had to replace strlen(cData) with (password.length * 2).
But even then the outcome was still not right. The key had to become Hexadecimal data just like in the c# code. Then the NSData should be used to create the CCHmac. This code turns the NSString to NSData with hexadecimal values:
int len = (int)([inputString length] / 2); // Target length
unsigned char *buf = malloc(len);
unsigned char *whole_byte = buf;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [inputString length] / 2; i++) {
byte_chars[0] = [inputString characterAtIndex:i*2];
byte_chars[1] = [inputString characterAtIndex:i*2+1];
*whole_byte = strtol(byte_chars, NULL, 16);
whole_byte++;
}
NSData *data = [NSData dataWithBytes:buf length:len];
free( buf );
return data;
In the end the total code would look like this:
-(NSDictionary *)authenticateWithUsername:(NSString *)userName andPassword:(NSString *)password andSyncUrl:(NSString *)syncUrl
{
NSString *key = #"6DB51F17C529AD3CABEC50B3C89CB21F4F1422F58A5B42D0E8DB8CB5CDA146511891C1BAF47F8D29401E3400267682B202B7DA146511891C1BAF47F8D29401E3";
NSData *data = [self stringToHexData:key];
const char *cPassword = [password cStringUsingEncoding:NSUnicodeStringEncoding];
unsigned char cHMAC[CC_SHA1_DIGEST_LENGTH];
CCHmac(kCCHmacAlgSHA1, data.bytes, data.length, cPassword, (password.length * 2), cHMAC);
NSData *HMAC = [[NSData alloc] initWithBytes:cHMAC length:sizeof(cHMAC)];
NSString *hashedPassword = [HMAC base64EncodedStringWithOptions:0];
NSString *clientPassword = hashedPassword;
// some code here left out for readability sake
return dict;
}
- (NSData *) stringToHexData:(NSString *)inputString
{
int len = (int)([inputString length] / 2); // Target length
unsigned char *buf = malloc(len);
unsigned char *whole_byte = buf;
char byte_chars[3] = {'\0','\0','\0'};
int i;
for (i=0; i < [inputString length] / 2; i++) {
byte_chars[0] = [inputString characterAtIndex:i*2];
byte_chars[1] = [inputString characterAtIndex:i*2+1];
*whole_byte = strtol(byte_chars, NULL, 16);
whole_byte++;
}
NSData *data = [NSData dataWithBytes:buf length:len];
free( buf );
return data;
}

How to encrypt a string using public key cryptography

I am trying to implement my own RSA encryption engine. Given these RSA algorithm values:
p = 61. // A prime number.
q = 53. // Also a prime number.
n = 3233. // p * q.
totient = 3120. // (p - 1) * (q - 1)
e = 991. // Co-prime to the totient (co-prime to 3120).
d = 1231. // d * e = 1219921, which is equal to the relation where 1 + k * totient = 1219921 when k = 391.
I am trying to write a method to encrypt each byte in a string and return back an encrypted string:
public string Encrypt(string m, Encoding encoding)
{
byte[] bytes = encoding.GetBytes(m);
for (int i = 0; i < bytes.Length; i++)
{
bytes[i] = (byte)BigInteger.ModPow(bytes[i], e, n);
}
string encryptedString = encoding.GetString(bytes);
Console.WriteLine("Encrypted {0} as {1}.", m, encryptedString);
return encryptedString;
}
The obvious issue here is that BigInteger.ModPow(bytes[i], e, n) may be too large to fit into a byte-space; it could result in values over 8 bits in size. How do you get around this issue while still being able to decrypt an encrypted string of bytes back into a regular string?
Update: Even encrypting from byte[] to byte[], you reach a case where encrypting that byte using the RSA algorithm goes beyond the size limit of a byte:
public byte[] Encrypt(string m, Encoding encoding)
{
byte[] bytes = encoding.GetBytes(m);
for (int i = 0; i < bytes.Length; i++)
{
bytes[i] = (byte)BigInteger.ModPow(bytes[i], e, n);
}
return bytes;
}
Update: My issue is that encryption would cause a greater number of bytes than the initial input string had:
public byte[] Encrypt(string m, Encoding encoding)
{
byte[] bytes = encoding.GetBytes(m);
byte[] returnBytes = new byte[0];
for (int i = 0; i < bytes.Length; i++)
{
byte[] result = BigInteger.ModPow(bytes[i], (BigInteger)e, n).ToByteArray();
int preSize = returnBytes.Length;
Array.Resize(ref returnBytes, returnBytes.Length + result.Length);
result.CopyTo(returnBytes, preSize);
}
return returnBytes;
}
public string Decrypt(byte[] c, Encoding encoding)
{
byte[] returnBytes = new byte[0];
for (int i = 0; i < c.Length; i++)
{
byte[] result = BigInteger.ModPow(c[i], d, n).ToByteArray();
int preSize = returnBytes.Length;
Array.Resize(ref returnBytes, returnBytes.Length + result.Length);
result.CopyTo(returnBytes, preSize);
}
string decryptedString = encoding.GetString(returnBytes);
return decryptedString;
}
If you ran this code like this:
byte[] encryptedBytes = engine.Encrypt("Hello, world.", Encoding.UTF8);
Console.WriteLine(engine.Decrypt(encryptedBytes, Encoding.UTF8));
The output would be this:
?♥D
?♥→☻►♦→☻►♦oD♦8? ?♠oj?♠→☻►♦;♂?♠♂♠?♠
Obviously, the output is not the original string because I can't just try decrypting each byte at a time, since sometimes two or more bytes of the cypher-text represent the value of one integer that I need to decrypt back to one byte of the original string...so I want to know what the standard mechanism for handling this is.
Your basic code for encrypting and decrypting each byte - the call to ModPow - is working, but you're going about the "splitting the message up and encrypting each piece" inappropriately.
To show that the ModPow part - i.e. the maths - is fine, here's code based on yours, which encrypts a string to a BigInteger[] and back:
using System;
using System.Linq;
using System.Numerics;
using System.Text;
class Test
{
const int p = 61;
const int q = 53;
const int n = 3233;
const int totient = 3120;
const int e = 991;
const int d = 1231;
static void Main()
{
var encrypted = Encrypt("Hello, world.", Encoding.UTF8);
var decrypted = Decrypt(encrypted, Encoding.UTF8);
Console.WriteLine(decrypted);
}
static BigInteger[] Encrypt(string text, Encoding encoding)
{
byte[] bytes = encoding.GetBytes(text);
return bytes.Select(b => BigInteger.ModPow(b, (BigInteger)e, n))
.ToArray();
}
static string Decrypt(BigInteger[] encrypted, Encoding encoding)
{
byte[] bytes = encrypted.Select(bi => (byte) BigInteger.ModPow(bi, d, n))
.ToArray();
return encoding.GetString(bytes);
}
}
Next you need to read more about how a byte[] is encrypted into another byte[] using RSA, including all the different padding schemes etc. There's a lot more to it than just calling ModPow on each byte.
But to reiterate, you should not be doing this to end up with a production RSA implementation. The chances of you doing that without any security flaws are very slim indeed. It's fine to do this for academic interest, to learn more about the principles of cryptography, but leave the real implementations to experts. (I'm far from an expert in this field - there's no way I'd start implementing my own encryption...)
Note: I updated this answer. Please scroll down to the update for how it should actually be implemented because this first way of doing it is not the correct way of doing RSA encryption.
One way I can think to do it is like this (but may not be compliant to standards), and also, note this does not pad:
public byte[] Encrypt(string m, Encoding encoding)
{
byte[] bytes = encoding.GetBytes(m);
byte[] returnBytes = new byte[0];
for (int i = 0; i < bytes.Length; i++)
{
byte[] result = BigInteger.ModPow(bytes[i], (BigInteger)e, n).ToByteArray();
int preSize = returnBytes.Length;
Array.Resize(ref returnBytes, returnBytes.Length + result.Length + 1);
(new byte[] { (byte)(result.Length) }).CopyTo(returnBytes, preSize);
result.CopyTo(returnBytes, preSize + 1);
}
return returnBytes;
}
public string Decrypt(byte[] c, Encoding encoding)
{
byte[] returnBytes = new byte[0];
for (int i = 0; i < c.Length; i++)
{
int dataLength = (int)c[i];
byte[] result = new byte[dataLength];
for (int j = 0; j < dataLength; j++)
{
i++;
result[j] = c[i];
}
BigInteger integer = new BigInteger(result);
byte[] integerResult = BigInteger.ModPow(integer, d, n).ToByteArray();
int preSize = returnBytes.Length;
Array.Resize(ref returnBytes, returnBytes.Length + integerResult.Length);
integerResult.CopyTo(returnBytes, preSize);
}
string decryptedString = encoding.GetString(returnBytes);
return decryptedString;
}
This has the potential of being cross-platform because you have the option of using a different datatype to represent e or n and pass it to a C# back-end service like that. Here is a test:
string stringToEncrypt = "Mary had a little lamb.";
Console.WriteLine("Encrypting the string: {0}", stringToEncrypt);
byte[] encryptedBytes = engine.Encrypt(stringToEncrypt, Encoding.UTF8);
Console.WriteLine("Encrypted text: {0}", Encoding.UTF8.GetString(encryptedBytes));
Console.WriteLine("Decrypted text: {0}", engine.Decrypt(encryptedBytes, Encoding.UTF8));
Output:
Encrypting the string: Mary had a little lamb.
Encrypted text: ☻6☻1♦☻j☻☻&♀☻g♦☻t☻☻1♦☻? ☻g♦☻1♦☻g♦☻?♥☻?☻☻7☺☻7☺☻?♥☻?♂☻g♦☻?♥☻1♦☻$☺☻
c ☻?☻
Decrypted text: Mary had a little lamb.
Update: Everything I said earlier is completely wrong in the implementation of RSA. Wrong, wrong, wrong! This is the correct way to do RSA encryption:
Convert your string to a BigInteger datatype.
Make sure your integer is smaller than the value of n that you've calculated for your algorithm, otherwise you won't be able to decypher it.
Encrypt the integer. RSA works on integer encryption only. This is clear.
Decrypt it from the encrypted integer.
I can't help but wonder that the BigInteger class was mostly created for cryptography.
As an example:
using System;
using System.Collections.Generic;
using System.Linq;
using System.Numerics;
using System.Security.Cryptography;
using System.Text;
using System.Threading.Tasks;
namespace BytePadder
{
class Program
{
const int p = 61;
const int q = 53;
const int n = 3233;
const int totient = 3120;
const int e = 991;
const int d = 1231;
static void Main(string[] args)
{
// ---------------------- RSA Example I ----------------------
// Shows how an integer gets encrypted and decrypted.
BigInteger integer = 1000;
BigInteger encryptedInteger = Encrypt(integer);
Console.WriteLine("Encrypted Integer: {0}", encryptedInteger);
BigInteger decryptedInteger = Decrypt(encryptedInteger);
Console.WriteLine("Decrypted Integer: {0}", decryptedInteger);
// --------------------- RSA Example II ----------------------
// Shows how a string gets encrypted and decrypted.
string unencryptedString = "A";
BigInteger integer2 = new BigInteger(Encoding.UTF8.GetBytes(unencryptedString));
Console.WriteLine("String as Integer: {0}", integer2);
BigInteger encryptedInteger2 = Encrypt(integer2);
Console.WriteLine("String as Encrypted Integer: {0}", encryptedInteger2);
BigInteger decryptedInteger2 = Decrypt(encryptedInteger2);
Console.WriteLine("String as Decrypted Integer: {0}", decryptedInteger2);
string decryptedIntegerAsString = Encoding.UTF8.GetString(decryptedInteger2.ToByteArray());
Console.WriteLine("Decrypted Integer as String: {0}", decryptedIntegerAsString);
Console.ReadLine();
}
static BigInteger Encrypt(BigInteger integer)
{
if (integer < n)
{
return BigInteger.ModPow(integer, e, n);
}
throw new Exception("The integer must be less than the value of n in order to be decypherable!");
}
static BigInteger Decrypt(BigInteger integer)
{
return BigInteger.ModPow(integer, d, n);
}
}
}
Example output:
Encrypted Integer: 1989
Decrypted Integer: 1000
String as Integer: 65
String as Encrypted Integer: 1834
String as Decrypted Integer: 65
Decrypted Integer as String: A
If you are looking to use RSA encryption in C# then you should not be attempting to build your own. For starters the prime numbers you have chosen are probably to small. P and Q are supposed to be large prime numbers.
You should check out some other question/answers:
how to use RSA to encrypt files (huge data) in C#
RSA Encryption of large data in C#
And other references:
http://msdn.microsoft.com/en-us/library/system.security.cryptography.rsacryptoserviceprovider.encrypt(v=vs.110).aspx
http://msdn.microsoft.com/en-us/library/system.security.cryptography.rsacryptoserviceprovider.aspx

passing c++ char* to c# via shared-memory

Sorry for probably simple question but I'm newbie in shared memory and trying to learn by example how to do things.
On c++ side I receive such pair: const unsigned char * value, size_t length
On c# side I need to have regular c# string. Using shared memory what is the best way to do that?
It's not that easy to using the string.
If it's me, I'll try these ways:
1.simply get a copy of the string. System.Text.Encoding.Default.GetString may convert from a byte array to a string.
You may try in a unsafe code block (for that you could use pointer type) to do:
(1) create a byte array, size is your "length"
byte[] buf = new byte[length];
(2) copy your data to the array
for(int i = 0; i < length; ++i) buf[i] = value[i];
(3) get the string
string what_you_want = System.Text.Encoding.Default.GetString(buf);
2.write a class, having a property "string what_you_want", and each time you access it, the above process will perform.
before all, you should first using P/Invoke feature to get the value of that pair.
edit: this is an example.
C++ code:
struct Pair {
int length;
unsigned char value[1024];
};
#include <windows.h>
#include <stdio.h>
int main()
{
const char* s = "hahaha";
HANDLE handle = CreateFileMappingW(INVALID_HANDLE_VALUE, NULL, PAGE_READWRITE, 0, sizeof(Pair), L"MySharedMemory");
struct Pair* p = (struct Pair*) MapViewOfFile(handle, FILE_MAP_READ|FILE_MAP_WRITE, 0, 0, sizeof(Pair));
if (p != 0) {
p->length = lstrlenA(s);
lstrcpyA((char*)p->value, s);
puts("plz start c# program");
getchar();
} else
puts("create shared memory error");
if (handle != NULL)
CloseHandle(handle);
return 0;
}
and C# code:
using System;
using System.IO.MemoryMappedFiles;
class Program
{
static void Main(string[] args)
{
MemoryMappedFile mmf = MemoryMappedFile.OpenExisting("MySharedMemory");
MemoryMappedViewStream mmfvs = mmf.CreateViewStream();
byte[] blen = new byte[4];
mmfvs.Read(blen, 0, 4);
int len = blen[0] + blen[1] * 256 + blen[2] * 65536 + blen[3] * 16777216;
byte[] strbuf = new byte[len];
mmfvs.Read(strbuf, 0, len);
string s = System.Text.Encoding.Default.GetString(strbuf);
Console.WriteLine(s);
}
}
just for example.
you may also add error-check part.

Sending an image from a C# client to a C server

If I send plain text there is no problem. Everything is ok.
However If I try to send from the C# client an image, the server receives correct bytes number, but when I save the buffer to a file (in binary mode - wb), it always has 4 bytes.
I send it by the C# client by using the function File.ReadAllBytes().
My saving code looks like
FILE * pFile;
char *buf = ReceiveMessage(s);
pFile = fopen (fileName , "wb");
fwrite(buf, sizeof(buf[0]), sizeof(buf)/sizeof(buf[0]), pFile);
fclose (pFile);
free(buf);
My receiving function looks like
static unsigned char *ReceiveMessage(int s)
{
int prefix;
recv(s, &prefix, 4, 0);
int len = prefix;
char *buffer= (char*)malloc(len + 1);
int received = 0, totalReceived = 0;
buffer[len] = '\0';
while (totalReceived < len)
{
if (len - totalReceived > BUFFER_SIZE)
{
received = recv(s, buffer + totalReceived, BUFFER_SIZE, 0);
}
else
{
received = recv(s, buffer + totalReceived, len - totalReceived, 0);
}
totalReceived += received;
}
return buffer;
}
Your C code needs to pass len back from the ReceiveMessage() function.
char *buf = ReceiveMessage(s); // buf is a char*
... sizeof(buff) // sizeof(char*) is 4 or 8
So you'll need something like
static unsigned char *ReceiveMessage(int s, int* lenOut)
{
...
*lenOut = totalReceived ;
}
You do a beginners mistake of using sizeof(buf). It doesn't return the number of bytes in the buffer but the size of the pointer (which is four or eight depending on if you run 32 or 64 bit platform).
You need to change the ReceiveMessage function to also "return" the size of the received data.
You do not get size of array by sizeof. Change to i.e.:
int len = 0;
char *buf;
buf = ReceiveMessage(s, &len);
/* then use len to calculate write length */
static unsigned char *ReceiveMessage(int s, int *len)
/* or return len and pass ptr to buf */
{
...
}

How to calculate CRC_B in C#

How to calculate CRC_B encoding in C# as described in ISO 14443?
Here is some background info:
CRC_B encoding
This annex is provided for explanatory purposes and indicates the bit patterns that will
exist in the physical layer. It is included for the purpose of checking an ISO/IEC
14443-3 Type B implementation of CRC_B encoding. Refer to ISO/IEC 3309 and CCITT X.25
2.2.7 and V.42 8.1.1.6.1 for further details. Initial Value = 'FFFF'
Example 1: for 0x00 0x00 0x00 you should end up with CRC_B of 0xCC 0xC6
Example 2: for 0x0F 0xAA 0xFF you should end up with CRC_B of 0xFC 0xD1
I tried some random CRC16 libraries but they aren't giving me the same results. I didn't get the same results from online checks either like in here.
I reversed this from the C code in ISO/IEC JTC1/SC17 N 3497 so its not pretty but does what you need:
public class CrcB
{
const ushort __crcBDefault = 0xffff;
private static ushort UpdateCrc(byte b, ushort crc)
{
unchecked
{
byte ch = (byte)(b^(byte)(crc & 0x00ff));
ch = (byte)(ch ^ (ch << 4));
return (ushort)((crc >> 8)^(ch << 8)^(ch << 3)^(ch >> 4));
}
}
public static ushort ComputeCrc(byte[] bytes)
{
var res = __crcBDefault;
foreach (var b in bytes)
res = UpdateCrc(b, res);
return (ushort)~res;
}
}
As a test, try the code below:
public static void Main(string[] args)
{
// test case 1 0xFC, 0xD1
var bytes = new byte[] { 0x0F, 0xAA, 0xFF };
var crc = CrcB.ComputeCrc(bytes);
var cbytes = BitConverter.GetBytes(crc);
Console.WriteLine("First (0xFC): {0:X}\tSecond (0xD1): {1:X}", cbytes[0], cbytes[1]);
// test case 2 0xCC, 0xC6
bytes = new byte[] { 0x00, 0x00, 0x00 };
crc = CrcB.ComputeCrc(bytes);
cbytes = BitConverter.GetBytes(crc);
Console.WriteLine("First (0xCC): {0:X}\tSecond (0xC6): {1:X}", cbytes[0], cbytes[1]);
Console.ReadLine();
}

Categories

Resources